Duplicate Content from WordPress Category Base?
-
I recently changed my category base in WordPress and instead of redirecting or deleting the old base, WordPress kept the content up. So I now have duplicate content on two different urls - one on the old category base, one on the new category base.
How should I handle this situation? The site is only a couple weeks old, if that makes any difference.
-
Few Links, i have redirect to the other post ! but still it has shown as warning and same 404 error
-
I wonder why WordPress wouldn't implement this automatically? They automatically 301 posts when you change the url, why not a category base?
-
I would put in a 301 redirect from the old category URL to the new category URL.
You can learn more about redirection with this Moz guide, but what it will effectively do is stop both of the pages being crawled, resolving any duplicate content issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content? - Ecommerce reviews loading the same products on every page
Hello there! I use a plugin on my ecom site that shows customer reviews - not product reviews but general shopping experience reviews. The plugin also loads links and short descriptions of products those customers bought. Having installed it site-wide, on every page there are short descriptions of the same products. Of course, as people leave new reviews the content changes (but it doesn't happen very often). So the question is: Is having links and short descriptions of the same products on every page harmful for SEO in this case? I'd be grateful for any insight into this matter.
On-Page Optimization | | thpchlk0 -
Duplicate Content - Pricing Plan tables
Hey guys, We're faced with a problem that we want to solve. We're working on the designs for a few pages for a drag & drop email builder we're currently working on, and we will be having the same pricing table on several pages (much like Moz does). We're worried that Google will take this as duplicate content and not be very fond of it. Any ideas about how we could integrate the same flow without potentially harming ranking efforts? And NO, re-writing the content for each table is not an option. It would do nothing but confuse the heck out of our clients. 😄 Thanks everybody!
On-Page Optimization | | andy.bigbangthemes0 -
Duplicate Page Titles
It seems as though we are being flagged for duplicate page titles when really they are slightly different. Is it better to remove the "dart board" or "dart board backboard" from all the product titles? We were doing this for optimal SEO - to rank for the search of "dart board" - but is it really hurting us? for example, our product titles are: Obama dart board backboard, Texas dart board backboard, Oklahoma dart board backboard, etc. Yet they are being flagged as duplicate titles.
On-Page Optimization | | DartsDecor0 -
Duplicate Issue
Hello Mozzers! We have a client going through a website revamp. The client is The Michelangelo Hotel, and they are part of Star Hotels. Star Hotels plans to create a section on their site for The Michelangelo, as opposed to maintaining a stand alone site. They will then take the michelangelohotel.com domain, and point it to the corresponding pages on the Star site. The guest will key in www.michelangelohotel.com, and will see the same content that can be found on www.starhotel.com/en/michelangelo-hotel-new-york. The problem we have is this: Essentially the same content will be indexed twice, once on starhotels.com and once on michelangelohotel.com. This would seem to cause a duplicate content issue. What are your thoughts? Edit: I apologize, because I was not nearly clear enough here. The Star Hotels site will have 5 pages dedicated to The Michelangelo Hotel. The content will sit solely on that server as those 5 pages. Those 5 pages will each be indexed as 2 URLs. www.michelangelohotel.com <-> www.starhotels.com/en/michelangelo/ www.michelangelohotel.com/accommodations <-> www.starhotels.com/en/michelangelo/accommodations And so on. Thanks!
On-Page Optimization | | FrankSweeney0 -
I'm looking to put a quite length FAQs tab on product pages on an ecommerce site. Am I likely to have duplicate content issues?
On an ecommerce site we have unique content on the product pages (i.e. descriptions), as well as the usual delivery and returns tabs for customer convenience. From this we haven't had any duplicate content issues or warnings, which seems to be the case industry-wide. However, we're looking to add a more lengthy FAQs tab which is still highly relevant to the customer but contains a lot more text than the other tabs. The product descriptions are also relatively small. Do you think this will cause potential duplicate content issues or should it be treated the same as a delivery tab, for instance?
On-Page Optimization | | creativemay0 -
Index.php getting Duplicate page content.
I am quite new to SEO and have now got my first results. I am getting all my index.php pages returned as Duplicate page content. ie: blue-widgets/index.php
On-Page Optimization | | ivoryred
blue-widgets/ green-widgets/large/index.php
green-widgets/large/ How do solve this issue?0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Filtered Navigation, Duplicate content issue on an Ecommerce Website
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution. For example. You have a page that lists 12 products out of 100: companyname.com/productcategory/page1.htm And then you filter these products: companyname.com/productcategory/filters/page1.htm The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products? I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages? I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
On-Page Optimization | | 13375auc30