50+ duplicate content pages - Do we remove them all or 301?
-
We are working on a site that has 50+ pages that all have duplicate content (1 for each state, pretty much).
Should we 301 all 50 of the URLs to one URL or should we just completely get rid of all the pages?
Are there any steps to take when completely removing pages completely? (submit sitemap to google webmaster tools, etc)
thanks!
-
I would keep them on the same URLs and explain the dupe content risk to client. Then client can decide to upgrade them himself, have you source new content or allow them to run "as is".
I know of a lot of sites that have pages like this that are working well and make nice money.
-
They have gotten some OK traffic. They're being clicked, but it's the exact same on each page.
There are no links coming in to any of them, however. The URLs themselves across the site are pretty weird, so I will be doing a complete restructuring of those as well.
Based on the traffic they've gotten, I agree, it would be good to do unique content for each state, but this obviously wouldn't be able to get completed for quite some time.
With all that in mind, what should we do?
-
We are working on a site that has 50+ pages that all have duplicate content (1 for each state, pretty much).
There are a lot of sites that make a lot of money by having one page for each state (or county or country or city or whatever). Although it is labor intensive to develop content for fifty different pages it can be done eloquently.
There are also a lot of successful sites that get away with fifty pages of cookie cutter content (Replace Alabama with Alaska, etc. on every page).
I would see what kind of traffic the site is currently getting on those pages, how many sales they are making, what visitors are doing on them, do they have any links?
These pages could be highly successful and if that is the case, II would work to improve them instead of removing them.... and if they have an inbound links at present I might improve them on the same URL rather than removing them to build on new URLs.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content from Wordpress Template
Hi Wondering if anyone can help, my site has flagged up with duplicate content on almost every page, i think this is because the person who set up the site created a lot of template pages which are using the same code but have slightly different features on. How would I go about resolving this? Would I need to recode every template page they have created?
Technical SEO | | Alix_SEO0 -
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
Search/Search Results Page & Duplicate Content
If you have a page whose only purpose is to allow searches and the search results can be generated by any keyword entered, should all those search result urls be no index or rel canonical? Thanks.
Technical SEO | | cakelady0 -
Noindex, follow duplicate pages
I have a series of websites that all feature a library of the same content. These pages don't make up the majority of the sites content, maybe 10-15% of the total pages. Most of our clients won't take the time to rewrite the content, but it's valuable to their site. So I decided to noindex, follow all of the pages. Outside of convincing them all to write their own versions of the content, is this the best method? I could also block the pages with robots.txt, but then I couldn't pass any link juice through the pages. Any thoughts?
Technical SEO | | vforvinnie0 -
Is there an easier way from the server to prevent duplicate page content?
I know that using either 301 or 302 will fix the problem of duplicate page content. My question would be; is there an easier way of preventing duplicate page content when it's an issue with the URL. For example: URL: http://example.com URL: http://www.example.com My guess would be like it says here, that it's a setting issue with the server. If anyone has some pointers on how to prevent this from occurring, it would be greatly appreciated.
Technical SEO | | brianhughes2 -
Duplicate page content issue needs resolution.
After my last "crawl" report, I received a warning about "duplicate page content". One page was: http://anycompany.com and the other was: http://anycompany.com/home.html How do I correct this so these pages aren't competing with each other or is this a problem?
Technical SEO | | JamesSagerser0 -
We are still seeing duplicate content on SEOmoz even though we have marked those pages as "noindex, follow." Any ideas why?
We have many pages on our website that have been set to "no index, follow." However, SEOmoz is indexing them as duplicate content. Why is that?
Technical SEO | | cmaseattle0 -
Solution for duplicate content not working
I'm getting a duplicate content error for: http://www.website.com http://www.website.com/default.htm I searched for the Q&A for the solution and found: Access the.htaccess file and add this line: redirect 301 /default.htm http://www.website.com I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page: "This webpage has a redirect loop
Technical SEO | | Joeuspe
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer." "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects." How can I correct this? Thanks0