New Client Wants to Keep Duplicate Content Targeting Different Cities
-
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities.
We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
-
This is a tough situation. I tend to agree with Ricky - these are exactly the kinds of pages that have been hit by Panda, and there's real risk. In the old days, the biggest risk was that the pages would just stop getting traffic. Now, the impact could hit the rest of the site as well, and it's a lot more dangerous.
The problem is that it's working for now, and you're asking them to give up traffic in the short-term to avoid losing it in the long-term. Again, I think the long-term risk is serious (and it's not that easy to recover from), but the short-term pain to the client is very real.
What's the scope of the 300 pages compared to the rest of the site (are we talking a 400 page site or a 40,000 page site)? How many of these city pages are getting real traffic? My best alternative solution is to pin down the 10-20% of the city pages getting most of the traffic, temporarily NOINDEX the rest, and then beef up those well-trafficked city pages with unique content (so, maybe you're talking about 30 pages). Then, build out from there.
Give these pages real value - it's not only good for SEO, but it will probably improve conversion, too. The other problem with pages that just swap out a city is that they're often low quality - they may draw traffic in, but then have high bounce rates and low conversion. If you can show that you can improve the value, even with some traffic loss, it's easier to win this fight.
-
Does the analytics support specific city search terms targeting those city specific pages, or going to the home page (or the canonical version of the duplicate content page)?
If it is the later, then you certainly should move those city specific keyword terms into the single version of the duplicate content in some creative fashion.
Regardless you still should remove the duplicate content, preferably sooner than later because they are certainly low value pages!
-
I agree with Ricky - I would slowly make all those pages unique in some way. I still find it beneficial to rank to different city pages as long as they have prime content. Google will eventually sift its way and find those pages as spam.
-
It seems to me that Google would see all of that duplicate content and simply have 1 page ranking as the canonical page. If they are seeing organic traffic and rankings for multiple pages, I am not sure how long that will last. From what I understand, it would be best to start the slow process of making the content on each page somewhat unique.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content with tagging and categories
Hello, Moz is showing that a site has duplicate content - which appears to be because of tags and categories. It is a relatively new site, with only a few blog publications so far. This means that the same articles are displayed under a number of different tags and categories... Is this something I should worry about, or just wait until I have more content? The 'tag' and 'category' pages are not really pages I would expect or aim for anyone to find in google results anyway. Would be glad to here any advice / opinions on this Thanks!
On-Page Optimization | | wearehappymedia1 -
Duplicate content errors
I have multiple duplicate content errors in my crawl diagnostics. The problem is though that i already took care of these problems with the canonical tag but MOZ keeps saying there is a problem. For example this page http://www.letspump.dk/produkter/56-aminosyre/ has a canonical tag, but moz still says it has an error. Why is that?
On-Page Optimization | | toejklemme0 -
Duplicate content issue, across site domains (blogging)
Hi all, I've just come to learn that a client has been cross-posting their blog posts to other blogs (on higher quality domains, in some cases). For example - this is the same post on 3 different blogs. http://thebioethicsprogram.wordpress.com/2014/06/30/how-an-irb-could-have-legitimately-approved-the-facebook-experiment-and-why-that-may-be-a-good-thing/
On-Page Optimization | | ketanmv
http://blogs.law.harvard.edu/billofhealth/2014/06/29/how-an-irb-could-have-legitimately-approved-the-facebook-experiment-and-why-that-may-be-a-good-thing/
http://www.thefacultylounge.org/2014/06/how-an-irb-could-have-legitimately-approved-the-facebook-experimentand-why-that-may-be-a-good-thing.html
And, sometimes a 4th time, on an NPR website. I'm assuming this is doing no one any favors and Harvard or NPR is going to earn the rank most every time. I'm going to encourage them to publish only fresh content on their real blog, would you agree? Can this actually harm the ranking of their blog and website - should we delete the old entries when migrating the blog? They are going to move their Wordpress Blog to hosting on their real domain soon:
http://www.bioethics.uniongraduatecollege.edu/news/ The current set up is not adding any value to their domain. Thank you for any advice! Ketan0 -
Duplicate Content - Blog Rewriting
I have a client who has requested a rewrite of 250 blog articles for his IT company. The blogs are dispersed on a variety of platforms: his own website's blog, a business innovation website, and an IT website. He wants to have each article optimised with keyword phrases and then posted onto his new website thrice weekly. All of this is in an effort to attract some potential customers to his new site and also to establish his company as a leader in its field. To what extent would I need to rewrite each article so as to avoid duplicating the content? Would there even be an issue if I did not rewrite the articles and merely optimised them with keywords? Would the articles need to be completely taken by all current publishers? Any advice would be greatly appreciated.
On-Page Optimization | | StoryScout0 -
Duplicate Content on Event Pages
My client has a pretty popular service of event listings and, in hope of gathering more events, they opened up the platform to allow users to add events. This works really well for them and they are able to garner a lot more events this way. The major problem I'm finding is that many event coordinators and site owners will take the copy from their website and copy and paste it, duplicating a lot of the content. We have editor picks that contain a lot of unique content but the duplicate content scares me. It hasn't hurt our page ranking (we have a page ranking of 7) but I'm wondering if this is something that we should address. We don't have the manpower to eliminate all the duplication but if we cut down the duplication would we experience a significant advantage over people posting the same event?
On-Page Optimization | | mattdinbrooklyn0 -
Why Moz is showing Duplicate Page Content Issues?
We have a Career Section on our website. For each job post, there is a separate link of "Apply Job". Now Moz's Crawl Diagnostic is showing Duplicate page content for such URLs. Here are two such URLs: http://tiny.cc/em9nyw http://tiny.cc/bq9nyw Can any one please suggest on this? Thanks
On-Page Optimization | | chandman0 -
My website is saying I have duplicate page content and page title. How do I fix it?
Hi, I created a website on webstarts.com. After I launched it then ran a scan through SEO it says I have duplicate page content and page title. The 2 pages it is reading are technically the same page. www.mobilemowermedicsinc.com and www.mobilemowermedicsinc.com/index . I am unsure how to get rid of on of these as it keeps saying this is an error in the SEO scan. Could someone please advise me of what to do from here. Thanks!
On-Page Optimization | | bcarp880 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0