Duplicate Content on Website with Multiple Locations
-
Hi there,
I've spent hours reading posts on duplicate content and googling this but I'm still not sure what to do.
We created a site that has two WP installs for a company with two different locations - the landing page is website.com and links to WP install 1 (website.com/city1), and WP install 2 (website.com/city2). They specifically wanted two different sites so they could be managed by staff at either location. However some of the pages have the same content - ie. services, policies, etc. so all of those are showing errors for duplicate content. All pages have different city-specific URL's and meta-descriptions but that clearly doesn't help.
We can't redirect the "duplicate" pages because then it would take the user to the other city's specific site. Is there anything we can do?? Is this going to significantly damage rankings?
Thanks kindly for any help you can provide.
-
HI,
It doesn't redirect the user, no. It tells Google 'which URL' you prefer to be indexed. Now, again I don't believe that this is the best option, as you want BOTH cities to rank. Does it affect the rankings? Yes, because you are saying Page A is more valuable than the duplicated Page B- leaving Page B out in the wild as the less important page for ranking.
So, again having two versions of the domains ( based on the cities) isn't beneficial, these should both be under one domain ( wp installation) and adding a "locations" page, to reduce the self competition.
-
Thank you for your help Tammy. I read through your links when you first replied and they helped a lot.
Pardon my lack of knowledge here, but I just want to make sure I understand correctly: If I go the rel=canonical route for the "duplicate" pages, it won't actually redirect the user, but will just tell google where the "original" page lives, and that it's duplicated on purpose correct? Does that then hurt the rankings for the site that's not showing as the "original" content?
Thanks again for your help.
-
While not appealing, you should rewrite all the content to be 100% unique, if it is privacy policy, tos, etc, you can no index those to reduce duplication. Otherwise, your options are limited. I realize that the products/ services will be similar in nature, but writing them in a different way to reduce the significantly similar content.
Alternatively, you can do a cross domain canonical tag, this tells Google that this content is duplicated intentionally on the other URL.
Here are a few articles about that:
https://yoast.com/rel-canonical/
http://webmasters.stackexchange.com/questions/56326/canonical-urls-with-multiple-domains
Next, focus on building local links to the individual city pages, to further differentiate the cities and the intent. Also, using the schema.org for 'local business' on each versions of the URL's. And, again I will say this is not an ideal situation and the best case scenario would be to add that content on ONE domain just with different location pages, within a subdirectory format.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regarding Website Recrawls
I have been working through the errors on my website - and just done a recrawl. I had 9 recrawls left for the month - now I only have 7.. anyone know why that is?
Moz Bar | | Stevepearce00 -
Why do my Moz duplicate content results show me pages with no noticeably similar content?
Sometimes the "Pages with Duplicate Content" results under Content Issues show pages that, from what I'm able to see or otherwise test, have no duplicate content, save for the same navigation that exists on all of my pages. For example, a recent issue said that the following pages had duplicate content:
Moz Bar | | rickmic
https://freezerworks.com/index.php/html/slider-overlay
https://freezerworks.com/index.php/ufaqs/what-do-i-get-with-my-purchase-of-freezerworks
https://freezerworks.com/index.php/videos/fda-and-freezerworks-2
https://freezerworks.com/index.php/lims-testing-module Even a side-by-side of the page source in a text comparison tool shows nothing but navigation and scripts used in every page. Am I not seeing something?2 -
Many Duplicate Content Flags
Not sure about you all, but I’m loving the new Moz Site Crawler. However, I was noticing that it is identifying a huge amount of pages as duplicate content. There are about 30,000 pages in this website, with that said we’ve had to make many templates to make the site scalable. Additionally a url rule was lost which caused a significant amount of duplicate pages to be created. I am working through using the moz crawl tool to identify duplicate pages but noticing many pages under “Affected Pages,” are actually unique content pages with initial content that is duplicate. I read that Moz flags any pages with 90% or more content overlapping content or code. My theory for this is that some templates that are too similar, to the point that Moz reads them as duplicative. Has this happened for anyone else? In addition, if Moz is flagging these similar pages as duplicate content, do we surmise that Google bots are having the same issue? We have seen issues with rankings as it pertains to the actual duplicate pages but hadn't experienced issues across the unique pages, they are hyperlocal pages so we are able to see rankings quite easily.
Moz Bar | | HZseo0 -
Duplicate Page Content
The site crawl is registering duplicate page content for our storefront site, but the pages aren't the same. They're ascending pages within the same category (ex: Featured, Featured pg2, Featured pg3, and so on). What can be done to fix these errors or prevent them in the future?
Moz Bar | | MGuid550 -
Www and non www / duplicate content / redirects / www resolve issue
I am not getting docked for these specific errors, but I am getting docked for 1 page has a WWW resolve issue and 1 wrong URL in the sitemap... (SEM Rush) but when I use moz, it's not showing any issues. So I have these things set up so far: In .htaccess i have a command that removes the www. 301 redirect from www version to the non www (homepage) canonical on index.html pointing to non www version, I also set up a canonical tag for each page on the site search console with non www, www, https www, https non www all set to non www preference. Also, when I fetch the www version in google search console it says it's being 301 redirected to non www version which is basically what I want.Is there anything that i'm missing? These errors on SEM Rush are giving me anxiety lol.
Moz Bar | | donnieath1 -
301 Redirects detected as duplicates
I have 19 pages that are all 301 redirected to the same page. Moz is detected these 19 pages a duplication's of each other. Does anyone know how to solve this issue?
Moz Bar | | Worship_Digital0 -
Duplicate Page Titles detected, no relevant links shown
Moz is reporting duplicate page titles, but the relevant pages aren't being shown. The downloaded report doesn't show them either. Screencast: http://screencast.com/t/gkyRds6u
Moz Bar | | ElykInnovation0