How To Handle Duplicate Content Regarding A Corp With Multiple Sites and Locations?
-
I have a client that has 800 locations. 50 of them are mine. The corporation has a standard website for their locations. The only thing different is their location info on each page. The majority of the content is the same for each website for each location.
What can be done to minimize the impact/penalty of having "duplicate or near duplicate" content on their sites? Assuming corporate won't allow the pages to be altered.
-
Cool, keep me posted.
-
Thanks again and let me know if you have any other ideas... I will keep you posted on what happens...
-
Thank you for the info... Some of that is similar to what I was thinking. I feel that corporate is pretty stiff but I will have to try to make a case with them.
-
It is time to speak to the corporation and tell them to get some professional advice. If they have 800 sites that are all exactly the same then they are at best going to see most of them fail to get indexed and at worse could face some kind of penalty or de-listing across the entire network (are these sites linked together?).
What can you do?
Well, if it were me, I would want to do one of the following:
- Substantially rewrite the content across all of the sites and noindex any pages that can't be rewritten
- Not use these corporate sites and build my own. If we can't have it taken down, at least no index it.
- No index the duplicate pages and build your own unique content on the site to attract traffic
- Create one unique and global site that has all locations listed in a directory type affair - maybe even with their own portal or sub domain with a blog and location specific content.
- Create unique social portals on facebook / google plus and try to make those the main target for search users
As ever, without a link I have to add the caveat that it's kind of tough to give specific and laser targeted advice for sites we have not seen but certainly, this is problematic and will undermine any kind of SEO efforts you may make on the sites.
In a nutshell - you need to resolve this, or give up on these sites as a way of generating search traffic.
Hope that helps!
Marcus -
Well similar content does get penalty for sure. More than 40 to 50 % similar content similarity is even worse. Robots.txt file can be implemented to not allow Google to index it though Google will still crawl it.
Content has to be changed. Contact us, about us are same in man websites which seems to be ok with Google as it cant be avoided and Google understand that. But having home page and other important pages similar too is not good. Will be good to see what others have to say here.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirects for Multiple Language Sites in htaccess File
Hi everyone, I have a site on a subdomain that has multiple languages set up at the domain level: https://mysite.site.com, https://mysite.site.fr , https://mysite.site.es , https://mysite.site.de , etc. We are migrating to a new subdomain and I am trying to create 301 redirects within the htaccess file, but I am a bit lost on how to do this as it seems you have to go from a relative url to an absolute - which would be fine if I was only doing this for the english site, but I'm not. It doesn't seem like I can go from absolute url to an absolute url - but I could be wrong. I am new to editing the htaccess file - so I could definitely use some advice here. Thanks.
Intermediate & Advanced SEO | | amberprata0 -
Big retailers and duplicate content
Hello there! I was wondering if you guys have experience with big retailers sites fetching data via API (PDP content etc.) from another domain which is also sharing the same data with other multiple sites. If each retailer has thousands on products, optimizing PDP content (even in batches) is quite of a cumbersome task and rel="canonical" pointing to original domain will dilute the value. How would you approach this type of scenario? Looking forward to read your suggestions/experiences Thanks a lot! Best Sara
Intermediate & Advanced SEO | | SaraCoppola1 -
Duplicating content from manufacturer for client site and using canonical reference.
We manage content for many clients in the same industry, and many of them wish to keep their customers on their individualized websites (understandably). In order to do this, we have duplicated content in part from the manufacturers' pages for several "models" on the client's sites. We have put in a Canonical reference at the start of the content directing back to the manufacturer's page where we duplicated some of the content. We have only done a handful of pages while we figure out the canonical reference potential issue. So, my questions are: Is this necessary? Does this hurt, help or not do anything SEO-wise for our ranking of the site? Thanks!
Intermediate & Advanced SEO | | moz1admin1 -
How do I use public content without being penalized for duplication?
The NHTSA produces a list of all recalls for automobiles. In their "terms of use" it states that the information can be copied. I want to add that to our site, so there is an up-to-date list for our audience to see. However, I'm just copying and pasting. I'm allowed to according to NHTSA, but google will probably flag it right? Is there a way to do this without being penalized? Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup1 -
Where is the best place to put a sitemap for a site with local content?
I have a simple site that has cities as subdirectories (so URL is root/cityname). All of my content is localized for the city. My "root" page simply links to other cities. I very specifically want to rank for "topic" pages for each city and I'm trying to figure out where to put the sitemap so Google crawls everything most efficiently. I'm debating the following options, which one is better? Put the sitemap on the footer of "root" and link to all popular pages across cities. The advantage here is obviously that the links are one less click away from root. Put the sitemap on the footer of "city root" (e.g. root/cityname) and include all topics for that city. This is how Yelp does it. The advantage here is that the content is "localized" but the disadvantage is it's further away from the root. Put the sitemap on the footer of "city root" and include all topics across all cities. That way wherever Google comes into the site they'll be close to all topics I want to rank for. Thoughts? Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Microsite as a stand-alone site under one domain and sub-domained under another: duplicate content penalty?
We developed and maintain a microsite (example: www.coolprograms.org) for a non-profit that lives outside their main domain name (www.nonprofit-mainsite.org) and features content related to a particular offering of theirs. They are utilizing a Google Grant to run AdWords campaigns related to awareness. They currently drive traffic from the AdWords campaigns to both the microsite (www.coolprograms.org) and their main site (www.nonprofit-mainsite.org). Google recently announced a change in their policy regarding what domains a Google Grant recipient can send traffic to via AdWords: https://support.google.com/nonprofits/answer/1657899?hl=en. The ads must all resolve to one root domain name (nonprofit-mainsite.org). If we were to subdomain the microsite (example: coolprograms.nonprofit-mainsite.org) and keep serving the same content via the microsite domain (www.coolprograms.org) is there a risk of being penalized for duplicate content? Are there other things we should be considering?
Intermediate & Advanced SEO | | marketing-iq0 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0 -
2 sites or one sites: 2 locations
Hello, I have a dog training client who is offering services in 2 separate locations. We're looking to be first in the non-local search results and also rank well in google places. Would it be better to go for 2 separate sites or one site and try to rank for 2 different locations with one site? There's both local and standard search results when we type in our keywords. Thanks!
Intermediate & Advanced SEO | | BobGW0