Implementation advice on fighting international duplicate content
-
Hi All,
Let me start by explaining that I am aware of the rel="canonical" and **rel="alternate" hreflang="x" **tags but I need advice on implementation.
The situation is that we have 5 sites with similar content. Out of these 5:
- 2 use the same URL stucture and have no suffix
- 2 have a different URL structure with a .html suffix
- 1 has an entirely different URL structure with a .asp suffix
The sites are quite big so it will take a lot of work to go through and add rel="alternate" hreflang="x" tags to every single page (as we know the tag should be applied on a page level not site level).
4 out of the 5 sites are managed by us and have the tag implemented so that makes it easier but the 5th is managed in Asia and we fear the amount of manual work required will put them off implementing it. The site is due to launch at the end of the month and we need to sort this issue out before it goes live so that we are not penalised for duplicate content.
Is there an easy way to go about this or is the only way a manual addition?
Has anyone had a similar experience?
Your advice will be greatly appreciated.
Many thanks,
Emeka.
-
Unfortunately yes, it is needed to be rerun the process with the tool.
-
Thanks Gianluca,
Have you had experience using the tool above? Presumably each time a new page is added to the site the tool would have to be run again?
I agree that an in-house solution will be best but given the time limit we are open to ideas.
I appreciate your response.
Emeka.
-
When it come to massive sites and hreflang annotations, the ideal solution is implementing the hreflang using the sitemap.xml method.
It is explained here by Google: https://support.google.com/webmasters/answer/2620865?hl=en.
A tool that makes easier to implement hreflang in a sitemap file is the one The Mediaflow created:
http://www.themediaflow.com/tool_hreflang.php.
Right now, that is the only tool I know for that kind of task, so you could also think to create an internal in-house solution, if you have internal developers who can be dedicated to this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Has Anyone Successfully implemented SpecialAnnouncement Schema?
Hello, I'm in the process of updating my clients' websites with the SpecialAnnouncement Schema type, and I'm wondering if anyone has successfully done so yet, and whether or not they're seeing any kind of results /richsnippets directly in SERPs from it? Also, has anyone else run into issues checking their schema with Google's Structured Data Testing Tool? Sometimes I get an error saying "specialAnnouncement is not a type known to Google," and sometimes I get one saying that "the property datePosted is not recognized by Google for an object of type SpecialAnnouncement." I assume these errors are because the schema type is so new, but you know what happens when you assume... Thank you for any insights!
Local Website Optimization | | LocalSEOLady0 -
Should I avoid duplicate url keywords?
I'm curious to know Can having a keyword repeat in the URL cause any penalties ? For example xyzroofing.com xyzroofing.com/commercial-roofing xyzroofing.com/roofing-repairs My competitors with the highest rankings seem to be doing it without any trouble but I'm wondering if there is a better way. Also One of the problems I've noticed is that my /commercial-roofing page outranks my homepage for both residential and commercial search inquiries. How can this be straightened out?
Local Website Optimization | | Lyontups0 -
No Index, No Follow Short *but relevant) content?
One of the sections of our blog is "Community Involvement." In this section, we post pictures of the event, what it was for, and what we did to help. We want our clients, and potential clients, to see that we do give back to our local community. However, thee are all very short posts (maybe a few hundred words). I'm worried this might look like spam, or at the very least, thin content to google, so should I no index no follow the posts or just leave them as is? Thanks, Ruben
Local Website Optimization | | KempRugeLawGroup0 -
Duplicate content, hijacked search console, crawl errors, ACCCK.
My company employed a national marketing company to create their site, which was obviously outsourced to the lowest bidder. It looks beautiful, but has a staging site with all duplicate content in the installation. I am not seeing these issues in search console, and have had no luck getting the staging site removed from the files. How much should I be banging the drum on this? We have hundreds of high level crawl errors and over a thousand in midlevel. Of course I was not around to manage the build. I also do not have ftp access I'm also dealing with major search console issues. The account is proprietarily owned by a local SEO company and I can not remove the owner who is there by delegation. The site prefers the www version and does not read the same traffic for the non www version We also have something like 90,000 backlinks from 13 sites. And a shit ton of ghost spam. Help!
Local Website Optimization | | beth_thesomersteam0 -
Duplicate content on a proxy site?
I have a local client with a 500 page site.
Local Website Optimization | | TFinder
They advertise online and use traditional media like direct mail.
A print media company, Valpak, has started a website
And wants the client to use their trackable phone number
And a proxy website. When I type the proxy domain in the browser
It appears to be client home page at this proxy URL. The vendor
Wishes to track activity on its site to prove their value or something
My question is: is their any "authority" risk to my clients website
By allowing this proxy site??0 -
Local site went from dominating first page - bad plugin caused duplicate content issues - now to 2nd page for all!
I had a bad plugin create duplicate content issues on my Wordpress CMS - www.pmaaustin.com I got it fixed, but now every keyword has been stuck on page 2 for search terms for 4 months now, where I was 49 out of 52 keywords on page one. It's a small local niche with mostly easier to rank keywords. Am I missing something? p.s. Also has a notice on the Dashboard that says: "404 Redirected: There are 889 captured 404 URLs that need to be processed." Could that be a problem? Thanks, Steve
Local Website Optimization | | OhYeahSteve0 -
How can I rank my .co.uk using content on my .com?
Hi, We currently have a .com site ranking second for our brand term in the .co.uk SERP. This is mainly because we don't own the exact match brand term which comes from not having a clue what we were doing when we set up the company. Would it be possible to out rank this term considering we the weighing that google puts towards exact matches in the URL? N.B - There are a few updates we could do to the homepage to make the on-page optimisation better and we have not actively done any link building yet which will obviously help. competitor SERP rank 1 - MOZ PA38 DA26 Our Site SERP rank 2 - MOZ PA43 DA32 Thanks Ben
Local Website Optimization | | benjmoz0 -
Rel alternate implementing with x default
Hi, I am implementing rel alternate codes on my website but had few doubts with the same. 1. Can I have the same URL www.mysite.com as "x-default" and for "en-us" like below: 2. Will all the URL contain its own referencing code. For example, will the URL www.uk.mysite.com have code on it or not with the other codes Looking forward for the help guys! Many Thanks | |
Local Website Optimization | | HiteshBharucha
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |0