International SEO and duplicate content: what should I do when hreflangs are not enough?
-
Hi,
A follow up question from another one I had a couple of months ago:
It has been almost 2 months now that my hreflangs are in place. Google recognises them well and GSC is cleaned (no hreflang errors).
Though I've seen some positive changes, I'm quite far from sorting that duplicate content issue completely and some entire sub-folders remain hidden from the SERP.
I believe it happens for two reasons:1. Fully mirrored content - as per the link to my previous question above, some parts of the site I'm working on are 100% similar. Quite a "gravity issue" here as there is nothing I can do to fix the site architecture nor to get bespoke content in place.
2. Sub-folders "authority". I'm guessing that Google prefers sub-folders over others due to their legacy traffic/history. Meaning that even with hreflangs in place, the older sub-folder would rank over the right one because Google believes it provides better results to its users.
Two questions from these reasons:
1. Is the latter correct? Am I guessing correctly re "sub-folders" authority (if such thing exists) or am I simply wrong?2. Can I solve this using canonical tags?
Instead of trying to fix and "promote" hidden sub-folders, I'm thinking to actually reinforce the results I'm getting from stronger sub-folders.
I.e: if a user based in belgium is Googling something relating to my site, the site.com/fr/ subfolder shows up instead of the site.com/be/fr/ sub-sub-folder.
Or if someone is based in Belgium using Dutch, he would get site.com/nl/ results instead of the site.com/be/nl/ sub-sub-folder.Therefore, I could canonicalise /be/fr/ to /fr/ and do something similar for that second one.
I'd prefer traffic coming to the right part of the site for tracking and analytic reasons. However, instead of trying to move mountain by changing Google's behaviour (if ever I could do this?), I'm thinking to encourage the current flow (also because it's not completely wrong as it brings traffic to pages featuring the correct language no matter what).
That second question is the main reason why I'm looking out for MoZ's community advice: am I going to damage the site badly by using canonical tags that way?
Thank you so much!
G -
Apologies for the delay coming back to you - Christmas didn't help.
And thanks for your answer; I will give this specific use of canonical a shot starting with small subsets of the site and monitor the impact on my ranking first.
Another interrogation on top of its impact on the site is to know whether it's worth the effort.
But I guess I'll only know it by trying directly. -
1. Is the latter correct? Am I guessing correctly re "sub-folders" authority (if such thing exists) or am I simply wrong?
Your two points are valid ones. I don't want to say correct as in that is the cause for sure, but the age of content in my experience does play a role in duplicate content picking.
2. Can I solve this using canonical tags?
Canonicals can go wrong with hreflang, but it isn't a bad idea if you get it right. However, you know your content and your users better than us.Another possible solution to help everything is to detect the user's location and ASK (Don't redirect on IP alone) if they prefer to see that location's content. This will encourage the sharing of all of your content over time.
But if I am completely realistic, nothing is going to show up perfectly if you are trying to geo-target without actual geo-targeted content. Sometimes you just need to tell the business owners who made this decision that opening a shop in another country, trying to act like a local business with zero changes to the content, just isn't going to work out in every business in every country.
-
Great, thanks for your reply!
How should I use canonical tags though?
I assume that blindly canonicalising parts of the site would be pretty silly.
As in, I've pulled out analytics reviewing the volume of page views for an entire sub-folder against a potential sub-folder it could be canonicalised to.I.e. site.com/fr/ gets 100k visits
Site.com/be/fr/ gets 1k visits.
Therefore it should be canonicalised as it receives very low traffic (1% of /fr/)Site.com/de/ gets 100k visits
Site.com/ch/de gets 50k visits
Therefore it should not be canonicalised as it receives a fair bit of traffic (50% of /de/).Or it doesn't matter and both sub-folders should be canonicalised no matter what?
-
Hi - Pages have authority & this forms part of the domain authority & yes use canonical tags as to avoid being penalised for duplicate content
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mixing up languages on the same page + possible duplicate content
I have a site in English hosted under .com with English info, and then different versions of the site under subdirectories (/de/, /es/, etc.) Due to budget constraints we have only managed to translate the most important info of our product pages for the local domains. We feel however that displaying (on a clearly identified tab) the detailed product info in English may be of use for many users that can actually understand English, and may help us get more conversions to have that info. The problem is that this detailed product info is already used on the equivalent English page as well. This basically means 2 things: We are mixing languages on pages We have around 50% of duplicate content of these pages What do you think that the SEO implications of this are? By the way, proper Meta Titles and Meta Descriptions as well as implementation of href lang tag are in place.
Intermediate & Advanced SEO | | lauraseo0 -
Duplicated content multi language / regional websites
Hi Guys, I know this question has been asked a lot, but I wanted to double check this since I just read a comment of Gianluca Fiorelli (https://mza.seotoolninja.com/community/q/can-we-publish-duplicate-content-on-multi-regional-website-blogs) about this topic which made me doubt my research. The case: A Dutch website (.nl) wants a .be version because of conversion reasons. They want to duplicate the Dutch website since they speak Dutch in large parts of both countries. They are willing to implement the following changes: - Href lang tags - Possible a Local Phone number - Possible a Local translation of the menu - Language meta tag (for Bing) Optional they are willing to take the following steps: - Crosslinking every page though a language flag or similar navigation in the header. - Invest in gaining local .be backlinks - Change the server location for both websites so the match there country (Isn't neccessery in my opinion since the ccTLD should make this irrelevant). The content on the website will at least be 95% duplicated. They would like to score with there .be in Belgium and with there .nl in The Netherlands. Are these steps enough to make sure .be gets shown for the quarry’s from Belgium and the .nl for the search quarry’s from the Netherlands? Or would this cause a duplicated content issue resulting in filtering out version? If that’s the case we should use the canonical tag and we can’t rank the .be version of the website. Note: this company is looking for a quick conversion rate win. They won’t invest in rewriting every page and/or blog. The less effort they have to put in this the better (I know it's cursing when talking about SEO). Gaining local backlinks would bring a lot of costs with it for example. I would love to hear from you guys. Best regards, Bob van Biezen
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Duplicated Content with Index.php
Good Afternoon, My website uses Joomla CMS and has the htaccess rewrite code enabled to ensure the use of search engine friendly URLs (SEF's). While browsing the crawl diagnostics I have found that Moz considers the /index.php URL a duplicate to our root. I will always under the impression that the htaccess rewrite took care of that issue and obviously I would like to address it. I attempted to create a 301 redirect from the index.php URL to the root but ran into an issue when attempting to login to the admin portion of the website as the redirect sent me back to the homepage. I was curious if anyone had advice for handling the index.php duplication issue, specifically with Joomla. Additionally, I have confirmed that in Google Webmasters, under URL parameters, the index.php parameter is set as 'Representative URL'.
Intermediate & Advanced SEO | | BrandonEML0 -
Scraping / Duplicate Content Question
Hi All, I understanding the way to protect content such as a feature rich article is to create authorship by linking to your Google+ account. My Question
Intermediate & Advanced SEO | | Mark_Ch
You have created a webpage that is informative but not worthy to be an article, hence no need create authorship in Google+
If a competitor comes along and steals this content word for word, something similar, creates their own Google+ page, can you be penalised? Is there any way to protect yourself without authorship and Google+? Regards Mark0 -
Penalized for Duplicate Page Content?
I have some high priority notices regarding duplicate page content on my website www.3000doorhangers.com Most of the pages listed here are on our sample pages: http://www.3000doorhangers.com/home/door-hanger-pricing/door-hanger-design-samples/ On the left side of our page you can go through the different categories. Most of the category pages have similar text. We mainly just changed the industry on each page. Is this something that google would penalize us for? Should I go through all the pages and use completely unique text for each page? Any suggestions would be helpful Thanks! Andrea
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
International Image SEO - one host vs multiple hosts
I've got 3 sites (same name) located in Australia, US and UK. Currently these sites are all pulling images (I own) from 1 location. I'd like to create image XML sitemaps for each of these sites. As I see it, my options are: 1. Keeping the images hosted in the 1 place and creating image XML sitemaps for each of the 3 sites (which seems to be technically ok because https://support.google.com/webmasters/answer/178636?hl=en&ref_topic=20986 states that if the image URL isn't on the same domain, both domains need to be verified in Webmaster Tools). However, is there a risk here that the sitemaps will conflict because they are pulling from images on the same host? 2. Hosting the images locally (ie. the same images will be hosted in 3 locations) and applying hreflang in the sitemap. Does anyone know which of these options are best (obviously #1 would be more convenient), or whether there are any other options for attacking this issue? Thanks!
Intermediate & Advanced SEO | | oline1230 -
Duplicate content issue for franchising business
Hi All We are in the process of adding a franchise model to our exisitng stand alone business and as part of the package given to the franchisee will be a website with conent identical to our existing website apart from some minor details such as contact and address details. This creates a huge duplicate content issue and even if we implement a cannonical approach to this will still be unfair to the franchisee in terms of their markeitng and own SEO efforts. The url for each franchise will be unique but the content will be the same to a large extend. The nature of the service we offer (professional qualificaitons) is such that the "products" can only be described in a certain way and it will be near on in impossible to have a unique set of "product" pages for each franchisee. I hope that some of you have come across a similar problem or that some of you have suggestions or ideas for us to get round this. Kind regards Peter
Intermediate & Advanced SEO | | masterpete0 -
How do I fix the error duplicate page content and duplicate page title?
On my site www.millsheating.co.uk I have the error message as per the question title. The conflict is coming from these two pages which are effectively the same page: www.millsheating.co.uk www.millsheating.co.uk/index I have added a htaccess file to the root folder as I thought (hoped) it would fix the problem but I doesn't appear to have done so. this is the content of the htaccess file: Options +FollowSymLinks RewriteEngine On RewriteCond %{HTTP_HOST} ^millsheating.co.uk RewriteRule (.*) http://www.millsheating.co.uk/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.html\ HTTP/ RewriteRule ^index\.html$ http://www.millsheating.co.uk/ [R=301,L] AddType x-mapp-php5 .php
Intermediate & Advanced SEO | | JasonHegarty0