Duplicate content & canonicals
-
Hi,
Working on a website for a company that works in different european countries.
The setup is like this:
www.website.eu/nl
www.website.eu/be
www.website.eu/fr
...You see that every country has it's own subdir, but NL & BE share the same language, dutch...
The copywriter wrote some unique content for NL and for BE, but it isn't possible to write unique for every product detail page because it's pretty technical stuff that goes into those pages.
Now we want to add canonical tags to those identical product pages. Do we point the canonical on the /be products to /nl products or visa versa?
Other question regarding SEOmoz: If we add canonical tags to x-pages, do they still appear in the Crawl Errors "duplicate page content", or do we have to do our own math and just do "duplicate page content" minus "Rel canonical" ?
-
Hey Joris,
As of now it will most likely see it as duplicate content, because technically it still is duplicate content to a crawler bot, they won't know your intentions or target audience for each subfolder. The only way you could get around our crawler seeing it as duplicate is by blocking rogerbot with robots.txt or meta robots from that subfolder. Then there is putting up relconanoicals, which is the best way.
Hope this sheds some light on the duplicate content issues.
Best,
Nick
SEOmoz -
Thanks Robert!
-
Will do!
-
Now, that was a good question. Why not send a quick email to [email protected] and just ask if there is a way to circumvent? LMK please.
-
Hi Robert,
Thx for your quick answer, I will make sure that in Google Webmaster Tools we say that the /be is for Belgium and the /nl for The Netherlands, but the duplicate content will still show up in our reports in SEOmoz, no?
-
First question is: Have you thought of using the .cc instead of the sub directory? Rand speaks to the .fr issue in his WBF mentioned by iBiz Leverage.
As to canonical to avoid duplicate content, you shouldn't have a duplicate content issue even with the two languages so long as you set your country target for each. But, read or watch the WBF by Rand as it is full of info on this subject and domain auth, etc.
-
I have same problem and found this URL: http://www.youtube.com/watch?v=Ets7nHOV1Yo
Here is also another link from SEOmoz; i think this is most helpful: http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday
Hope this can help.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical tag on webstore products to avoid Duplicate Page Content ?
Hi, I would like to have an opinion on what how we are planning to solve the issue with Duplicate Page Contents that MOZ PRO is showing us. MOZ Pro is showing us a lot of pages with duplicate content as High Priority Issue. Mainly the problem is with products which have very few differences between them, e.g. pink bike model X and red bike model X. So we decided to implement a canonical tag on these products, and the pink bike model X will now have a canonical pointing to the red bike model X. So hopefully we will be ranking higher with our red bike model X and our pink bike model X will disapear from the index. Am I right ? Is it a good practice, since we will loose long tails indexes? I check each canonical in the Search Console, and we have extremely few searched for "pink bike model X" most of searches are "bike model X". Thank you in advance for your opinion. Isabelle
Moz Pro | | isabelledylag0 -
How do I fix duplicate title issues?
I have a sub domain that isn't even on our own site but it's resulting in a lot of errors in Moz for duplicate content, as shown here: http://cl.ly/1R081v0K0e2N. Would this affect our ranking or is it simply just errors within Moz? What measures could I take to make sure that Moz or Google doesn't associate our site with these errors? Would I have to noindex in the htaccess file for the sub domain?
Moz Pro | | MMAffiliate0 -
1500 Domains... Where to begin? & Web Structure Question.
So, as the title says, I am stuck. I recently have been brought on as the SEO guru for a small-mid size company with the task of rebuilding their web presence. Their website is in pretty unfortunate condition. The more research I do, the farther and farther I am going down the rabbit hole of chaos. Essential the previous CEO was doing all SEO work. He purchased 1500 domains, all keyword specific. Installed wordpress on roughly 1,000 and then began pumping out content. Of the 1,000 roughly 300 of them have about 600-2,000 characters worth of content that is absolute fluff. From there the linking began. Now the content is different enough that Google doesn't seem to notice that its the SAME FREEKIN THING on each domain, but I am very concerned. The company has their main multi-page domain which has other links and sources of traffic, but in essence the previous owner created a micro link web. My advice is to cut those links ASAP and remove the previous work. At the same time, I also don't want them to lose rank. So I guess I am asking a whole slew of questions... Am I right in thinking that we have to build a bridge before we burn a bridge? Is it worth fixing up some of those other domains to have original content to try and bolster what we already have? Would it be better to combine everything into one website, or try and have different domains represent different things. For example Envato.com is an umbrella website with 8 separate websites operating under the same roof using different domains.? Where do I begin? I feel like I have started this project numerous times. I know the keywords, I know where the duplicate content is, I know the structure of the main domain, I am getting the structure of the entire link web. Lastly, any thoughts you all have would be greatly appreciated. I realistically have minimal experience in this realm. I am a a major nub. I understand SEO in theory, sorta. So I'm getting there!
Moz Pro | | HashtagHustler0 -
Is it Possible to Subscribe to an SEOmoz Q&A Category via RSS?
I'm interested in tracking new questions in a specific category in SEOmoz's Q&A (the Reputation Management category), and I was thinking RSS would be the easiest way to do this. Is it possible? Or is there another way to get new questions asked in a category into your email or RSS reader?
Moz Pro | | brianspatterson0 -
Wrong duplicated page content
I found out that some errors on my website are considered as "duplicated page content" while they are not, the content is different on each page. I wonder why ? Is it an issue from Seomoz ?
Moz Pro | | Amadeus_eBC0 -
Excel tips or tricks for duplicate content madness?
Dearest SEO Friends, I'm working on a site that has over 2,400 instances of duplicate content (yikes!). I'm hoping somebody could offer some excel tips or tricks to managing my SEOMoz crawl diagnostics summary data file in a meaningful way, because right now this spreadsheet is not really helpful. Here's a hypothetical situation to describe why: Say we had three columns of duplicate content. The data is displayed thusly: | Column A | Column B | Column C URL A | URL B | URL C | In a perfect world, this is easy to understand. I want URL A to be the canonical. But unfortunately, the way my spreadsheet is populated, this ends up happening: | Column A | Column B | Column C URL A | URL B | URL C URL B | URL A | URL C URL C | URL A | URL B | Essentially all of these URLs would end up being called a canonical, thus rendering the effect of the tag ineffective. On a site with small errors, this has never been a problem, because I can just spot check my steps. But the site I'm working on has thousands of instances, making it really hard to identify or even scale these patterns accurately. This is particularly problematic as some of these URLs are identified as duplicates 50+ times! So my spreadsheet has well over 100K cells!!! Madness!!! Obviously, I can't go through manually. It would take me years to ensure the accuracy, and I'm assuming that's not really a scalable goal. Here's what I would love, but I'm not getting my hopes up. Does anyone know of a formulaic way that Excel could identify row matches and think - "oh! these are all the same rows of data, just mismatched. I'll kill off duplicate rows, so only one truly unique row of data exists for this particular set" ? Or some other work around that could help me with my duplicate content madness? Much appreciated, you Excel Gurus you!
Moz Pro | | FMLLC0 -
How to resolve Duplicate Content crawl errors for Magento Login Page
I am using the Magento shopping cart, and 99% of my duplicate content errors come from the login page. The URL looks like: http://www.site.com/customer/account/login/referer/aHR0cDovL3d3dy5tbW1zcGVjaW9zYS5jb20vcmV2aWV3L3Byb2R1Y3QvbGlzdC9pZC8xOTYvY2F0ZWdvcnkvNC8jcmV2aWV3LWZvcm0%2C/ Or, the same url but with the long string different from the one above. This link is available at the top of every page in my site, but I have made sure to add "rel=nofollow" as an attribute to the link in every case (it is done easily by modifying the header links template). Is there something else I should be doing? Do I need to try to add canonical to the login page? If so, does anyone know how to do it using XML?
Moz Pro | | kdl01 -
A Simple(ish) Q&A Improvement
Hey Roger, In the posting etiquette it states that users should 'find out if someone has already posted your question before adding it' but we all know that is never really going to happen. So, we tend to see the same questions asked again and again when there are already top class answers in the Q&A. So, could you not implement some kind of suggested answer system? So, user posts a question, you analyse it, list potential answers that may provide the answer based on the content of the question itself. If nothing matches, the user can then go on to post the question but in many cases, they would get a top notch answer instantly. I have seen this on other sites and it would certainly cut down on the amount of duplicated questions and the suggested answers could all be hand picked to some extent (good answers, lots of thumbs up etc). So, the Q&A would have less duplication and users would get directed to the best possible answers for a given question in a shorter time frame. Good for those that do the answering, good for the people who have questions - everyone is happy. Just a thought, my tuppence, 5 cents, etc. 🙂 Marcus
Moz Pro | | Marcus_Miller2