Canonical questions
-
Hi,
We are working on a site that sells lots of variations of a certain type of product. (Car accessories)
So lets say there are 5 products but each product will need a page for each car model so we will potentially have a lot of variations/pages. As there are a lot of car models, these pages will have pretty much the same content, apart from the heading and model details. So the structure will be something like this;
Product 1 (landing page)
- Audi (model selection page)
---Audi A1 (Model detail page)
---Audi A2 (Model detail page)
---Audi A3 (Model detail page) - BMW (model selection page)
---BMW 1 Series (Model detail page)
---BMW 3 Series (Model detail page)
Product 2 (landing page)
- Audi (model selection page)
---Audi A1 (Model detail page)
---Audi A2 (Model detail page)
---Audi A3 (Model detail page) - BMW (model selection page)
etc
etc
The structure is like this as we will be targeting each landing page for AdWords campaigns.
As all of these pages could look very similar to search engines, will simply setting up each with a canonical be enough? Is there anything else we should do to ensure Google doesn't penalise for duplicate page content?
Any thoughts or suggestions most welcome.
Thanks! - Audi (model selection page)
-
No problem. Do share screenshots of product pages and the URLs (once available) here. Will be able to help you out with this. Fixing is using canonical or meta robots is not a time taking solution to implement in general and hence, can be fixed at the last moment (before going live) as well. So, this can be parked for now.
-
Thanks for that,
It could be an issue creating 'unique' content on every page as potentially there will be A LOT of pages (one for each major car make & model) but you might be right. I'll have a think and a chat with the dev team.
Thanks again.
-
Hi David,
Thanks for sharing a couple of instances to help me understand the point here. Well, I don't think there is any need of blocking these pages from indexing. You're confused about it just because you don't have much content to show on these pages and the templates is similar and hence, google might consider them as duplicate pages, right?
To resolve this issue and also, to make these pages stronger from organic visibility perspective, you would need to add on-page content and other "cool" features to make them powerful anyways. But, blocking them for bots won't be a good solution I believe.
Btw, if sharing the URLs of the pages is not possible as its in development phase, could you please share the screenshots of the pages here? Would be able to comment on how this should be handled once after having a look at it.
-
Thanks Nitin,
The site is in development so unfortunately I can't share a URL but I found a link that is not a million miles away from what we are doing, see below. My concern was because the bulk of the content on each page will be the same. Each page will be structure something like this:
Page Title
Car model detail (lets say Audi A4)Generic product information for 4 product types:
Product 1
Product 2
etcSomething like this:
http://www.carscovers.co.uk/AUDI-A4-ALLROAD-CAR-COVER-2008-ONWARDS.html
http://www.carscovers.co.uk/BMW-1-SERIES-COUPE-CABRIOLET-CAR-COVER-2004-ONWARDS.htmlAs you will see, the main content on each of the pages above is the same. (because the actual product is the same).
Does this help describe the potential issue?
-
Hi David,
If header and other details are different on these pages, why would you like to set canonical or somehow block these pages from indexing? That should be a candidate for duplicate content penalty I believe.
Could you please share some sample URLs to help me understand the issue you're talking about? I'll try my best to guide to handling this neatly from SEO perspective.
-
You welcome! Enjoy the rest of your day.
-
That does help, not sure how I missed that. Thanks Benjamin.
-
Hi David,
You might find this will help you. https://mza.seotoolninja.com/learn/seo/duplicate-content
Other than that, someone else may be able to answer your question in more detail if that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question on Indexing, Hreflang tag, Canonical
Dear All, Have a question. We've a client (pharma), who has a prescription medicine approved only in the US, and has only one global site at .com which is accessed by all their target audience all over the world.
Intermediate & Advanced SEO | | jrohwer
For the rest of the US, we can create a replica of the home page (which actually features that drug), minus the existence of the medicine, and set IP filter so that non-US traffic see the duplicate of the home page. Question is, how best to tackle this semi-duplicate page. Possibly no-index won't do because that will block the site from the non-US geography. Hreflang won't work here possibly, because we are not dealing different languages, we are dealing same language (En) but different Geographies. Canonical might be the best way to go? Wanted to have an insight from the experts. Thanks,
Suparno (for Jeff)1 -
Technical 301 question
Howdy all, this has been bugging me for a while and I wanted to know the communities ideas on this. We have a .com website which has a little domain authority and is growing steadily. We are a UK business (but have a US office which we will be adapting too soon) We are ranking better within google.com than we do on google.co.uk probably down to our TLD. Is it a wise idea to 301 our .com to .co.uk for en-gb enquiries only? Is there any evidence that this will help improve our position? will all the link juice passed from 301s go to our .co.uk only if we are still applying the use of .com in the US? Many thanks and hope this isn't too complicated! Best wishes,
Intermediate & Advanced SEO | | TVFurniture
Chris0 -
Questions on Google Penguin Clean-up Strategy
Hello Moz Community! I was hit with a REAL bad penalty in May 2013, and the date corresponds to Penguin #4. Never received a manual spam action, but the 50% drop in traffic was very apparent. Since then, I've had a slow reduction in traffic, to where I am today... which is almost baseline. Increases in traffic have not occurred regardless of efforts. In researching a little more, I see that my old SEO companies built my links with exact keyterm matches, many of them repeated over and over, verbatim, on different sites. I've heard two pieces of advice that I don't like 1) scrap the site, or 2) disavow all the links. I would rather see if I can get the webmasters to change the link to something generic, or my brand name, before I do either of these. To scrap my site and start new will be damn near impossible because I'm in an extremely competitive niche, and my site has age (since 2007), so rather work with what I have. A couple of questions, for folks who are in the know about this penalty, if I may: This penguin update, #4, on May 22nd, was it ONLY because of the link text? Or was it also because of the link quality? None of the updates before it harmed me, and I believe those were because of the quality? Could it be for links linking from my blog to my site? My blog (ex. www.mysite.com/blog), has close to 1,000 blog posts, and back in the days I would write these really long, keyword stuffed links leading to www.mysite.com. I've been in the process of cleaning these up, and shortening them, and changing them to more generic (click here's), but it is a LONG and painstaking process. If I get webmasters to change text to just the url or brand name, that's better than disavowing, correct? As long the linking site has a decent spam score and PA/DA on OSE? Is having SOME exact anchor text okay on these links? Is it just the abuse that's the problem? If so, how many should I leave? (like 5 max per keyword?) Or should I just change to the url, or disavow altogether, any and all links that have exact keyword matches? I've downloaded my link profile from OSE and Majestic, and will do so from Ahrefs (I believe it is)? Does Webmaster Tools have any section that can help give me insights into the issue? If so, can you point me in the right direction? Can I get partial credit, for some work done? For instance, say a major update, or crawl, happens, and I've only fixed/disavowed 25% percent of the links by then, is there a possibility that I get a small boost in traffic? Or am I in the doghouse till they are all fixed? Say I clean/disavow everything up, will my improvement be seen in the next crawl? Or the next Penguin update? As there may be a substantial difference in time there. 😎 I see AHREFS, has some information on anchor text... any rules of thumb as to percentages of use of a certain anchor text, to see if I'm abusing or not, before I start undertaking all of this? Thanks! Could the penalty have "passed" altogether, and this is just where I rank? Thanks guys, but the last thing I want to do is ditch my site... I will work hard on this, but need some guidance. Much appreciated! David
Intermediate & Advanced SEO | | DavidC.0 -
Duplicate content question
Hi there, I work for a Theater news site. We have an issue where our system creates a chunk of duplicate content in Google's eyes and we're not sure how best to solve. When an editor produces a video, it simultaneously 1) creates a page with it's own static URL (e.g. http://www.theatermania.com/video/mary-louise-parker-tommy-tune-laura-osnes-and-more_668.html); and 2) displays said video on a public index page (http://www.theatermania.com/videos/). Since the content is very similar, Google sees them as duplicate. What should we do about this? We were thinking that one solution would to be dynamically canonicalize the index page to the static page whenever a new video is posted, but would Google frown on this? Alternatively, should we simply nofollow the index page? Lastly, are there any solutions we may have missed entirely?
Intermediate & Advanced SEO | | TheaterMania0 -
Video Sitemap Creation Question
I have created a sitemap file as per Google Web Master Tools instructions. I have it saved as a .txt file. Am I right in thinking that this needs to be uploaded as a .xml file? If so, how do I convert this to a XML? I have tried but it seems to corrupt - there must be a simple way to do this?!
Intermediate & Advanced SEO | | DHS_SH0 -
Duplicate Content Question
Currently, we manage a site that generates content from a database based on user search criteria such as location or type of business. ..Although we currently rank well -- we created the website based on providing value to the visitor with options for viewing the content - we are concerned about duplicate content issues and if they would apply. For example, the listing that is pulled up for the user upon one search could have the same content as another search but in a different order. Similar to hotels who offer room booking by room type or by rate. Would this dynamically generated content count as duplicate content? The site has done well, but don't want to risk a any future Google penalties caused by duplicate content. Thanks for your help!
Intermediate & Advanced SEO | | CompucastWeb1 -
How far can I push rel=canonical?
My plan: 3 sites with identical content, yet--wait for it--for every article whose topic is A, the pages on all three sites posting that article will have a rel=canonical tag pointing to Site A. For every article whose topic is B, the pages on all three sites posting that article will have a rel=canonical tag pointing to Site B. So Site A will have some articles about topics A, B, and C. And for pages with articles about A, the rel=canonical will point to the page it's on. Yet for pages with articles about B, the rel=canonical will point to the version of that article on site B. Etc. I have my reasons for planning this, but you can see more or less that I want each site to rank for its niche, yet I want the users at each site to have access to the full spectrum of articles in the shared articles database without having to leave a given site. These would be distinct brands with distinct Whois, directory listings, etc. etc. The content is quality and unique to our company.
Intermediate & Advanced SEO | | TheEspresseo0 -
Rel=Canonical URLs?
If I had two pages: PageA about Cats PageB about Dogs If PageA had a link rel=canonical to PageB, but the content is different, how would Google resolve this and what would users see if they searched "Cats" or "Dogs?" If PageA 301 redirected to PageB, (no content in PageA since it's 301 redirected), how would Google resolve this and what would users see if they searched "Cats" or "Dogs?"
Intermediate & Advanced SEO | | visionnexus0