Advice on Duplicate Page Content
-
We have many pages on our website and they all have the same template (we use a CMS) and at the code level, they are 90% the same. But the page content, title, meta description, and image used are different for all of them.
For example -
http://www.jumpstart.com/common/find-easter-eggs
http://www.jumpstart.com/common/recognize-the-rsWe have many such pages.
Does Google look at them all as duplicate page content? If yes, how do we deal with this?
-
EGOL, Everett,
Thank you both for your very useful suggestions. Sounds like we should do something similar to our PDF documents to represent them as the actual/canonical content on the page. And we'll look at our CMS to see how we might implement the unlinked page name in the breadcrumb. We have done some work already in adding structured data with schemas (including aggregate ratings), so that is hopefully yielding some results already.
However, after an encouraging traffic spike that seemed to indicate that we were on the right track, we saw a very worrisome dip last month.... which then led to a lot of worried hand wringing about Panda.
So these suggestions are very helpful ; thanks again and we'll try them out!
-
Thank you, Everett,
Nice to see you posting in Q&A.
Look forward to seeing you regularly.
-
Hello Sudhir,
Those two pages would not be seen as duplicates. Google is very capable of separating the template from the content.
On a side note, you should look into getting the name of the page/game into the breadcrumb, though it doesn't have to be linked like the previous two pages in the path. For example:
You are here: Home --> Common --> Find Easter Eggs
Allowing visitors to review and rate the games would provide useful, keyword-rich, natural content on an otherwise content-sparse page. Once reviews/ratings are implemented you could also use Schema.org markup to enhance your search engine results by showing star ratings next to each game.
Good luck!
-
Google knows how to separate the template of the site from the content. So you have nothing to worry about if most of the code on your pages is the same code that is used on every other page.
I looked at your two sample pages and saw a few things that would concern me...
This page had very little content. If you have lots of pages with such a tiny amount of content you could have Panda problems.
http://www.jumpstart.com/common/find-easter-eggs
You also have pages like this....
http://www.jumpstart.com/common/recognize-the-rs-view
These have very little content.
I have a site with lots of printable content that is mainly images placed in .pdf documents to control the scale of the printing and the look of the printed page. The pages used to present them to visitors and the pdf documents were all thin content and my site had a Panda problem. That cause the rankings of every page on the site to fall and really damaged my traffic. I solved that by noindexing the html pages and applying rel=canonical to the pdf files using .htacess.
I can't say if this will happen to you but I would be uncomfortable if I had a site with such little content on its pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issue with my Blog and archive pages . Any help greatly appreciated
Dear Mozzers, I have been looking at my news section on my eCommerce site and I think I may have a duplicate content issue and wanted some advice on whether I do and if so , how best I handle this. http://www.website.co.uk/news
Technical SEO | | PeteC12
http://www.website.co.uk/news/page:1
http://www.website.co.uk/news/page:2
http://www.website.co.uk/news/page:3
http://www.website.co.uk/news/limit:9999 (This is show all) I also have the ability of showing articles by month : http://www.website.co.uk/news/archive/2015/04 (April)
http://www.website.co.uk/news/archive/2015/03 (March)
http://www.website.co.uk/news/archive/2015/02 (Feb)
http://www.website.co.uk/news/archive/2015/01 (Jan) I am wondering if there's a duplicate issue here or not given that I also articles by month as well and if so how best I handle this.? I already do pagination on my news pages (page 1 , page 2) by using rel=next and rel=Prev but I don't have an canconical or anything as yet. I enclose a couple of links if this would help and would appreciate if someone could take a browse. I have a View All link on my homepage for for all news items - http://goo.gl/JPPIvQ I which have a different urls - March 2015 Articles - http://goo.gl/0O1wYD and April 2015 articles - http://goo.gl/GdW2oK On another note, These articles are also linked to from the relevant category landing pages on my website to help with SEO. I have not used H tags on the article links in my landing pages , just displaying the weblink back to the news article.I've done this to try and improve the PR and rankings of my landing pages. Just wondered if anyone has any comments as to whether thats a good or bad idea and whether I could improve it in any way - An example is here (scroll down the page to the pressure washing guides) - http://goo.gl/nnRE49 Thanks Pete0 -
Big page of clients - links to individual client pages with light content - not sure if canonical or no-follow - HELP
Not sure what best practice here is: http://www.5wpr.com/clients/ Is this is a situation where I'm best off adding canonical tags back to the main clients page, or to the practice area each client falls under? No-following all these links and adding canonical? No-follow/No-index all client pages? need some advice here...
Technical SEO | | simplycary0 -
SEOMOZ and non-duplicate duplicate content
Hi all, Looking through the lovely SEOMOZ report, by far its biggest complaint is that of perceived duplicate content. Its hard to avoid given the nature of eCommerce sites that oestensibly list products in a consistent framework. Most advice about duplicate content is about canonicalisation, but thats not really relevant when you have two different products being perceived as the same. Thing is, I might have ignored it but google ignores about 40% of our site map for I suspect the same reason. Basically I dont want us to appear "Spammy". Actually we do go to a lot of time to photograph and put a little flavour text for each product (in progress). I guess my question is, that given over 700 products, why 300ish of them would be considered duplicates and the remaning not? Here is a URL and one of its "duplicates" according to the SEOMOZ report: http://www.1010direct.com/DGV-DD1165-970-53/details.aspx
Technical SEO | | fretts
http://www.1010direct.com/TDV-019-GOLD-50/details.aspx Thanks for any help people0 -
Duplicate page content
hi I am getting an duplicate content error in SEOMoz on one of my websites it shows http://www.exampledomain.co.uk http://www.exampledomain.co.uk/ http://www.exampledomain.co.uk/index.html how can i fix this? thanks darren
Technical SEO | | Bristolweb0 -
How critical is Duplicate content warnings?
Hi, So I have created my first campaign here and I have to say the tools, user interface and the on-page optimization, everything is useful and I am happy with SEOMOZ. However, the crawl report returned thousands of errors and most of them are duplicate content warnings. As we use Drupal as our CMS, the duplicate content is caused by Drupal's pagination problems. Let's say there is a page called "/top5list" , the crawler decided /top5list?page=1" to be duplicate of "/top5list". There is no real solution for pagination problems in Drupal (as far as I know). I don't have any warnings in Google's webmaster tools regarding this and my sitemap I submitted to Google doesn't include those problematic deep pages. (that are detected as duplicate content by SEOMOZ crawler) So my question is, should I be worried about the thousands of error messages in crawler diagnostics? any ideas appreciated
Technical SEO | | Gamer070 -
How do I eliminate duplicate page titles?
Almost...I repeat almost all of my duplicate page titles show up as such because the page is being seen twice in the crawl. How do I prevent this? <colgroup><col width="336"> <col width="438"></colgroup>
Technical SEO | | ENSO
| www.ensoplastics.com/ContactUs/ContactUs.html | Contact ENSO Plastics |
| ensoplastics.com/ContactUs/ContactUs.html | Contact ENSO Plastics | This is what is from the CSV...there are many more just like this. How do I cut out all of these duplicate urls?0 -
High number of Duplicate Page titles and Content related to index.php
It appears that every page on our site (www.bridgewinners.com) also creates a version of itself with a suffix. This results in Seomoz indicating that there are thousands of duplicate titles and content. 1. Does this matter? If so, how much? 2. How do I eliminate this (we are using joomla)? Thanks.
Technical SEO | | jfeld2220 -
Duplicate content connundrum
Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown
Technical SEO | | JamesBSEO0