Base copy on 1 page, then adding a bit more for another page - potential duplicate content. What to do?
-
Hi all,
We're creating a section for a client that is based on road trips - for example, New York to Toronto. We have a 3 day trip, a 5 day trip, a 7 day trip and a 10 day trip. The 3 day trip is the base, and then for the 5 day trip, we add another couple of stops, for the 7 day trip, we add a couple more stops and then for the 10 day trip, there might be two or three times the number of stops of the initial 3 day trip.
However, the base content is similar - you start at New York, you finish in Toronto, you likely go through Niagara on all trips. It's not exact duplicate content, but it's similar content. I'm not sure how to look after it?
The thoughts we have are:1) Use canonical tags 3,5,7 day trips to the 10 day trip.
2) It's not exactly duplicate content, so just go with the content as it isWe don't want to get hit by any penalty for duplicate content so just want to work out what you guys think is the best way to go about this.
Thanks in advance!
-
I agree with Ryan, doing it all on one page with the different number of days/itineraries sectioned out is the way I'd go.
First, a page actually can rank well for more than one search term, especially when they are so closely related.
And second, what is the point of optimizing different pages for different numbers of days and then canonicalizing them to the ten day trip? The canonical indicates that only the ten-day page should be indexed, so who cares whether the shorter trip pages are optimized or not? They won't be findable in the SERPs.
Lastly, there is no penalty as such for duplicate content. Google just decides which page is the most useful to show and the others drop out of the index. If your pages are very similar, Google may well make the decision for you and drop some of them out of the index.
-
I see. In my experience that is too small of a difference to create multiple pages for. I'd get better rankings driving links, reviews, and engagement around the one page with multiple day purchase options. For content marketing to work well it needs to be more differentiated per page. Cheers!
-
Thanks Ryan!
It's more for content marketing than for selling anything - we are going to have multiple 3 days, multiple 5 days etc. Our thought process was if we can have different pages, then we can optimise around "new york to toronto 3 day trip", "new york to toronto 5 day trip" etc.
Cheers!
-
To me this seems like something that would work better on one page, just from the potential of upsells and CRO. Plus you'd have more content and strength potential dedicated to the one page regardless of trip duration which would be better for search. I'd cluster something like this. Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to avoid duplicate content
Hi there, Our client has an ecommerce website, their products are also showing on an aggregator website (aka on a comparison website where multiple vendors are showing their products). On the aggregator website the same photos, titles and product descriptions are showing. Now with building their new website, how can we avoid such duplicate content? Or does Google even care in this case? I have read that we could show more product information on their ecommerce website and less details on the aggregator's website. But is there another or better solution? Many thanks in advance for any input!
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
Duplicate content based on filters
Hi Community, There have probably been a few answers to this and I have more or less made up my mind about it but would like to pose the question or as that you post a link to the correct article for this please. I have a travel site with multiple accommodations (for example), obviously there are many filter to try find exactly what you want, youcan sort by region, city, rating, price, type of accommodation (hotel, guest house, etc.). This all leads to one invevitable conclusion, many of the results would be the same. My question is how would you handle this? Via a rel canonical to the main categories (such as region or town) thus making it the successor, or no follow all the sub-category pages, thereby not allowing any search to reach deeper in. Thanks for the time and effort.
Intermediate & Advanced SEO | | ProsperoDigital0 -
Duplicated privacy policy pages
I work for a small web agency and I noticed that many of the sites that we build have been using the same privacy policy. Obviously it can be a bit of a nightmare to write a unique privacy policy for each client so is Google likely to class this as duplicate content and result in a penalty? They must realise that privacy policies are likely to be the same or very similar as most legal writing tends to be! I can block the content in robots.txt or meta no-index it if necesarry but I just wanted to get some feedback to see if this is necessary!
Intermediate & Advanced SEO | | Jamie.Stevens1 -
Noindexing Duplicate (non-unique) Content
When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Example: I have a real estate business and I have noindex on MLS pages. However, is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website? On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....
Intermediate & Advanced SEO | | khi50 -
Magento Duplicate Content Recovery
Hi, we switched platforms to Magento last year. Since then our SERPS rankings have declined considerably (no sudden drop on any Panda/Penguin date lines). After investigating, it appeared we neglected to No index, follow all our filter pages and our total indexed pages rose sevenfold in a matter of weeks. We have since fixed the no index issue and the pages indexed are now below what we had pre switch to Magento. We've seen some positive results in the last week. Any ideas when/if our rankings will return? Thanks!
Intermediate & Advanced SEO | | Jonnygeeuk0 -
Duplicate content for swatches
My site is showing a lot of duplicate content on SEOmoz. I have discovered it is because the site has a lot of swatches (colors for laminate) within iframes. Those iframes have all the same content except for the actual swatch image and the title of the swatch. For example, these are two of the links that are showing up with duplicate content: http://www.formica.com/en/home/dna.aspx?color=3691&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= http://www.formica.com/en/home/dna.aspx?color=204&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= I do want each individual swatch to show up in search results and they currently are if you search for the exact swatch name. Is the fact that they all have duplicate content affecting my individual rankings and my domain authority? What can I do about it? I can't really afford to put unique content on each swatch page so is there another way to get around it? Thanks!
Intermediate & Advanced SEO | | AlightAnalytics0 -
Load balancing - duplicate content?
Our site switches between www1 and www2 depending on the server load, so (the way I understand it at least) we have two versions of the site. My question is whether the search engines will consider this as duplicate content, and if so, what sort of impact can this have on our SEO efforts? I don't think we've been penalised, (we're still ranking) but our rankings probably aren't as strong as they should be. The SERPs show a mixture of www1 and www2 content when I do a branded search. Also, when I try to use any SEO tools that involve a site crawl I usually encounter problems. Any help is much appreciated!
Intermediate & Advanced SEO | | ChrisHillfd0 -
Need help/insights. Site SEO = OK, Authority BLs = OK, Rank page #1\. How to reach pos #1?
Hi there! Some doubts are confusing my head and need some assistence from you to get on the right track. I'll explain my situation and want to hear from you what do you really recommend for med/long term permanent results. 1 - I have a PR2 (.com.br) domain; 2 - I'm talking about little/med competition micro-niche keywords; 3 - I got all pages I want to, indexed (I have a well SEO constructed website with internal link building); 4 - If a keyword has average competition, I'll already start ranking in page #3 on the SERP's; For a few low competition keywords I start on page #1; 5 - I do a little whitehat link building, 1 or 2 backlinks on authority sites and then like 15 days later I came to page #1, generally on position 9/10; And then I got stucked 🙂 No more authority sites where I can get backlinks... I do some posts on the company twitter/facebook page's, but they are no follow, so I don't really now if this can help. (never see a SERP result). I did some "blackhat" stuff to see if it really work: I can say for sure the "profile backlinks" that we can buy from some sites doesn't work (maybe it's just for me). I can't see it on webmaster tool and neither my ranks changed since I bought a pack of 100 links (the links are working, I see it one by one) to test. Maybe the problem is about the domains, cause my site is .com.br and I'm buying .com profile links. I guess google understand backlinks from .com.br more valuable for my sites. Back to whitehat: I wrote some articles and posted it the right way, of course on .com.br articles sites, got it indexed and can see the backlink on webmaster tool, but no change on SERP's. (maybe this can be a long term result and I'm not seeing it yet). I'm really "scratching my hand" to do some blackhat stuff, but I don't want to lose what I already have done... I heard a lot about scrapebox but doesn't fell confortable to spam as hell a lot of blogs. I really want long term permanent results (my sites are totally whitehat/corporate sites). Can you expert guys give me some point to where I need to "walk" now to improve the SERP's? I never reached top #1 and want to try to rank at least one time to understand how this can be made... I'm thinking now to pay someone to rewrite 20 copies of an article and up it on some sites, to see if 20 can improve something. But still no confident, because it will cost like $100 for a good writer do it for me on my language. Maybe I can do better things with 100 bucks. I guess I did the path right: Internal SEO -> got indexed -> backlinking from authorities -> new articles backlinks to me (is it ok at this position or no?) -> (what next ?) I know SEO is a hard/never ending work, but what I'm trying to get cleaned on my head is the path of the work (if a right path really exists). Every word will be apreciated. What do you can suggest to me to try now? (please give me a hint to see SERP's results 🙂 if I feel that something worked, no matter how it can cost to me, but I'll pay for the work happily) Sorry if I'm a little confusing, english isnt' my first language. Thanks.
Intermediate & Advanced SEO | | azaiats20