How can a website have multiple pages of duplicate content - still rank?
-
Can you have a website with multiple pages of the exact same copy, (being different locations of a franchise business), and still be able to rank for each individual franchise? Is that possible?
-
No it's not possible.
Let's say you have http://foo.com/a and http://bar.com/b and both have the exact same content, google will choose one, index it, and forget about the other.
Let's say you have 1000 pages on both foo.com and bar.com exactly the same 1000 pages, google will analyze each page one by one, and choose one version for the index, and forget about the other, depending on a variety of factors (like the domain itself and backlink profile) will pick some pages from foo.com and some from bar.com, but you will still have only 1000 pages in the index.
How much of the content of each page you need to change to differentiate is unknown since no one knows the code of google algo.
You can try with guess and test, you change something and you see what happen monitoring changes in the indexing of the edited pages.
-
Hi Steve,
It is pretty well known duplicate content is not a "good thing" but it doesn't mean it will hurt you directly. Google rarely will penalize a site for duplicate content REF: Matt Cutts
It does however make ranking a specific page harder when you're competing with yourself. Most SEO tools will warn you about each case of duplicate content it finds which can also be annoying.
In the best case scenario you may rank multiple pages for longer tailed keywords for example.. something like
Main Keyword: Gourmet Shopping
Tails: Dayton Ohio, Tampa Florida, Los Angles California, Topeka KansasIn such case you may would rank 1 of those pages over the other for Gourmet Shopping however if there was enough specific location specific content you may find other pages ranking well in those city state type searches.
Hope that makes sense,
Don
-
If you have unique content on the actual franchise page (location etc.), the underlying pages could be duplicate content but use canonicals to point to the original content.
So you need at least one page per franchise to be unique (and use the proper markup if applicable).
Good luck
-
Is it impossible to write unique content for these pages? Even a small amount of unique content on top of the exact same copy would be beneficial.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does having Multiple Similar Topic pages hurt my ranking?
Hi, We have an ecommerce store and currently have topic pages setup for each category/location combination, each topic page lists relevant products available for sale, so for example Most Popular Birthday Party products in UK
Intermediate & Advanced SEO | | cmavroudisyahoocom
Most Popular Birthday Party products in London
Most Popular Birthday Party products in Manchester We are now looking at ways of capitalising on longtail keyword searches and a potential solution is to expand the number of Topic Pages/Location combinations so for example Most Popular Birthday Party products in UK
Cheapest Birthday Party products in UK
Birthday Party products for small groups in UK
Birthday Party products for large groups in UK
Children Birthday Party products for in UK
etc In general would this be a positive or negative thing to do for our site to give each longtail keyword its own dedicated topic page (given that our crawl budget is not necessarily high). Or should we just try to add longtail keyword to the original topic page itself and make that one rank better? Thanks0 -
How will canonicalizing an https page affect the SERP-ranked http version of that page?
Hey guys, Until recently, my site has been serving traffic over both http and https depending on the user request. Because I only want to serve traffic over https, I've begun redirecting http traffic to https. Reviewing my SEO performance in Moz, I see that for some search terms, an http page shows up on the SERP, and for other search terms, an https page shows. (There aren't really any duplicate pages, just the same pages being served on either http or https.) My question is about canonical tags in this context. Suppose I canonicalize the https version of a page which is already ranked on the SERP as http. Will the link juice from the SERP-ranked http version of that page immediately flow to the now-canonical https version? Will the https version of the page immediately replace the http version on the SERP, with the same ranking? Thank you for your time!
Intermediate & Advanced SEO | | JGRLLC0 -
I have implemented rel = "next" and rel = "prev" but google console is picking up pages as being duplicate. Can anyone tell me what is going on?
I have implemented rel="next" and rel = "prev" across our site but google console is picking it up as duplications. Also individual pages show up in search result too. Here is an example linkhttp://www.empowher.com/mental-health/content/sizeismweightism-how-cope-it-and-how-it-affects-mental-healthhttp://www.empowher.com/mental-health/content/sizeismweightism-how-cope-it-and-how-it-affects-mental-health?page=0,3The second link shows up as duplicate. What can i do to fix this issue?
Intermediate & Advanced SEO | | akih0 -
4 websites with same content?
I have 4 websites (1 Main, 3 duplicate) with same content. Now I want to change the content for duplicate websites and main website will remain the same content. Is there any problem with my thinking?
Intermediate & Advanced SEO | | marknorman0 -
"No index" page still shows in search results and paginated pages shows page 2 in results
I have "no index, follow" on some pages, which I set 2 weeks ago. Today I see one of these pages showing in Google Search Results. I am using rel=next prev on pages, yet Page 2 of a string of pages showed up in results before Page 1. What could be the issue?
Intermediate & Advanced SEO | | khi50 -
Want to merge high ranking niche websites into a new mega site, but don't want to lose authority from old top level pages
I have a few older websites that SERP well, and I am considering merging some or all of them into a new related website that I will be launching regardless. My old websites display real estate listings and not much else. Each website is devoted to showing homes for sale in a specific neighborhood. The domains are all in the form of Neighborhood1CityHomes.com, Neighborhood2CityHomes.com, etc. These sites SERP well for searches like "Neighborhood1 City homes for sale" and also "Neighborhood1 City real estate" where some or all of the query is in the domain name. Google simply points to the top of the domain although each site has a few interior pages that are rarely used. There is next to zero backlinking to the old domains, but each links to the other with anchor text like "Neighborhood1 Cityname real estate". That's pretty much the extent of the link profile. The new website will be a more comprehensive search portal where many neighborhoods and cities can be searched. The domain name is a nonsense word .com not related to actual key words. The structure will be like newdomain.com/cityname/neighborhood-name/ where the neighborhood real estate listings are that would replace the old websites, and I'd 301 the old sites to the appropriate internal directories of the new site. The content on the old websites is all on the home page of each, at least the content for searches that matter to me and rank well, and I read an article suggesting that Google assigns additional authority for top level pages (can I link to that here?). I'd be 301-ing each old domain from a top level to a 3rd level interior page like www. newdomain/cityname/neighborhood1/. The new site is better than the old sites by a wide margin, especially on mobile, but I don't want to lose all my top positions for some tough phrases. I'm not running analytics on the old sites in question, but each of the old sites has extensive past history with AdWords (which I don't run any more). So in theory Google knows these old sites are good quality.
Intermediate & Advanced SEO | | Gogogomez0 -
I try to apply best duplicate content practices, but my rankings drop!
Hey, An audit of a client's site revealed that due to their shopping cart, all their product pages were being duplicated. http://www.domain.com.au/digital-inverter-generator-3300w/ and http://www.domain.com.au/shop/digital-inverter-generator-3300w/ The easiest solution was to just block all /shop/ pages in Google Webmaster Tools (redirects were not an easy option). This was about 3 months ago, and in months 1 and 2 we undertook some great marketing (soft social book marking, updating the page content, flickr profiles with product images, product manuals onto slideshare etc). Rankings went up and so did traffic. In month 3, the changes in robots.txt finally hit and rankings decreased quite steadily over the last 3 weeks. Im so tempted to take off the robots restriction on the duplicate content.... I know I shouldnt but, it was working so well without it? Ideas, suggestions?
Intermediate & Advanced SEO | | LukeyJamo0 -
Does having multiple links to the same page influence the Link juice this page is able to pass
Say you have a page and it has 4 outgoing links to the same internal page. In the original Pagerank algo if these links were links to an page outside your own domain, this would mean that the linkjuice this page is able to pass would be devided by 4. The thing is i'm not sure if this is also the case when the outgoing link, is linking to a page on your own domain. I would say that outgoing links (whatever the destination) will use some of your link juice, so it would be better to have 1 outgoing link instead of 4 to the same destination, the the destination will profit more form that link. What are you're thoughts?
Intermediate & Advanced SEO | | TjeerdvZ0