The Bible and Duplicate Content
-
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links:
http://lds.org/scriptures/nt/james/1.5
http://lds.org/scriptures/nt/james/1.5-10
http://lds.org/scriptures/nt/james/1
All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter.
Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users.
We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap!
So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective.
Thanks all for taking the time!
-
Dave,
Thanks for the clarification. You're definitely in a rare circumstance as compared to most web sites.
In reality, since it's the Bible, there is going to be a duplicate content issue regardless, given how many sites currently and how many more will most likely publish the same content now and in the future. From Eternalministries.org to KingJamesBibleOnline.org, concordance.biblebrowser.com, and so many other sites are all offering this content.
If you can find a way to offer your content in a unique way, and within your own site, offer different versions of it (individual verses compared to entire chapters), then ideally yes, you'd want it all indexed.
How you do that without adding your own unique text above or below each page's direct biblical content is the issue though.
Given this challenge,this is why I offered the concept of not indexing variations. Even if you weren't hit by the Panda update, any time Google has to evaluate multiple pages across sites where the content is either identical or "mostly" identical, someone's content is going to suffer to one degree or another. Any time it's a conflict within a single site, some versions are going to be given less ranking value than others.
So unfortunately it's not a simple, straight forward situation where duplication avoidance can be guaranteed to provide the maximum reach, nor is there a simple way to boost multiple versions in a way to guarantee that they'll all be found, let alone show up above "competitor" sites.
This is why I initially offered what are essentially SEO best practices for addressing duplicate content.
If you don't want to lose the traffic you have now that come in by multiple means, the only other way to bolster what you've got already is to focus on high quality long term link building, and social media.
The link building would need to focus on obtaining high quality links pointing to deep content. (Specific chapter pages and specific verse pages), where the anchor text used in those links varies between chapter or verse specific words, broader bible related phrases, and the LDS brand.
On the other hand, by implementing canonical tags, you will definitely reduce at least a number of visits that currently come in by variation URLs. Will that be compensated for by an equal or greater number of visits to the new "preferred" URL? In this rather unique situation there's no way to truly know. It is a risk.
Which brings me back to the concept that you'd potentially be better off finding ways to add truly unique content around the biblical entries. It's the only on-site method I can think of that would allow you to continue to have multiple paths indexed. Combined with unique page Titles, chapter/verse targeted links and social media, it could very well make the difference.
With what, over 1100 chapters, and 31,000 verses, that's a lot of footwork. Then again, it's a labor of love, and every journey is made up of thousands of steps.
-
So you're saying it would not be a good idea to try and get every verse url listed in Google? Perhaps we could try adding a canonical tag to point the the chapter only? For example, browsing the site you can't actually navigate to http://lds.org/scriptures/nt/james/1.5?lang=eng. You can only navigate to /james/1?lang=eng. However, the other URLs exist when someone links externally to a specific chapter and verse. The code on the page will highlight the desired verse. In our example the entire chapter exists on its own url and the content is unique.
Your suggestion may work if we just canonicalize all those "verse" urls like /james/1.5?lang=eng and james/1.5-10?lang=eng to /james/1?lang=eng. Some of the more popular verses with great page authority could actually help prop up the rest of the content on the page.
My concern though is that MUCH of the scripture related traffic comes through queries of the exact chapter/verse reference. So I can see where having individual pages for each passage could be valuable for rankings. But that user experience is poor when someone wants to see a range of passages like ch 5 vs 1-4 or similar. So we are looking for the best way to get our URLs indexed and ranked as individual passages or ranges of passages that are popular on search engines.
I can tell you that this section was not hit by the Panda update. The content is not "thin" as could be the case if we put each verse on a single page.
The ?lang=eng parameter is how we handle language versions. We have the scriptures online in several languages. I'm sure there are better ways to handle that as well. Due to the size of the organization we're certainly trying to get the low hanging fruit out of the way first.
-
Dave,
You're facing a difficult challenge - satisfy the needs of SEO, or user experience. In light of all that Google has done going back to their May Day update last year and right through the Panda/Farmer update, duplicate content, as well as "thin" content, is more of a concern than ever.
Just having unique titles on each page is not enough. It's the entire weight of uniqueness.
Since you're not intending to go to individual pages for each verse, as long as you've got multiple methods of getting tocontent that is found by other methods, only one method should be designated as the primary search engine preferred method. All others should be blocked from being indexed.
From there, users can choose to explore other methods of finding content as they bookmark your site if they find it of help to their goals.
Unfortunately, this does of course, mean that you're going to end up with many less pages indexed. However every page that is indexed will become stronger in their individual rankings, and that in turn will boost all of the pages above them, and the entire site over time.
And here's another issue - when I go to any of the URLs you posted above, your site automatically tacks on "?lang=eng" using 301 Redirects. This means any inbound links you have pointing to the non-appended URLs are not providing maximum value to your site, since they point to pages designated as permanently moved.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with old website still online & duplicate content
I launched a new wordpress site at www.cheaptubes.com in Sept. I haven't taken the old one down yet, it is still at http://65.61.43.25/ The reason I left it up is I wanted to make sure everything was properly redirected 1st. Some pages and images are still ranking but most point to the new site. When I search for carbon nanotubes pricelist and look in images I see some of our images on the old site are still ranking there https://www.google.com/imgres?imgurl=http://65.61.43.25/images/single-walled-nanotubes.1.gif&imgrefurl=http://65.61.43.25/ohfunctionalizedcnts.htm&h=359&w=451&tbnid=HKlL84A_9X0jGM:&docid=N2wdCg7rSQBsjM&ei=-A2qVqThL4WxeKCyjdAM&tbm=isch&ved=0ahUKEwikvcWdxczKAhWFGB4KHSBZA8oQMwhJKCIwIg I guess I can put WP on the old server and do some 301s from there but I'm not sure if that is best or if I should just kill it off entirely? My rankings took a hit on Nov 15th and business has been bad ever since so I'm trying to figure this out quickly. Moz.com and onpage.org both say my site has duplicate content on several pages. I've looked at the content and it isn't duplicate. How can I figure this out? Google likely see's it the same way. These aren't duplicate pages, they are different products. I even searched my product pages to make sure I didn't have 2 of each in there and I don't. With Moz its mostly product tags it sees as duplicate but the products are completely different
Technical SEO | | cheaptubes0 -
When is Duplicate Content Duplicate Content
Hi, I was wondering exactly when duplicate content is duplicate content? Is it always when it is word-for-word or if it is similar? For example, we currently have an information page and I would like to add a FAQ to the website. There is, however, a crossover with the content and some of it is repeated. However, it is not written word for word. Could you please advise me? Thanks a lot Tom
Technical SEO | | National-Homebuyers0 -
Fullsite=true coming up as duplicate content?
Hello, I am new to the fullsite=true method of mobile site to desktop site, and have recently found that about 50 of the instances in which I added fullsite=true to links from our blog show as a duplicate to the page that it is pointing to? Could someone tell me why this would be? Do I need to add some sort of rel=canonical to the main page (non-fullsite=true) or how should I approach this? Thanks in advance for your help! L
Technical SEO | | lfrazer0 -
Advice on Duplicate Page Content
We have many pages on our website and they all have the same template (we use a CMS) and at the code level, they are 90% the same. But the page content, title, meta description, and image used are different for all of them. For example - http://www.jumpstart.com/common/find-easter-eggs
Technical SEO | | jsmoz
http://www.jumpstart.com/common/recognize-the-rs We have many such pages. Does Google look at them all as duplicate page content? If yes, how do we deal with this?0 -
Duplicate page/Title content - Where?
Hi, I have just run a crawl on a new clients site, and there is several 'duplicate page content' and 'Duplicate Page Title'' issues. But I cannot find any duplicate content. And to make matters worse. The actual report has confused me. Just for example the about us page is showing in both reports and for both under 'Other URLs' it is showing 1? Why? Does this mean there is 1 other page with duplicate page title? or duplicate page content? Where are the pages that have the duplicate page titles, or duplicate page content? I have run scans using other software and a copyscape scan. And apart from missing page titles, I cannot find any page that has duplicate titles or content. I can find % percentages of pages with similar/same page titles/content. But this is only partial and contextually correct. So I understand that SEO Moz may pick percentage of content, which is fine, and therefore note that there is duplicate content/page titles. But I cannot seem to figure out where I would the source of the duplicate content/page titles. As there is only 1 listed in both reports for 'Other URLs' Hopefully my long question, has not confused. many thanks in advance for any help
Technical SEO | | wood1e20 -
Duplicate Content Issue
SEOMOZ is giving me a number of duplicate content warnings related to pages that have an email a friend and/or email when back in stock versions of a page. I thought I had those blocked via my robots.txt file which contains the following... Disallow: /EmailaFriend.asp Disallow: /Email_Me_When_Back_In_Stock.asp I had thought that the robot.txt file would solve this issue. Anyone have any ideas?
Technical SEO | | WaterSkis.com0 -
Duplicate Content
Hi, we need some help on resolving this duplicate content issue,. We have redirected both domains to this magento website. I guess now Google considered this as duplicate content. Our client wants both domain name to go to the same magento store. What is the safe way of letting Google know these are same company? Or this is not ideal to do this? thanks
Technical SEO | | solution.advisor0 -
Noindex duplicate content penalty?
We know that google now gives a penalty to a whole duplicate if it finds content it doesn't like or is duplicate content, but has anyone experienced a penalty from having duplicate content on their site which they have added noindex to? Would google still apply the penalty to the overall quality of the site even though they have been told to basically ignore the duplicate bit. Reason for asking is that I am looking to add a forum to one of my websites and no one likes a new forum. I have a script which can populate it with thousands of questions and answers pulled direct from Yahoo Answers. Obviously the forum wil be 100% duplicate content but I do not want it to rank for anyway anyway so if I noindex the forum pages hopefully it will not damage the rest of the site. In time, as the forum grows, all the duplicate posts will be deleted but it's hard to get people to use an empty forum so need to 'trick' them into thinking the section is very busy.
Technical SEO | | Grumpy_Carl0