Duplicate biographies across several domains?
-
Hey Gang,
We've built a niche specific, manually edited legal community that is full of unique content. (While it's ultimately a directory, and that is often viewed as a bad word in these times, ours is a curated source that doesn't allow anyone and everyone to join.) We feel comfortable that it passes the sniff test post-panda/penguin, etc., and its doing rather well to date.
The question we have is do we really need to create unique biographies for each of our legal members? Some of our competitors simply use the same bio information that the lawyer has on their own website and copies that to their site. The competitors I'm talking about are LARGE, well-respected, extremely successful folks, like FindLaw.
Here's an example:
Both rank, etc., and the FindLaw code doesn't place any restrictions on their content regarding the bio., and it's an obvious exact match.
I get that duplicate content is primarily a concern among one's own URL and the pages across a specific domain, using rel-canonical, etc., but what about two different domains that need to supply factual information that can't be altered? Is it anything we should worry ourselves with? Any tags we should insert in our code with regard to the bios?
Thanks!!!
Wayne
-
Thanks, Oleg! I was thinking along the same lines, but it's a bit difficult to reproduce a person's biography, as it is such specific information, e.g., Schools they graduated from, where they worked, experience, etc. And since that's the only information on that page, it makes it a bit tough, indeed!
-
You will have a much easier time ranking if you provide unique biographies instead of scraping something that already appears on several websites. Like you said, those sites are huge and have a lot of authority so they will rank and you wont.
However, if the scraped bio is just a portion of the page and the rest of the page is unique, valuable information, that would help offset the copied content. Next time a panda refresh comes around, you'll know if you got hit with a duplicate content penalty.
-Oleg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Consolidating a Large Site with Duplicate Content
I will be restructuring a large website for an OEM. They provide products & services for multiple industries, and the product/service offering is identical across all industries. I was looking at the site structure and ran a crawl test, and learned they have a LOT of duplicate content out there because of the way they set up their website. They have a page in the navigation for “solution”, aka what industry you are in. Once that is selected, you are taken to a landing page, and from there, given many options to explore products, read blogs, learn about the business, and contact them. The main navigation is removed. The URL structure is set up with folders, so no matter what you select after you go to your industry, the URL will be “domain.com/industry/next-page”. The product offerings, blogs available, and contact us pages do not vary by industry, so the content that can be found on “domain.com/industry-1/product-1” is identical to the content found on “domain.com/industry-2/product-1” and so-on and so-forth. This is a large site with a fair amount of traffic because it’s a pretty substantial OEM. Most of their content, however, is competing with itself because most of the pages on their website have duplicate content. I won’t begin my work until I can dive in to their GA and have more in-depth conversations with them about what kind of activity they’re tracking and why they set up the website this way. However, I don’t know how strategic they were in this set up and I don’t think they were aware that they had duplicate content. My first thought would be to work towards consolidating the way their site is set up, so we don’t spread the link-equity of “product-1” content, and direct all industries to one page, and track conversion paths a different way. However, I’ve never dealt with a site structure of this magnitude and don’t want to risk messing up their domain authority, missing redirect or URL mapping opportunities, or ruin the fact that their site is still performing well, even though multiple pages have the same content (most of which have high page authority and search visibility). I was curious if anyone has dealt with this before and if they have any recommendations for tackling something like this?
On-Page Optimization | | cassy_rich0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
WordPress - duplicate content
I'm using WordPress for my website. However, whenever I use the post section for news, I get a report back from SEOmoz saying that there's duplicate content. What it does is it posts them in the Category and Archive section. Does anyone know if Google sees this as duplicate content and if so how to stop it? Thanks
On-Page Optimization | | AAttias0 -
Issue: Duplicate Page Content (index.htm)
I get an error of "**Issue:**Duplicate Page Content" for the following pages in the SEOMOZ Crawl Diagnostics. But these pages are the same one! Duhhhh.... Is there a way to hide this false error? http://www.stdtime.com/ http://www.stdtime.com/index.htm BTW, I also get "**Issue:**Duplicate Page Title" for this page. Another false error...
On-Page Optimization | | raywhite0 -
Suggestions to avoid duplicate content
Hi, we have about 6500 products, almost all with descriptions. SEOMOZ is showing about 2500 of them with duplicate content. The reason for this is that only one or two words are different for each product. For example, we have 500 award certificates. All are the same size and have the same description. But one is swimming, one baseball, one reading, etc, etc. Apparently the 1 word difference is not enough to differentiate. We have the same issue with our trophies - they are identical, except for figures. Does anyone have any good tips on how to change the content to avoid this issue and to avoid making up content for 2500 items? Thanks! Neil trophycentral.com
On-Page Optimization | | trophycentraltrophiesandawards0 -
Duplicate Page Titles
I have over 200 duplicate page titles on a site that I am working on. Does putting a date at the end of some of them make it a unique enough title?
On-Page Optimization | | SavingSense0 -
What if it is a domain name
The suggestion is that my keyword food truck be in the URL of gourmetstreets.com. But what if my URL which is the domain name of the site does not contain the keyword? I cannot change the domain name, can or should I? Thanks
On-Page Optimization | | richwebstudio0 -
Crawl Diagnostics - Duplicate Content and Duplicate Page Title Errors
I am getting a lot of duplicate content and duplicate page title errors from my crawl analysis. I using volusion and it looks like the photo gallery is causing the duplicate content errors. both are sitting at 231, this shows I have done something wrong... Example URL: Duplicate Page Content http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Duplicate Page Title http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Would anyone know how to properly disallow this? Would this be as simple as a robots.txt entry or something a little more involved within volusion? Any help is appreicated. Cheers Geoff B. (a.k.a) newbie.
On-Page Optimization | | GeoffBatterham0