Opinions on Boilerplate Content
-
Howdy,
Ideally, uniqueness for every page's title, description, and content is desired. But when a site is very, very large, it becomes impossible. I don't believe our site can avoid boilerplate content for title tags or meta-descriptions. We will, however, markup the pages with proper microdata so Google can use this information as they please.
What I am curious about is boilerplate content repeated throughout the site for the purpose of helping the user, as well as to tell Google what the page is about (rankings).
For instance, this page and this page offer the same type of services, but in different areas. Both pages (and millions of others) offer the exact same paragraph on each page. The information is helpful to the user, but it's definitely duplicate content. All they've changed is the city name.
I'm curious, what's making this obvious duplicate content issue okay? The additional unique content throughout (in the form of different businesses), the small, yet obvious differences in on-site content (title tags clearly represent different locations), or just the fact that the site is HUGELY authorative and gets away with it?
I'm very curious to hear your opinions on this practice, potential ways to avoid it, and whether or not it's a passable practice for large, but new sites.
Thanks!
-
The SEO of the site is probably fine. The problem with the site is that it takes one page of content and smears it across dozens of thin content, duplicate content, cookie cutter pages. The SEO is lipstick on a pig.
-
Thanks again for the response, EGOL. It is appreciated.
Can you point to any examples of large-scale sites like this with better SEO for these pages? I mean, any site that targets every city, neighborhood, park, etc. with content like this should theoretically run into duplicate content and display thin result pages quite often.
And even so, these pages are helpful. I Google "restaurant + small cities near me" and Yelp pages come up, which benefit me.
Yelp is one of the biggest review sites on the web and their filtered search result pages are indexed and ranking ultra high all over the place. What are they doing so special?
This page and this page both offer nearly the same exact results, just shuffled a bit. Beyond simply being too big to get slapped, why is it okay when Yelp does this?
-
I agree. It is on a very thin line. I believe that Google's Panda algo will eventually hit it. I look at lots of site that people say lost traffic. This one has a similar design and content Style.
-
That's interesting. It seems to have been around for quite a while and ranks well. Of all the similar sites I've seen, Houzz seems to walk the thinnest line on bad-SEO though. Their filter creates nearly identical pages, all of which get indexed, and they have no canonicals for any of them and virtually the same on-page SEO as well. Not to mention the same blurbs across millions of pages, etc.
It's weird to me though that a reasonably targeted blurb is such bad business when the rest of the site is so helpful to users. One would think Google would allow it since the blurbs apply to each page and the "results" are the real meat and potatoes of the site.
-
This site has lots of duplicate content from page to page and lots of thin content on a repeating template. It will be hit by Panda.
-
EGOL,
I think you're making unfair assumptions about our site. Each page visible to Google will have helpful information and content on the site. The one's that don't will not be "published" for Google or our users.
I assure you, the site will be worthwhile and helpful to the end user, especially as time progresses. In fact, if you read above, I am asking specifically about adding additional helpful content to the user, but trying to avoid DC issues by posting it throughout each site.
I am not trying to shortcut anything, I'm curious why some sites are able to seemingly circumvent SEO tenets and was hoping for a helpful discussion.
And again, I'll reiterate, I am not interested in boilerplate content to shortcut anything. It would be in addition to existing useful content. The boilerplate content on similar pages would also be beneficial to the end user. Using the examples above, I believe the small blurbs above _can _be helpful to the user. Do you agree?
Thanks for the response.
-
The problem that you face is that you are trying to make a website with millions of pages for which you do not have adequate content. You are trying to take shortcuts by using a cookiecutter instead of doing the work to make a worthy and unique website.
If you continue with your current business plan, I believe that Google will not treat your site very well. These sites used to work in Google over ten years ago and at that time they were ingenious. Today they are spam.
-
The paragraph of helpful content is identical (beyond a city being swapped out) but it still helps their searches. If you tailor a search with one of their cities and a cousin keyword within the text, they pop-up on the front page usually. That's what I'm asking about. Why is Google ignoring this obvious DC?
I'm assuming the business listings are making the page unique enough to override the duplicate paragraph + the site is huge and has TONS of authority.
-
They're not identical, and I notice many directories are set-up like this. Two individual users with different interests would find unique information from both of these samples. The only issue is how your competition has setup their page. For instance, if someone is just targeting Phoenix, and really goes to town with unique information and links, that may rank better because they may be views as more of an authority on the subject.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Cross Domain duplicate content...
Does anyone have any experience with this situation? We have 2 ecommerce websites that carry 90% of the same products, with mostly duplicate product descriptions across domains. We will be running some tests shortly. Question 1: If we deindex a group of product pages on Site A, should we see an increase in ranking for the same products on Site B? I know nothing is certain, just curious to hear your input. The same 2 domains have different niche authorities. One is healthcare products, the other is general merchandise. We've seen this because different products rank higher on 1 domain or the other. Both sites have the same Moz Domain Authority (42, go figure). We are strongly considering cross domain canonicals. Question 2 Does niche authority transfer with a cross domain canonical? In other words, for a particular product, will it rank the same on both domains regardless of which direction we canonical? Ex: Site A: Healthcare Products, Site B: General Merchandise. I have a health product that ranks #15 on site A, and #30 on site B. If I use rel=canonical for this product on site B pointing at the same product on Site A, will the ranking be the same if I use Rel=canonical from Site A to Site B? Again, best guess is fine. Question 3: These domains have similar category page structures, URLs, etc, but feature different products for a particular category. Since the pages are different, will cross domain canonicals be honored by Google?
Intermediate & Advanced SEO | | AMHC1 -
Reviews and Other Content in Tabs and SEO
Hello, We are redesigning our product page and have considered putting our customer reviews in a 'tab' on the page, so it is not visible to the user until they click on the tab. Are there any SEO implications of this? Right now, we do have problems with this because we use a third party tool for our reviews and they are in javascript, so they do not get crawled, but going forward we will be using our native platform. We want the text of the reviews to get crawled and indexed. Thanks.
Intermediate & Advanced SEO | | Colbys0 -
Rotating Content Concern on Deep Pages
Hi there, I apologize if I'm too vague, but this is a tough issue describe without divulging too much of our project. I'm working on a new project which will provide information results in sets of 3. Let's say someone wants to find 3 books that match their criteria, either through their organic search which leads them to us, or through their internal search on our site. For instance, if they're looking for classic movies involving monsters, we might display Frankenstein, Dracula, and The Mummy. We'd list unique descriptions about the movies and include lots of other useful information. However, there are obviously many more monster movies than those 3, so when a user refreshes the page or accesses it again, a different set of results show up. For this example, assume we have 5 results to choose from. So it's likely Google will index different results shuffled around. I'm worried about this causing problems down the line with ranking. The meat and potatoes of the page content are the descriptions and information on the movies. If these are constantly changing, I'm afraid the page will look "unstable" to Google since we have no real static content beyond a header and title tag. Can anyone offer any insight to this? Thanks!
Intermediate & Advanced SEO | | kirmeliux0 -
E-commerce store, in need of protecting our own content
Dear other Moz fans, We have an E-commerce store in Norway. Our main conversion to sale still happens in our physical store, but do to the description and information we provide online.
Intermediate & Advanced SEO | | Monica_Flirt
To warn you before you click; Our store is a boutique for "erotic items". A nice one how ever, made buy woman for woman and their man. We use enormous time writing descriptions and information for (almost) every item online.
We really want to protect our content (text information). What is the best practice to mark up "protection" of our hard work content? Thank you for your time.
Regards form the Flirt girls in Norway.0 -
Duplicate content reported on WMT for 301 redirected content
We had to 301 redirect a large number of URL's. Not Google WMT is telling me that we are having tons of duplicate page titles. When I looked into the specific URL's I realized that Google is listing an old URL's and the 301 redirected new URL as the source of the duplicate content. I confirmed the 301 redirect by using a server header tool to check the correct implementation of the 301 redirect from the old to the new URL. Question: Why is Google Webmaster Tool reporting duplicated content for these pages?
Intermediate & Advanced SEO | | SEOAccount320 -
Duplicate content for images
On SEOmoz I am getting duplicate errors for my onsite report. Unfortunately it does not specify what that content is... We are getting these errors for our photo gallery and i am assuming that the reason is some of the photos are listed in multiple categories. Can this be the problem? what else can it be? how can we resolve these issues?
Intermediate & Advanced SEO | | SEODinosaur0 -
Duplicate content
Is there manual intervention required for a site that has been flagged for duplicate content to get back to its original rankings, once the duplicated content has been removed? Background: Our site recently experienced a significant drop in traffic around the time that a chunk of content from other sites (ie. duplicate) went live. While it was not an exact replica of the pages on other sites, there was quite a bit of overlap. That content has since been removed, but our traffic hasn't improved. What else can we do to improve our ranking?
Intermediate & Advanced SEO | | jamesti0 -
Duplicate page Content
There has been over 300 pages on our clients site with duplicate page content. Before we embark on a programming solution to this with canonical tags, our developers are requesting the list of originating sites/links/sources for these odd URLs. How can we find a list of the originating URLs? If you we can provide a list of originating sources, that would be helpful. For example, our the following pages are showing (as a sample) as duplicate content: www.crittenton.com/Video/View.aspx?id=87&VideoID=11 www.crittenton.com/Video/View.aspx?id=87&VideoID=12 www.crittenton.com/Video/View.aspx?id=87&VideoID=15 www.crittenton.com/Video/View.aspx?id=87&VideoID=2 "How did you get all those duplicate urls? I have tried to google the "contact us", "news", "video" pages. I didn't get all those duplicate pages. The page id=87 on the most of the duplicate pages are not supposed to be there. I was wondering how the visitors got to all those duplicate pages. Please advise." Note, the CMS does not create this type of hybrid URLs. We are as curious as you as to where/why/how these are being created. Thanks.
Intermediate & Advanced SEO | | dlemieux0