How to deal with very similar (thin) content by design?
-
Hello all,
I run a website which lists direct contact details (tel. and email.) of organisations.
I have 100s of similar pages which are very thin on content (by design). Each page has a couple of lines of somewhat unique content.
People find the site useful since it simply tells them which number to dial in order to speak to a real person at any given organisation. They can't easily find the information elsewhere and I believe it satisfies search intent.
Am I at risk for being flagged for duplicate / low quality content?
Should I add more text simply to add 'unique' content to each page even though it adds no value to users? That doesn't seem right either!
Looking forward to hear where you guys stand on this,
Many thanks,
-
Personally I would find a way of thinking laterally about generating new and unique content for your existing and potential users. "Poetry is when you make new things familiar and familiar things new," according to marketing guru Rory Sutherland. This rings true with content.
-
I'd definitely consider adding a review section on each page for the quality of the information, something along the lines of what "who called me" sites do. That way you're getting user generated content around how quickly they were able to get a hold of someone and how it helped resolve their issue. That would be a value add and help differentiate the pages. Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weight of content further down a page
Hi, A client is trying to justify a design decision by saying he needs all the links for all his sub pages on the top level category page as google won't index them; however the links are available on the sub category and the sub category is linked to from the top level page so I have argued as long as google can crawl the links through the pages they will be indexed and won't be penalised. Am I correct? Additionally the client has said those links need to be towards the top of the page as content further down the page carries less weight; I don't believe this is the case but can you confirm? Thanks again, Craig.
Intermediate & Advanced SEO | | CSIMedia1 -
Minimum amount of content for Ecommerce pages?
Hi Guys, Currently optimizing my e-commerce store which currently has around 100 words of content on average for each category page. Based on this study by Backlinko the more content the better: http://backlinko.com/wp-content/uploads/2016/01/02_Content-Total-Word-Count_line.png Would you say this is true for e-commerce pages, for example, a page like this: http://www.theiconic.com.au/yoga-pants/ What benefits would you receive with adding more content? Is it basically more content, leads to more potential long-tail opportunity and more organic traffic? Assuming the content is solid and not built just for SEO reasons. Cheers.
Intermediate & Advanced SEO | | seowork2140 -
Noindexing Thin News Content for Panda
We've been suffering under a Panda penalty since Oct 2014. We've completely revamped the site but with this new "slow roll out" nonsense it's incredibly hard to know at what point you have to accept that you haven't done enough yet. We have thousands of news stories going back to 2001, some of which are probably thin and some of which are probably close to other news stories on the internet being articles based on press releases. I'm considering noindexing everything older than a year just in case, however, that seems a bit of overkill. The question is, if I mine the logfiles and only deindex stuff that Google sends no further traffic to after a year could this be seen as trying to game the algo or similar? Also, if the articles are noindexed but still exist, is that enough to escape a Panda penalty or does the page need to be physically gone?
Intermediate & Advanced SEO | | AlfredPennyworth0 -
Scraped content ranking above the original source content in Google.
I need insights on how “scraped” content (exact copy-pasted version) rank above the original content in Google. 4 original, in-depth articles published by my client (an online publisher) are republished by another company (which happens to be briefly mentioned in all four of those articles). We reckon the articles were re-published at least a day or two after the original articles were published (exact gap is not known). We find that all four of the “copied” articles rank at the top of Google search results whereas the original content i.e. my client website does not show up in the even in the top 50 or 60 results. We have looked at numerous factors such as Domain authority, Page authority, in-bound links to both the original source as well as the URLs of the copied pages, social metrics etc. All of the metrics, as shown by tools like Moz, are better for the source website than for the re-publisher. We have also compared results in different geographies to see if any geographical bias was affecting results, reason being our client’s website is hosted in the UK and the ‘re-publisher’ is from another country--- but we found the same results. We are also not aware of any manual actions taken against our client website (at least based on messages on Search Console). Any other factors that can explain this serious anomaly--- which seems to be a disincentive for somebody creating highly relevant original content. We recognize that our client has the option to submit a ‘Scraper Content’ form to Google--- but we are less keen to go down that route and more keen to understand why this problem could arise in the first place. Please suggest.
Intermediate & Advanced SEO | | ontarget-media0 -
Publishing pages with thin content, update later?
So I have about 285 pages I created with very, very thin content on each. Each is unique, and each serves its own purpose. My question is, do you guys think it is wise to publish all of these at once to just get them out there and update each as we go along? Each page is very laser targeted and I anticipate that a large handful will actually rank soon after publishing. Thanks! Tom
Intermediate & Advanced SEO | | TomBinga11250 -
Frequent FAQs vs duplicate content
It would be helpful for our visitors if we were to include an expandable list of FAQs on most pages. Each section would have its own list of FAQs specific to that section, but all the pages in that section would have the same text. It occurred to me that Google might view this as a duplicate content issue. Each page _does _have a lot of unique text, but underneath we would have lots of of text repeated throughout the site. Should I be concerned? I guess I could always load these by AJAX after page load if might penalize us.
Intermediate & Advanced SEO | | boxcarpress0 -
Ajax Content Indexed
I used the following guide to implement the endless scroll https://developers.google.com/webmasters/ajax-crawling/docs/getting-started crawlers and correctly reads all URLs the command "site:" show me all indexed Url with #!key=value I want it to be indexed only the first URL, for the other Urls I would be scanned but not indexed like if there were the robots meta tag "noindex, follow" how I can do?
Intermediate & Advanced SEO | | wwmind1 -
I have a duplicate content problem
The website guy that made the website for my business Premier Martial Arts Austin disappeared and didn't set up that www. was to begin each URL, so I now have a duplicate content problem and don't want to be penalized for it. I tried to show in Webmaster tools the preferred setup but can't get it to OK that I'm the website owner. Any idea as what to do?
Intermediate & Advanced SEO | | OhYeahSteve0