User generated content (Comments) - What impact do they have?
-
Hello MOZ stars!
I have a question regarding user comments on article pages. I know that user generated content is good for SEO, but how much impact does it really have?
For your information:
1 - All comments appears in source code and is crawled by spiders.
2 - A visitor can comment a page for up to 60 days.
3 - The amount of comments depends on the topic, we usually gets between 3-40 comments.My question:
1 - If we were to remove comments completely, what impact would it have from seo perspective? (I know you cant be certain - but please make an educated guess if possible)
2 - If it has a negative and-/or positive impact please specify why!If anything is unclear or you want certain information don't hesitate to ask and I'll try to specify.
Best regards,
Danne -
Not what you asked, but other than SEO I would say comments do have an effect. I have heard advertisers say they were looking for sites with comments. Their thinking was they wanted popular sites with followers and they is how they judged it.
-
I do think that negative comments hurt UX and eventually the bottom line. No one wants to work with a company that has ton of negative feedback. Which is exactly why user generated content is so important to the searchers. It is a candid review of a company or product. There can be in the middle reviews, like a 3 star rating because customer service was great but the product stinks. I think those kinds of comments and reviews are necessary and overall good for UX.
In my opinion as a consumer, I want to see the bad comments. I always use the example of shoes and clothes. I don't want to find out when I get a pair of shoes in the mail that the sizes run a little small. If I see that in the comments or reviews ahead of time I will know to buy a size bigger and save myself the trouble of returning the product. These kinds of "negative" reviews are useful to a searcher and I wouldn't remove them.
-
Additional to what David said, I would still consider leaving the comments option open (until there is no "over-usage").
Also a factor to consider (especially in Barry's case), what kind of comments do people post. Do they have a positive or a negative annotation? Are they on-topic or not?
If you have a community, like Moz has IMO, where I see a lot of good, complementing comments, responses to each of the posts, I'd consider indexing the comments.
What do you think? David, Monica?
-
I also read that article. Barry seemed to think that the comments were hurting the site, rather than helping. Comments can get off topic, or stray away from the original article. If I remember correctly, Barry made the comments viewable, but not readable by Google as a result.
For return traffic, I think comments are great. After seeing the results that Barry shared, I'm not sure if it is still a good idea to have them included in the page crawl.
Here is the article that he spoke about this: https://www.seroundtable.com/google-panda-ser-poll-19675.html
IMO, I would leave the comments on the pages, but block them from being indexed/use javascript for showing the comments if possible.
-
Like I have mentioned in my response, that is one case.
But I must agree with Monica, you should place the value to the searchers&User Experience.
-
User generated content in my opinion is extremely useful. It is unique, it is informative most of the time and it is valuable to future searches. In this instance I would be more concerned about the value to the searchers and to user experience than the SEO effects.
-
Hi Danne,
I remember reading a post about this from Barry Schwartz on seroundtable.com: https://www.seroundtable.com/google-panda-ser-hurt-comments-19652.html
Read it through, it quite describes the effect of user generated content (specially comments).
This is one specific case, I am sure that it is not a general rule for this.
Gr., Keszi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My wordpress site generating bad links
Hello Everyone, I have wordpress site Which is from last 20 days generating links like For Example http://www.domainname.com/game/965/wiki/キャラクター図鑑_レアリティ(★★★)_【ID:675】ワッツ・ステップニー htttp://www.domainname.com/nkpghfu_13356_gvgjq_tfjhnkt_jsj_296_82566_673_567_245 This is screenshot of webmaster tools http://prnt.sc/ccwh0e can please any expert check & Tell How this Link i am getting, Also What are steps i need take for removing this Errors, As it is harming my sites Flow As well As Rankings. Thanx in Advance
Intermediate & Advanced SEO | | innovativekrishna10 -
Duplicate content - how to diagnose duplicate content from another domain before publishing pages?
Hi, 🙂 My company is having new distributor contract, and we are starting to sell products on our own webshop. Bio-technology is an industry in question and over 1.000 products. Writing product description from scratch would take many hours. The plan is to re-write it. With permission from our contractors we will import their 'product description' on our webshop. But, I am concerned being penalies from Google for duplicate content. If we re-write it we should be fine i guess. But, how can we be sure? Is there any good tool for comparing only text (because i don't want to publish the pages to compare URLs)? What else should we be aware off beside checking 'product description' for duplicate content? Duplicate content is big issue for all of us, i hope this answers will be helpful for many of us. Keep it hard work and thank you very much for your answers, Cheers, Dusan
Intermediate & Advanced SEO | | Chemometec0 -
Opinions on Boilerplate Content
Howdy, Ideally, uniqueness for every page's title, description, and content is desired. But when a site is very, very large, it becomes impossible. I don't believe our site can avoid boilerplate content for title tags or meta-descriptions. We will, however, markup the pages with proper microdata so Google can use this information as they please. What I am curious about is boilerplate content repeated throughout the site for the purpose of helping the user, as well as to tell Google what the page is about (rankings). For instance, this page and this page offer the same type of services, but in different areas. Both pages (and millions of others) offer the exact same paragraph on each page. The information is helpful to the user, but it's definitely duplicate content. All they've changed is the city name. I'm curious, what's making this obvious duplicate content issue okay? The additional unique content throughout (in the form of different businesses), the small, yet obvious differences in on-site content (title tags clearly represent different locations), or just the fact that the site is HUGELY authorative and gets away with it? I'm very curious to hear your opinions on this practice, potential ways to avoid it, and whether or not it's a passable practice for large, but new sites. Thanks!
Intermediate & Advanced SEO | | kirmeliux0 -
Copy content from news about your own company
I´ve a client who uses news articles about his company in newspaper and other magazines in his blog. It´s fair because he wants to show how how big and important he is when someone check his website but in otherwise he is just copying content from others sites. ( there is lot of original content also) Should I use noindex on these pages or use a rel=canonical? We already ask permission and show the font with the link in these cases.
Intermediate & Advanced SEO | | SeoMartin10 -
Above the Fold Content
How important is the placement of unique content "Above the Fold". Will attention grabbing images suffice or must their be a lot of unique text?
Intermediate & Advanced SEO | | casper4340 -
Blog Duplicate Content
Hi, I have a blog, and like most blogs I have various search options (subject matter, author, archive, etc) which produce the same content via different URLs. Should I implement the rel-canonical tag AND the meta robots tag (noindex, follow) on every page of duplicate blog content, or simply choose one or the other? What's best practice? Thanks Mozzers! Luke
Intermediate & Advanced SEO | | McTaggart0 -
When to delete low quality content
If 75% of a site is poor quality, but still accounts for 35% of the traffic to the site, should the content be 404ed? Or, would it be better to move it to a subdomain and set up 301 re-directs? This site was greatly affected by Panda.
Intermediate & Advanced SEO | | nicole.healthline0 -
Online Sitemap Generator
I have a site that has around 5,000 pages now. Are there any recommened online free/paid tools to generate a sitemap for me?
Intermediate & Advanced SEO | | rhysmaster0