User comments with page content or as a separate page?
-
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
-
actually, on second thoughts I think the view-all page solution with rel=canonical (http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html) might be the smarter choice.
-
Hi Peter
That's actually a pretty good idea, I like it!
Only thing I'm not sure about: If we do paginate, the product description should still stay on top of the page, while only the comments below change. That way we get duplicate content, and the paginated pages with the additional comments would not be ranking well anyhow, I guess. So using rel=next/prev and rel=canonical might be the right choice, even if that way, only the first page with comments will be able to rank?
-
After posting this topic, we found that including all of the comments on the same page helped with long tail queries and alike. We haven't implemented pagination with the comments though, I think the most we have on one page is around 120 reasonably lengthy comments. I would add pagination for anything longer than that - you could use the REL=next and REL=previous tags on these pages to ensure that the engines group the pages together so they know they are the same piece of content. I hope this helps! Let us know what you decide.
-
I'm wondering about the same thing. Would you actually limit the amount of user comments on a page? And if so, would you place the surplus comments on an extra page?
-
You will want the comments on the same page as the actual content for sure. The UGC on the main page will help keep it fresh as well as being another possible reason for people to link to it. Asking a user to browse to a second page would make it that less likely they would actually comment as well. Keeping it simple would be best. It's kind of the same idea as to why you would want to have your blog on the same sub domain as your main site as in the newest whiteboard Friday.
-
Comments on a separate page is a PITA.
**....especially as extra pages add extra pagerank... ** Extra pages have nothing to do with adding pagerank. In fact the more pages you have the less pagerank any single page on your site has.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Massive Drop in Users
A client of mine is seeing crazy drops in users. Feb. 2018 was an all-time high in organic users, over 38,000. However, there has been a steep drop off since then. In November 2018, the number of organic users was at a little over 1,000. The client said there have been no major changes to the site. I have no idea what is happening. Below are things I have done already: - Submitted a disavow file- Fixed URL parameters- Performed an entire site audit. Fixed all site errors and re-wrote new metadata. - Made sure no main pages have been deleted. 301 redirects are in place where necessary. - Started to consolidate low-quality pages. If I had to guess, I would assume this is an E.A.T. related drop. I have no way of knowing though. This is a YMYL site, so we are working to increase E.A.T. However, any insight would be helpful. Rankings are dropping off so quick I'm not sure what else I can do. Please comment if you need more context. PLEASE HELP!!
White Hat / Black Hat SEO | | BryanPhelps-BigLeapWeb0 -
"Google chose different canonical than user" Issue Can Anyone help?
Our site https://www.travelyaari.com/ , some page are showing this error ("Google chose different canonical than user") on google webmasters. status message "Excluded from search results". Affected on our route page urls mainly. https://www.travelyaari.com/popular-routes-listing Our canonical tags are fine, rel alternate tags are fine. Can anyone help us regarding why it is happening?
White Hat / Black Hat SEO | | RobinJA0 -
Social engineering content detected
hello, i have Got Social engineering content detected Message on webmaster tools on my around 20 sites, i have checked on server cleared, all unnecessary folders, But still i am not getting rectified this issue. One more error i got is Remove the deceptive content, But there is no any content on website which can harm my site, so kindly help & tell us steps we need take to resolve this issue, i am facing it from 10 days, yet not able to resolve, thnx in advance
White Hat / Black Hat SEO | | rohitiepl0 -
Linking my pages
Hello everybody, i have a small dilemma and i am not shore what to do. I am (my company) the owner of 10 e-commerce web sites. On every site i have a link too the other 9 sites and i am using an exact keyvoerd (not the shop name).Since the web stores are big and have over a 1000 pages, this means thet all my sites have a lot off inbound links (compared with my competiton). I am woried that linking them all together could be bad from Googles point of wiev. Can this couse a problem for me, should i shange it? Regardes, Marko
White Hat / Black Hat SEO | | Spletnafuzija0 -
Duplicate content or not? If you're using abstracts from external sources you link to
I was wondering if a page (a blog post, for example) that offers links to external web pages along with abstracts from these pages would be considered duplicate content page and therefore penalized by Google. For example, I have a page that has very little original content (just two or three sentences that summarize or sometimes frame the topic) followed by five references to different external sources. Each reference contains a title, which is a link, and a short abstract, which basically is the first few sentences copied from the page it links to. So, except from a few sentences in the beginning everything is copied from other pages. Such a page would be very helpful for people interested in the topic as the sources it links to had been analyzed before, handpicked and were placed there to enhance user experience. But will this format be considered duplicate or near-duplicate content?
White Hat / Black Hat SEO | | romanbond0 -
Using Programmatic Content
My company has been approached a number of times by computer generated content providers (like Narrative Science and Comtex). They are providing computer generated content to a number of big name websites. Does anyone have any experience working with companies like this? We were burned by the first panda update because we were busing boilerplate forms for content
White Hat / Black Hat SEO | | SuperMikeLewis0 -
Is pulling automated news feeds on my home page a bad thing?
I am in charge of a portal that relies on third-party content for its news feeds. the third-party in this case is a renowned news agency in the united kingdom. After the panda and penguin updates, will these feeds end up hurting my search engine rankings? FYI: these feeds occupy only 20 percent of content on my domain. The rest of the content is original.
White Hat / Black Hat SEO | | amit20760 -
Google Penalising Pages?
We run an e-commerce website that has been online since 2004. For some of our older brands we are getting good rankings for the brand category pages and also for their model numbers. For newer brands, the category pages aren't getting rankings and neither are the products - even when we search for specific unique content on that page, Google does not return results containing our pages. The real kicker is that the pages are clearly indexed, as searching for the page itself by URL or restricting the same search using the site: modifier the page appears straight away! Sometimes the home page will appear on page 3 or 4 of the rankings for a keyword even though their is a much more relevant page in Google's index from our site - AND THEY KNOW IT, as once again restricting with the keywords with a site: modifier shows the obviously relevant page first and loads of other pages before say the home page or the page that shows. This leads me to the conclusion that something on certain pages is flagging up Google's algorithms or worse, that there has been manual intervention by somebody. There are literally thousands of products that are affected. We worry about duplicate content, but we have rich product reviews and videos all over these pages that aren't showing anywhere, they look very much singled out. Has anybody experienced a situation like this before and managed to turn it around? Link - removed Try a page in for instance the D&G section and you will find it easily on Google most of the time. Try a page in the Diesel section and you probably won't, applying -removed and you will. Thanks, Scott
White Hat / Black Hat SEO | | scottlucas0