Your advice regarding thin content would be really appreciated
-
Hi guys,
I have embarked on a new site creation. The site is being created from scratch and very custom.
Basically the site allows people to review certain products and services.
If each review completed by users is seen as a seperate page by google ... is this considered deceptive or a likelihood of being slapped with a thin content penalty?
Basically 1 product may have hundreds of reviews naturally over time. Some may be really short and some may be longer.
the reason why i would like the user reviews to be seen as seperate pages is because I want google to understand that people are regularly interacting with the main content page.
Any advice in this area would be really appreciated.
-
"You are better off putting up a separate review page for every review that page gets but still, I would choose putting it on the same page to take full advantage of it."
Yep this is exactly what I want to do. Sounds like Laura's idea is amazing and I need to do some more research on how to design a page to act in this way.
Thanks heaps guys!
-
Actually the way amazon allows users to review products is very similar to the way i want mine set up.
So there is a main product page (like amazon) and reviews below it... what is this 'overflow' you speak of? can you kindly provide any direction?
-
So if I understood this correctly:
- You want to assign a new page per review
- You fear that it's thin content
If so, then it is thin. Very thin.
Google will know if you update your page, much like how pages get new comments.
You are better off putting up a separate review page for every review that page gets but still, I would choose putting it on the same page to take full advantage of it.
Laura's suggestion of doing overflows like amazon is also a good one. Pretty brilliant actually. I would've forgotten about that.
-
Hi Irdeto -
I'm not sure I understand how putting each review on a separate page would make Google think that there is more interaction on the product page (am I getting that right?)
Having each review as a separate page is potentially indeed very thin. It also sounds like a horrible user experience. Review text will add more rich text to the product page, which is great. Can you limit the reviews on a page to x reviews and then come up with a system for the overflow (potentially a product reviews page (vs the main product page) like Amazon?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword rank drop, any advice?
Screen Shot 2021-08-26 at 19.02.18.png My search visibility dropped from around 13% a few weeks ago to 8.29%. I know that Google launched a bunch of updates in this past few weeks to ignore spam links, and I'm pretty sure that was the reason for the drop - some of the links to my site date back over 10 years and those links were garbage. Confusingly, at the same time, my Domain Authority went up by 1 to 32, then back down a week later. How can I restore my previous rank in the short term? We're designing a new site at the moment with vastly improved page speed, but I'm not sure what effect that will have yet (thespacecollective.com).
Intermediate & Advanced SEO | | moon-boots0 -
Very Old Pages Creeping Up - Advice
We are currently having very old pages dating back 5+ years ago appearing on moz all of a sudden, we don't necessarily get traffic from these links anymore and i doubt they still hold any weight. Currently they take you to a 404 page, would there be any worth in redirecting these links?
Intermediate & Advanced SEO | | JH_OffLimits0 -
Need advice on redirects
Hi, I have new web addresses for my subpages. None if them have external links. Should I do redirects to the new pages or just leave the old pages in 404 and let google crawl and rank the new page. I am asking because my current pages don’t have a good ranking and I am thinking starting with a clean url is better. Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Duplicate content. Competing for rank.
Scenario: An automotive dealer lists cars for sale on their website. The descriptions are very good and in depth at 1,200 words per car. However chunks of the copy are copied from car review websites and weaved into their original copy. Q1: This is flagged in copyscape - how much of an issue is this for Google? Q2: The same stock with the same copy is fed into a popular car listing website - the dealer's website and the classifieds website often rank in the top two positions (sometimes the dealer on top other times the classifieds site). Is this a good or a bad thing? Are you risking being seen as duplicating/scraping content? Thank you.
Intermediate & Advanced SEO | | Bee1590 -
Duplicate content with URLs
Hi all, Do you think that is possible to have duplicate content issues because we provide a unique image with 5 different URLs ? In the HTML code pages, just one URL is provide. It's enough for that Google don't see the other URLs or not ? Example, in this article : http://www.parismatch.com/People/Kim-Kardashian-sa-securite-n-a-pas-de-prix-1092112 The same image is available on: http://cdn-parismatch.ladmedia.fr/var/news/storage/images/paris-match/people/kim-kardashian-sa-securite-n-a-pas-de-prix-1092112/15629236-1-fre-FR/Kim-Kardashian-sa-securite-n-a-pas-de-prix.jpg http://resize-parismatch.ladmedia.fr/img/var/news/storage/images/paris-match/people/kim-kardashian-sa-securite-n-a-pas-de-prix-1092112/15629236-1-fre-FR/Kim-Kardashian-sa-securite-n-a-pas-de-prix.jpg http://resize1-parismatch.ladmedia.fr/img/var/news/storage/images/paris-match/people/kim-kardashian-sa-securite-n-a-pas-de-prix-1092112/15629236-1-fre-FR/Kim-Kardashian-sa-securite-n-a-pas-de-prix.jpg http://resize2-parismatch.ladmedia.fr/img/var/news/storage/images/paris-match/people/kim-kardashian-sa-securite-n-a-pas-de-prix-1092112/15629236-1-fre-FR/Kim-Kardashian-sa-securite-n-a-pas-de-prix.jpg http://resize3-parismatch.ladmedia.fr/img/var/news/storage/images/paris-match/people/kim-kardashian-sa-securite-n-a-pas-de-prix-1092112/15629236-1-fre-FR/Kim-Kardashian-sa-securite-n-a-pas-de-prix.jpg Thank you very much for your help. Julien
Intermediate & Advanced SEO | | Julien.Ferras0 -
Noindexing Thin News Content for Panda
We've been suffering under a Panda penalty since Oct 2014. We've completely revamped the site but with this new "slow roll out" nonsense it's incredibly hard to know at what point you have to accept that you haven't done enough yet. We have thousands of news stories going back to 2001, some of which are probably thin and some of which are probably close to other news stories on the internet being articles based on press releases. I'm considering noindexing everything older than a year just in case, however, that seems a bit of overkill. The question is, if I mine the logfiles and only deindex stuff that Google sends no further traffic to after a year could this be seen as trying to game the algo or similar? Also, if the articles are noindexed but still exist, is that enough to escape a Panda penalty or does the page need to be physically gone?
Intermediate & Advanced SEO | | AlfredPennyworth0 -
How should I exclude content?
I have category pages on an e-commerce site that are showing up as duplicate pages. On top of each page are register and login, and when selected they come up as category/login and category/register. I have 3 options to attempt to fix this and was wondering what you think is the best. 1. Use robots.txt to exclude. There are hundreds of categories so it could become large. 2. Use canonical tags. 3. Force Login and Register to go to their own page.
Intermediate & Advanced SEO | | EcommerceSite0 -
Diagnosing duplicate content issues
We recently made some updates to our site, one of which involved launching a bunch of new pages. Shortly afterwards we saw a significant drop in organic traffic. Some of the new pages list similar content as previously existed on our site, but in different orders. So our question is, what's the best way to diagnose whether this was the cause of our ranking drop? My current thought is to block the new directories via robots.txt for a couple days and see if traffic improves. Is this a good approach? Any other suggestions?
Intermediate & Advanced SEO | | jamesti0