"Noindex, follow" for thin pages?
-
Hey there Mozzers,
I have a question regarding Thin pages. Unfortunately, we have Thin pages, almost empty to be honest. I have the idea to ask the dev team to do "noindex, follow" on these pages. What do you think?
Has someone faced this situation before?
Will appreciate your input!
-
+1 to EGOL and Ginaluca. We need more information about that pages.
In any case, if we are talking about thin content, but if is quality content and it's not duplicated content or oriented-for-SEO content, I would not use noindex for it.
If we ar talking about empty pages or almost empty pages maybe is better to use noindex, or maybe is better to delete and redirect with 301 this pages.
I would reduce the internal linking, and maybe put those internal links lower or in places with less visibility. Just that.
Greetings!
-
EGOL was right asking more information also for one precise reason: in some website a "thin page" maybe the best thing the same website can offer to a visitor because that page answers exactly to what the user needs from it.
That is why so often the Googlers say that thin content per se it's not a problem.
It's a problem if it is due to some technical issue or because of bad on-page SEO (i.e.: a page with a photo and no caption and written description of the photo).
So, to better answer your question, we need to know more about the nature of those thin pages you are talking about.
p.d.: using "noindex, follow" is not anymore suggested by Googlers. In fact, few months ago, John Mueller declared that if Google sees a page with a noindex,follow for a long time, then it will start considering the "follow" as a nofollow", so the original reason of its use won't be satisfied.
-
If you want good responses to this question, then post more about these pages (current content, how many, current traffic, current rankings, recent problems, purpose of pages, etc.) and more about your site (current content, how many, current traffic, current rankings, recent problems, etc.).
Questions with little information are often ignored by people who might know a lot about the subject because they don't want to guess, they don't want to think about and write about all possible cases, put their effort into a question when the poster didn't put much of his own effort into explaining.
Also, who are you? Owner? Employee? SEO? Are you the guy who put these pages up and didn't put any content on them? The guy who paid for the skinny content that is currently up there and needs to have input on yanking them down or paying for proper content?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt - "File does not appear to be valid"
Good afternoon Mozzers! I've got a weird problem with one of the sites I'm dealing with. For some reason, one of the developers changed the robots.txt file to disavow every site on the page - not a wise move! To rectify this, we uploaded the new robots.txt file to the domain's root as per Webmaster Tool's instructions. The live file is: User-agent: * (http://www.savistobathrooms.co.uk/robots.txt) I've submitted the new file in Webmaster Tools and it's pulling it through correctly in the editor. However, Webmaster Tools is not happy with it, for some reason. I've attached an image of the error. Does anyone have any ideas? I'm managing another site with the exact same robots.txt file and there are no issues. Cheers, Lewis FNcK2YQ
Technical SEO | | PeaSoupDigital0 -
Do I need to verify my site on webmaster both with and without the "www." at the start?
As per title, is it necessary to verify a site on webmaster twice, with and without the "www"? I only ask as I'm about to submit a disavow request, and have just read this: NB: Make sure you verify both the http:website.com and http://www.website.com versions of your site and submit the links disavow file for each. Google has said that they view these as completely different sites so it’s important not to forget this step. (here) Is there anything in this? It strikes me as more than a bit odd that you need to submit a site twice.
Technical SEO | | mgane0 -
"Extremely high number of URLs" warning for robots.txt blocked pages
I have a section of my site that is exclusively for tracking redirects for paid ads. All URLs under this path do a 302 redirect through our ad tracking system: http://www.mysite.com/trackingredirect/blue-widgets?ad_id=1234567 --302--> http://www.mysite.com/blue-widgets This path of the site is blocked by our robots.txt, and none of the pages show up for a site: search. User-agent: * Disallow: /trackingredirect However, I keep receiving messages in Google Webmaster Tools about an "extremely high number of URLs", and the URLs listed are in my redirect directory, which is ostensibly not indexed. If not by robots.txt, how can I keep Googlebot from wasting crawl time on these millions of /trackingredirect/ links?
Technical SEO | | EhrenReilly0 -
How do I 301 redirect a number of pages to one page
I want to redirect all pages in /folder_A /folder_B to /folder_A/index.php. Can I just write one or two lines of code to .htaccess to do that?
Technical SEO | | Heydarian0 -
Can I "Run Macros" on my own?
I talked to the SEO company I am using and trying to get an understanding of what it is they are doing for me. They told me that one of the most important things they are doing is running macros. Is this something I could learn to do myself? What does it mean? How do I do it? How long does it take?? I have recently been educating myself on SEO and coded my website with metadata titles and descriptions. Is running macros something I can do on my own too? I guess I'd also just like to know what it is.
Technical SEO | | CapitolShine0 -
Why is an error page showing when searching our website using Google "site:" search function?
When I search our company website using the Google site search function "site:jwsuretybonds.com", a 400 Bad Request page is at the top of the listed pages. I had someone else at our company do the same site search and the 400 Bad Request did not appear. Is there a reason this is happening, and are there any ramifications to it?
Technical SEO | | TheDude0 -
A rel="canonical" to www.homepage.com/home.aspx Hurts my Rank?
Hello, The CMS that I use makes 3 versions of the homepage:
Technical SEO | | EvolveCreative
www.homepage.com/home.aspx homepage.com homepage.com/default.aspx By default the CMS is set to rel=canonical all versions to the www.homepage.com/home.aspx version. If someone were to link to a website they most likely aren't going to link to www.homepage.com/home.aspx, they'll link to www.homepage.com which makes that link juice flow through the canonical to www.homepage.com/home.aspx right? Why make that extra loop at all? Wouldn't that be splitting the juice? I know 301's loose 1-5 % juice, but not sure about canonical. I assume it works the same way? Thanks! http://yoursiteroot/0 -
Rel=cannonical vs. noindex.follow for paginated pages
I"m working on a real estate site that has multiple listing pages, e.g. http://www.hhcrealestate.com/manhattan-beach-mls-real-estate-listings I'm trying to get the main result page to rank for that particular geo-keyword, i.e. "manhattan beach homes for sale". I want to make sure all of the individual listings on the paginated pages, 2,3, 4 etc. still get indexed. Is it better to add to all of the paginated pages, i.e.manhattan-beach-mls-real-estate-listings-2, manhattan-beach-mls-real-estate-listings--3, manhattan-beach-mls-real-estate-listings-4, etc. or is it better to add noindex,follow to those pages?
Technical SEO | | fthead91