Can adding "noindex" help with quality penalizations?
-
Hello Moz fellows,
I have another question about content quality and Panda related penalization.
I was wondering this: If I have an entire section of my site that has been penalized due to thin content, can adding "noindex,follow" to all pages belonging to that section help de-penalizing the rest of the site in the short term, while we work to improve those penalized pages, which is going to take a long time? Can that be considered a "short term solution" to improve the overall site scoring on Google index while we work to improve those penalized pages, and, once ready, we remove the "noindex" tag?
I am eager to know your thoughts on this possible strategy.
Thank you in advance to everyone!
-
Thank you for your posting, but I made further research on all this, and I tend to disagree with what you state.
It is now my understanding that if you remove a page from the index, that content is no longer considered by Google, because it is actually "out of the index"... therefore, if, let's say, a specific page or a specific section of the site which could have caused a site-wide "content" penalty is removed from the index, those pages are no longer affecting any algorithmic calculation on the quality of the site from a "contents" stand point, and such alleged "content-related penalty" should be lifted.
Anyone else can confirm that?
-
Hi Fabrizo,
I agree with Andy's response up above. No indexing is not as good as removing the content from the website altogether, but it still can work as long as there are no links or sitemaps that lead Google back to the low quality content.
No indexing the pages won't be a permanent solution, only a temporary one that might help you in the meantime.
-
I am sorry, but I haven't received an affirmative answer to my last inquiry above...
-
Thank you Andy for your reply.
While I was waiting for an answer here, I made further research, and it looks like this can be a good strategy to cope with Panda related penalties, at least until the "bad content" is updated and improved:
https://mza.seotoolninja.com/community/q/noindex-vs-page-removal-panda-recovery
Your thoughts?
Thank you again!
-
Hi Fabrizio,
Yes, and no.
I have seen this work in the past and I have also seen it make no difference. My feeling these days is that no-indexing doesn't solve the issue, even while being worked on, as I have seen more occurrences of it not working.
How big a problem are you trying to deal with? I did help a company with 37k pages recover from Panda a while ago, but we have to do some pretty hefty trimming of the site in order to get it back into good shape again. There issue was that thousands of pages all had big pieces of the same content on many similar pages, so we cut out a lot of the problem areas and pulled the site into something that resembled a bit more sense.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
980 links from 75 domains and Graded "A" on Moz Page Grader-- still not ranking for our term. Thoughts?
A few additional interesting details: A blog post we wrote with the same keyword ranks 8, but this page does not crack the top 20. Crazy competitive term-- top SERP are from HBR, Entrepreneur and Inc. We use Instapage as landing page builder-- could this effect our rankings? URL is not a subdomain Pretty stumped over here. Thanks y'all!
Intermediate & Advanced SEO | | lbernes220 -
B2B site targeting 20,000 companies with 20,000 dedicated "target company pages" on own website.
An energy company I'm working with has decided to target 20,000 odd companies on their own b2b website, by producing a new dedicated page per target company on their website - each page including unique copy and a sales proposition (20,000 odd new pages to optimize! Yikes!). I've never come across such an approach before... what might be the SEO pitfalls (other than that's a helluva number of pages to optimize!). Any thoughts would be very welcome.
Intermediate & Advanced SEO | | McTaggart0 -
"Leeching" backlinks...yes or no?
A lot of websites, by virtue of practicality, will link to wikipedia articles to explain certain concepts. Would it be worthwhile to reach out to those websites and ask them to change the link to a different resource if that resource is a much better alternative than the wikipedia article? And how would you approach this? Thanks!
Intermediate & Advanced SEO | | mack-ayache0 -
Can you spot the differences?
Well, I have been scratching my head on this for days, I will try throwing the ball to you with hopes someone more experienced than me can help. The scenario is: e-commerce -> brand page -> SERP -> comparison between how two pages rank; one from my website, one from a competitor website. The brand is Michelin, the keyword is "pneumatici michelin" (equivalent in italian of “michelin tires”). I am not looking at SERP first page, where competition is surely much more fierce. I am looking at position 11: http://www.cambio-gomme.it/marchi/michelin/ And my page (not in the first 50): http://www.gomme-auto.it/pneumatici/michelin My page: MOZ Page Grade (for keyword “pneumatici michelin”): A External backlinks to the page: 1 Domain Authority: 29 Page Authority: 24 On-page SEO optimization: keyword density: 0.87% internal links: 145 external links: 3 page size: 108kb html size: 24kb words on page: 2077 link-words: 408 non-linked words: 1669 time to first byte: 0.419s Competitor page: MOZ Page Grade (for keyword “pneumatici michelin”): A External backlinks to the page: 0 Domain Authority: 26 Page Authority: 13 On-page SEO optimization: keyword density: 0.75% internal links: 70 external links: 1 page size: 31kb html size: 9kb words on page: 1521 link-words: 168 non-linked words: 1353 time to first byte: 0.373s Domain age is very similar, both websites launched close to each other in 2012. Ideas? Suggestion on other metrics to compare?
Intermediate & Advanced SEO | | max.favilli0 -
When is it recommended to use a self referencing rel "canonical"?
In what type of a situation is it the best type of practice to use a self referencing rel "canonical" tag? Are there particular practices to be cautious of when using a self referencing rel "canonical" tag? I see this practice used mainly with larger websites but I can't find any information that really explains when is a good time to make use of this practice for SEO purposes. Appreciate all feedback. Thank you in advance.
Intermediate & Advanced SEO | | SEO_Promenade0 -
Robot.txt help
Hi, We have a blog that is killing our SEO. We need to Disallow Disallow: /Blog/?tag*
Intermediate & Advanced SEO | | Studio33
Disallow: /Blog/?page*
Disallow: /Blog/category/*
Disallow: /Blog/author/*
Disallow: /Blog/archive/*
Disallow: /Blog/Account/.
Disallow: /Blog/search*
Disallow: /Blog/search.aspx
Disallow: /Blog/error404.aspx
Disallow: /Blog/archive*
Disallow: /Blog/archive.aspx
Disallow: /Blog/sitemap.axd
Disallow: /Blog/post.aspx But Allow everything below /Blog/Post The disallow list seems to keep growing as we find issues. So rather than adding in to our Robot.txt all the areas to disallow. Is there a way to easily just say Allow /Blog/Post and ignore the rest. How do we do that in Robot.txt Thanks0 -
To noindex or not to noindex
Our website lets users test whether any given URL or keyword is censored in China. For each URL and keyword that a user looks up, a page is created, such as https://en.greatfire.org/facebook.com and https://zh.greatfire.org/keyword/freenet. From a search engines perspective, all these pages look very similar. For this reason we have implemented a noindex function based on certain rules. Basically, only highly ranked websites are allowed to be indexed - all other URLs are tagged as noindex (for example https://en.greatfire.org/www.imdb.com). However, we are not sure that this is a good strategy and so are asking - what should a website with a lot of similar content do? Don't noindex anything - let Google decide what's worth indexing and not. Noindex most content, but allow some popular pages to be indexed. This is our current approach. If you recommend this one, we would like to know what we can do to improve it. Noindex all the similar content. In our case, only let overview pages, blog posts etc with unique content to be indexed. Another factor in our case is that our website is multilingual. All pages are available (and equally indexed) in Chinese and English. Should that affect our strategy?References:https://zh.greatfire.orghttps://en.greatfire.orghttps://www.google.com/search?q=site%3Agreatfire.org
Intermediate & Advanced SEO | | GreatFire.org0 -
Pagination: rel="next" rel="prev" in ?
With Google releasing that instructional on proper pagination I finally hunkered down and put in a site change request. I wanted the rel="next" and rel="prev" implemented… and it took two weeks for the guy to get it done. Brutal and painful. When I looked at the source it turned out he put it in the body above the pagination links… which is not what I wanted. I wanted them in the . Before I respond to get it properly implemented I want a few opinions - is it okay to have the rel="next" in the body? Or is it pretty much mandatory to put it in the head? (Normally, if I had full control over this site, I would just do it myself in 2 minutes… unfortunately I don't have that luxury with this site)
Intermediate & Advanced SEO | | BeTheBoss1