Can adding "noindex" help with quality penalizations?
-
Hello Moz fellows,
I have another question about content quality and Panda related penalization.
I was wondering this: If I have an entire section of my site that has been penalized due to thin content, can adding "noindex,follow" to all pages belonging to that section help de-penalizing the rest of the site in the short term, while we work to improve those penalized pages, which is going to take a long time? Can that be considered a "short term solution" to improve the overall site scoring on Google index while we work to improve those penalized pages, and, once ready, we remove the "noindex" tag?
I am eager to know your thoughts on this possible strategy.
Thank you in advance to everyone!
-
Thank you for your posting, but I made further research on all this, and I tend to disagree with what you state.
It is now my understanding that if you remove a page from the index, that content is no longer considered by Google, because it is actually "out of the index"... therefore, if, let's say, a specific page or a specific section of the site which could have caused a site-wide "content" penalty is removed from the index, those pages are no longer affecting any algorithmic calculation on the quality of the site from a "contents" stand point, and such alleged "content-related penalty" should be lifted.
Anyone else can confirm that?
-
Hi Fabrizo,
I agree with Andy's response up above. No indexing is not as good as removing the content from the website altogether, but it still can work as long as there are no links or sitemaps that lead Google back to the low quality content.
No indexing the pages won't be a permanent solution, only a temporary one that might help you in the meantime.
-
I am sorry, but I haven't received an affirmative answer to my last inquiry above...
-
Thank you Andy for your reply.
While I was waiting for an answer here, I made further research, and it looks like this can be a good strategy to cope with Panda related penalties, at least until the "bad content" is updated and improved:
https://mza.seotoolninja.com/community/q/noindex-vs-page-removal-panda-recovery
Your thoughts?
Thank you again!
-
Hi Fabrizio,
Yes, and no.
I have seen this work in the past and I have also seen it make no difference. My feeling these days is that no-indexing doesn't solve the issue, even while being worked on, as I have seen more occurrences of it not working.
How big a problem are you trying to deal with? I did help a company with 37k pages recover from Panda a while ago, but we have to do some pretty hefty trimming of the site in order to get it back into good shape again. There issue was that thousands of pages all had big pieces of the same content on many similar pages, so we cut out a lot of the problem areas and pulled the site into something that resembled a bit more sense.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can some one help how to fix spam problem. please see the attached file
Hi have spam link issue in my website at attaching report please help to fix this so my domain can get better. thanks. aEP1vVy
Intermediate & Advanced SEO | | grbassi0 -
Google WMT/search console: Thousands of "Links to your site" even only one back-link from a website.
Hi, I can see in my search console that a website giving thousands of links to my site where hardly only one back-link from one of their page to our page. Why this is happening? Here is screenshot: http://imgur.com/a/VleUf
Intermediate & Advanced SEO | | vtmoz0 -
Can you nofollow a URL?
Hey Moz Community, My questions sounds pretty simple but unfortunately, it isn't. I have a domain name (we'll use example.com for this) http://example.com which 301 re-directs to http://www.example.com. http://example.com has bad links pointing to it and http://www.example.com does not. So essentially, I want to stop negative influences from http://example.com being passed on to http://www.example.com. A 302 re-direct sounds like it would work in theory but is this the best way to go about this? Just so you know, we have completed a reconsideration request a long time ago but I think the bad links are still negatively affecting the website as it does not rank for it's own name which is bizarre. Actual Question: How do I re-direct http://example.com to http://www.example.com without passing on the negative SEO attached to http://example.com? Thanks in advance!
Intermediate & Advanced SEO | | RiceMedia0 -
How can I tell where I was penalised?
It seems for sometime now I have not been showing up the top for my keywords when I use to be ranking #1 and #2 for my main keywords, however now, I don't show up anywhere - this has been for almost a year now, can anyone tell me how I can find out (apart from google web masters tools) where I was penalised? Thanks
Intermediate & Advanced SEO | | edward-may0 -
How to structure links on a "Card" for maximum crawler-friendliness
My question is how to best structure the links on a "Card" while maintaining usability for touchscreens. I've attached a simple wireframe, but the "card" is a format you see a lot now on the web: it's about a "topic" and contains an image for the topic and some text. When you click the card it links to a page about the "topic". My question is how to best structure the card's html so google can most easily read it. I have two options: a) Make the elements of the card 2 separate links, one for the image and one for the text. Google would read this as follows. //image
Intermediate & Advanced SEO | | jcgoodrich
[](/target URL) //text
<a href=' target="" url'="">Topic</a href='> b) Make the entire "Card" a link which would cause Google to read it as follows: <a></a> <a>Bunch of div elements that includes anchor text and alt-image attributes above along with a fair amount of additional text.</a> <a></a> Holding UX aside, which of these options is better purely from a Google crawling perspective? Does doing (b) confuse the bot about what the target page is about? If one is clearly better, is it a dramatic difference? Thanks! PwcPRZK0 -
Webmaster Tools "Not found" errors after sitemap update
Hello Mozzers - I found a sitemap with loads of URL errors on it (none of the URLs on sitemap actually existed) so I went ahead and updated sitemap - now I'm seeing a spike in "not found" errors in WMT - is this normal / anything to worry about when you significantly change a sitemap. I've never replaced every URL on a sitemap before! L
Intermediate & Advanced SEO | | McTaggart0 -
Fluctuating Rankings on "difficult" keywords
Hi guys, I have a client who wants to rank well for two very "difficult" keywords and eight easier ones. The easy ones are "treadmills + city" and the difficult ones are "treadmills" and "treadmill". We have got great traction on the "+city" keywords and he now ranks on page one for all those. However, we have noticed that although he ranks on page 2-3 for "treadmill" treadmills", those rankings fluctuate widely day to day. Rankings for the "+city" versions are stable, a rising slowly as I would expect. Rankings for the difficult keywords can be 235 one day, 32 the next week, 218 the day after that, then stable at 30ish for a week, then fluctuation again. I know Google update every day, but what are the likely causes of the easier keywords being stable, while the harder ones fluctuate? Thanks.
Intermediate & Advanced SEO | | stevedeane0 -
"Too many links" - PageRank question
This question seems to come up a lot. 70 flat page site. For ease of navigation, I want to link every page to one-another. Pure CSS Dropdown menu with categories - each expanding to each of the subpage. Made, implemented, remade smartphone friendly. Hurray. I thought this was an SEO principle - ensuring good site navigation and good internal linking. Not forcing your users to hit "back". Not forcing your users to jump through hoops. But unless I've misread http://www.seomoz.org/blog/how-many-links-is-too-many then this is something that's indirectly penalised by Google because a site with 70 links from its homepage only lets each sub-page inherit 1/80th of its PageRank. Good site navigation vs your subpages are invisible on Google.
Intermediate & Advanced SEO | | JamesFx0