Noindex vs. page removal - Panda recovery
-
I'm wondering whether there is a consensus within the SEO community as to whether noindexing pages vs. actually removing pages is different from Google Pandas perspective?Does noindexing pages have less value when removing poor quality content than physically removing ie. either 301ing or 404ing the page being removed and removing the links to it from the site?
I presume that removing pages has a positive impact on the amount of link juice that gets to some of the remaining pages deeper into the site, but I also presume this doesn't have any direct impact on the Panda algorithm?
Thanks very much in advance for your thoughts, and corrections on my assumptions
-
I think it can get pretty complicated, but a couple of observations:
(1) In my experience, NOINDEX does work - indexation is what Google cares about primarily. Eventually, you do need to trim the crawl paths, XML sitemaps, etc., but often it's best to wait until the content is de-indexed.
(2) From an SEO perspective (temporarily ignoring Panda), a 301 consolidates link juice - so, if a page has incoming links or traffic, that's generally the best way to go. If the page really has no value at all for search, either a 404 or NOINDEX should be ok (strictly from an SEO perspective). If the page is part of a path, then NOINDEX,FOLLOW could preserve the flow of link juice, whereas a 404 might cut it off (not to that page, but to the rest of the site and deeper pages).
(3) From a user perspective, 301, 404, and NOINDEX are very different. A 301 is a good alternative to pass someone to a more relevant or more current page (and replace an expired one), for example. If the page really has no value at all, then I think a 404 is better than NOINDEX, just in principle. A NOINDEX leaves the page lingering around, and sometimes it's better to trim your content completely.
So, the trick is balancing (2) and (3), and that's often not a one-sized fits all solution. In other words, some groups of pages may have different needs than others.
-
Agreed - my experience is that NOINDEX definitely can have a positive impact on index dilution and even Panda-level problems. Google is mostly interested in index removal.
Of course, you still need to fix internal link structures that might be causing bad URLs to roll out. Even a 404 doesn't remove a crawl path, and tons of them can cause crawler fatigue.
-
I disagree with everyone The reason panda hit you is because you were ranking for low quality pages you were telling Google wanted them to index and rank.
When you
a) remove them from sitemap.xmls
b) block them in robots.txt
c) noindex,follow or noindex, nofollow them in metas
you are removing them from Googles index and from the equation of good quality vs low quality pages indexed on your site.
That is good enough. You can still have them return a 200 and be live on your site AND be included in your user navigation.
One example is user generated pages when users signup and get their own URL www.mysite.com/tom-jones for example.Those pages can be live but should not be indexed because they have no content usually other than a name.
As long as you are telling Google - don't index them I don't want them to be considered in the equation of pages to show up in the index, you are fine with keeping these pages live!
-
Thanks guys
-
I would agree noindex is not as good as removing the content but it still can work as long as there are no links or sitemaps that lead Google back to the low quality content.
I worked on a site that was badly affected by Panda in 2011. I had some success by noindexing genuine duplicates (pages that looked really alike but did need to be there) and removing low quality pages that were old and archived. I was left with about 60 genuine pages that needed to be indexed and rank well so I had to pay a copywriter to rewrite all those pages (originally we had the same affiliate copy on there as lots of other sites). That took about 3 months for Google to lift or at least reduce the penalty and our rankings to return to the top 10.
Tom is right that just noindexing is not enough. If pages are low quality or duplicates then keep them out of sitemaps and navigation so you don't link to them either. You'll also nned redirects in case anyone else links to them. In my experience, eventually Google will drop them from the index but it doesn't happen overnight.
Good luck!
-
Thanks Tom
Understand your points. The idea behind noindexing is that you're telling Google not to take any notice of the page.
I guess the question is whether that works:
- Not at all
- A little bit
- A lot
- Is as good as removing the content
I believe it's definitely not as good as actually removing the content, but not sure about the other three possibilities.
We did notice that we got a small improvement in placement when we noindexed a large amount of the site and took several hundred other pages actually down. Hard to say which of those two things caused the improvement.
We've heard of it working for others, which is why I'm asking...
Appreciate your quick response
Phil
-
I don't see how noindexing pages would help with regards to a Panda recovery if you're already penalised.
Once the penalty is in place, my understanding is that it will remain so until all offending pages have been removed or changed to unique content. Therefore, noindexing would not work - particularly if that page is accessible via an HTML/XML sitemap or a site navigation system. Even then, I would presume that Google will have the URL logged and if it remained as is, any penalty removable would not be forthcoming.
Noindexing pages that has duplicate content but hasn't been penalised yet would probably prevent (or rather postpone) any penalty - although I'd still rather avoid the issue outright where possible. Once a penalty is in place, however, I'm pretty sure it will remain until removed, even if noindexed.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical Page Question
Hi, I have a question relation to Canonical pages That i need clearing up. I am not sure that my bigcommere website is correctly configured and just wanted clarification from someone in the know. Take this page for example https://www.fishingtackleshop.com.au/barra-lures/ Canonical link is https://www.fishingtackleshop.com.au/barra-lures/ The Rel="next" link is https://www.fishingtackleshop.com.au/barra-lures/?sort=bestselling&page=2 and this page has a canonical tag as rel='canonical' href='https://www.fishingtackleshop.com.au/barra-lures/?page=2' /> Is this correct as above and working as it should or should the canonical tag for the second (pagination page) https://www.fishingtackleshop.com.au/barra-lures/?page=2 in our source code be saying rel='canonical' href='https://www.fishingtackleshop.com.au/barra-lures/' />
Technical SEO | | oceanstorm0 -
Old Content Pages
Hello we run a large sports website. Since 2009 we have been doing game previews for most games every day for all the major sports..IE NFL, CFB, NBA, MLB etc.. Most of these previews generate traffic for 1-2 days leading up to or day of the event. After that there is minimal if any traffic and over the years almost nothing to the old previews. If you do a search for any of these each time the same matchup happens Google will update its rankings and filter out any old matchups/previews with new ones. So our question is what would you do with all this old content? Is it worth just keeping? Google Indexes a majority of it? Should we prune some of the old articles? The other option we thought of and its not really practical is to create event pages where we reuse a post each time the teams meet but if there was some sort of benefit we could do it.
Technical SEO | | dueces0 -
Thousands of 404-pages, duplicate content pages, temporary redirect
Hi, i take over the SEO of a quite large e-commerce-site. After checking crawl issues, there seems to be +3000 4xx client errors, +3000 duplicate content issues and +35000 temporary redirects. I'm quite desperate regarding these results. What would be the most effective way to handle that. It's a magento shop. I'm grateful for any kind of help! Thx,
Technical SEO | | posthumus
boris0 -
Page Speed or Size?
Hi everyone. I have a client who really wants to add a 1min html5 video to the background of their homepage. I have managed to reduce the size of the video to 20MB and I have tested the page in pingdom. The results are 1.85 s to load, and weighed in at 21.2 MB. My question is does Google factor page load speed or size in it's ranking factors? I am also mindful of the negative effect this could have on bounce rate. Thanks.
Technical SEO | | WillWatrous0 -
Remove page with PA of 69 and 300 root domain links?
Hi We have a few pages within our website which were at one time a focus for us, but due to developing the other areas of the website, they are now defunct (better content elsewhere) and in some ways slightly duplicate so we're merging two areas into one. We have removed the links to the main hub page from our navigation, and were going to 301 this main page to the main hub page of the section which replaces it. However I've just noticed the page due to be removed has a PA of 69 and 15,000 incoming links from 300 root domains. So not bad! It's actually stronger than the page we are 301'ing it to (but not really an option to swap as the URL structure will look messy) With this in mind, is the strategy to redirect still the best or should we keep the page and turn it into a landing page, with links off to the other section? It just feels as though we would be doing this just for the sake of google, im not sure how much decent content we could put on it as we've already done that on the destination page. The incoming links to that page will still be relevant to the new section (they are both v similar hence the merging) Any suggestions welcome, thanks
Technical SEO | | benseb0 -
Removal of date archive pages on the blog
I'm currently building a site which currently has an archive of blog posts by month/year but from a design perspective would rather not have these on the new website. Is the correct practice to 301 these to the main blog index page? Allow them to 404? Or actually to keep them after all. Many thanks in advance Andrew
Technical SEO | | AndieF0 -
Should I consider webmaster tools links and linked pages ratio to remove unnatural links?
I don't know this is a suitable place for post this question. Anyway I have done it. According to the Google webmaster tools, Links to your site page. My blog has considerable amount of links, from linked pages (from certain domain names). For an instance please refer following screenshot. When I am removing unnatural links, should I consider these, links from linked pages ratio? Almost all of these sites are social bookmarking sites. When I publish a new bookmark on those sites, they automatically add a homepage link. As a result of that, I got a huge number of home page links from linked pages. What is your recommendation? Thanks! webmaster.png web_master_tools.png
Technical SEO | | Godad0 -
Remove Deleted (but indexed) Pages Through Webmaster Tools?
I run a blog/directory site. Recently, I changed directory software and, as a result, Google is showing 404 Not Found crawling errors for about 750 non-existent pages. I've had some suggest that I should implement a 301 redirect, but can't see the wisdom in this as the pages are obscure, unlikely to appear in search and they've been deleted. Is the best course to simply manually enter each 404 error page in to the Remove Page option in Webmaster Tools? Will entering deleted pages into the Removal area hurt other healthy pages on my site?
Technical SEO | | JSOC0