Panda Recovery - What is the best way to shrink your index and make Google aware?
-
We have been hit significantly with Panda and assume that our large index with some pages holding thin/duplicate content being the reason.
We have reduced our index size by 95% and have done significant content development on the remaining 5% pages.
For the old, removed pages, we have installed 410 responses (Page does not exist any longer) and made sure that they are removed from the sitempa submitted to Google; however after over a month we still see Google spider returning to the same pages and the webmaster tools shows no indicator that Google is shrinking our index size.
Are there more effective and automated ways to make Google aware of a smaller index size in hope of Panda recovery? Potentially using the robots.txt file, GWT URL removal tool etc?
Thanks /sp80
-
Hi. I would be curious to know if anyone else has experienced something similar and recovered from Panda. How long did it take you? Did you manually remove the pages, set up 410s or 404s, or create 301s?
I've been working on a site for sometime now which has lost a great of traffic since July 2013. Over the past 2 months, a process has gone underway to manually remove the URLs from the index. The index has been cut in half, but still not at what it was pre-penalty. About 20,000 more pages to figure out what needs to be removed before it reaches the level it was before the massive traffic drop.
Any recovery or insight would be helpful.
-
Hi Sp80 (and group),
It's been about six months since you posted your Panda recovery question. I'm curious if you implemented Kerry22's suggestions, and what results you've seen. I hope it's worked out for you.
We're also dealing with removing thousands of pages of thin content (through 410s, keeping links up and sitemaps, as per Kerry's suggestion). This was a very helpful discussion to read.
Thanks,
Tom
-
Hi kerry,
Your post gives me some hope. I was hit by Panda in Feb. 2011 and lost 85% of my google traffic Made many changes to my site -- page deletions re-directs added content etc. Got a bump of 25% in September 2011 but lost that and more afterward.
We have an e-commerce gift site with 6000 pages. Is your site an e-commerce site?
I have not found a recovery story from any sites like mine that were hit with that large a drop.
I hope your recovery would relate to my situation.
-
Did Google process the 301s? In other words, are the old pages still in the index or not? If they processed the 301s eventually, you generally should be ok. If the old URLs seem stranded, then you might be best setting up the XML sitemap with those old URLs to just kick Google a little. I don't think I'd switch signals and move from a 301 to 404, unless the old pages are low quality, had bad links, etc.
Unfortunately, these things are very situational, so it can be hard to speak in generalities.
-
Hi Dr. Pete,
I know this is a late entry into this thread, but.. what if we did all our content cutting in the wrong ways over the past year – is there something we could/should do now to correct for this? Our site was hit by panda back in March 2012, and since then we've cut content several times. But we didn’t use this good process you advocate – here’s what we did when we cut pages:
1. We set up permanent 301 redirects for all of them immediately
2. Simultaneously, we always removed all links pointing to cut pages (we wanted to make sure users didn’t get redirected all the time)This is a far cry from what you recommend and what Kerry22 did to recover successfully. If you have some advice on the following questions, I’d definitely appreciate it:
- Is it possible Google still thinks we have this content on our site or intend to bring it back, and as a result we continue to suffer?
- If that is a possibility, then what can we do now (if anything) to correct the damage we did?
We're thinking about removing all of those 301s now, letting all cut content return 404s and making a separate sitemap of cut content to submit it to Google. Do you think it's too late or otherwise inadvisable for us to do this kind of thing?
Thanks in advance,
Eric -
It might be worth exploring NOINDEX'ing the useful pages and 410'ing the non-useful ones, if only because sometimes a mix of signals is more palatable to Google. Any time you remove a swatch of content with one method, it can trigger alarm bells. I'll be honest, though - these situations are almost always tricky and you almost always have to measure and adjust. I've never found a method that's right for all situations.
-
Thanks Pete,
I appreciate your input. Next to the additional sitemap with the known Google-indexed URLs we want deindexed, we also have reopened some crawl paths to these pages to see if there is a speed up.
This is an undertaking carried out across 30 international properties so we will be able to experiment with measures for certain domains and see how it affects de-indexing speed as we are tracking the numbers reported by Google daily.
I agree about the bad user experience of 410s as a dead end. We are mostly de-indexing as a mean of recovery from Panda but the content pages that we try to deindex are actually still useful to the users, just thin and partially duplicative in content. We have decided to still display the content when such page is reached but return a status code of 410. Alternatively it seems we could just set the robot tag to noindex but my feeling is the 410 approach will lead to faster deindexing - would you agree?
Also if you have any expertise to share on how to compile a more ocomprehensive list of URLs indexed by Google for a particular domain other than scraping the web interface using the site:domain.com query approach (only returns a small subset compared to the stated total number of indexed pages) please let me know.
Thanks again /Thomas
-
If you want to completely remove these pages, I think Kerry22 is spot on. A 410 is about the fastest method we know of, and her points about leaving the crawl paths open are very important. I completely agree with leaving them in a stand-alone sitemap - that's good advice.
Saw your other answer, so I assume you don't want to 301 or canonical these pages. The only caveat I'd add is user value. Even if the pages have no links, make sure people aren't trying to visit them.
This can take time, especially at large scale, and a massive removal can look odd to Google. This doesn't generally result in a penalty or major problems, but it can cause short-term issues as Google re-evaluates the site.
The only option to speed it up is, if the pages have a consistent URL parameter or folder structure, you may be able to do a mass removal in Google Webmaster Tools. This can be faster, but it's constrained to similar-looking URLs. In other words, there has to be a pattern. The benefit is that you can make the GWT request on top of the 410s, so that can sometimes help. Any massive change takes time, though, and often requires some course correction, I find.
-
Think second sitemap will be fine. Wouldn't add a page with just links as that is the type of page Panda doesn't like.
Regarding sets of pages - we started by going into the search results - found a lot of content that shouldn't have been indexed.
We then looked manually at the content on subsets of pages and found pages that were thin and very similar to others (at the product level) and either made them more unique or removed them. Tools like this also help identify similar pages across products/categories http://www.copyscape.com/compare.php
It's only been 2 weeks, so it looks like we have pretty much 80% recovered and still improving - still looking at numbers and over Christmas and NY obviously traffic is quiet. I think 100% recovery is dependent on too many variables, like whether you continue link building during your time fixing the site, losing links by removing pages, adding more pages, competitors gaining authority/rankings etc
-
Hey Kerry,
There was addition of additional pages in April which is also when our sites started seeing a decrease in rankings - so the timing adds up.
The drops starting June have no clear root for us - we started our de-indexation process starting of December.
We are thinking to speed up indexation exclusively through a second Google Sitemap as anything else would need to be a very artificial landing page with a high number of links at this point. Would you be concerned exclusively using a Sitemap over keeping the unwanted pages linked from your linking structure?
Further, I am interested in how you determined the set of pages you know were part of the Google index to be delisted? It appears the best way to do so is to scrape the Google search results of pages returned for a domain and build up a list this way.
Did you recover completely to prior Panda?
Best /Thomas
-
Hi
No problem, I am happy to help!
Yes. graph declined sloooowly but only when we started removing pages. This is half the problem - you have to wait for Google to find the changes. The waiting is frustrating as you don't know if what you have done is right, but the stuff I listed will help speed it up. We literally had to wait until none of the pages could be found in the index.
I see a big increase in your indexation from April to May 2012. When did you get hit and what happened over that month - did you add a lot of new pages/products? Are those drops in indexation from June to Dec 2012 you removing pages or did the drop just start to 'happen' and then you got hit?
-
Kerry,
Thank your for your amazing response on the deindexing question I had. It was incredibly well written and very easy to follow. Very happy to hear you were able to recover.
You make a really good point; allowing Google to still be able to reach the pages; when we started reviewing our site structure we also changed our linking structure so while all pages we dont want to have longer in the index return a 410 they certainly aren't all discoverable. Our assumption was that Google will revisit them sooner or later given that they are part of the index but I can definitely imagine that thinks would get sped up by compiling a dedicated sitemap.
A big question I would have for you is how did the index status graph adjust for you in GWT over time? We started our restructuring start of January and we can't see a difference yet: http://imgur.com/eKBJ0
Did you graph decline step by step?
Thanks again
-
Hi
We just recovered from Panda - took us 6 months, but the best way to do this is to 410 or 404 your pages, but don't remove the links. If you remove the links to those pages then Google won't be able to find those pages and know that you have removed them.
Here are the steps you need to follow to get the changes indexed:
1. Remove the pages but leave the links to them on your site (we left these discretely at the bottom of the pages they were on, so users wouldn't find them easily, but Google would). You will see Google slowly start to pick up the number of 404s/410s in Webmaster Tools - don't worry about so many 410s being picked up - it won't hurt you. Don't no follow links, remove links, or block pages with robots.txt. You want Google to find your changes.
2. Revise your sitemaps - take the 410 pages out of the original sitemap and add them to a new separate sitemap and submit this in Webmaster Tools. Then you can see the true indexation rates of your current pages (gives you a good idea of how many are indexed vs not and if you still have issues). You can then also track the deindexation of your 410s separately - see how fast they are being deindexed - be patient, it takes time. We only recovered once they were all deindexed.
Our decision to use sitemaps as well as internal links was due to the fact that some deep pages are only crawled periodically and we wanted Google to find the changes quickly. This is useful: http://www.seomoz.org/blog/logic-meet-google-crawling-to-deindex
4. Then Wait If all your pages are removed and you are still affected by Panda, start looking for more duplicate content, and look with an objective view at your pages that still exist. You may be surprised with what you find. The process took us 6 months because we had to wait for Google to pick up our changes, and then revise, tweak, look for more to do etc.
I will write a case study soon, but in the meantime hope this helps you! I know how frustrating it is.
PS. If you are losing link value from 410s, 410 first, recover from Panda, and then 301 the select pages that have links to get the link juice back. It will be faster that way.
-
Google is already recrawling those pages for the last months but is returning to the pages that return 410. We have very explicit logging configured.
Google URL removal tool is not an option due to the manual character of the submission.
-
I think you need to wait for Google to get them recrawl these pages .. however, you can use Google URL removal tool in Webmaster Tools...
-
Thanks,
To be clear - my question does not look for proposals to recovery but implementation advice around shrinking the Google index size. We are talking about a scale of 10 thousands of pages. /Thomas
-
what about this approach - I am assuming that you know the exact date when the rank falls ..
You need to compare the traffic from Google for each pages. Find out those pages that suffered the most. Either get them removed [just exactly what you are doing] or completely rewrite them, adding nice images, videos etc, in short make it more interactive.
Now locate pages that are not that much affected. You need to make slight changes in them. Do not remove these pages.
Now locate those pages that have not affected at all. If those pages are content heavy, you need to produce some more pages with well written content./
Hope that helps.
-
Correct, it is intentional. The removed links have no link juice. The hop is though that an explicit 410 is a clearer signal for Google to remove the pages form the index.
I have been reading warnings around implementing a significant volume of 301s as it could be considered unnatural.
-
Just curious, is there any reason you did a 410 instead of a 301? I think most webmasters would setup 301 redirects to the most relevant remaining page for each of the pages that you did remove. With a 410, you're effectively dropping backlinks that might have existed to any of the pages that you had.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
Google Indexing Request - Typical Time to Complete?
In Google Search Console, when you request the (re) indexing of a fetched page, what's the average amount of time it takes to re-index and does it vary that much from site to site or are manual re-index request put in a queue and served on a first come - first serve basis despite the site characteristics like domain/page authority?
Intermediate & Advanced SEO | | SEO18050 -
Same language, Different countries. What would be the best way to introduce it?
Hello, We have a .com magento store with the US geo targeting We're going to launch a different versions soon, one for the US, and another one for Canada (we're going to add a Spanish and French versions later as well) The stores content will be same, except currency and contact us page. What would be a better strategy to introduce it to Google? What is better URL structure? example.com/ca/ , example.com/en-ca/ , or ca.example.com/ ? Should we stay with the original www.example.com/ (example.com) and just close an access to /ca/ and /us/ / or use rel=canonical / or use "alternate" hreflang to avoid duplicate content issues? Thanks in advance
Intermediate & Advanced SEO | | Meditinc.com0 -
"Null" appearing as top keyword in "Content Keywords" under Google index in Google Search Console
Hi, "Null" is appearing as top keyword in Google search console > Google Index > Content Keywords for our site http://goo.gl/cKaQ4K . We do not use "null" as keyword on site. We are not able to find why Google is treating "null" as a keyword for our site. Is anyone facing such issue. Thanks & Regards
Intermediate & Advanced SEO | | vivekrathore0 -
Legal Client Wants to Change Domain Name... What's the best way to pass authority from old domain?
Hey Mozzers,
Intermediate & Advanced SEO | | WhiteboardCreations
I received a call on Friday from a 6 attorney law office who have been my client for a long time. They have an established brand/domain in their market which isn't very big, but has a lot of competition. 2 of the attorneys are leaving to start their own firm and they want to remove a letter from their name abbreviation, thus their domain name as well. So, the other partners want to change the domain to reflect this. They want to buy a EMD [city]lawyers.com for about $1,600 along with some others to protect their new brand and name. I have a good idea as to what I need to do, BUT would love to hear advice from the community for this type of drastic change. 301 redirects? New Google Analytics code or same just different profile? Webmasters verifications? Content from old site? Old domain forwarding or keep active for a little bit? Is not the time to get them an SSL? Also, what should I prepare them for in terms of website traffic expectations and Google authority drops or remains the same? I know their Moz DA/PA will drop to 1/1, but anything else to look out for? Thank you in advance!
Fellow Pro Member - Patrick1 -
Pages are Indexed but not Cached by Google. Why?
Here's an example: I get a 404 error for this: http://webcache.googleusercontent.com/search?q=cache:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all But a search for qjamba restaurant coupons gives a clear result as does this: site:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all What is going on? How can this page be indexed but not in the Google cache? I should make clear that the page is not showing up with any kind of error in webmaster tools, and Google has been crawling pages just fine. This particular page was fetched by Google yesterday with no problems, and even crawled again twice today by Google Yet, no cache.
Intermediate & Advanced SEO | | friendoffood2 -
Development site is live (and has indexed) alongside live site - what's the best course of action?
Hello Mozzers, I am undertaking a site audit and have just noticed that the developer has left the development site up and it has indexed. They 301d from pages on old site to equivalent pages on new site but seem to have allowed the development site to index, and they haven't switched off the development site. So would the best option be to redirect the development site pages to the homepage of the new site (there is no PR on dev site and there are no links incoming to dev site, so nothing much to lose...)? Or should I request equivalent to equivalent page redirection? Alternatively I can simply ask for the dev site to be switched off and the URLs removed via WMT, I guess... Thanks in advance for your help! 🙂
Intermediate & Advanced SEO | | McTaggart1 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0