Recovery Steps For Panda 3.5 (Rel. Apr. 19, 2012)?
-
I'm asking people who have recovered from Panda to share what criteria they used - especially on sites that are not large scale ecommerce sites.
Blog site hit by Panda 3.5. Blog has approximately 250 posts. Some of the posts are the most thorough on the subject and regained traffic despite a Penguin mauling a few days after the Panda attack. (The site has probably regained 80% of the traffic it lost since Penguin hit without any link removal or link building, and minimal new content.)
Bounce rate is 80% and average time on page is 2:00 min. (Even my most productive pages tend to have very high bounce rates BUT those pages maintain time on page in the 4 to 12 minute range.)
The Panda discussions I've read on these boards seem to focus on e-commerce sites with extremely thin content. I assume that Google views much of my content as "thin" too. But, my site seems to need a pruning instead of just combiining the blue model, white model, red model, and white model all on one page like most of the ecommerce sites we've discussed.
So, I'm asking people who have recovered from Panda to share what criteria they used to decide whether to combine a page, prune a page, etc.
After I combine any series articles to one long post (driving the time on page to nice levels), I plan to prune the remaining pages that have poor time on page and/or bounce rates. Regardless of the analytics, I plan to keep the "thin" pages that are essential for readers to understand the subject matter of the blog. (I'll work on flushing out the content or producing videos for those pages.)
How deep should I prune on the first cut? 5% ? 10% ? Even more ? Should I focus on the pages with the worst bounce rates, the worst time on page, or try some of both?
If I post unique and informative video content (hosted on site using Wistia), what I should I expect for a range of the decrease in bounce rate ?
Thanks for reading this long post.
-
Alan : Thanks for sharing your experience in such detail.
-
After almost 2 years of panda destruction, and constant work on my site with no recovery whatsoever, I don't know if I have anything useful to contribute yet, so take this as some input.
Large site with over 2.2 million pages.
Deleted around 1.5 million pages
Removed all duplicate titles (removed for fixed)
removed all duplicate descriptions (removed or fixed)
Removed all problem pages (extra short, damaged content, empty)
Removed all duplicate body content pages.
Prevent addition of any new duplicates and if any slip past, fix within 24 hours.
Also, checked for incoming links and discovered some sites with problems pointing in - fixed or had these removed.
RESULT after almost 2 years? - zero improvement.
Almost ready to slash wrists, but about to try subdomaining first.
It would be funny if not so sad.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Page Speed Score 91, But 5-8 Seconds to Download URL
Greetings MOZ Community: In Google Analytics under "Site Speed" under "Behavior" our home page has a page speed rank of 91 which I assume is pretty fast. However the "Average Page Load Time" is varies between 5 and 8 seconds, which seems very slow. My developers have made major efforts to optimize the home page URL (www.nyc-officespace-leader.com) for speed. The page has a carousel which I assume may be slowing it down. Is the download speed of this page detrimental to SEO? Or is the favorable Page Speed Score good enough. I am particularly concerned because the most competitive phrases are ranked on the home page. As it stands I am having a lot of difficulty ranking in the top ten for these pages. My concern is that the slow download speed of the home page could be holding back ranking of these terms. If necessary I can always redesign the home page and remove the carousel or reduce the number of listings in the carousel to speed it up. Is this worth investing effort in or is the speed good enough? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
2.3 million 404s in GWT - learn to live with 'em?
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this. Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT. Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file. Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis. So here’s where my thought process is leading: Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index. Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis. Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time. We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates. Thoughts? If you think I’m off base, please set me straight. 🙂
Intermediate & Advanced SEO | | ufmedia0 -
Noindex, rel=cannonical, or no worries?
Hello, SEO pros, We need your help with a case ↓ Introduction: Our website allows individual contractors to create a webpage where they can show what services they offer, write something about themselves and show their previous projects in pictures. All the professions and services assigned accordingly are already in our system, so users need to pick a profession and mark all services they provide or suggest those which we missed to add. We have created unique URLs for all the professions and services. We have internal search field and use a autocomplete to direct users to the right page. **Example: ** PROFESSION Carpenter (URL: /carpenters ) SERVICES Decking (URL: /carpenters/decking) Kitchens (URL: /carpenters/kitchens) Flooring and staircases (URL: /carpenters/flooring-and-staircases) Door trimming (URL: /carpenters/door-trimming) Lock fitting (URL: /carpenters/lock-fitting) Problem We want to be found by Google search on all the services and give a searchers a list of all carpenters in our database who can provide a service they want to find. We give 15 contractors per page and rank them by recommendations provided by their clients. Our concern is that our results pages may be marked as duplicate since some of them give the same list of carpenters. All the best 15 carpenters offer door-trimming and lock-fitting. So, all the same 15 are shown in /carpenters, /carpenters/lock-fitting, /carpenters/door-trimming. We don't want to be marked as spammers and loose points on domain trust, however we believe we give quality content since we gave what the searchers want to find - contractors, who offer what they need. **Solution? ** Noindex all service pages to avoid duplicate content indexed by Google OR rel=canonical tag on service pages to redirect to profession page. e.g. on /carpenters/lock-fitting page make a tag rel=canonical to /carpenters. OR no worries, allow Google index all the professions and services pages. Benefits of indexing it all (around 2500 additional pages with different keywords) is greater than ttagging service pages with no index or rel=canonical and loosing the opportunities to get more traffic by service titles. We need a solution which would be the best for our organic traffic 🙂 Many thanks for your precious time.
Intermediate & Advanced SEO | | osvaldas0 -
Does this make sense to recover from panda?
Hello guys, our website was pandalized on 9/27/2012 and we haven't been able to recover since then. I've fixed as much as possible when it comes to poor content, and we have been getting high quality links consistently for the past 3-4 months. Our blog had some duplicate content issues due to categories, tags, feeds, etc. I solved those problems before the past 2 refreshes without success. I'm considering moving the blog to a subdomain, more than PR, I'm interested in recovering from panda, and let the blog grow on its own. What do you think about that?
Intermediate & Advanced SEO | | DaveMri0 -
Cross Domain Rel Canonical for Affiliates?
Hi We use the Cross Domain Rel Canonical for duplicate content between our own websites, but what about affiliates sites who want our XML feed, (descriptions of our products). We don´t mind being credited but would this present a danger for us? Who is controlling the use of that cross domain rel canonical, us in our feed or them? Is there another way around it?
Intermediate & Advanced SEO | | xoffie0 -
Panda 2.5
I'm sure we have all read about the latest round of Google's algorithm changes also known as the "Panda 2.5" updates. This latest update seems to have hit some pretty large press release sites including PR Newswire and Businesswire (both of these have a great page rank and domain authority making them a great tool for SEO's in regards to inbounds links). Ultimately this update has directly affected their sites traffic, keyword rankings, and the number of indexed pages in Google. But what will this do to our smaller sites that benefit from these great links? Will these panda updates continue to target these content farms and lower their domain authority? Will that extrapolate out and effect the domain authority of our sites? What are your thoughts for those of us that utilize these services, should we re-evaluate our process? I look forward to a great discussion. Regards - Kyle
Intermediate & Advanced SEO | | kchandler0 -
Steps you can take to ensure your content is indexed and registered to your site before a scraper gets to it?
Hi, A clients site has significant amounts of original content that has blatantly been copied and pasted in various other competitor and article sites. I'm working with the client to rejig lots of this content and to publish new content. What steps would you recommend to undertake when the new, updated site is launched to ensure Google clearly attributes the content to the clients site first? One thing I will be doing is submitting a new xml + html sitemap. Thankyou
Intermediate & Advanced SEO | | Qasim_IMG0 -
Should I be using rel canonical here?
I am reorganizing the data on my informational site in a drilldown menu. So, here's an example. One the home page are several different items. Let's say you clicked on "Back Problems". Then, you would get a menu that says: Disc problems, Pain relief, paralysis issues, see all back articles. Each of those pages will have a list of articles that suit. Some articles will appear on more than one page. Should I be worried about these pages being partially duplicates of each other? Should I use rel-canonical to make the root page for each section the one that is indexed. I'm thinking no, because I think it would be good to have all of these pages indexed. But then, that's why I'm asking!
Intermediate & Advanced SEO | | MarieHaynes0