Panda 2.2 Full Recovery In Action
-
I have had several new clients come to me after Panda and Panda 2. Lots of audits. The client who had the worst problems, and has since corrected the worst issues based on my audit just bounced back in an epic way, and while it could be a short-term thing, I don't believe that's the case - it's just too big of a jump back - full recovery.
I'm curious to find out if anyone sees a similar recovery on your sites.
FYI the biggest problems (most of which have been resolved now) include:
- Content organization - it was a mess of a site
- Extreme over-use of ads on the page and in the content
- Topical focus - there was so much going on across every page of the site that confused Google
- Major site speed issues
-
They were running several ad networks feeding ads everywhere - the system was choking on it. Then there's the fact that it's a Drupal site, set up with default native taxonomy, causing serious data extraction bottle-necks.
Both issues had to be addressed. It's still slow, but much better.
-
HI Alan,
Great to hear. Could you expand on your experience with the site speed and how you improved it? Was it server side or due to site content? Location of server?
-
it's actually hit a higher level of traffic than pre-May-Day. Hard to tell on that chart but amazing if it holds.
-
Great news, Alan. Keep us posted if this holds. Looks like traffic is about up to normal.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Panda Cleanup - Removing Old Blog Posts, Let Them 404 or 301 to Main Blog Page?
tl;dr... Removing old blog posts that may be affected by Panda, should we let them 404 or 301 to the Blog? We have been managing a corporate blog since 2011. The content is OK but we've recently hired a new blogger who is doing an outstanding job, creating content that is very useful to site visitors and is just on a higher level than what we've had previously. The old posts mostly have no comments and don't get much user engagement. I know Google recommends creating great new content rather than removing old content due to Panda concerns but I'm confident we're doing the former and I still want to purge the old stuff that's not doing anyone any good. So let's just pretend we're being dinged by Panda for having a large amount of content that doesn't get much user engagement (not sure if that's actually the case, rankings remain good though we have been passed on a couple key rankings recently). I've gone through Analytics and noted any blog posts that have generated at least 1 lead or had at least 20 unique visits all time. I think that's a pretty low barrier and everything else really can be safely removed. So for the remaining posts (I'm guessing there are hundreds of them but haven't compiled the specific list yet), should we just let them 404 or do we 301 redirect them to the main blog page? The underlying question is, if our primary purpose is cleaning things up for Panda specifically, does placing a 301 make sense or would Google see those "low quality" pages being redirected to a new place and pass on some of that "low quality" signal to the new page? Is it better for that content just to go away completely (404)?
Technical SEO | | eBoost-Consulting0 -
Should summary pages have the rel canonical set to the full article?
My site has tons of summary pages, Whether for a PDF download, a landing page or for an article. There is a summary page, that explains the asset and contains a link to the actual asset. My question is that if the summary page is just summary of an article with a "click here to read full article" button, Should I set the rel canonical on the summary page to go to the full article? Thanks,
Technical SEO | | Autoboof0 -
Noindex large productpages on webshop to counter Panda
A Dutch webshop with 10.000 productpages is experiencing lower rankings and indexation. Problems started last october, a little while after the panda and penguin update. One of the problems diagnosed is the lack of unique content. Many of the productpages lack a description and some are variants of eachother. (color, size, etc). So a solution could be to write unique descriptions and use rel canonical to concentrate color/size variations to one productpage. There is however no capacity to do this on short notice. So now I'm wondering if the following is effective. Exclude all productpages via noindex, robots.txt. IN the same way as you can do with search pages. The only pages left for indexation are homepage and 200-300 categorypages. We then write unique content and work on the ranking of the categorypages. When this works the product pages are rewritten and slowly reincluded, category by category. My worry is the loss of ranking for productpages. ALthoug the ranking is minimal currently. My second worry is the high amount of links on category pages that lead to produtpages that will be excluded rom google. Thirdly, I am wondering if this works at all. using noindex on 10.000 productpages consumes crawl budget and dillutes the internal link structure. What do you think?
Technical SEO | | oeroek0 -
2 sets of stats for same site
Somehow on OSE I managed to get two different sets of results appear for my page. The column on the left (PA 34) is for mysite.com/ and the second column is for www.mysite.com/ .Note that these are the same site. Why do i have two different sets of results ?(note some things are the same such as google +1 & FB likes)Im concerned ive done something wrong and could have a bigger beast with both sets of results merged together. Any help much appreciated. Chris QFNeGh7
Technical SEO | | cjkimber0 -
Is it panda, pengiun, ad penalty?
I'm trying to figure out why my google traffic is going down... I see that back in Feb and then March 2011 it started to drop, which I assume was pengiun. I saw a gradual comeback in traffic until March 2012, which I assume the second drop was another pengiun update. The decline continued gradually until I saw a big drop in October 2012 which is completely dropping off in the past month today. I recreated my website on wordpress, improving content and removing google ads. Relaunched a few weeks ago and still see a big drop. Any idea what happened? I only got a message from google about a large traffic drop in march 2012 and a 404 error increase recently when I launched the new site which I fixed with 301 and removing media attachment pages that were indexed that gave a 404. Once concern is I have no idea if I have a problem with pengiun. Could I have a problem with too many links coming from my blog or soicial network? What's acceptable number of back links to not be spam? If you add pages in the blogroll is this thought of as spam with pengiun? website: http://www.dashinfashion.com Thanks for your help!
Technical SEO | | dashinfashion0 -
Panda Victim still looking for recovery looking for help
I am an internet retailer hit by Panda and have made many changes to my site since first being hit on feb 2011. I had a slight recovery last september but have since slipped back again. I have scoured the internet for panda recoveries for internet retailers like me but I have not seen any. If anyone knows of recoveries of a site like mine (wackyplanet.com) -- we are on a yahoo store platform I would aappreciate any info as I am looking for an SEO who has experience with Panda as it relates to sites like mine.
Technical SEO | | bobforesi0 -
Title Missing: Page is an Action
I have a page that has a missing Title Tag. I had a look at the page and it looks like an action, this is part of th cart. http://mydomainxyz.com/billing/status/index.php?action=phpinfo Is this a problem? How dod i deal with this?
Technical SEO | | stefanok0 -
De-indexing thin content & Panda--any advantage to immediate de-indexing?
We added the nonidex, follow tag to our site about a week ago on several hundred URLs, and they are still in Google's index. I know de-indexing takes time, but I am wondering if having those URLs in the index will continue to "pandalize" the site. Would it be better to use the URL removal request? Or, should we just wait for the noindex tags to remove the URLs from the index?
Technical SEO | | nicole.healthline0