Panda Recovery Question
-
Dear Friends,
One of my customers was hit by the Panda, we were working on improve the tiny content on several pages and the remaining pages were:
1 NOINDEX/FOLLOW
2. Removed from sitemap.xml
3. Un-linked from the site (no one page on the site link to the pour content)
As conclusion we can't see any improvement, my question is should I remove the pour content pages (404)?
What is your recommendation?
Thank you for your time
Claudio
-
Thank you
-
Ivan, Panda is a page-level user experience algorithm. Ask yourself: do content pages that result in high bounce rates, low average site visits OR result in "pogosticking" (users click on the pages then immediately use the back button and return back to search results) REALLY qualify as quality pages? The answer is no, they don't.
I would urge you to visit the list of 23 questions I initially linked to above and ASK these of your current content options. Further, if you do visit this link, take a look at this quote from Amit Singhal on the same page:
"low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content."
Directly from the horse's mouth. Can't get any clearer than that.
-
Hi,is this really true about this: bounce rate of 100% OR an average visit of less than 30 seconds should be reviewed closely for complete removal from your site. Even a small amount of these type of pages can drag down an entire site algorithmically.
thank you
-
Dear Casey,
Thank you for your prompt response, I want to share with you he url http://goo.gl/4QBVjR please take a look an will be welcome all your feedback
Thank you
Claudio
-
Absolutely! Think of your site as a book. It used to be (pre-Panda) that adding new pages to your site was the right result. More pages, even low-quality pages, allowed for your site to better trigger long tail keywords which generated more traffic. This traffic may not have been super-targeted though and tended to generate very high bounce rates.
Now, post-Panda, it's clear that even a SMALL amount of low-quality, thin, or poor user experience content will drag down your entire domain. That's how Panda works -- it's a page-level quality algorithm. So pruning or removing that content is definitely a consideration to which you must give serious thought. Ask yourself: does your client's content answer a question, fulfill a need, or provides a unique viewpoint all of which work together to provide a full quality user experience? If not, then either re-write (usually a complete waste of time) or remove it completely from your site.
When Google pushed out Panda waaaaay back in 2011 they published a list of 23 questions that site owners should be asking themselves when auditing their site for content and user experience. Read this list and take a hard look at your site and content practices with an eye to understanding how Google may see your site.
Then, I'd suggest you go into Google Analytics under Behavior, choose Site Content, then All Pages, and then sort that content by Bounce Rate. Any page that has a bounce rate of 100% OR an average visit of less than 30 seconds should be reviewed closely for complete removal from your site. Even a small amount of these type of pages can drag down an entire site algorithmically.
Finally, if you do remove the pages from your site, I'd suggest a 410 GONE status code. These seem to be processed much faster than regular 404s and it's a clear sign to Google that these pages are NEVER coming back!
I hope this was helpful Claudio. Good luck with your client's site!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Panda, rankings and other non-sense issues
Hello everyone I have a problem here. My website has been hit by Panda several times in the past, the first time back in 2011 (first Panda ever) and then another couple of times since then, and, lastly, the last June 2016 (either Panda or Phantom, not clear yet). In other words, it looks like my website is very prone to "quality" updates by big G: http://www.virtualsheetmusic.com/ Still trying to understand how to get rid of Panda related issues once for all after so many years of tweaking and cleaning my website of possible duplicate or thin content (301 redirects, noindexed pages, canonicals, etc), and I have tried everything, believe me. You name it. We recovered several times though, but once in a while, we are still hit by that damn animal. It really looks like we are in the so called "grey" area of Panda, where we are "randomly" hit by it once in a while. Interestingly enough, some of our competitors live joyful lives, at the top of the rankings, without caring at all about Panda and such, and I can't really make a sense of it. Take for example this competitors of ours: http://8notes.com They have a much smaller catalog than ours, worse quality of offered music, thousands of duplicate pages, ads everywhere, and yet... they are able to rank 1st on the 1st page of Google for most of our keywords. And for most, I mean, 99.99% of them. Take for example "violin sheet music", "piano sheet music", "classical sheet music", "free sheet music", etc... they are always first. As I said, they have a much smaller website than ours, with a much smaller offering than ours, their content quality is questionable (not cured by professional musicians, and highly sloppy done content as well as design), and yet they have over 480,000 pages indexed on Google, mostly duplicate pages. They don't care about canonicals to avoid duplicate content, 301s, noindex, robot tags, etc, nor to add text or user reviews to avoid "thin content" penalties... they really don't care about anything of that, and yet, they rank 1st. So... to all the experts out there, my question is: Why's that? What's the sense or the logic beyond that? And please, don't tell me they have a stronger domain authority, linking root domains, etc. because according to the duplicate and thin issues I see on that site, nothing can justify their positions in my opinion and, mostly, I can't find a reason why we instead are so much penalized by Panda and such kind of "quality" updates when they are released, whereas websites like that one (8notes.com) rank 1st making fun of all the mighty Panda all year around. Thoughts???!!!
Intermediate & Advanced SEO | | fablau0 -
Can multiple geotargeting hreflang tags be set in one URL? International SEO question
Hi All, I have a question please. If i target www.onedirect.co.nl/en/ in English for Holland, Belgium and Luxembourg, are the tags below correct? English for Holland, Belgium and Luxembourg: http://www.example.co.nl/en/" hreflang="en-nl" /> http://www.example.co.nl/en/" hreflang="en-be" /> http://www.example.co.nl/en/" hreflang="en-lu" /> AND Targeting Holland and Belgium in Dutch: Pour la page www.onedirect.co.nl on peut inclure ce tag: http://www.example.co.nl" hreflang="nl-nl" /> http://www.example.co.nl" hreflang="nl-be" /> thanks a lot for your help!
Intermediate & Advanced SEO | | Onedirect_uk0 -
Can multiple geotargeting hreflang tags be set in one URL? International SEO question
Hi All, Thank you for this great post! I have a question please. If i target www.onedirect.co.nl/en/ in English for Holland, Belgium and Luxembourg, are the tags below correct? English for Holland, Belgium and Luxembourg: http://www.example.co.nl/en/" hreflang="en-nl" /> http://www.example.co.nl/en/" hreflang="en-be" /> http://www.example.co.nl/en/" hreflang="en-lu" /> AND Targeting Holland and Belgium in Dutch: Pour la page www.onedirect.co.nl on peut inclure ce tag: http://www.example.co.nl" hreflang="nl-nl" /> http://www.example.co.nl" hreflang="nl-be" /> thanks a lot for your help!
Intermediate & Advanced SEO | | Onedirect_uk0 -
Penguin 2.1\. Bad links removed - do I need to wait for next Penguin upgrade to see recovery?
Hi - I have read conflicting advice about this issue - after taking action and removing bad links following a Penguin 2.1 hit, will the site need to wait for the next Penguin upgrade before the link clean-up has any effect? Or will the cleaning of the links be acknowledged and "rewarded" with a ranking improvement before that (assuming all bad links were cleared out)?
Intermediate & Advanced SEO | | StevieD0 -
2013 Panda Update Question
Hi everyone, I'm new here 🙂 So far I've had wonderful success seo wise and none of the updates (Penguin nor Panda) affected any sites, until this one. For example, one site has 7 keywords I'm optimizing for. Out of those 7, all but 2 (and variations of the 2 - one word vs long-tail) completely tanked. These keywords were all on page 2/3. One of the two survivors never budged from page 2 (it's a brand keyword so I was sooo happy to finally get it to page 2) Now when I check rankings, the other terms show up in the 200-400 spots, but NOT for the URL I was optimizing for (category page) but instead for random products in the category. The only thing I've done differently with the 2 keywords that are still doing well, was focus - we did more link-building for those, but not an extreme amount. Never over-optimize. My question is, how did 2 survive and 5 are still floating up and down. Last night I saw one go up 122 spots, now today down 14. I'm really struggling with this. Thank you
Intermediate & Advanced SEO | | Freelancer130 -
Canonical tag vs 301 in this Panda situation - trying to wrap my brain around this!
Here's the situation. Let's say you have a development site that was created on a subdomain such as examplesite.webdesign.com. When the new site, examplesite.com launches, the developer forgot to remove examplesite.webdesign.com from the index. As such, two copies of the site exist. Because the development site existed first, examplesite.com ends up being affected by Panda and drops out of the search results. As a result only the development site is visible on Google searches. I've been trying to wrap my head around whether using canonical tags or 301 redirects would be best. On one hand you could insert a canonical tag on each page of the subdomain to tell Google that the correct version to index is examplesite.com. On the other hand you could do a 301 redirect from every page of the development site to to examplesite.com. Now, here's where it gets complicated. Because the new site has been flagged as a Panda site, in either case will it need to see a Panda refresh in order to be included in the index?
Intermediate & Advanced SEO | | MarieHaynes0 -
Need some urgent Panda advice. Open discussion about recovering from the Panda algorithm.
I have a site that has been affected by Panda, and I think I have finally found the problem. When I created this site in the year 2006, I bought content without checking it. Recently, when I went through the site I found out that this content had many duplicates around the web. Not 100% exact, but close to. The first thing I did is ask my best writer to rewrite these topics, as they are a must on my site. This is a very experienced writer, and she will make the categories and subpages outstanding. Second thing I did was putting a NOINDEX, FOLLOW robots meta in place for the pages I determined being bad. They haven't been de-indexed yet. Another thing I recently did is separate other languages and move these over to other domains (with 301's redirecting the old locations to the new.) This means that the site now has a /en/ directory in the URL which is no longer used. With this in mind I was thinking to relocate the NEW content, and 301 the old (to preserve the juice for a while.) For example: http://www.mysite.com/en/this-is-a-pandalized-page/ 301 to http://www.mysite.com/this-is-the-rewritten-page/ The benefits of doing this are: decreasing the amounts of directories in the URL getting rid of pages that are possibly causing trouble getting fresh pages added to the site Now, the advice I am looking for is basically this: Do you agree with the above? Or don't you agree? If you don't, please be so kind to include a reason with your answer. If you do, and have any additional information, or would like to discuss, please go ahead 🙂 Thanks, Giorgio PS: Is it proven that Panda is now a running update? Or is it still periodically executed?
Intermediate & Advanced SEO | | VisualSense1 -
Panda Update - Challenge!
I met with a new client last week. They were very negatively impacted by the Panda update. Initially I thought the reason was pretty straight-forward and had to do with duplicate content. After my meeting with the developer, I'm stumped and I'd appreciate any ideas. Here are a few details to give you some background. The site is a very nice looking (2.0) website with good content. Basically they sell fonts. That's why I thought there could be some duplicate content issues. The developer assured me that the product detail pages are unique and he has the rel=canonical tag properly in place. I don't see any issues with the code, the content is good (not shallow), there's no advertising on the site, XML sitemap is up to date, Google webmaster indicates that the site is getting crawled with no issues. The only thing I can come up with is that it is either: Something off-page related to links or Related to the font descriptions - maybe they are getting copied and pasted from other sites...and they don't look like unique content to Google. If anyone has ideas or would like more info to help please send me a message. I greatly appreciate any feedback. Thank you, friends! LHC
Intermediate & Advanced SEO | | lhc670