Panda recovery. Is it possible ?
-
Dear all,
To begin, english is not my native language, so I'm very sorry if I make some mistake.
On the 23th march 2012, Panda penalized my website (a coupon website) Avec-Reduction (dot com for the url). At this date, I have lost more than 70% of my traffic.
The structure of the website was like an e-commerce website. Categories -> merchant page -> coupon page. The content was to thin for Google, I'm agree wit that.
So, in may, I have made a new version. Here you can see the most important modifications :
- A smallest header (-100px height).
- 2 columns website (the oldest website had 3 columns)
- I have deleted the category menu with the list of all categories and the alphabetical menu.
- less ads on the website (since few days I have also deleted the 2 adense blocks)
- The coupons was promoted with the thumbs of the merchant on the home listing. Now I have few top lists in text only.
- I have deleted all the categories pages (one page by category of merchant, with the listing of all the merchants of the category). Now I have only one page for this. Same thing with the alphabetical pages. All these deleted pages have a 301 redirect. The 2 new pages (categorie page and alphabetical page) are on noindex.
- I have deleted all the promo codes pages, all the coupons are now on the merchant page (301 redirect used).
- I have create an anti-spam system for the review codes (I have a lot of spam on these forms, even if I cleaned every day/2days). Now, I have no spam.
- Visitors have now the possibility to put a note and a review for each merchant. This fonctionality is new, so not a lot of reviews for the moment.
- All the merchants pages without promo codes have a noindex on the robot tag.
- Since july, I have the possibility to use the "title" of each promo code. I can personnalised the promo code. On the same time, to have more content, I have the possibility to add sales or great promos for each merchants, not only promo codes.
- Affiliate links are created on JS which open a new window (a redirect page with noindex).
That's the end of the most important changes on my website. I have a better speed page (/2 since july) because I have optimized my images, CSS, JS...
At the end of july, I had health problem and the website has no update until the first days of october. Now, the website is updated every day, but between july and october I have no panda recovery.
I have no duplicate content, I try to add the most content I can. So I don't understand why Google Panda penalized me again and again. Some of my competitors have a lot of keyword stuffing (4, 5, 6, ... 10 lines of KS on each merchant pages). Some of them have only affiliate merchants, automatic script to put coupons on websites), few "same" websites...
I have less than 30% of affiliated merchant, I validate all the coupons or promo manually, I personalized all my coupons... So I don't understand what to do.
I will appreciate all help. If you see problems on my wesite or if you know tips to have a panda recovery, I will be very happy to have informations.
Many thanks for all.
Sincerely,
Florent
-
Dear Edward,
You are right. I have seen some US/UK coupons websites with informations about the company or the website (payment methods, shipping methods...). I think it's a good idea to have better content. On France, there are not a lot of coupons websites with these types of informations. The market on our country is important but less thant country like USA. So, a lot of websites are made automaticly, in few days some webmasters have few coupons website and they win money because they have a network with power.
I think I will use this option on the near future. 2600 merchants, it will be long to add these informations but if it's possible to have a Panda Recovery, I think it's not a hard work, just necessary.
Thx for your help.
Sincerely,
F.
-
I've worked with a few Coupon/Promo Codes sites since the launch of Penguin with some success and some failure. The biggest issues I've found between coupon sites is a lack of truly original content and very thin content, with pages frequently saying the same thing as other pages but slightly re-worded. Duplicate content issues are usually common as well.
Ex: "Check out our coupon codes for [company/website]...[more filler text here]."
One strategy that seems to be fairly effective (and logical) for such sites is filling the retailer coupon pages with information relevant to the coupons (which obviously vary) as well as the company. Ex: company history, background, etc. -- content that's truly unique from page to page.
-
Dear eyepaq,
Many thanks for your reply, that's great.
Like you say, I'm sure that my change are good for the future so when the panda filter (sorry for having used the word penality, in France we used the word "filtre", it's difficult to speak in other language :p).
you don't have to be good to stay safe or recover from panda, you need to be better then the rest of the sites that are covering the same niche.
I'm ok with that, that's the only point on which I have no idea to be better. I have see all the most important US coupons websites to help me but they are to big for me. Technically they are better, better design and I think they have a lot of persons which work every day for the website. In France, there are less competitors, 5 biggest and all the other are very simple website like mine.
+ panda it's only about the content (site speed, affiliate format, links etc are not taking into account - for those you have other filters and penalties)
I know, but havind a better speed is good for visitor. I think it's good for Google to show that speed page is important for the website.
Spot the ones that are still up and try to improve the content in order to be better then those are have a different approach for the same niche / area in order to get a spot for the diverse spots. In this way you will get back.
I will work on this :).
First assess the situation and see if you can be better - in terms of content - then those that are still on top in visibility that are targeting more or less the same content as you do. If you can beat them - change your content strategy and approach the same content in a different format and flow in order to make it on the top results as one of the sites that are part of the diversity results.
Ok but it's very difficult. I use 2 websites on my niche that have better traffic than. Why these 2 websites ? Only because they have a simple design and not a lot of options on the merchant pages. These 2 websites have problems which I haven't got but no pand filter and better traffic. The reason ? They are oldest than me and they have a lot of links (one have more 1 million link). So it's not very clean but they rank well.
Just a last question. Do you think it's better for me to "crypt" the coupon code (on the html code) ? Why ? Just because Google can see that we have all the same code. If I use a crypt code, perhaps it will be better to say "I have unique content" ? Do you think that it'a a good idea ?
Once again many thanks for your post. You are very clear for me and you give me another vision to solve my problem :).
Sincerely,
Florent
-
Hi Florent,
All the changes that you did are very good and for sure it will help your site - but not necessarily on the Panda issue.
When talking about panda you need to consider a few things:
-
panda is not a penalty - it's a filter (very important difference)
-
you don't have to be good to stay safe or recover from panda, you need to be better then the rest of the sites that are covering the same niche.
-
panda it's only about the content (site speed, affiliate format, links etc are not taking into account - for those you have other filters and penalties)
So, fi you are 100% that Panda is to blame for your drop in rankings you need to compare your self with the competition first and see how can you be better then them.
Just put your self in Google shoes - if you have 10 sites on the same niche with more or less the same content you want to keep 1 or 2, populate the rest of the results with diverse results and move the rest -50 or whatever.
If you are not the one on the 1 or 2 set then you are one of the one that just got moved back - way back (down).
Spot the ones that are still up and try to improve the content in order to be better then those are have a different approach for the same niche / area in order to get a spot for the diverse spots. In this way you will get back.
First assess the situation and see if you can be better - in terms of content - then those that are still on top in visibility that are targeting more or less the same content as you do. If you can beat them - change your content strategy and approach the same content in a different format and flow in order to make it on the top results as one of the sites that are part of the diversity results.
Hope it helps. Is it clear or I am beating around the bush ?
Cheers !
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible to expand organic reach in multiple countries/markets without localized content?
Hi everyone, I was recently hired as Content Lead for a SaaS company. We are based in Germany with plans to expand into the UK, Ireland, Spain, and the Netherlands. All of our website content is entirely in English and we don't have plans to localize content for any of the new markets. At least not yet.
Intermediate & Advanced SEO | | localyze_mason
One of my responsibilities will be to expand our organic reach through, mostly through SEO content. Though I'm comfortable with the fundamentals of SEO, I'm no expert and I certainly don't have experience with international SEO. I consulted a couple of resources like this guide to international SEO from Moz and this video from Semrush. In a nutshell, this is what I gather: if you want to expand organic reach in foreign countries/markets, you need to 1) decide what kind of domain you want to use and then implement the necessary technical configurations and 2) create localized content in the target market's language. As I mentioned, we won't be localizing any content at first. My question, then, is can we go about creating content in English and hope to gain any kind of meaningful organic exposure in non-English speaking markets? If so, what's the best approach? I apologize in advance if any of this isn't clear or if the answer is super obvious. Happy to provide further details upon request. Thanks in advance for any help that can be offered!0 -
Panda penalty removal advice
Hi everyone! I'm after a second (or third, or fourth!) opinion here! I'm working on the website www.workingvoices.com that has a Panda penalty dating from the late March 2012 update. I have made a number of changes to remove potential Panda issues but haven't seen any rankings movement in the last 7 weeks and was wondering if I've missed something... The main issues I identified and fixed were: Keyword stuffed near duplicate title tags - fixed with relevant unique title tags Copies of the website on other domains creating duplicate content issues - fixed by taking these offline Thin content - fixed by adding content to some pages, and noindexing other thin/tag/category pages. Any thoughts on other areas of the site that might still be setting off the mighty Panda are appreciated! Cheers Damon.
Intermediate & Advanced SEO | | Digitator0 -
Site recovery after manual penalty, disavow, SSL, Mobile update = but dropped again in May
I have a site that has had a few problems over the last year. We had a manual penalty in late 2013 for bad links, some from guest blogs and some from spammy sites. Reconsideration requests had me disavow almost all of the incoming links. Later in 2014, the site was hit with link injection malware and had another manual penalty. That was cleared up and manual penalty removed in Jan 2015. During this time the site was moved to SSL, but there were some redirect problems. By Feb 2015 everything was cleared up and a an updated disavow list was added. The site recovered in March and did great. A mobile version was added in April. About May 1st rankings dropped again. Traffic is about 40% off it's March levels. Recently I read that a new disavow file will supersede an old one, and if all of the original domains and URLs aren't included in the new disavow file they will no longer be disavowed. Is this true? If so, is it possible that a smaller disavow file uploaded in Feb would cause rankings to drop after the May 3 Quality update? Can I correct this by disavowing all the previously disavowed domains and URLs? Any advice for determining why the site is performing poorly again? We have well written content, regular blogs, nothing that seems like it should violate the Google guidelines.
Intermediate & Advanced SEO | | Robertjw0 -
Possible problem with new site (GWT no queries/very low index vs. submitted)
Hi everyone, I recently launched a new website for a small business loan company in the Dallas area. The site has been live for roughly a month and a half. I submitted everything to GWT as usual, including my sitemap. I am not sure what's going on with the site, as there is no activity from GWT in the impressions or queries. The submit vs. index is 24/3 (and hasn't moved). Also the queries graph on the overview stops at 3/18/2015... On another note, when I go to Crawl > Sitemaps, it shows that there were pages indexed during the month of march and then on April 3 it drops from 17 to 2 and never increases. Google says there are no errors or issues found, but I feel like there's something wrong. When I do site:, my URLs do pop up which makes me believe there's just a problem with my GWT. With that being said, I'm not happy THINKING there's something wrong. I need to actually know what the problem is. The only thing I can think of that I have done is purchase SSL for the site, but when I search what pages are indexed using www. it shows all the HTTPS URLS, so that would tell me that the site is getting indexed without a problem? Does anyone have a clue as to what might be happening? I will attach some screen shots so that you can get a better idea... KQ2366i D5xBNZf mF7kkgW
Intermediate & Advanced SEO | | jameswesleyhunt0 -
200 for Site Visitors, 404 for Google (but possibly 200?)
A 2nd question we have about another site we're working with... Currently if a visitor to their site accesses a page that has no content in a section, it shows a message saying that there is no information currently available and the page shows 200 for the user, but shows 404 for Google. They are asking us if it would be better to change the pages to 200's for Google and what impact that might have considering there would be different pages displaying the same 'no information here' message.
Intermediate & Advanced SEO | | Prospector-Plastics0 -
Magento Duplicate Content Recovery
Hi, we switched platforms to Magento last year. Since then our SERPS rankings have declined considerably (no sudden drop on any Panda/Penguin date lines). After investigating, it appeared we neglected to No index, follow all our filter pages and our total indexed pages rose sevenfold in a matter of weeks. We have since fixed the no index issue and the pages indexed are now below what we had pre switch to Magento. We've seen some positive results in the last week. Any ideas when/if our rankings will return? Thanks!
Intermediate & Advanced SEO | | Jonnygeeuk0 -
Canonical tag vs 301 in this Panda situation - trying to wrap my brain around this!
Here's the situation. Let's say you have a development site that was created on a subdomain such as examplesite.webdesign.com. When the new site, examplesite.com launches, the developer forgot to remove examplesite.webdesign.com from the index. As such, two copies of the site exist. Because the development site existed first, examplesite.com ends up being affected by Panda and drops out of the search results. As a result only the development site is visible on Google searches. I've been trying to wrap my head around whether using canonical tags or 301 redirects would be best. On one hand you could insert a canonical tag on each page of the subdomain to tell Google that the correct version to index is examplesite.com. On the other hand you could do a 301 redirect from every page of the development site to to examplesite.com. Now, here's where it gets complicated. Because the new site has been flagged as a Panda site, in either case will it need to see a Panda refresh in order to be included in the index?
Intermediate & Advanced SEO | | MarieHaynes0 -
403, 301, 302, 404 errors & possible google penalty
William Rock ran a Xenu site scan on nlpca(dot)com and mentioned the following: ...ran a test with Xenu site scan and it found a lot of broken links with 403, 301, 302, 404 Errors. Other items found: Broken page-local links (also named 'anchors', 'fragmentidentifiers'): http://www.nlpca.com/DCweb/Interesting_NLP_Sites.html#null anchor occurs multiple timeshttp://www.nlpca.com/DCweb/Interesting_NLP_Sites.html#US not found Could somone give us an output of that list, and which ones of these errors do we need to clean up for SEO purposes? Thank you.
Intermediate & Advanced SEO | | BobGW0