Need some urgent Panda advice. Open discussion about recovering from the Panda algorithm.
-
I have a site that has been affected by Panda, and I think I have finally found the problem.
When I created this site in the year 2006, I bought content without checking it. Recently, when I went through the site I found out that this content had many duplicates around the web. Not 100% exact, but close to.
The first thing I did is ask my best writer to rewrite these topics, as they are a must on my site. This is a very experienced writer, and she will make the categories and subpages outstanding.
Second thing I did was putting a NOINDEX, FOLLOW robots meta in place for the pages I determined being bad. They haven't been de-indexed yet.
Another thing I recently did is separate other languages and move these over to other domains (with 301's redirecting the old locations to the new.) This means that the site now has a /en/ directory in the URL which is no longer used.
With this in mind I was thinking to relocate the NEW content, and 301 the old (to preserve the juice for a while.) For example:
http://www.mysite.com/en/this-is-a-pandalized-page/
301 to
http://www.mysite.com/this-is-the-rewritten-page/
The benefits of doing this are:
-
decreasing the amounts of directories in the URL
-
getting rid of pages that are possibly causing trouble
-
getting fresh pages added to the site
Now, the advice I am looking for is basically this: Do you agree with the above? Or don't you agree? If you don't, please be so kind to include a reason with your answer. If you do, and have any additional information, or would like to discuss, please go ahead
Thanks,
Giorgio
PS: Is it proven that Panda is now a running update? Or is it still periodically executed?
-
-
The reason I know for a fact its Panda, is that the site lost its rankings (and thus about 60% of its traffic) end of February 2011.
Since then I managed to get slightly better rankings by adding loads of content and rewriting some categories (with thin pages, and not too many sub pages) from the bottom up, however, I never realized I had the content I currently located, which is terrible in terms of quality, and has duplicates all over the web. Like I said, this content dates back to 2006 when I didn't have a clue about SEO.
It's not that the content will be rewritten, based on what's there. I just told my writer to write about topic X and topic Y and make it very informative, so I will go from bad to really good pages.
Moving the new pages to new locations and getting rid of the others "infected" pages seems the best in my opinion, despite the age of these pages and the occasional link to them.
-
Panda still runs in installments, not continually. Rewriting content sounds like a massive task, hope it's worth it (e.g. is it better to write new stuff instead?). Have you got any pagination present on the site or indexable search results? We are under assumption here that you are certain that the problem is caused by Panda filter and not other factor and that your page layout and ads are not the cause of the drop or that the problem is not link related. I see no problem with 301 of pages with duplicate content to a new better content page. Sounds like something users might appreciate as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issue with Site Map - how critical would you rank this in terms of needing a fix?
A problem has been introduced onto our sitemap whereby previously excluded URLs are no longer being correctly excluded. These are returning a HTTP 400 Bad Request server response, although do correctly redirect to users. We have around 2300 pages of content, and around 600-800 of these previously excluded URLs, An example would be http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/botswana/suggested-holidays/botswana-classic-camping-safari/Dates and prices.aspx (the page does correctly redirect to users). The site is currently being rebuilt and only has a life span of a few months. The cost our current developers have given us for resolving this is quite high with this in mind. I was just wondering: How much of a critical issue would you view this? Would it be sufficient (bearing in mind this is an interim measure) to change these pages so that they had a canonical or a redirect - they would however remain on the sitemap. Thanks
Intermediate & Advanced SEO | | KateWaite
Kate0 -
Can you recover from "Unnatural links to your site—impacts links" if you remove them or have they already been discounted?
If Google has already discounted the value of the links and my rankings dropped because in the past these links passed value and now they don't. Is there any reason to remove them? If I do remove them, is there a chance of "recovery" or should I just move forward with my 8 month old blogging/content marketing campaign.
Intermediate & Advanced SEO | | Beastrip0 -
Advice needed on how to handle alleged duplicate content and titles
Hi I wonder if anyone can advise on something that's got me scratching my head. The following are examples of urls which are deemed to have duplicate content and title tags. This causes around 8000 errors, which (for the most part) are valid urls because they provide different views on market data. e.g. #1 is the summary, while #2 is 'Holdings and Sector weightings'. #3 is odd because it's crawling the anchored link. I didn't think hashes were crawled? I'd like some advice on how best to handle these, because, really they're just queries against a master url and I'd like to remove the noise around duplicate errors so that I can focus on some other true duplicate url issues we have. Here's some example urls on the same page which are deemed as duplicates. 1) http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Holdings-and-sectors-weighting?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE&widgets=1 What's the best way to handle this?
Intermediate & Advanced SEO | | SearchPM0 -
Panda/Penguin & Ecommerce Sites in similar niches
Hello, We have a few online stores that are in similar niches. How do we make sure that we don't get penalized for this (Panda/Penguin) We have the sites interlinked, but our newest one is not going to be linked to the others. Also, will rewriting descriptions help if the product is on more than one site? Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Panda / Penguin Testing on a Site - Has anyone see this?
Hi, Trying to diagnose the fall of our site. We fell mainly with Panda 3.4 and then a little more with Penguin. We have a main site with 200 pages and an attached blog. example domain.com/blog Then blog that was really small with only 7 posts. One keyword phrase example: "ace widget software" has ranked # 2 and 3 through the entire storm. The page that is ranking is in our main root site (not the blog). We used to rank for 200 phrases now only rank for about 10 Over the past week I stumbled on the fact that if I create a new post in my blog, those pages rank in 3 days. Good rankings, #2 on one and at least first page on the other 5 pages. One page ranked #2 in 17 hours. The test I am conducting: I am now testing to see if maybe there is some coding issue on our site, we do not use a template but a 3 column design built in Dreamweaver using older style tables etc. 1. Putting a new page on the old design. 2. Taking an existing page and putting into new design without side columns. 3. Already testest - adding new page to blog (success on this test) Seems if it was a coding issue/ design the two or three keywords phrases that stayed steady through the storm would have fallen. our site: www.TranslationSoftware4u.com Has anyone else been adding new content to see it rank really good but cannot get the other pages to bounce back up in rankings? Open to ideas of why this is happening. Thanks in advance! Force7
Intermediate & Advanced SEO | | Force70 -
I need help with setting the preferred domain; www. or not??
Hi! I'm kinda new to the SEO game and struggling with this site I'm working on: http://www.moondoggieinc.com I set the preferred domain to www. in GWT but I'm not seeing it reroute to that? I can't seem to get any of my internal pages to rank, and I was thinking it's possiblly b/c of a duplicate content issue cause by this problem. Any help or guidance on the right way to set preferred domain for this site and whiy I can't get my internal pages to rank? THANKS! KristyO
Intermediate & Advanced SEO | | KristyO0 -
Whether our shared articles caused Panda hit to our high quality site
Hello, We are a quality site hit by Panda Our article collection: http://www.nlpca(dot)com/DCweb/NLP_Articles.html is partially articles written by the site owners and partially articles that are elsewhere on them web. We have permission to post every article, but I don't know if Google knows that. Could this be why we were hit by Panda? And if so, what do we do? We've dropped way down in rank but have worked our way half-way back up. Two of our main keywords are: NLP NLP Training Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Do i need different IP addresses for mini sites?
Hi everyone We are currently building some non-advertorial based mini sites the link to a main "money site", these mini sites are all run off wordpress or similar and have different designs, however all the WHOIS data remains under one company. So therefore I dont know if really you need different Class C IP's anymore as google et al will just look at the whois records and link the websites up that way? Is this tactic still worth doing? Thanks for any input!
Intermediate & Advanced SEO | | SEOwins0