Need some urgent Panda advice. Open discussion about recovering from the Panda algorithm.
-
I have a site that has been affected by Panda, and I think I have finally found the problem.
When I created this site in the year 2006, I bought content without checking it. Recently, when I went through the site I found out that this content had many duplicates around the web. Not 100% exact, but close to.
The first thing I did is ask my best writer to rewrite these topics, as they are a must on my site. This is a very experienced writer, and she will make the categories and subpages outstanding.
Second thing I did was putting a NOINDEX, FOLLOW robots meta in place for the pages I determined being bad. They haven't been de-indexed yet.
Another thing I recently did is separate other languages and move these over to other domains (with 301's redirecting the old locations to the new.) This means that the site now has a /en/ directory in the URL which is no longer used.
With this in mind I was thinking to relocate the NEW content, and 301 the old (to preserve the juice for a while.) For example:
http://www.mysite.com/en/this-is-a-pandalized-page/
301 to
http://www.mysite.com/this-is-the-rewritten-page/
The benefits of doing this are:
-
decreasing the amounts of directories in the URL
-
getting rid of pages that are possibly causing trouble
-
getting fresh pages added to the site
Now, the advice I am looking for is basically this: Do you agree with the above? Or don't you agree? If you don't, please be so kind to include a reason with your answer. If you do, and have any additional information, or would like to discuss, please go ahead
Thanks,
Giorgio
PS: Is it proven that Panda is now a running update? Or is it still periodically executed?
-
-
The reason I know for a fact its Panda, is that the site lost its rankings (and thus about 60% of its traffic) end of February 2011.
Since then I managed to get slightly better rankings by adding loads of content and rewriting some categories (with thin pages, and not too many sub pages) from the bottom up, however, I never realized I had the content I currently located, which is terrible in terms of quality, and has duplicates all over the web. Like I said, this content dates back to 2006 when I didn't have a clue about SEO.
It's not that the content will be rewritten, based on what's there. I just told my writer to write about topic X and topic Y and make it very informative, so I will go from bad to really good pages.
Moving the new pages to new locations and getting rid of the others "infected" pages seems the best in my opinion, despite the age of these pages and the occasional link to them.
-
Panda still runs in installments, not continually. Rewriting content sounds like a massive task, hope it's worth it (e.g. is it better to write new stuff instead?). Have you got any pagination present on the site or indexable search results? We are under assumption here that you are certain that the problem is caused by Panda filter and not other factor and that your page layout and ads are not the cause of the drop or that the problem is not link related. I see no problem with 301 of pages with duplicate content to a new better content page. Sounds like something users might appreciate as well.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving to a new domain for second time - critical, help needed fast!
Hello, Important: please do not ask why we need to change the domain, its not the matter at all, thank you for understanding. Over a month ago we successfully changed our domain name, 301 redirected, did GWT 'change of address' and all. The old domain was 2 years old, ranking very well, the new domain change of address was a success and traffic back on the new domain after a week. Today we need to change the domain name again, unfortunately, for some reasons, we have to, however we are not sure what to do in GWT, when I went to 'change of address' in the domain (the new first domain), i saw the following message (screenshot attached too): This site is undergoing a move Old URL | New URL If any URL on the left should not be moved, you can withdraw its move request. To do this, click the URL and then Withdraw. Now our questions: 1. For second time moving to a new domain, we should move from the old first domain (301 from the first old domain) or from the second domain (301 from the second domain)? 2. If from the old first domain, should we Withdraw from the first domain (lift up the first change of address in GWT) and then redirect the old first domain to the second new domain (the one we want to move now)? If yes, what to do with the first new domain (the one which we moved to a month ago) 3. If we should move from the first new domain, then what to do? The situation is clear but confusing what to do? It's just that we need to change the domain name again, move to a new one, for the second time, now we should redirect from the first old domain or first new domain? I purchased MOZ just to get help from you guys here, the only place i thought I could be helped. Of course gonna use Moz service too now that i have puurchased it 🙂 Awaiting your quick help guys. Thank you! 8csVpOZ2QoiYCoTR1t_SnQ.png
Intermediate & Advanced SEO | | mdmoz0 -
If we migrate the URLs from HTTP to HTTPS, Do I need to request again an inclusion in Google News?
Hi, If we migrate the URLs from HTTP to HTTPS, Do I need to request again an inclusion in Google News? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Own Domains shown as Spam Links in Open Site Explorer
Hi ! I have 7 Domains that I bought that point to the same webspace as my main domain. In Open Site Explorer they are showed as spam links. So to solve the issue I redirected the links to an empty subdirectory on the same server which is different from the directory the main domain is linking to. But nevertheless the domains are still showing up as spam. Why might that be? What can I do to get rid of these domains? In fact I only need the main domain. Cheers, Marc
Intermediate & Advanced SEO | | RWW0 -
Site Has Not Recovered (Still!) from Penguin
Hello, I have a site that just keeps getting hit with a ton of bad, unnatural backlinks due to the sins from previous SEO companies they've hired. About every quarter, I have to add more bad links to their disavow file... still. Is it time to move them to a new domain? Perhaps a .net? If so, do we just completely trash the old domain & not redirect it? I've never had a client like this in the past but they still want to maintain their branded name. Thanks for your feedback!
Intermediate & Advanced SEO | | TinaMumm0 -
Need help with huge spike in duplicate content and page title errors.
Hi Mozzers, I come asking for help. I've had a client who's reported a staggering increase in errors of over 18,000! The errors include duplicate content and page titles. I think I've found the culprit and it's the News & Events calender on the following page: http://www.newmanshs.wa.edu.au/news-events/events/07-2013 Essentially each day of the week is an individual link, and events stretching over a few days get reported as duplicate content. Do you have any ideas how to fix this issue? Any help is much appreciated. Cheers
Intermediate & Advanced SEO | | bamcreative0 -
Advice on outranking Amazon and other big names in eCommerce
I have a client that is targeting some product related keywords. They are on page one for them but Amazon, OfficeMax and Staples are ranking in the top 3 spots for this specific product. Before I start targeting completely different words, do you have any advice on how to tackle big name eCommerce sites who are ranking higher than you. Thank you!
Intermediate & Advanced SEO | | TheOceanAgency0 -
Need help creating sitemap
Hello, The details of my question is sitemap related. Below is the background info: we are ecommerce site with around 4000 pages, and 20000 images. we dont have a sitemap implemented on our site yet. i have checked alot of sitemap tools out there, like g-sitecrawler, xml sitemap, a1 sitemap builder etc, and i tried to create sitemaps via them, but all them give different results. the major links are all there, but the results start to vary for level 2, level 3 links and so on. plus no matter how much i read up on sitemaps, the more i am getting confused. i read lots of seomoz articles on sitemaps, and due to my limited seo and technical knowledge, the extra information on these articles gets more confusing. i also just read an article on seomoz that instead of having one sitemap, having multiple smaller sitemaps is very good idea, specially if we are adding lots of new products (which we are). Now my question: My question is having understood the immense value of sitemap (and by having it very poorly implemented before), how can i make sure that i get a very good sitemap (both xml and html sitemap). i do not want to do something again and just repeat old mistakes by having a poorly implemented sitemap for our site. I am hoping that one of the professionals out there, can help me also make and implement the sitemap. If you can please point me to the right direction.
Intermediate & Advanced SEO | | kannu10 -
Do I need to disallow the dynamic pages in robots.txt?
Do I need to disallow the dynamic pages that show when people use our site's search box? Some of these pages are ranking well in SERPs. Thanks! 🙂
Intermediate & Advanced SEO | | esiow20130