Large-Scale Penguin Cleanup - How to prioritize?
-
We are conducting a large-scale Penguin cleanup / link cleaning exercise across 50+ properties that have been on the market mostly all for 10+ years. There is a lot of link data to sift through and we are wondering how we should prioritize the effort.
So far we have been collecting backlink data for all properties from AHref, GWT, SeoMajestic and OSE and consolidated the data using home-grown tools.
As a next step we are obviously going through the link cleaning process. We are interested in getting feedback on how we are planning to prioritize the link removal work. Put in other words we want to vet if the community agrees with what we consider are the most harmful type of links for penguin.
- Priority 1: Clean up site-wide links with money-words; if possible keep a single-page link
- Priority 2: Clean up or rename all money keyword links for money keywords in the top 10 anchor link name distribution
- Priority 3: Clean up no-brand sitewide links; if possible keep a single-page link
- Priority 4: Clean up low-quality links (other niche or no link juice)
- Priority 5: Clean up multiple links from same IP C class
Does this sound like a sound approach? Would you prioritize this list differently?
Thank you for any feedback /T
-
Your data sources are correct (AHREFs, Bing, Ose & Majestic) but I recommend including Bing as well. The data is free and you will find at least some links not shown in other sources.
The link prioritization you shared is absolutely incorrect.
"Priority 1: Clean up site-wide links with money-words; if possible keep a single-page link"
While it is true site-wide links are commonly manipulative, removing the site wide link and keeping a single one does not necessarily make it less manipulative. You have only removed one of the elements which are often used to identify manipulative links.
"Priority 2: Clean up or rename all money keyword links for money keywords in the top 10 anchor link name distribution"
A manipulative link is still manipulative regardless of the anchor text used. Based in April 2012, Google used anchor text as a means to identify manipulative links. That was over 18 months ago and Google's link identification process has evolved substantially since that time.
"Priority 3: Clean up no-brand sitewide links; if possible keep a single-page link"
Same response as #1 & 2
"Priority 4: Clean up low-quality links (other niche or no link juice)"
See below
"Priority 5: Clean up multiple links from same IP C class"
The IP address should not be given any consideration whatsoever. You are using a concept that had validity years ago and is completely outdated.
bonegear.net IP address 66.7.211.83
vitopian.com IP address 64.37.49.163
There are no commonalities between the above two IP addresses, be it C block or otherwise, yet they are both hosted on the same server.
You have identified the issue affecting your site (Step 1) and collected a solid list of your backlinks using multiple sources (Step 2). The backlink report is an excellent step which places you well above most site owners and SEOs in your situation.
Step 3 - Identify links from every linking domain.
a. Have an experienced, knowledgeable human visit each and every linking domain. Yes, that is a lot of work but it is what's necessary if you are going to accurately identify all of the manipulative links. Prior to beginning this step, be absolutely sure the person can accurately identify manipulative links with AT LEAST 95% accuracy, although 100% is strongly desired.
b. Document the effort. I have had 3 clients who approached me with a Penguin issue, we confirmed there was not any manual action in place at the time we began the clean up process, but before we finished the sites incurred a manual penalty. Solid documentation of the clean up effort is required by Google in case the Penguin issue morphs into a manual penalty. Also, it just makes sense. You mentioned 50+ web properties so clearly others will be performing these tasks.
c. Audit the effort. A wise former boss once stated "You must inspect what you expect". Unless you carefully audit the work, the process will fail. Evaluators will mis-identify links. You will lose some quality links and manipulative links will be missed as well.
d. While you are on the site, capture manipulative site's e-mail address and contact forum URL (if any). This information is helpful to contact site owners to request link removal.
Step 4 - Conduct a Webmaster Outreach Campaign. Each manipulative domain needs to be contacted in a comprehensive manner. In my experience, most SEOs and site owners do not put in the required level of effort.
a. Send a professional request to the site's WHOIS e-mail address.
b. After 3 business days if no response is received, send the same letter to the site's e-mail address found on the website.
c. After another 3 business days, if no response is received submit the e-mail via the site's contact form. Take a screenshot of the submission on the site (not required for Penguin as no documentation is, but it is helpful for the process).
All of the manipulative link penalties (Penguin and manual) I have worked with have been cleaned up manually. With that said, we use Rmoov to manage the Webmaster Outreach process. It sends and maintains a copy of every e-mail sent. It even has a place to add the Contact Form URL. A big time saver.
If a site owner responds and removes the link, that's great. CHECK IT! If there are only a few links, manually confirm link removal. If there are many URLs, use Screaming Frog or another tool to confirm link removal.
If a site owner refuses or requests money, you can often achieve link removal by having further respectful conversations.
If a site owner does not respond, you can use "extra measures". Call the phone number listed in WHOIS. Send a physical letter to the WHOIS address. Reach out to them on social media sites. Is it a .com domain with missing WHOIS information? You can report them on INTERNIC. Is it a spammy wordpress.com or blogspot site? You can report that as well.
When Matt Cutts introduced the Disavow Tool, he clearly said "...at the point where you have written to as many people as you can, multiple times, you have really tried hard to get in touch and you have only been able to get a fraction of those links down and there is still a small fraction of those links left, that's where you can use our Disavow Tool".
The above process satisfies that requirement. In my experience, not much less than the above process meets that need. The overwhelming majority of those tackling these penalties try to perform the minimal amount of work possible, which is why forums are flooded with complaints about numerous attempts to remove manipulative link penalties and failing.
Upon completion of the above, THEN upload a Disavow list of the links you could not remove after every reasonable human effort. In my experience you should have removed at least 20% of the linking DOMAINS (with rare exceptions).
It can take up to 60 days thereafter, but if you truly cleaned up the links in a quality manner, then the Penguin issues should be fully resolved.
The top factors in determining whether you succeed or fail are:
1. Your determination to follow the above process thoroughly
2. The experience, training and focus of your team
You can resolve the issue in one round of effort and have the Penguin issue resolved within a few months....or you can be one of those site owners who thinks it is impossible and be struggling with the same issue a year later. If you are not 100% committed, RUN AWAY. By that I mean change domain names and start over.
Good Luck.
TLDR - Don't try to fool Google. Anchor text and site wide links are part of the MECHANISM used to identify manipulative links. Don't confuse the mechanism with the message. Google's clear message: EARN links, don't "build" links. Polishing up the old manipulative links is a complete waste of your time. AT BEST, you will enjoy limited success for a period of time until Google catches up. Many site owners and SEOs have already been there, and it is a painful process.
-
When you say "clean up" do you mean removing the links or disavowing them?
You will never be able to get them all removed, so in the end you will need to a Disavow anyways. If your time frame is short, you may want to make Priority One be doing a Disavow for each of the 50+ sites you are working with. Then you can proceed with attempting to get the links removed. I have not heard that there is any downside to having a link removed that already appears on your disavow file...
As for the order of the Priorities, you may want to shuffle them a bit depending on the different situations on the different websites. I suggest you read this Moz Blog article called It's Penguin-Hunting Season: How to Be the Predator and Not the Prey
...and then test a few of your sub-pages that used to rank well at the program used in this article which is called the Penguin Analysis Tool. I say sub-page because it needs a single keyword phrase you want rank that particular page for so it do the anchor text analysis. And that works better on focused sub-pages than on general homepages. $10 per website will let you fully evaluate two typical pages on each and see which facet of the link profile is most valuable to attack first.
-
Have you read the post at http://moz.com/blog/ultimate-guide-to-google-penalty-removal? Matt Cutts even called it out on Twitter as a good post. That's where I'd first look for ideas.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Upgarded woocommerce 2 to 3 and saw a large drop in rank
Hi guys, I recently upgraded from woocommerce 2 to v3 and rankings have been plummeting. I have double checked all of the schema, markup, technical, speed, etc and it all seems to be the same. The on thing that I did find is that while there used to be about 400 images (products) there are now over 1500 images in the index as all of the thumbnails are not in the sitemap. Would / could this cause the issue? Can i remove all the thumbs from the sitemap? if so any suggestions on how (google returns very little to nothing regarding this). Should i alt text every thumbnail..... Thanks in advance for any suggestions.
Intermediate & Advanced SEO | | plahpoy0 -
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
Was my site hit by Panda or Penguin? Looking for diagnosis help
My URL is: www.westlakedermatology.com Hello Mozers, I'm looking for some help or guidance as to why my site fell off the "rankings cliff" on 9/5. In the forums I hear a lot of others with a similar issue, and some speculation it is due to a Panda refresh. However, looking at our site we have unique content with each page having over 300-400 words (so it's not light or duplicate content). We get a lot of leads that verbally tell us our content helped answer some of their questions so I'm pretty confident its good for users. Can anyone see an issue with the content on our site? In terms of Penguin, I think our backlink profile is clean, our physicians do take part in providing content to various high quality and relevant websites/blogs. But we do not buy links or do anything in violation of Google's guidelines. In terms of brand, we are the biggest dermatology and plastic surgery group in the Austin area. So any brand implications to search should be on our side. Just looking for some sort of guidance or help, any suggestions would be great! Thanks,
Intermediate & Advanced SEO | | iderma
Adam Paddock0 -
Getting out of Google's Penguin
Hi all, my site www.uniggardin.dk has lost major rankings on the searchengine google.dk. Went from rank #2-3 on important keywords to my site, and after the latest update most of my rankings have jumped to #12 - #20. This is so annoying, and I really have no idea what to do. Can it cause bad links to my site? In that case what will I have to do? Thanks in advance,
Intermediate & Advanced SEO | | Xpeztumdk
Christoffer0 -
301 many smaller domains to a new, large domain
Hi all, I have a question regarding permanently redirecting many small websites into one, large new one. During the past 9 years I have created many small websites, all focusing on hotel reservations in one specific city. This has served me beautifully in the past, but I have come to the conclusion that it is no longer a sustainable model and therefore I am in the process of creating one large, worldwide hotel reservations website. To not loose any benefit of my hard work the past 9 years, I want to permanently redirect the smaller websites to the correct section of my new website. I know that if it is only a few websites, that this strategy is perfectly acceptable, but since I am talking about 50 to 100 websites, I am not so sure and would like to have your input. Here is what I would like to do: (the domain names are not mine, just an example) Old website: londonhotels.com 301 to newdomain.com/london/ Old website: berlinhotels.com 301 to newdomain.com/berlin/ Old website: amsterdamhotels.com 301 to newdomain.com/amsterdam/ Etc., etc. My plan is to do this for 50 to 100 websites and would like to have your thoughts on if this is an acceptable strategy or not. Just to be clear, I am talking about redirecting only my websites that are in good standing, i.e. none of the websites I am thinking about 301'ing have been penalized. Thanks for your thoughts on this.
Intermediate & Advanced SEO | | tfbpa0 -
Penguin recovery - Any suggestions?
Hey Guys. To be honest i havent heard about any case of penguin recovery yet except from the famous WMPU revocery (if u ask me it dosent count really) Any way I own a few sites.. that got hit by Penguin. I did remove 90% of the "suspected as bad links" and apply to google reconsideration. in return i got an answer after a few days from google saying :"We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team.... and so on" So what they saying is that the engine still think that your links are spammy.. what should i do?? i truy removed most of the links but homepage isnt really showing any progress.. he is just no where.. Idaes? Case Studies?
Intermediate & Advanced SEO | | Idoz0 -
Is the Penguin algorithmic penalty on a page basis or a site basis?
Just wondering if there has been any clarification of whether the Penguin algorithmic penalty is on a Page basis or a Site basis? In other words, is it all or nothing?
Intermediate & Advanced SEO | | darkgreenguy0 -
Penguin Update Issues.. What would you recommend?
Hi, We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%. We suspect it's for a couple of reasons 1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week 2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle? 3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com. Any help will be much appreciated as this is Killing our business. Jay
Intermediate & Advanced SEO | | ConservationM0