Google Reconsideration Request - Most Efficient Process
-
Hi,
I'm working on a Google reconsideration request for a site with a longstanding penalty.
Here's what I did:
Round 1
- Downloaded a CSV of all the domains and all the pages linking to the site. Went through the lot manually and sorted them into three types: Disavow Domain, Disavow Page, Keep
- All low-quality domains were disavowed, all pages from places like blogspot with low-quality links on certain blogs were disavowed. Submitted disavow file, then sent a detailed reconsideration request including a link to the disavow file.
Reconsideration request was not successful. Google gave two examples of links I should remove, bizarrely the examples they gave were already disavowed, which seemd a bit odd. So I took this to mean Google Webmaster Tools and disavow files were in themselves not enough. The links I kept were largely from PRWeb syndication which seems legit.
Round 2
Here's what I'm doing now. Any ideas for how the below process can be improved to get the maximum chance of a successful request, please let me know.- Get all linking pages from Webmaster Tools as before and also MajesticSEO's Historic Index. This gave me around three times more domains to remove. The additionnal domains from Majestic that weren't in Webmaster tools I just put them all in the disavow file.
- Conduct a manual link removal email campaign. I've got around 2500 domains to go through, so how can I best do this. My process at the moment is:
- Use software to get email addresses from whois records
- send them an email
- make a spreadsheet of responses
- include link to spreadsheet in Google Docs as well as link to new disavow file
Should I research each site manually to get email addresses? It does seem rather a waste of an offshorer's time, from what I've seen some people use offshorers and others have used software tools successfully. The other thing is sending the emails, how can I do this? Any smtp email campaign site won't let me use their service because the emails are not opt-in, they classify it as spam. Does anyone know a solution to send 2500 emails legitimately from a webmail account for example? I'm having to send bulk emails to get rid of spam links.
Finally most of the offending links have keyword anchor text from spun articles, I've deleted all the sites except EzineArticles. Would you delete this too, it's an awful site but client is hung up on it. ExineArticle links may have some value, on the other hand it's more of the same keyword-rich anchor text articles. Keep or disavow the individual pages?
Finally, anything else I've missed? Anything to add? Thanks for all your help
-
I personally do everything manually. I think that the link removal tools can work great for some sites, but your best chance at identifying the bad links and keeping the good ones is to look at them manually. 2500 domains is a lot, but not impossible. I'm currently working on an account of about that size and it will take me about 10-14 days to go through as many. Once you get going you will recognize patterns and it will go faster.
I used to get emails on my own but I have just hired someone to do this for me. I find that the automated tools miss a lot of them. I was considering hiring from o-desk or mechanical turk, but in my situation, because my business is expanding and most of what I do is penalty removal, it's worth my while to hire and train someone to do this for me.
btw...if you've got 2500 domains, you won't have 2500 emails. Many will be offline or nofollowed or perhaps even natural.
Ezine Articles links definitely need to be removed if they are followed links. Often times those links are nofollowed, but if you have a high enough account level there then they are followed and need to go.
A few other points:
-Yes you're right. It's not enough to just disavow. Google's going to want to see evidence that you've tried hard to remove links.
-Lately I have only be using links from WMT and not other sources like Majestic and ahrefs. That may cut down on the number of domains you have to deal with. So far it is working for me.
Hope that helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Title page google serp
Why does Google change the titles automatically? I have <title>Canyoning Açores - São Jorge | Discover Experience Açores</title> but google show Discover Experience Açores: Canyoning Açores - São Jorge
Intermediate & Advanced SEO | | tiagoarruda0 -
Prevent Google from crawling Ajax
With Google figuring out how to make Ajax and JS more searchable/indexable, I am curious on thoughts or techniques to prevent this. Here's my Situation, we have a page that we do not ever want to be indexed/crawled or other. Currently we have the nofollow/noindex command, but due to technical changes for our site the method in which this information is being implemented if it is ever displayed it will not have the ability to block the content from search. It is also the decision of the business to not list the file in robots.txt due to the sensitivity of the content. Basically, this content doesn't exist unless something super important happens, and even if something super important happens, we do not want Google to know of its existence. Since the Dev team is planning on using Ajax/JS to pull in this content if the business turns it on, the concern is that it will be on the homepage and Google could index it. So the questions that I was asked; if Google can/does index, how long would that piece of content potentially appear in the SERPs? Can we block Google from caring about and indexing this section of content on the homepage? Sorry for the vagueness of this question, it's very sensitive in nature and I am trying to avoid too many specifics. I am able to discuss this in a more private way if necessary. Thanks!
Intermediate & Advanced SEO | | Shawn_Huber0 -
Organic Google Sitelinks - can I edit?
A client just contacted me saying a competitor is threatening legal action threatened against a page description in a Google sitelink (just double checked and its in the serps results too). I checked the site and the content doesn't appear on that page or anywhere else on the site. I also added a Meta description to that page to see if that will have an effect. The page is the home page and I don't really want to demote it. Is there anything else I can/should do?
Intermediate & Advanced SEO | | agua0 -
Google News URL Structure
Hi there folks I am looking for some guidance on Google News URLs. We are restructuring the site. A main traffic driver will be the traffic we get from Google News. Most large publishers use: www.site.com/news/12345/this-is-the-title/ Others use www.example.com/news/celebrity/12345/this-is-the-title/ etc. www.example.com/news/celebrity-news/12345/this-is-the-title/ www.example.com/celebrity-news/12345/this-is-the-title/ (Celebrity is a channel on Google News so should we try and follow that format?) www.example.com/news/celebrity-news/this-is-the-title/12345/ www.example.com/news/celebrity-news/this-is-the-title-12345/ (unique ID no at the end and part of the title URL) www.example.com/news/celebrity-news/celebrity-name/this-is-the-title-12345/ Others include the date. So as you can see there are so many combinations and there doesnt seem to be any unity across news sites for this format. Have you any advice on how to structure these URLs? Particularly if we want to been seen as an authority on the following topics: fashion, hair, beauty, and celebrity news - in particular "celebrity name" So should the celebrity news section be www.example.com/news/celebrity-news/celebrity-name/this-is-the-title-12345/ or what? This is for a completely new site build. Thanks Barry
Intermediate & Advanced SEO | | Deepti_C0 -
SEO - Why variation in Google positioning? Are we penalized?
Our webiste is made of 2 different parts hosted in 2 different directory on the same domain: 1. The website itself, which can be seen as a catalog. 2. A blog on WP. Sometimes we lose 50% of our traffic and 5 to 10 days later the traffic goes up again. When looking at position in Google it appears that when losses of traffic happen, the Google positions of pages from the website become bad. The blog pages themselves keep their good position. Suddenly after few days we find our previous good postioning and we have a good traffic again. Here at the questions we have: 1. Why do we have variation on our Google positioning? Which parameters can influence? Bounce rate? Speed? 3. Why only the website is penalized and never the WP blog whereas they are on same domain? If someone has experienced it before or has a idea, your expertise is welcome... Thanks.
Intermediate & Advanced SEO | | ZOD0 -
Google Adsense Good for SEO?
Is there any merit to the statement that Google will give some SEO value to sites that display Adsense? Or is there absolutely no SEO value for or against a site that displays Adsense Ads? Clearly, it would benefit Google's finance to give at least a small boost to sites that display Adsense, but do they do it? My guess is no, but I'm wondering ...
Intermediate & Advanced SEO | | applesofgold0 -
Holiday hijack lowers Google ranking
A client of ours made a mistake that has dropped their Google rankings. They posted a holiday greeting on our homepage for several weeks, and now the search engines are not picking the page up at all. Any thoughts or suggestions on how to repair this?
Intermediate & Advanced SEO | | Event360300 -
Links on Google Notebook
I have used OSE to look at links of a competitors site and notice they have dozens for links from Google Notebook pages eg http://www.google.pl/notebook/public/05275990022886032509/BDQExDQoQs8r3ls4j This page has a PA of 48 Is this a legitimate linking strategy?
Intermediate & Advanced SEO | | seanmccauley0