Could this work for Google Reconsideration Request?
-
One of my websites has received the following message:
We've reviewed your site and we still see links to your site that violate our quality guidelines.
Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes.
We encourage you to make changes to comply with our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results.
If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.
I have used LinkResearchTools DTOX to locate unnatural links and remove them. So far I've been able to remove or nofollow 50/350 and that's as far as I can ever go. The rest of the websites either don't respond or don't have any contact information.
I added another 300 suspicious websites to my list and I'll try to get the links manually removed.
Hopefully I can get 100/650 websites (and a bit more links) removed in total - at most. That is my estimate.
I've been thinking to use Google Disavow Tool for the rest and make sure to submit a nicely written report with spreadsheets to Google - when I get to the reconsideration point.
What are your thoughts on this?
-
It took us 4 reconsideration requests before our unnatural link building penalty was lifted. In order to have the manually penalty revoked, we downloaded all our backlink lists from Moz, Bing and Google webmaster tools imported them into a Google doc and remove the duplicates. We then contacted every site and made a not of the contact details along with the date in a new column on the Google doc, we would then re visit the sites the following week and email them again. After 3 attempts we would but : unable to contact webmaster for link removal added URL to disinvow list.
Once we had gone through the whole list, which did take a long time we submitted the disinvow list first, then submitted our reconsideration request explains what we had done since our last request and include the link to the Google docs as the spam team can access this and see all the hard work you have done.
Good luck with it, and i hope you recover soon
-
Keep in mind that Matt Cutts has said people should usually be taking a machete to their backlinks rather than a scalpel. Don't try and remove just enough to make a difference, remove everything you think could possibly hurt you. And remove anything Google's likely to crack down on in the future. In my experience with Link Deotx, usually over 75% of backlinks marked as suspicious warrant removal.
-
No, don't remove the links coming from good authoritative sites.
For rest of the links, download links to your site report in Google Webmaster Tools and check if those low quality sites are in there. Start by removing the links that are in the report. You can disavow the links as well but you should be very careful in doing so. Sometimes, people get extra cautious and disavow good backlinks.
Regards
-
Should I also remove links coming from high authority domains with excessive use of commercial keywords in anchor text? What do you think about using Google Disavow for the ones I cannot remove manually?
-
Exactly! You are on the right path. Google states that you should submit the proof of your manual removal work along with the reconsideration request. It will help them understand that you are serious and are working for it and will also expedite the process.
Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My website is my name. Overnight it went from being the number one google search to not showing up at all when you google my name. Why would this happen?
I built my website via square space. It is my name. If you google my name it was the number one hit. Suddenly 2 weeks ago it doesn't show up AT ALL. I went through square spaces SEO check list, secured my site etc. Still doesn't show up. Why would this happen all of the sudden and What can I do? Thank you!
Intermediate & Advanced SEO | | Jbark0 -
Google blocks certain articles on my website ... !
Hello I have a website with more than 350 unique articles, Most of them are crawled by Google without a problem, but I find out certain articles are never indexed by Google. I tried to rewrite them, adding fresh images and optimizing them but it gets me nowhere. Lately, I rewrite an article of those and tried to (fetch and render) through Google Webmasters, and I found this result, can you tell me if there is anything to do to fix that? BMVh4
Intermediate & Advanced SEO | | Evcindex0 -
New Google Update on 21st April
Hi We all know that the mobile update is coming on the 21st April and if your site isnt mobile friendly in Googles eye you will be removed from the mobile index. Will this affect tablets. Most of our pages are mobile friendly but there are a few which arent. However these are tablet friendly. I havent heard Google mention about tablet rankings. Thanks Andy
Intermediate & Advanced SEO | | Andy-Halliday0 -
AngularJS - How does Google go?
We're rebuilding our entire website in angularJS. We've got it rendering fine in WMT, but does that mean that it's content is detectable? I've looked into prerender.io and that seems like a great solution to the problem of not seeing any static HTML, but is it really necessary? I'm looking into this as I'm having the argument currently with my devs, and they're all certain that Google renders angularJS fine.
Intermediate & Advanced SEO | | localdirectories0 -
Apps content Google indexation ?
I read some months back that Google was indexing the apps content to display it into its SERP. Does anyone got any update on this recently ? I'll be very interesting to know more on it 🙂
Intermediate & Advanced SEO | | JoomGeek0 -
Link from Google.com
Hi guys I've just seen a website get a link from Google's Webmaster Snippet testing tool. Basically, they've linked to a results page for their own website test. Here's an example of what this would look like for a result on my website. http://www.google.com/webmasters/tools/richsnippets?q=https%3A%2F%2Fwww.impression.co.uk There's a meta nofollow, but I just wondered what everyone's take is on Trust, etc, passing down? (Don't worry, I'm not encouraging people to go out spamming links to results pages!) Looking forward to some interesting responses!
Intermediate & Advanced SEO | | tomcraig860 -
Does Google read texts when display=none?
Hi, In our e-commerce site on category pages we have pagination (i.e toshiba laptops page 1, page 2 etc.). We implement it with rel='next' and 'prev' etc. On the first page of each category we display a header with lots of text information. This header is removed on the following pages using display='none'. I wondered if since it is only a css display game google might still read it and consider duplicated content. Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0