5th Reconsideration Request, Have i missed anything...
-
Hi Guys,
I wonder if any of you can help me out.I'll be shortly submitting another reconsideration request to Google.I've been working on removing bad / spammy links to our site http://goo.gl/j7OpL over the past 6 months and so far every reconsideration request I have submitted has been knocked back with the following message:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Dear site owner or webmaster of http://goo.gl/j7OpL ,
We received a request from a site owner to reconsider http://goo.gl/j7OpL for compliance with Google's Webmaster Guidelines.
We've reviewed your site and we still see links to your site that violate our quality guidelines .
Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes .
We encourage you to make changes to comply with our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results.
If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.
If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support.
Sincerely,
Google Search Quality Team
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
I've removed over 70% of all our links - we had some large sitewide links on big sites with exact match anchor text to our main money keyword, I've also removed a large link network that our previous SEO company setup.
Today I have completed an overhaul of all our internal links, near enough every blog post that we added to the site had a link back to the home page with an exact match money keyword.
1 thing that I did notice was when we got hit by the penalty it didn't affect every keyword we target just our main / most competitive keyword, yes some of our other keywords took a dip in rankings but not as much as our main keyword.
When I submit our next reconsideration request I'll also attach a spreadsheet of links that I can't remove either because I can't find any contact details / blocked by whois or I'm just not getting a response when I email them.
If anyone can point out anything else that I have missed or might have missed that would be great.
Thanks,
Scott
-
Ryan's given you a super generous answer! I wanted to add a couple of things:
You mentioned that you will attach a list of links that you couldn't get removed. It may help to go even further. What I usually do is attach a document that contains a copy of each email that I have sent for sites that I was unsuccessful with. And, if I got a negative response back I would include that email as well.
I also include screenshots of every contact form that I have submitted. It may be overkill but from Google's perspective if you just say, "I tried to contact them" that's not enough.
You're probably already doing this, but be super humble in your request and make sure that you tell Google you are committed to following the quality guidelines from this point on. I think part of the reason why Google makes webmasters go through this is because they want to be sure that they understand the gravity of trying to game the system with SEO tactics.
And like Ryan said...be really tough on yourself when it comes to links. I have seen a number of webmasters that say, "NO! That's not an unnatural link! It came from an article that I wrote", or something like that. But in reality almost every link that you have had a hand in creating is one that is considered unnatural to Google.
Good luck! If you are successful, it would be great for you to post about your success here in the Q&A to encourage others.
Marie
-
Great answer yet again Ryan.
Thanks for your detailed response.
Thanks,
Scott
-
Hi Scott,
Removing manual penalties for manipulative links is a complex task. The result for most people is to repeatedly have the Reconsideration Request declined. If you tried another 5 times, the results are not likely to change. At a high level there is likely an error in one of three areas:
1. You need to use a comprehensive list of all known backlinks to your site. Using the list from Google is not even close to enough. I use Google WMT + OSE + Raven (Majestic) + AHREFs + SEMrush + Bing. If you do not start with a comprehensive list of links, you will continue to miss addressing manipulative links and Google will not even pay any attention to your Reconsideration Request.
2. You need to ensure your idea of a manipulative link is calibrated with Google. The process begins with being intimately familiar with Google's Guidelines. A few questions to ask for each link:
-
if search engines did not exist, would this link be here?
-
who created the link / content? If the link was created by the site owner, it would likely be considered manipulative
-
how credible is the site? the web page? the content? is it focused on a specific topic or a grab bag?
-
what value does this link / page offer to users?
The above list is not comprehensive, and there are other factors to weigh. There are corner cases as well. What I can share is the PA and DA of the pages involved should not be given any consideration at all. Additionally, there is not any automated tool which can be used for making an organic vs manipulative link determination. I have reviewed several and, to put it nicely, they seem to offer completely false hope to desperate site owners.
3. You need to make a solid, good-faith effort to contact linking sites to request the links be removed. Do not simply change anchor text as that does not make the link any less manipulative. Don't give up simply because the WHOIS e-mail is not valid. Try the WHOIS e-mail, the site e-mail and the Contact Form (if any) on the site. If a site owner denies your link removal request the first time, respond to them in a very polite manner and ask in a different way.
I have been involved with the Reconsideration Request for numerous clients in your situation. Items 1 & 2 are the most common issues and they are show stoppers.
Good Luck.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
Intermediate & Advanced SEO | | amarieyoussef0 -
What should I do after a failed request for validation (error with noindex, nofollow) in new Google Search Console?
Hi guys, We have the following situation: After an error message in new google search console for a large amount of pages with noindex, nofollow tag, a validation is requested before the problem is fixed. (it's incredibly stupid decision taken before asking the SEO team for advice) Google starts the validation, crawls 9 URLs and changes the status to "Failed". All other URLs are still in "pending" status. The problem has been fixed for more than 10 days, but apparently Google doesn't crawl the pages and none of the URLs is back in the index. We tried pinging several pages and html sitemaps, but there is no result. Do you think we should request for re-validation or wait more time? It there something more we could do to speed up the process?
Intermediate & Advanced SEO | | ParisChildress0 -
Site still indexed after request 'change of address' search console
Hello, A couple of weeks ago we requested a change of address in Search console. The new, correct url is already indexed. Yet when we search the old url (with site:www.) we find that the old url is still indexed. Is there another way to remove old urls?
Intermediate & Advanced SEO | | conversal0 -
What Happens If a Hreflang Sitemap Doesn't Include Every Language for Missing Translated Pages?
As we are building a hreflang sitemap for a client, we are correctly implementing the tag across 5 different languages including English. However, the News and Events section was never translated into any of the other four languages. There are also a few pages that were translated into some but not all of the 4 languages. Is it good practice to still list out the individual non-translated pages like on a regular sitemap without a hreflang tag? Should the hreflang sitemap include the hreflang tag with pages that are missing a few language translations (when one or two language translations may be missing)? We are uncertain if this inconsistency would create a problem and we would like some feedback before pushing the hreflang sitemap live.
Intermediate & Advanced SEO | | kchandler0 -
Error: Missing required field "updated"
In my WordPress blog, there are pages for tags,categories,... like : https://www.abc.com/blog/category/how-to-cook-something/ On these pages I am getting the following error: Error: Missing required field "updated" So far I have 39 if these errors. Please let me know if this is an important issue to pay attention to? If yes, how I can fix it? Thanks Everyone
Intermediate & Advanced SEO | | AlirezaHamidian0 -
Requesting New Custom URL for Google+ Local Business Page
This question is about the new custom URLs for Google+ Local Business pages: Has anyone heard any success stories with requesting a custom URL different than the two reserved ones offered by Google via contacting a Google Rep by phone? And what advantages might there be for a local business to go with a very long custom URL such asgoogle.com/+rosenbergAndDalgrenLLPFortLauderdale as opposed to just google.com/+RdLawyers? Does having the city name in the URL offer any <acronym title="Search Engine Optimization">SEO</acronym> benefit? Thanks!
Intermediate & Advanced SEO | | gbkevin0 -
What do successful reconsideration request letters look like?
If you have had a successful reconsideration request, would you be comfortable sharing the letter you sent? We are trying to draft ours and could use some guidance.
Intermediate & Advanced SEO | | CMC-SD0 -
Temporary Duplicate Sites - Do anything?
Hi Mozzers - We are about to move one of our sites to Joomla. This is one of our main sites and it receives about 40 million visits a month, so the dev team is a little concerned about how the new site will handle the load. Dev's solution, since we control about 2/3 of that traffic through our own internal email and cross promotions, is to launch the new site and not take down the old site. They would leave the old site on its current URL and make the new site something like new.sub.site.com. Traffic we control would continue to the old site, traffic that we detect as new would be re-directed to the new site. Over time (the think about 3-4 months) they would shift the traffic all to the new site, then eventually change the URL of the new site to be the URL of the old site and be done. So this seems to be at the outset a duplicate content (whole site) issue to start with. I think the best course of action is try to preserve all SEO value on the old URL since the new URL will eventually go away and become the old URL. I could consider on the new site no-crawl/no-index tags temporarily while both sites exist, but would that be risky since that site will eventually need to take those tags off and become the only site? Rel=canonical temporarily from the new site to the old site also seems like it might not be the best answer. Any thoughts?
Intermediate & Advanced SEO | | Kenn_Gold0