Skip to content
NEW: Moz AI, Refreshed Interfaces & More API Data. Discover what's new at Moz!
Link building caa8312

Google Tells Us Why "Unnatural Link Warnings" Ask For SEO Services Used

Gabriel Goldenberg

This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

Table of Contents

Gabriel Goldenberg

Google Tells Us Why "Unnatural Link Warnings" Ask For SEO Services Used

This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

A variety of smart SEOs have been debating why Google's unnatural link warnings prompt webmasters for details on the SEO services they used. Of particular note is this excellent case study on Youmoz. 

At stake in the debate: lots of time savings and a fairly easy, predictable and effective way to buy rankings and income by paying blog networks to manufacture backlinks for you.

poker chips
Caption: Thinking critically is at the core of SEO and solves high-stake debates.

Obviously this competes with AdWords, because you're buying the same traffic, just from different parties. And it makes Google look stupid. So it's no surprise it's against the guidelines.

(It can't be about Google protecting users from user-unfriendly websites in the rankings because if the websites that ranked didn't convert - the ultimate measure of user satisfaction - no one would be buying these links.)

The debate revolves around how Google uses the data.

Side 1: Google Gets the Data to Whack-a-Mole the Blog Networks' Properties

At Warrior Forum in particular, but elsewhere too, webmasters claim that Google can't be whacking these sites algorithmically, but based on webmasters' reports.

1) If Google were penalizing sites algorithmically, why bother asking for the data from penalized webmasters? 

2) Furthermore, they argue, an algorithmic approach would make negative SEO  quite easy. (Negative SEO means violating Google's guidelines on behalf of a competitor to get them penalized.)

3) Third, if the penalty is algorithmic, why haven't 100% of the networks' sites been hit? Therefore, Google asks for the data to whack as large a percentage of the networks as possible. It's casting FUD (fear, uncertainty, and doubt) against a competing business model.

Side 2: Google Did it Algorithmically and Blog Networks are Dead

This side claims that Google may have bought a seed list of sites by using this technique, but once they identified the networks' footprint, the rest was done algorithmically.

This doesn't enable negative SEO because... well, people don't say. This Warrior Forum post suggests it is possible.

From my experience with reinclusion requests and from hearing penalties at the keyword level, it appears that Google has minimized the possible impact of negative SEO and gives a site that comes clean its rankings back without too much delay.

The result is that the effort at dislodging a competitor is expensive, short-lived or even restricted to some keywords, and thus the ROI (compared to building or buying your own links) is dubious.

It's quite clever: Google has used lateral thinking to carry over a rule from the world of web and software security. There, experts agree that you can't 100% prevent getting hacked, but you can create measures to minimize the damage.

So why haven't 100% of blog network sites been hit, if it was algorithmic?

Ah - that goes to the heart of the question: Why are Google's spam hunters asking for data on the SEO services used by penalized webmasters?

Matt Cutts recently provided the answer to why Google seeks human spam report data, in December of last year, in the context of human quality raters.

The reason why Google wants details on SEO services used for "unnatural link building" is for quality control on algorithms. Like playing hot and cold, the data tells Google how accurate its algo is at detecting paid links. As Matt said: "Rule #1: Don't muck with the data you use to evaluate algo quality."

Which also tells us that the spam reports requested with unnatural link warnings probably are used to audit the algo quality without being "mole pointers" for the ongoing whack-a-mole game.

Finding this answer is another example of lateral thinking, one of the 7 principles of advanced SEO.

Some other useful / interesting items on human raters:

Whack-a-mole pic via Herman Sylvester.

Back to Top

Read Next

Regaining Trust: Lessons Learned From Losing a Local SEO Client

Regaining Trust: Lessons Learned From Losing a Local SEO Client

Apr 22, 2024
How Inclusivity Drives Link Diversity — Whiteboard Friday

How Inclusivity Drives Link Diversity — Whiteboard Friday

Mar 22, 2024
Why Building Links with Digital PR Is Hard — And That’s OK!

Why Building Links with Digital PR Is Hard — And That’s OK!

Feb 14, 2024

Comments

Please keep your comments TAGFEE by following the community etiquette

Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.