Google Manual Penalty - Dilemma?
-
Hi Guys,
A while back, my company had a 'partial match' manual penalty from google for 'unnatural links' pointing to our site.
This glorious feat was accomplished by our previous SEO agency for quite heavily spamming links (directories, all kinds of low quality sites).
That being said, when the penalty hit we really didnt see any drop in traffic. In fact, it was not long after the penalty that we launched a new website and since our traffic has grown quite significantly. we've doubled our total visits from prior penalty to now.
This previous SEO also did submit a couple of reconsideration requests (both done loosely as to fool Google by only removing a small amount of links, then abit more the next time when it failed - this was obviously never going to work). Since then, I myself have submitted a reconsideration request which was very thorough, disavowing 85 Domains (every single one at domain level rather than the individual URLs as I didnt want to take any chances), as well as getting a fair few links removed from when the webmaster responded. I documented this all and made multiple contacts to the webmasters so i could show this to Google.
This reconsideration request was not successful - Google made some new backlinks magically appear that i had not seen previously. But really, my main point is; am I going to do more damage removing more and more links in order to remove the penalty, because as it stands we haven't actually noticed any negative effects from the penalty! Perhaps the negative effects have not been noticed due to the fact that not long after the penalty, we did get a new site which was much improved and therefore would naturally get much more traffic than the old site, but overall it has not been majorly noticed.
What do you guys think - is it worth risking drop in rankings to remove the penalty so we don't face any future issues, or should I not go too heavy with the link removal in order to preserve current rankings? (im really interested to see peoples views on this, so please leave a comment if you can help!)
-
That's the problem...it's often hard to tell whether a link is natural or not. For example, a local directory listing might be ok, but it could be unnatural. If it helps, I wrote a Moz article that describes different kinds of unnatural links: http://moz.com/ugc/what-is-an-unnatural-link-an-in-depth-look-at-the-google-quality-guidelines
-
Thanks for your response, you've clarified a lot for me here.
Essentially, so long as only the unnatural links are removed I should not harm my sites ranking?That is, so long as Google agree on which links are the unnatural ones!
I better get to work auditing all of these links - see you again in afew years! haha.
-
"Google made some new backlinks magically appear that i had not seen previously."
This made me chuckle. Google is a strange animal. John Mueller has said many times that looking at your links in Webmaster Tools is enough, but I will often get back example unnatural links that are not in Webmaster Tools. This is one of the reasons why when I do a backlink audit I combine links from a number of different sources including OSE, ahrefs and majestic.
Now, I have seen sites lift penalties by just going on their Webmaster Tools links but really it's best to get them from multiple sources.
BUT...even when I combine every possible source I can find I will quite often get example links back from Google that don't exist on ANY backlink checkers. These are tough. But usually they are clues that can help you to find more links. For example, often when this happens it's a scraped version of a press release that is given. What I'll do is take a chunk of text in quotes and search for it on Google and often I'll find 3-4 additional links that weren't in my audit list.
Another thing you can do is download new links from GWT as often new ones will pop up even if they are years old.
Are you going to do more harm to your site than good? That depends on how good you are at auditing links. If you're only getting rid of unnatural links then you won't hurt your site and you may even see an improvement in rankings either immediately, a few weeks after the penalty is lifted, or when Penguin refreshes. But, if you're guessing at your disavow decisions then yes, if you disavow good links you're going to do harm to your site.
Best of luck!
-
Keep doing what you're doing. As long as you know how to properly identify if a site/link is good or bad, you shouldn't hurt your site. Better to do this work now and prevent another penalty in the future than to put it off.
RE: total backlinks - I recommend combining and deduping Open Site Explorer, Webmaster Tolls, Majestic, and AHREFs for the most thorough picture.
-
It will often take multiple requests for Google to remove a manual penalty to ensure you put enough effort in to cleaning up your link profile.
What tools did you use to find your links? It's best to use a combination of tools to find all of the possible links to your site. The amount of links you remove/disavow is relative to the size of your link profile, some sites have had to remove or disavow 1,000s of domains.
Ensure the links that you remove are exact match links or those from directories and guest blogging etc.
It's best to remove more links than not enough as even having poor links will result in Google marking you down. If you're not thorough enough, there's every chance you could get penalized again in the future. Also make sure your recon request is clear and simple and clearly demonstrates the work you have done to remove or disavow any offending links.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On Page #2 of Bing But Nowhere on Google. Please Help !
Hi, community. I have a problem with the ranking of my blog and I hope anyone could help me to solve this problem. I have been trying to rank my blog post for a keyword for almost 6 months but still getting no success. My URL is: this blog post
White Hat / Black Hat SEO | | Airsionquin
Target keyword: best laptops for college The interesting fact is that the post has been on page #2 of BING but nowhere on google. It was on page #3 of google for about one month, but it's been 1-2 weeks gone(not ranked anymore but it's still well indexed). The post has been replaced by another post of my blog(let's say post A) which doesn't have any link. The Post A is ranking on page #4 right now.
The weird thing is my post which ranks for this keyword frequently changes. One day the Post A was on page#4 then after a few days it changed to the post B. Yesterday I searched on google for a keyword "number one on bing but nowhere on google" and then I
come across to read this article on MOZ community and one of the people here said that it was over optimization issue. I think my post has been suffering for an over optimization penalty algorithm. Just for your information, I have been building backlinks to this URL for the last 5 months(it's 1+ year old). It has backlinks only about 1,5k from 200 domains(according to ahref). I have used the exact match anchor only under +/- 2%. The rest is branded, naked URL and generic anchors.
So, in this case, I thought that I haven't done any over anchor optimization.
I have checked the keyword density and I found it was "safe". One important thing I can remember before the post has gone is I add a backlink from lifehack.org(guest post) with exact match anchor.
I suspect this is really the cause because 2-3 days after doing that then the post is gone(dropped) and replaced by another post of my blog(as I've mentioned before). But it's very strange because the amount of the anchor keyword(including the long tail) is only about 10(from 200 domains) or only 5% which mean it should be safe. I'm so Sorry. It's a long story 🙂 So, What is actually happening to my post? and How to fix this problem... Please..please help me... Any hep is appreciated. By the way, Sorry for my poor english.. 🙂0 -
Google Images and slideshow copyright
Hello, I made a slideshow and referenced Google Images without searching with advanced copyright settings. Can I just put a copyright disclaimer in my video, or do I need to reshoot it? Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Recovering from Pinguin Penalty
We have big issue with a website who has been hardly penalized by Pinguin on october 4th. After a lot of try to remove bad links and sending two disavow files, none of our actions has improved our situation. We're wondering if this solution might be good : changing the domaine name Keeping the same content Not using Webmaster tools and redirect 301 and wait until the site will be fully indexed Build new links Please tell us your opinion and solution. Thanks
White Hat / Black Hat SEO | | webit400 -
Tools to check Google Local SEO with suggestions.
Is there any tool for to check website position on Google maps ?? and also what is the way to check that a website is listed on which local directories and on which not listed and to get suggestions for improvements ?? so need Tools to check Google Local SEO with suggestions.
White Hat / Black Hat SEO | | mnkpso0 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Google authorship and multiple sites with multiple authors
Hi guys :). I am asking your help - basically I would like to know what would be the best way to set all of this up. Basically I have two main (e-commerce) sites, and a few other big web properties. What I would like to know is if it is ok to link the main sites to my real G+ account, and use alias G+ accounts for other web properties, or is that a kind of spamming? The thing is that I use a G+ account for those e-commerce sites, and would not necessarily want the other web properties to be linked to the same G+ account, as they are not really related. I do hope I was clear. Any insight would be appreciated. Thanks.
White Hat / Black Hat SEO | | sumare0 -
Google messages & penalties
I just read the following comment in a response to someone else's question. The Responer is an SEOMoz Authority whose opinion I respect and have learned from (not sure if it's cool to mention names in a question) and it spurred my curiosity: "...Generally you will receive a warning from Google before your site is penalized, unless you are talking about just specific keywords." This is something I have been wondering about in relation to my own sudden ranking drop for 2 specific keywords as I did not receive any warnings or notices. I have been proceeding as if I had over used these keywords on my Home page due to an initial lesser drop, but identifying the cause for the huge drop still seems useful for a number of reasons. Can anyone explain this further?
White Hat / Black Hat SEO | | gfiedel0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0