Google Disavow and Penalty lifted please help?
-
We disavowed 80% of our backlink profile due to our last SEO building cheap nasty links and filed a reconsideration requested (we had the Google Webmaster Tools notice of detected unnatural links to http://www.xxx.co.uk penalty for a year from the 24<sup>th</sup> march 2012 but thought it best to clean up before round 2 – even though we had no real penalty and we dd some decent link building that moved us up).
We then received a successful penalty lifted note (on the 22<sup>nd</sup> of May 2013) but our rankings dropped (due to the crap links propping us up) since then we have built a fair few high quality links but our rankings do not seem to be moving much if at all (7 weeks clear now).
has anyone had any experience with the above (are we in a sandbox type situation).
Thank you for your time
Thanks Bob
-
If you disavowed 80% of your links, and many of them were being counted by Google prior to the penalty, then you should expect your rankings to drop substantially, even after the penalty is removed....because you simply have a fraction of the links you had before. Granted, the ones you disavowed were most likely very weak links, which is good.
But moving forwards, what you've got to do is build new, strong, non-spammy links. And expect this to take some time, too. From when you add a new link, it takes Google a while to discover that, then recalculate the PR that flows to your site from that link. At one point, that link juice recalculation seemed to me to be taking about a month; I suspect it's a little faster now. As well, I'm seeing indications with several clients' sites that there's an artificial delay of 3 to 6 months when major changes happen on a site...then, with no further changes made to the site, the rankings and traffic will inch up over a couple of months.
So my advice: be patient; you need to rebuild the links you've lost (with better quality sources, of course). And if you can create interesting, shareable content, leverage Facebook, G+ etc. to get your site visitors to help you build some of those links (and get real traffic from the shares) in the social sites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recovering from Google Penguin/algorithm penalty?
Anyone think recovery is possible? My site has been in Google limbo for the past 8 months to around a year or so. Like a lot of sites we had seo work done a while sgo and had tons of links that Google now looks down on. I worked with an seo company for a few months now and they seem to agree Penguin is the likely culprit, we are on page 8-10 for keywords that we used to be on page 1 for. Our site is informative and has everything in tact. We deleted whatever links possible and some sites are even hard to find contact information for and some sites want money, I paid a few a couple bucks in hopes maybe it could help the process. Anyway we now have around 600 something domains on disavow file we out up in March-April, with around 100 or 200 added recently as well. If need be a new site could be an option as well but will wait and see if the site can improve on Google with a refresh. Anyone think recovery is possible in a situation like this? Thanks
White Hat / Black Hat SEO | | xelaetaks0 -
When you get a new inbound link do you submit a request to google to reindex the new page pointing at you?
I'm just starting my link building campaign in earnest, and received my first good quality inbound link less than an hour ago. My initial thought was that I should go directly to google, and ask them to reindex the page that linked to me... If I make a habit of that (getting a new link, then submitting that page directly to google), would that signify to google that this might not be a natural link building campaign? The links are from legitimate (non-paid, non-exchange) partners, which google could probably figure out, but I'm interested to know opinions on this. Thanks, -Eric
White Hat / Black Hat SEO | | ForForce0 -
Re-Post: Unanswered - Loss of rankings due to hack. No manual penalty. Please advise.
Sorry for reposting, but i must have accidentally marked this as answered. I am still seeking advice/solutions. I have a client who's site was hacked. The hack added a fake directory to the site, and generated thousands of links to a page that no longer exists. We fixed the hack and the site is fully protected. We disavowed all the malicious/fake links, but the rankings fell off a cliff (they lost top 50 Google rankings for most of their targeted terms). There is no manual penalty set, but it has been 6 weeks and their rankings have not returned. In webmaster tools, their priority #1 "Not found" page is the fake page that no longer exists. Is there anything else we can do? We are out of answers and the rankings haven't even come back at all. Any advise would be helpful. Thanks!
White Hat / Black Hat SEO | | digitalimpulse0 -
Website rankings plummeted after a negative SEO attack - help!
Hello Mozzers A website of a new client (http://bit.ly/PuVNTp) use to rank very well. It was on the top page for any relevant search terms in its industry in Southern Ontario (Canada). Late last year, the client was the victim of a negative SEO attack. Thousands upon thousands of spammy backlinks were built (suspected to be bought using something like Fiverr). The links came from very questionable sites or just low quality sites. The backlink growth window was very small (2,000 every 24 hours or so). Since that happened that site has all but disappeared from search results. It is still indexed and the owner has disavowed most of the bad backlinks but the site can't seem to bounce back. The same happened for another site that they own (http://bit.ly/1tErxpu) except the number backlinks produced was even higher. The sites both suffer from duplicate content issues and at one point (in 2012) were de-indexed due to the very spammy work of a former SEO. They came back in early 2013 and were fine for some time. Thoughts?
White Hat / Black Hat SEO | | mattylac0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
How to transform an excel file on a txt file to send the Google Dissavow
I have a disallow file made on excel with lots of columns off information. I want to transform to txt file saving it from excel, but the result file seems understandable Can someone helpme on how to transform an excel file on the Google Dissavow file format for the final import
White Hat / Black Hat SEO | | maestrosonrisas0 -
Has my site been penalized by google
Hi all I have noticed a sudden drop in rankings for most of my keywords on kerryblu ,co,uk and was thinking the site may have been manually penalized by google. I have not received any notification of this in webmaster tools but can't think of any other reason for the loss of rankings. I have searched the web for info on this but can't find a definite answer. Is there any way of knowing for sure. At the time of the crash the only real change I made was adding google adsense to my blog. Could this be responsible. Thanks for looking.
White Hat / Black Hat SEO | | Dill0 -
Multiple H1 tags are OK according to developer. I have my doubts. Please advise...
Hi, My very well known and widely respected developer is using multiple H1 tags I see - they like using them in their code and they argue multiple H1s conform with HTML5 standards. They are resisting a recode to one H1 tag per page. However, I know this is clearly an issue in Bing, so I don't want to risk it with Google. Any thoughts on whether it's best to avoid multiple H1 tags in Google (any evidence and reasoning would be great - I can then put that to my developer...) Many thanks for your help, Luke
White Hat / Black Hat SEO | | McTaggart0