Alrogthimc penalty due to pharma hack that created drug links to home page. What to do?
-
Our site: Starcitylimo.com got destroyed by a pharma styled hack to their wordpress site. After having to re build the site from scratch to remove the virus, it was found that hundreds or even thousands of pharma links from overseas sites point to his home page (so we can't just 404 the pages).
Contacting the sites for removal does nothing. Added to dissavow 4 months ago did nothing. He's page 5 for every keyword he was position 5 or better for.
Is this one of those situations where its time to move on to a new domain?
-
No problem, it looks like you have a pretty involved project ahead of you. I would do a page level audit in GWT, Majestic and aHrefs. Be very careful which pages you 301.
If you want to be surgical - you can try getting good links changed to the new domain. That will be incredibly time consuming.
Regarding GWT reporting, I had a client that bought a link farm and did a blanket redirect... right smack into their root domain. Ouch. Luckily they had administrative control of said link farm - when the sitewide footer links were removed - it took about a month for GWT to remove them from link reports. Mileage may vary. Part of it depends on Google knowing it's not there - at least for the purposes of GWT reporting.
They had about 2,000,000+ backlinks. A lot of them were sitewides, another portion was article spam and blog spam. Some of it was just scraped. I wanted to remove at least 500,000 (just to be cautious) they only allowed 30K. But I got the link farm out of there. The site stabilized - but it was clear they needed to remove more. Such is life.
It would have been a great recovery story.
-
Thanks Travis, wow well I had never heard that about WMT. After reading countless link removal, penalty removal, SEO articles on moz and around the web I've never heard the download Google gives you isn't sufficient. I know it lags way behind others, but wow thats news people need to know.
It sounds like its time to move the site and move the good links we have with it then. More time is definitely not an option and with the majority being foreign spam, I don't believe we can control the link removal process successfully.
Thanks Again!
-
GWT definitely doesn't show you the whole picture. Majestic SEO won't show you the whole picture. Ahrefs or OSE can't show you the whole picture either.
Google has a pretty good idea of your link inventory. They share very little. In my understanding, the spreadsheet you download is just what they've shown you until it's no longer there.
No one set of link data is ever 100%. It's a big internet, but the one source that appears to generally share the least amount of data is Google. That's why you should use multiple sets of link data and do some spreadsheet wizardry to sort out the duplicates.
The general consensus regarding how long it takes to get a link penalty lifted is; "It takes as long as it takes." Some have taken up to a year and a few turned around fairly fast. Four months isn't exactly irregular.
Majestic is showing a lot of deleted links, so there's definitely some progress there. It's just hard to see the wheat from the chaff at a glance. Yeah, the backlink profile was hit hard.
Regarding the 404s... letting them 404 is just wasting crawl budget. It may be a good idea to 410 those and let the bots know those pages aren't there. In regard to what pages were dropped, you might have lost some pretty good links. I haven't done a page level audit, but losing good links with a ton of bad can delay the process as well.
Someone is going to chime in about the 'foundation' of links, and they would be correct. It's a whole other screed - so I'll leave it to them. Seriously, this one looks pretty nasty. Best of luck.
-
Hello Rob, as I mentioned hundreds of these links point to his home page. I have already changed the URL's and content on the other pages and let them 404.
If you search wordpress Pharma hack you can read all about it. But needless to say we fixed the hack, the problem is just the inbound links and the nasty algorithmic penalty that he has. The notice we got from Google that the site was compromised was removed 4 weeks ago, and rankings went from beyond page 10 to mostly page 5-6.
-
GTW backlinks is only 10% of Majestic? I've never seen this data before, and that would make zero sense if Google employees penalties on websites for links but does not provide the list of all the external links they find. I know the summary is all thats provided in GTW but the download is supposed to all inclusive, or so I have been lead to understand...
Either way, these links are IMPOSSIBLE to remove. So if Google webmaster Disavow doesn't actually work at removing penalties (algorithimic in nature) what then can we do?
-
Majestic SEO is still crawling thousands of 'new' bad links. I can't really say what Google bot is doing. I also don't know what domains you've disavowed.
The good news is, Majestic shows that it appears to be at least ebbing. I would revisit GWT backlinks and look for anything new. Know that you're barely getting 10% of the picture, if that.
I would hate to give up on a domain that's ten years old, but it looks like there's a little more history to understand. Plug it into SEM rush and slide the scale back to 'all time'.
-
Where are the links embedded? On the actual pages of your site (internal)? On the homepage? In the back-end of Wordpress (as you said it was a virus)??? Provide a little more detail and we may be able to find the right solution. Each of which would be different based on the various types of links going out to his site.
Thanks, Rob
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Webshop landing pages and product pages
Hi, I am doing extensive keyword research for the SEO of a big webshop. Since this shop sells technical books and software (legal books, tax software and so on), I come across a lot of very specific keywords for separate products. Isn't it better to try and rank in the SERP's with all the separate product pages, instead of with the landing (category) pages?
Intermediate & Advanced SEO | | Mat_C0 -
23k Links from one doman pointing to a single page, good or bad?
Hey all, So I found a domain that GWT tells me has 23k links pointing to a landing page. I found that the link is part of their global nav as a text ad and that's why it's probably registering so many links. The site has a DA of 56, is this a bad thing? Could it be hurting the rest of my site's ability to rank? Thanks, Roman
Intermediate & Advanced SEO | | Dynata_panel_marketing0 -
Glossary index and individual pages create duplicate content. How much might this hurt me?
I've got a glossary on my site with an index page for each letter of the alphabet that has a definition. So the M section lists every definition (the whole definition). But each definition also has its own individual page (and we link to those pages internally so the user doesn't have to hunt down the entire M page). So I definitely have duplicate content ... 112 instances (112 terms). Maybe it's not so bad because each definition is just a short paragraph(?) How much does this hurt my potential ranking for each definition? How much does it hurt my site overall? Am I better off making the individual pages no-index? or canonicalizing them?
Intermediate & Advanced SEO | | LeadSEOlogist0 -
Can 410 links trigger a penalty?
Hi! This is a follow on question from my other post - http://moz.com/community/q/site-dropped-after-recovery. As mentioned there, I've ad a manual penalty revoked for http://www.newyoubootcamp.com/. This came after the forum was hacked and some poor quality SEO was done. We've managed to clean a large amount of links, but ones such as http://about1.typepad.com/blog/2014/04/tweetdeck-to-launch-as-html5-web-app-now-accepting-beta-testers.html (anchor is "microsoft") are still being found and indexed. My question is that although the forum is now 410'd, can these junk links still be causing any harm? A huge amount have been disavowed, and many others taken down after a manual outreach campaign, but still others are appearing. The site is performing poorly in search despite having a much better domain authority, driven by largely great links from national newspapers, than its competitors, as well as solid user metrics such as a bounce rate of 30% and few on-site issues. This makes me think it must be the link profile. Any advice would be much appreciated. S
Intermediate & Advanced SEO | | Blink-SEO0 -
Should we show(to google) different city pages on our website which look like home page as one page or different? If yes then how?
On our website, we show events from different cities. We have made different URL's for each city like www.townscript.com/mumbai, www.townscript.com/delhi. But the page of all the cities looks similar, only the events change on those different city pages. Even our home URL www.townscript.com, shows the visitor the city which he visited last time on our website(initially we show everyone Mumbai, visitor needs to choose his city then) For every page visit, we save the last visited page of a particular IP address and next time when he visits our website www.townscript.com, we show him that city only which he visited last time. Now, we feel as the content of home page, and city pages is similar. Should we show these pages as one page i.e. Townscript.com to Google? Can we do that by rel="canonical" ? Please help me! As I think all of these pages are competing with each other.
Intermediate & Advanced SEO | | sanchitmalik0 -
Getting individual website pages to rank for their targeted terms instead of just the home page
Hi Everyone, There is a pattern which I have noticed when trying to get individual pages to rank for the allocated targeted terms when I execute an SEO campaign and would been keen on anyones thoughts on how they have effectively addressed this. Let me try and explain this by going through an example: Let's say I am a business coach and already have a website where it includes several of my different coaching services. Now for this SEO campaign, I'm looking to improve exposure for the clients "business coaching" services. I have a quick look at analytics and rankings and notice that the website already ranks fairly well for that term but from the home page and not the service page. I go through the usual process of optimising the site (on-page - content, meta data, internal linking) as well as a linkbuilding campaign throughout the next couple of month's, however this results in either just the home page improving or the business page does improve, but the homepage's existing ranking has suffered, therefore not benefiting the site overall. My question: If a term already ranks or receives a decent amount of traffic from the home page and not from the page that its supposed to, why do you think its the case and what would you be your approach to try shift the traffic to the individual page, without impacting the site too much?. Note: To add the home page keyword target term would have been updated? Thanks, Vahe
Intermediate & Advanced SEO | | Vahe.Arabian0 -
Link Acquisition - link building
When using Site Explorer to find out my competiters links so I can do some link aquisition SEO do I look for the "inbound" links or or "linking domains"? Also, what filters should I choose? I want to make a spreadsheet as Rand suggested in his video and start to prioritize my link building.
Intermediate & Advanced SEO | | musicforkids0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0