What is the best way to remove and fight back backlink spam?
-
Removing low quality and spam backlinks. What is the most effective clean-up process?
-
Hey Matti
Glad it helped buddy. Check out those tools, they won't do the job for you but they will certainly help out with some of the manual labour aspects.
Marcus
-
thank you for the response Marcus. so far it's not really that bad. I discovered that there were some pretty bizarre links that facilitated the atrophy of our rankings but not quite reach 90% spam links.Starting a new domain is far more than I think. I'll probably look at the tools that you suggested and see what's out there then.
Regards,
Matti
-
Hey Matti
In a nutshell, if it is really bad, then start again on a new domain.
What I am seeing with a few people I am helping is that where the site has had historical results, but is now penalised, attempting to clean up if the back link profile is pretty rotten (90% + placed links) is a pretty tough gig.
There are some tools out there that are proving useful and the pick of the bunch would be:
- rmoov
- Link Cleanup and Contact
- Remove’em
These all have pros and cons so you will likely want to use all of them.
Additionally, you will want to make sure the site is worth saving and likely invest some time and effort in generating some honest links through some solid content marketing. Maybe build some kind of free report or something specific to the site that you can use to do some outreach based link building. Do some blogging, invest some time and effort in the quality of the site.
Additionally, if you have a penalty, be prepared to put in a few requests and if you intend to disavow, be thorough.
With some experience here, you also have to ask yourself - what are you trying to save? If the answer to that question is that you are trying to save some spam links that still seem to be working at the moment, then, seriously, start again.
Without a link and some research it is hard to make a call but know this, it is a tough job to remove bad links and unless you have a link profile where there is something worth saving, then a new domain is likely the fastest way to sort out this mess and make sure you don't get hit again when they tighten up the link penalties down the road (you know it's going to happen).
There really is no generic answer here and every situation is different but be sure to know what you are getting yourself into before you undertake this and do an honest review of the site, the content and the links to make sure this is a battle you can win.
This is a good read:
http://cyrusshepard.com/penalty-lifted/
Hope that helps buddy
Marcus
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I submit a sitemap for a highly dynamic site or not? If so, what's the best way to go about doing it?
I do SEO for online boutique marketplace. I've been here for about 4 weeks and no one's done there SEO (they've been around for about 5 years), so there's lots to do. A big concern is whether or not to submit a sitemap, and if I do submit one, what's the best way to go about doing one.
Technical SEO | | Jane.com0 -
Deindexed site - is it best to start over?
A potential client's website has been deindexed from Google. We'd be completely redesigning his site with all new content. Would it be best to purchase a new url and redirect the old deindexed site to the new one, or try stick with the old domain?
Technical SEO | | WillWatrous0 -
Changing the city of operation and trying to know the best way of informing Google
We are having a business operating out of three cities A, B and C with A being the primary address and the business provides its services in B and C as well. Business has decided to shut shop in C and instead add D as another city. Currently the URLs are like www.domainname.com/A/productswww.domainname.com/B/productswww.domainname.com/C/productsPlease help us in understanding the best way to inform google that City C is non operational now.Do we need to do the redirects, and if yes, should we do the redirects to Home Page?Or can we just remove the C city URLs from the webmaster tool and inform Google.
Technical SEO | | deep_irvin0 -
What is the best strategy for a company in various countries?
Hello I have to make yt SEO marketing strategy for a company that provides services in Spain, Colombia and Mexico
Technical SEO | | interficto
I'm looking at two options: Buy different domains (TLD): This option seems feasible but very expensive and manage each domain position it would have to have different content in each (plus you would not know that because it is put exactly the same domain) Place each service and country folders eg
www.dominio.com / mexico / training-financiero.html
www.dominio.com / espana / training-financiero.html I have understood that option 1 is no longer necessary since you can use html tags within the code to tell Google that you try to target content to customers from a different country.
In principle we would use the same content would change only a few words and of course the currency to suit the local currency of each country. However I believe that customers could rely more on a domain if their country. Plus I'm afraid I google indexed as duplicate content is another matter What country would main that could confuse the visitor?0 -
Best TLD for china
In China there are 2 commonly used tlds .cn and .com.cn. We own both versions for a new domain. Does anyone know if there is research done which one is the best TLD "in the eyes" of the search engines Baidu and Google? Or maybe there is a methodology to select the best? Thanks!
Technical SEO | | Paul-G0 -
Removing irrelevant items from Google News?
A client wants to know if it's possible to get Google to remove stories from Google News feeds if those stories have nothing to do with the client? Any advice would be greatly appreciated. Thank you.
Technical SEO | | JamesAMartin0 -
What's the best way to solve this sites duplicate content issues?
Hi, The site is www.expressgolf.co.uk and is an e-commerce website with lots of categories and brands. I'm trying to achieve one single unique URL for each category / brand page to avoid duplicate content and to get the correct URL's indexed. Currently it looks like this... Main URL http://www.expressgolf.co.uk/shop/clothing/galvin-green Different Versions http://www.expressgolf.co.uk/shop/clothing/galvin-green/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/1 http://www.expressgolf.co.uk/shop/clothing/galvin-green/2 http://www.expressgolf.co.uk/shop/clothing/galvin-green/3 http://www.expressgolf.co.uk/shop/clothing/galvin-green/4 http://www.expressgolf.co.uk/shop/clothing/galvin-green/all http://www.expressgolf.co.uk/shop/clothing/galvin-green/1/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/2/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/3/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/4/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/all/ Firstly, what is the best course of action to make all versions point to the main URL and keep them from being indexed - Canonical Tag, NOINDEX or block them in robots? Secondly, do I just need to 301 the (/) from all URL's to the non (/) URL's ? I'm sure this question has been answered but I was having trouble coming to a solution for this one site. Cheers, Paul
Technical SEO | | paulmalin0 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0