Best way to clean up a nasty backlink profile?
-
A new client of mine sadly has a TON of terrible links (3800 links from 1500 domains) which are pointing to landing pages that have been created specifically for manipulating engines. Besides contacting these sites and asking to have the links removed the only solution I can think of it to delete these pages and let them 404. Obviously I am not thrilled about that but I'm not sure what else to do. Does anyone have any other ideas for how to clean up this backlink profile? Thanks
-
Thank you everyone for the comments and confirmation. Now I just have to break the news to the client that we are going to have to delete those pages haha. Wish me luck
-
Definitely remove the pages from the server and 404 them, you are lucky they linked to internal pages and not the homepage or you would have many many hours of reaching out to webmasters on your hands.
This is a simple fix. and 404s are ok if the 404's are not a result of dead links pointing to no longer existing pages on your site. external sites 404ing to you is fine.
-
No, 404 errors are a natural part of the Internet.
-
If you have 404 errors when Google crawls your site does this not also impact your rankings?
-
Nakul,
Thank you for clarifying this question. We are also in the same situation. It is so good to have someone in the field to offer advice.
-
Honestly, I think you gut is telling you the right thing. And thankfully you answered the question yourself.
The fact that those pages were created to manipulate rankings (as in doorway pages?)
If that's the case, and there are external links to those pages, I would do exactly what you are thinking. Delete those pages and let them be a 404.
Did your client also receive the unnatural links google penalty notice ? If that's the case, then it's a no brainer to do this right away. You could also try to find if there are link networks involved and if there are certain webmasters who control multiple website specially when you say 3800 links from 1500 domains. I would atleast try to get "some" of them removed or atleast have some effort. If there's a way to send a blanket email to all of the 1500 domains, it's well worth the effort of atleast trying.
I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backlink is good or bad? All of the website links were of the same type.
Backlink is good or bad? All of the website links were of the same type. Website niche - Animation and 3D Rendering Studios Backlink from - http://www.adamfrisby.com/create-home-design-and-interior-decor-in-2d-3d.html the anchor tag is image URL from one of the many images in that post.
Intermediate & Advanced SEO | | varunrupal0 -
Best Format to Index a Large Data Set
Hello Moz, I've been working on a piece of content that has 2 large data sets I have organized into a table that I would like indexed and want to know the best way to code the data for search engines while still providing a good visual experience for users. I actually created the piece 3 times and am deciding on which format to go with and I would love your professional opinions. 1. HTML5 - all the data is coded using tags and contains all the data on page in the . This is the most straight forward method and I know this will get indexed; however, it is also the ugliest looking table and least functional. 2. Java - I used google charts and loaded all the data into a
Intermediate & Advanced SEO | | jwalker880 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
How do I best handle a minor URL change?
My company is about to complete an upgrade to our website but part of this will be changing the URLs slightly. Mainly the .aspx suffix will be dropped off the pages that we're most worried about. The current URLs will automatically redirect to the new pages, will this be enough or will there be an SEO impact? If it helps the site is www.duracard.com and the product pages are the ones we want to keep ranked. For instance if someone searches for "plastic gift cards" our page '<cite>https://www.duracard.com/products/plastic-gift-cards.aspx</cite>' is #3 and we want to make sure it stays that way once we change it to 'https://www.duracard.com/products/plastic-gift-cards'. Any advice would be greatly appreciated, thank you!
Intermediate & Advanced SEO | | Andrea.G0 -
How to create backlink to prominent sites which competition has
Hi, While going through link analysis of competition, there were links on high da sites, all these sites were reputable, wondering how to get dofollow links as what competition has attained any suggestions http://jalopnik.com/5965768/why-india-is-making-the-worlds-most-interesting-cars http://features.rr.com/topic/Maserati http://www.blouinartinfo.com/news/story/volkswagen-plans-two-compact-suvs http://www.firstpost.com/tag/honda-amaze
Intermediate & Advanced SEO | | Modi0 -
Best url structure
I am making a new site for a company that services many cities. I was thinking a url structure like this, website.com/keyword1-keyword2-keyword3/cityname1-cityname2-cityname3-cityname4-cityname5. Will this be the best approach to optimize the site for the keyword plus 5 different cities ? as long as I keep the total url characters under the SeoMoz reccomended 115 characters ? Or would it be better to build separate pages for each city, trying to reword the main services to try to avoid dulpicate content.
Intermediate & Advanced SEO | | jlane90 -
Best way to get the keyword ranking at the top
I am working on site for around six months now.
Intermediate & Advanced SEO | | ray2810
I have done social bookmarking submission, directory submission, blog comments, forum submissions etc. Is there anything else i can do to make the rank go higher. nothing is working correctly.0 -
What is the best way to embed PDF documents for SEO?
I have been using SCRIBD to embed PDF documents on my site but until recently I did not include the link back to SCRIBD. Will my site get credit for this content or will it go to SCRIBD? Is there a better way to embed PDF documents for SEO?
Intermediate & Advanced SEO | | casper4340