Suggestions on Link Auditing a 70,000 URL list?
-
I have a website with nearly 70,000 incoming links, since its a somewhat large site that has been online for 19 years.
The rate I was quoted for a link audit from a reputable SEO professional was $2 per, and clearly I don't have $140,000 to spend on a link audit
!!
I was thinking of asking you guys for a tutorial that is the Gold Standard for link auditing checklists - and do it myself. But then I thought maybe its easier to shorten the list by knocking out all the "obviously good" links first. My only concern is that I be 100% certain they are good links.
Is there an "easiest approach" to take for shortening this list, so I can give it to a professional to handle the rest?
-
Hi! - I wrote this guide a few years ago on penalty recovery which may help you as it contains a lot of methods around auditing the links - https://mza.bundledseo.com/blog/ultimate-guide-to-google-penalty-removal
If we were to approach a product with 70k URLs. We'd do the following steps:
- Pull all the URLs into a Spreadsheet
- Split the URLs into domains
- Filter the URLs are search for common spammy words. e.g 'Link', 'Best', 'Free', 'Cheap', 'Dir', 'SEO' etc (mark as spam accordingly)
- Run contact finding across all URLs using a tool such as URL Profiler with Whois Lookups
- Filter by contact name and find duplicates (mark as spam accordingly)
- Filter by website type and mark as spam accordingly
- Manually check remaining links
By working through by domain, you'll rule out thousands of spammy links very quickly. Though 70k will ultimately take a few solid days of work.
Hope this helps,
Lewis
-
Have you looked at www.monitorbacklinks.com, good tool.
-
Hello,
Although it's important to do a link audit if you feel you have been penalized, for some sites a link audit isn't necessary. With that being said, and you feel you need a link audit there are a few options. Ideally, you would go through each link and review it to see how it may be impacting your site, but often site owners don't have the time to do this.
- Review obvious links - Grab 50-100 links at a time and do a quick glance at each one to determine if it should be on a list of potentially bad links. This way you can quickly overlook links you know are not hurting your rankings. Over time you can slowly tackle your list and hammer out which links are bad.
- Focus on spam analysis links - Run your site through Moz open site explorer and review the spam analysis. Now you're not going to get every single link here, but you can get an idea on what links are lower quality.
- Look into other companies - $2 per link is quite high, and there are other companies out there that will do a link audit, removal, and disavow for much less. If you would like a quote please contact us. Look into multiple options, don't get sold on just what one place tells you.
Hope this is helpful, if you have any additional questions please feel free to ask.
Chris
-
$2 per link is very expensive when you are looking at so many, especially as there is a big part of this that can be automated (hint: This should cost you no more than about $5-$10k if outsourced).
Linda has given you some good tips there, but I do agree that you need to tread carefully because you can often go too far and end up jumping out of the frying pan and into the fire.
It really does help to first gather all of the links from as many sources as you can and as already mentioned, create your de-dupe list. Depending on who you speak to at this point, there are different ways to go through the data and start to segment the links into those you know that are dangerous, those that are perhaps a bit of a grey area, and those that are safe.
Cheers,
Andy
-
I concentrate on the "most normal or typical sites will not need to use this tool" part, myself. (Though it sounds like you may not fall into that category.)
So then it's back to downloading as comprehensive a list of links as you can by using various sources and looking them over. (Also, in the past I have used LinkResearchTools to get an overview--it isn't cheap but it is a lot less than $140,000.)
-
Yes. We have confirmed with Sucuri that there was a concerted, intentional spam campaign against our site in 2013 that has since destroyed our rankings. Though Google hasn't given us any warnings, Sucuri had us on a blacklist because of it, and was kind enough to remove us without any cost or obligation on our part to sign up. They also provided us with a list of some of the most offending links so I could disavow them.
With up to 70,000 total, I am confident there are more, and to be honest, I see no reason to "leave some". Or leave any. I believe Google's warning should focus on this part: "...if used incorrectly". That means ... simply use it correctly. And disavow bad links, period. That's my take at least.
-
First, are you sure you need a link audit? Google is pretty good at ignoring regular spammy links that get picked up over time by large sites, as they say in their "Disavow backlinks" help page.
If you think there is a cause for concern, Moz's own Open Site Explorer can give you a list of incoming links that includes a spam score for those links, which can be used as a first pass.
The general drill for a manual link audit is to find all of the links you can (search console, moz, ahrefs, majestic, etc.) and create a de-duped list. From there, the "definitely good links" are usually easy to spot--you will recognize them from your industry or from other authoritative sources. And you will probably recognize the spammy "Get Rich/Viagra" backlinks as well. (If you sort your list by domain, it is easier to pick them out as a group.)
The rest are the ones to look at more closely.
But as I said to start, unless you think you are being penalized, tread lightly when it comes to disavowals.
To quote from Google [about disavowal]:
"This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool."
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trailing Slashes on URLs
Hi we currently have a site on Wordpress which has two version of each URL trailing slash on URLs and one without it. Example: www.domain.com/page (preferred version - based on link data) www.domain.com/page**/** The non-slash version of the URL has most of the external links pointing to them, so we are going to pick that as the preferred version. However, currently, each version of every URL has rel canonical tag pointing to the non-preferred version. E.g. www.domain.com/page the rel canonical tag is: www.domain.com/page/ What would be the best way to clean up this setup? Cheers.
Intermediate & Advanced SEO | | cathywix0 -
Any idea why Google Search Console stopped showing "Internal Links" and "Links to your site"
Our default eCommerce property (https://www.pure-elegance.com) used to show several dozen External Links and several thousand Internal Links on Google Search Console. As of this Friday both those links are showing "No Data Available". I checked other related properties (https://pure-elegance.com, http:pure-elegance.com and http://www.pure-elegance.com) and all of them are showing the same. Our other statistics (like Search Analytics etc.) remain unchanged. Any idea what might have caused this and how to resolve this?
Intermediate & Advanced SEO | | SudipG0 -
Internal Linking - Can You Over Do It?
Hi, One of the sites I'm working on has a forum with thousands of pages, amongst thousands of other pages. These pages produce lots of organic search traffic... 200,000 per month. We're using a bit of custom code to link relevant words and phrases from various discussion threads to hopefully related discussion pages. This generates thousands of links and up to 8 in-context links per page. A page could have anywhere from 200 to 3000 words in one to 50+ comments. Generally, a page with 200 words would have fewer of these automatically generated links, just because there are fewer terms naturally on the page. Is there any possible problem with this, including but not limited to some kind of internal anchor text spam or anything else? We do it to knit together pages for link juice and hopefully user experience... giving them another page to go to. The pages we link to are all our pages that produce or we hope to produce organic search traffic from. Thanks! ....Darcy
Intermediate & Advanced SEO | | 945010 -
If I own a .com url and also have the same url with .net, .info, .org, will I want to point them to the .com IP address?
I have a domain, for example, mydomain.com and I purchased mydomain.net, mydomain.info, and mydomain.org. Should I point the host @ to the IP where the .com is hosted in wpengine? I am not doing anything with the .org, .info, .net domains. I simply purchased them to prevent competitors from buying the domains.
Intermediate & Advanced SEO | | djlittman0 -
Is there any SEO advantage to sharing links on twitter using google's url shortener goo.gl/
Hi is there any advantage to using <cite class="vurls">goo.gl/</cite> to shorten a URL for Twitter instead of other ones? I had a thought that <cite class="vurls">goo.gl/</cite> might allow google to track click throughs and hence judge popularity.
Intermediate & Advanced SEO | | S_Curtis0 -
Subdomain Metrics Links??
I have been analysing my companies website against our competitors and we beat them hands down on everything apart from the total links in the subdomain metrics. Our competitor jumped above us a couple of months ago to grab the number one spot for our industries most valuable keyword. They have had a new website designed and after looking at the source code and running it through SEO MOZ in comparison to our site I can't see how they have manged to do it. We beat them hands down on all factors apart from subdomain metrics > Total links where they have twice as many. When it comes to Page Specific Metrics and Root Domain Metrics we easily beat them on all factors. Does anyone have any ideas what I need to do to improve the subdomain metrics? Thanks
Intermediate & Advanced SEO | | Detectamet0 -
Suggestions for good ranking
Hello, My blog http://www.pakearning.com/ was having page rank 1 and now its doped to 0 and less ranking, please review my site and help me how can I improve ranking and traffic ? Suggest me what to do for best ranking ? Directory submission ? articles submission ? also for home page, how much keywords are good ?
Intermediate & Advanced SEO | | softcorner0 -
How to fix duplicated urls
I have an issue with duplicated pages. Should I use cannonical tag and if so, how? Or should change the page titles? This is causing my pages to compete with each other in the SERPs. 'Paradisus All Inclusive Luxury Resorts - Book your stay at Paradisus Resorts' is also used on http://www.paradisus.com/booking-template.php | http://www.paradisus.com/booking-template.php?codigoHotel=5889 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5891 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5910 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5911 line 9 |
Intermediate & Advanced SEO | | Melia0