Tools For Analysing Backlinks
-
Does anyone know of a good tool that analyses backlinks?
I have already tried Link Research Tools and found it very useful but it is very expensive. Does anyone know of an alternative that does the same thing, i.e. will tell you what links are causing harm to your site?
-
Hi Brian,
I use http://linkrisk.com/, it is really good and has been endorsed by many of us in the industry. There are different packages available depending on how big the backlink profile you have is but it categorises the links into suspect, high, neutral, reduced and low risk and helps you identify which links you need to concentrate on. I'd recommend you have a look at it.
-
I have a quick look at the all the sites you have suggested but there doesn't seem to be one that actually tells you in black and white whether your backlinks are good or bad.
Does anyone know of any sites that provide that?
-
Only other tool I know that is similar to the Link Detox tool (which is only $40 per run) is remove'em which is $250 per site.
-
-
Hi
On Moz theres Open Site Explorer for backlink analysis
Also look at aHrefs & MajesticSEO
All Best
Dan
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google webmaster tools hiccup?
Our flagship website, up until March 16 was getting 1600 impressions and 300 branded clicks per day as per GWT. After 3/16, branded search fell to 300 impressions and 25 clicks per day. Our rankings haven't changed, and neither has our traffic. We would definitely notice the decline in GA and Core Metrics, and it is running about the same. according to GWT, 75% fewer people started searching for our brand on 3/16, but all of our other metrics are indicating otherwise. Has anyone seen this before? Is it a tracking issue on our side?
Reporting & Analytics | | AMHC0 -
Changing URL Parameters in Webmaster Tools
We have a bit of a conundrum. Webmaster tools is telling us that they are crawling too many URLs: Googlebot found an extremely high number of URLs on your site: http://www.uncommongoods.com/ In their list of URL examples, all of the URLs have tons of parameters. We would probably be ok telling Google not to index any of the URLs with parameters. We have a great URL structure. All of our category and product pages have clean links (no parameters) The parameters come only from sorts and filters. We don't have a need for Google to index all of these pages. However, Google Analytics is showing us that over the last year, we received a substantial amount of search revenue from many of these URLs (800+ of them converted) So, Google is telling us they are unhappy. We want to make Google happy by ignoring all of the paramter URLs, but we're worried this will kill the revenue we're seeing. Two questions here: 1. What do we have to lose by keeping everything as-is. Google is giving us errors, but other than that what are the negative repercussions? 2. If we were to de-index all of the parameter URLs via Webmaster tools, how much of the revnenue would likely be recovered by our non-parameter URLs? I've linked to a screenshot from Google Analytics ArxMSMG.jpg
Reporting & Analytics | | znotes0 -
Webmaster tools crawl errors
Hi there, iv been tracking my webmaster tools crawl errors for a while now(6 months) and im noticing some pages that are far gone 404 are still poping out on the crawl errors. - that pages have no data for xml linking, and remote linking are from pages that are far gone 404 also. that pages have 404 error page + redirect to homepage, and google still notice them with old cache content. does someone have a clue why is this happening?
Reporting & Analytics | | Or.Shvartz0 -
Google Webmaster Tools is showing wrong data - help?
Hey all, I'm seeing some weird problems with Webmaster Tools. Specifically: We've submitted a sitemap with 174k URLs. According to the WMT dashboard, only 21 are indexed, though if you search our site via site:sitename.com blah blah, there are thousands of pages returned. Why is WMT only showing 21 indexed pages? Yet if I go to Health -> Index Status, it's showing nearly 199k URLs indexed. This seems consistent with searching Google site:sitename.com blah blah. Under "Search Queries", it's showing "no data available". Not sure why as it's linked to the proper Google Analytics account, which has keyword data. Any ideas what I'm doing wrong here? Thanks.
Reporting & Analytics | | chimptech0 -
Backlink Profile in Webmaster Tools
The quest for the "accurate" backlink profile never ends. We now have Webmaster Tools to help us with this search. My question is whether the backlink profile according to WT is the backlink profile according to Google. Do Google's search algorithms and WT synch up? If this is the case, forget OSE and MajesticSEO, WT will be the main backlink analysis tool I use going forward. Thanks for any insight.
Reporting & Analytics | | b40040400 -
Fb, twitter, etc stats tool?
I once saw a tool (from Distilled, I think?) where you would input your blog post or page URL, and it would give you a table showing how many Facebook likes, Tweets, etc that page has gotten. Does anyone have the URL for this or a similar tool?
Reporting & Analytics | | AdamThompson0 -
How do you set www vs non-www in Bing Webmaster Tools?
A site of mine is showing as example.com instead of www.example.com in Bing and Yahoo. I know where to set the preferred domain to www in Google Webmaster Tools, but I can't seem to find that function in Bing Webmaster Tools. Can someone please point me in the right direction?
Reporting & Analytics | | DentalPlanSEO0 -
Why are Seemingly Randomly Generated URLs Appearing as Errors in Google Webmaster Tools?
I've been confused by some URLs that are showing up as errors in our GWT account. They seem to just be randomly generated alphanumeric strings that Google is reporting as 404 errors. The pages do 404 because nothing ever existed there or was linked to. Here are some examples that are just off of our root domain: /JEzjLs2wBR0D6wILPy0RCkM/WFRnUK9JrDyRoVCnR8= /MevaBpcKoXnbHJpoTI5P42QPmQpjEPBlYffwY8Mc5I= /YAKM15iU846X/ymikGEPsdq 26PUoIYSwfb8 FBh34= I haven't been able to track down these character strings in any internet index or anywhere in our source code so I have no idea why Google is reporting them. We've been pretty vigilant lately about duplicate content and thin content issues and my concern is that there are an unspecified number of urls like this that Google thinks exist but don't really. Has anyone else seen GWT reporting errors like this for their site? Does anyone have any clue why Google would report them as errors?
Reporting & Analytics | | kimwetter0