Non-Recognition of Links
-
Hi All,
I asked about a client last month and have had to do some other digging to try to find out what's going on with its Google rankings.
According to our link-building spreadsheet, we have got up to 50 links (from 50 domains) in process of being actioned and a large proportion of these are actually in existence.
There are two questions:-
1. Open Site Explorer only recognises 3 domains - as I know that other domains exist and are pointing (mostly 'followed'), what can be the reason OSE doesn't recognise this?
2. What can be done to encourage these external links to be more easily accessible by OSE and, presumably other bots?
Other Points:-
1. I initially thought it might be crawl blocking issue causing the rankings, but Bing/Yahoo rankings are slowly dragging themselves upwards.
2. Robots.txt is not blocking any of the site
3. Pro on-site analysis for the target keyword is 'A'
4. The website's stats per OSE are better than some competitors in the top 20 except on the root domain issue, which is why the above point is important.
Link building for other clients has worked really well without hiccups and with general gradual recognition, so any tips from more experienced folks out there would be gratefully appreciated.
Many thanks,
Martin
-
Hi Martin,
You might find it useful to take a look at the Linkscape Update Schedule in case timing is a factor.
I believe Rand outlined the recent changes to the indexing rationale in this webinar
Using Open Site Explorer to Uncover New Marketing Opportunities , but if you still have questions then as Brian suggested, it may be a good idea to lodge a ticket or email the Help Team. help [at] seomoz.org.
Hope that helps,
Sha
-
Hey, if the page you got the link on was interesting enough that you wanted to get a link on it, than what harm is there in letting the world know about that resource via Twitter, Facebook, or whatever other service you choose? ... and if it's not worth talking about, or you would be embarrassed to speak of it, than how "quality" was that link anyway
On the OSE Catch 22, gotcha... all I can think of is that perhaps the low quality sites are not always re-crawled with the update, thus not picking up the new links?. An SEOmoz staffer with intimate knowledge of the crawl behavior could better answer that one though
Brian
-
Hi guys,
Thanks for the feedback so far and I will be definitely checking GWT and maybe even tweeting out the links. I did think that seemed a little bit... you know, false - but I guess it's just ensuring Google takes note of the actual page? What do people think? I'm unwilling to Facebook them out, because that's even more 'in your face' and I'm unwilling to SPAM out 50 domains just to get them indexed. Advice welcomed on these points.
@Brian - yes, I suppose they could be coming from lower domains, but equally many have been pulled from competitor link data from OSE, so catch 22?
@Theo - I will double-check
@Ross - firmly NO to Black Hat. I don't do this ANYWAY, but equally something's screwing up the SEO anyway, so going down that route could permanently jeopardise the site and that's not what the client's paying for.
-
Like Theo said, I would start with Webmaster Tools ( Links to your site > All domains area ), if they are in there, Google knows about them, and if they have any value to pass though that link, they are passing it.
One other quick note, if you know those pages you are getting links from are all index and follow pages, you may want to just double check to see if they have been indexed ( Google search for site:www.the-exact-domain.com/and-page-url.html ), if you get no results back, then you know those pages are not in the index (not found yet, or otherwise dropped).
On the OSE thing, if I am remembering this correctly Rand said something about how they were focusing the crawl, pulling in less low quality sites - could it be that the domains you are getting links from are low quality?
Brian
-
Hi Martin,
Although OSE is an awesome tool it is still in its infancy and may not have the capacity to crawl the links you are talking about. Another way to check the links is to have a look via majestic SEO, they having a much bigger index than OSE and tend to show a good deal more links.
I would also have a look at the Google Webmaster Tools and see if the links are present in there.
If you are worried about the links being crawled and indexed by google then take the URL and run it through Google itself with the site: command. If it does not turn up there is a chance it may not be indexed. If it is not indexed I believe that the site: command that returns no values must send a Google bot to the URL to crawl it- can't confirm this is true but it just makes good sense.
If you want to be doubly sure you links are getting crawled you can force a crawl by Google by bookmarking the page thorugh a bookmarking service or sharing it in a social network.
WARNING, MESSY BLACK HAT TACTICS COMING UP*****
And if you really want to give it a good ole kick up the jaxie you can load up a automatic bookmarking tool and bookmark the URL with you link on over a couple hundred domains. Problems with this method include:
- need to buy spammy software like bookmark demon
- you are in effect creating a link wheel which may devalue your efforts
- it sticks out like a sore thumb
- links on bookmarking sites drop of the link graph or get devalued very quickly..
However, the positives that come out of this technique are you link will be crawled and indexed and it will have another couple of hundred links pointing at it....... for a while.
If you are working with a client I would recommend just running it through facebook or tweeting out the link and stay away from forcing any crawls. However, if it is the middle of november and you have a christmas shop that needs to rank quickly get that black hat on.
Hope that helps.
-
Hi Martin,
Although OSE is an awesome tool it is still in its infancy and may not have the capacity to crawl the links you are talking about. Another way to check the links is to have a look via majestic SEO, they having a much bigger index than OSE and tend to show a good deal more links.
I would also have a look at the Google Webmaster Tools and see if the links are present in there.
If you are worried about the links being crawled and indexed by google then take the URL and run it through Google itself with the site: command. If it does not turn up there is a chance it may not be indexed. If it is not indexed I believe that the site: command that returns no values must send a Google bot to the URL to crawl it- can't confirm this is true but it just makes good sense.
If you want to be doubly sure you links are getting crawled you can force a crawl by Google by bookmarking the page thorugh a bookmarking service or sharing it in a social network.
WARNING, MESSY BLACK HAT TACTICS COMING UP*****
And if you really want to give it a good ole kick up the jaxie you can load up a automatic bookmarking tool and bookmark the URL with you link on over a couple hundred domains. Problems with this method include:
- need to buy spammy software like bookmark demon
- you are in effect creating a link wheel which may devalue your efforts
- it sticks out like a sore thumb
- links on bookmarking sites drop of the link graph or get devalued very quickly..
However, the positives that come out of this technique are you link will be crawled and indexed and it will have another couple of hundred links pointing at it....... for a while.
If you are working with a client I would recommend just running it through facebook or tweeting out the link and stay away from forcing any crawls. However, if it is the middle of november and you have a christmas shop that needs to rank quickly get that black hat on.
Hope that helps.
-
The fact that OSE doesn't pick up a link doesn't necessarily mean a link isn't 'active' and giving your site value. Even though Linkscape captures a vast amount of URLs, it only crawls a portion of the web, most likely from the bigger pages down. If many of these links to your site are coming from smaller / less powerful domains, they might not (yet) have been picked up by Linkscape.
Try looking at Google Webmaster central to see if the links are included there. If Google lists them as links there, they are very likely to be counted by them as well.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz Spam Analysis vs. GWMT Links to Your Site
Hi Moz Community, I have been conducting some link auditing and started comparing the Moz Spam Analysis tool with the links provided in Google WMT. It appears that the Moz Spam Analysis tools shows an aggregate of links that Moz may or may not consider spam, however when you download and look at Google's "Links to Your Site" list it provides every link iteration known to man that's pointing to the target website - without providing any hints as to whether or not a link may be considered spam by Google. The biggest concern I have here is that Google is picking up a lot of links, which I consider spam, that do not appear in the Moz Spam Analysis results. I guess the question(s) I have are: Does it make sense to compare these two data sets? Has anyone else tried this comparison and how did you use the information to make positive changes? Any recommendations when it comes to determining if an external link is spam/hurting/helping a website? Thank you!
Moz Pro | | GoogleDowner0 -
Link report that is broken down by C Block?
I've tried to do this in the advanced reports are of Moz, but to no avail. I just want to be able to see all the links (and anchor text would be nice too) for each CBlock.
Moz Pro | | DeluxeCorp0 -
Benefits of reducing on page links
This is more of a discussion point. What would be the measurable results of reducing the number of on page links, specifically on a home page, but let's assume by way of a large navigation menu most of the pages have a lot of links" For instance, would any of the stats on the MozBar be affected (let's start with the home page). Would the Page Authority or MozRank change at all, perhaps because there is less "juice" flowing out of the home page? Thanks! 🙂
Moz Pro | | ntcma0 -
"Too many on page links" phantom penalty? What about big sites?
So I am consistently over the recommended "100 links" rule on our site's pages because of our extensive navigation and plentiful footer links (somewhere around 300 links per page). I know that there is no official penalty for this but rather that it affects the "link juice" of each link on there. I guess my question is more about how places like Zappos and Amazon get away with this? They have WAY over 100 links per page... in fact I think that Zappos footer is 100+ links alone. This overage doesn't seem to affect their domain rankings and authority so why does SEO moz place so much emphasis on this error?
Moz Pro | | kida12meyer0 -
SEOmoz API - Links and Anchor Text Calls
Hi, I'm testing out the SEOmoz API - however I'm stuggling to understand the use of the Cols parameter within the "anchor-text" method. I've looped through increasing numbers of "Cols" for a standard query and there just seems to be no logical pattern.
Moz Pro | | AlexThomas
** - Could someone please enlighten me as to how this works?** E.g. of results for query: http://lsapi.seomoz.com/linkscape/anchor-text/www.seomoz.org/?Scope=term_to_page&Sort=domains_linking_page&Cols=1 1Array ( [0] => Array ( [aturid] => 86128451138 ) [1] => Array ( [aturid] => 86128451144 ) [2] => Array ( [aturid] => 86128451131 ) ) 2Array ( [0] => Array ( [atut] => seomoz ) [1] => Array ( [atut] => seomoz.org ) [2] => Array ( [atut] => seo ) ) 3Array ( [0] => Array ( [aturid] => 86128451138 [atut] => seomoz ) [1] => Array ( [aturid] => 86128451144 [atut] => seomoz.org ) [2] => Array ( [aturid] => 86128451131 [atut] => seo ) ) 4Array ( [0] => Array ( [atui] => 38845159274 ) [1] => Array ( [atui] => 38845159274 ) [2] => Array ( [atui] => 38845159274 ) ) 5Array ( [0] => Array ( [atui] => 38845159274 [aturid] => 86128451138 ) [1] => Array ( [atui] => 38845159274 [aturid] => 86128451144 ) [2] => Array ( [atui] => 38845159274 [aturid] => 86128451131 ) ) 6Array ( [0] => Array ( [atui] => 38845159274 [atut] => seomoz ) [1] => Array ( [atui] => 38845159274 [atut] => seomoz.org ) [2] => Array ( [atui] => 38845159274 [atut] => seo ) ) 7Array ( [0] => Array ( [atui] => 38845159274 [aturid] => 86128451138 [atut] => seomoz ) [1] => Array ( [atui] => 38845159274 [aturid] => 86128451144 [atut] => seomoz.org ) [2] => Array ( [atui] => 38845159274 [aturid] => 86128451131 [atut] => seo ) ) 8Array ( [0] => Array ( [atuiu] => 1 ) [1] => Array ( [atuiu] => 1 ) [2] => Array ( [atuiu] => 0 ) ) 9Array ( [0] => Array ( [atuiu] => 1 [aturid] => 86128451138 ) [1] => Array ( [atuiu] => 1 [aturid] => 86128451144 ) [2] => Array ( [atuiu] => 0 [aturid] => 86128451131 ) ) 10Array ( [0] => Array ( [atuiu] => 1 [atut] => seomoz ) [1] => Array ( [atuiu] => 1 [atut] => seomoz.org ) [2] => Array ( [atuiu] => 0 [atut] => seo ) ) Links API: Similar confusion here for:
"TargetCols"
"SourceCols"
"LinkCols" The description here http://apiwiki.seomoz.org/w/page/13991141/Links API - is a bit vague It appears that the links API spits out everything anyway - that one's less of an issue. So... could anyone explain how the Anchor-text API parameter Cols works?? Cheers!0 -
Newly acquired links not showing up on link analysis
I have recently gained a few links which I know are definitly up, because I have seen them, however, when I navigate to link analysis on SEOmoz pro, the number of links isnt showing as more than before. How long does it take to update?
Moz Pro | | CompleteOffice0 -
Only 2 internal links in OpenSite Explorer?
In Open Site Explorer´s tab Full list of Metrics I get for the Page Specific Metrics only 2 Internal Followd Links for one of my websites domain URL. I have checked this metric for other websites and I get some surprising results. Most of the websites get an amount which seems logical take the size of the site into account. But there are a couple of sites more for which I get very low results like only 1 or 2 Internal Followed Links! This is strange because the sites do have at least more than a 100 internal pages which are all linking back to the domain and are indexed in Google. I have checked if there is something strange with the robots.txt or htaccess but I havent found anything. So I wonder if this a failure in Open Site Explorer or can there be any other explanation? Anybody with similar experience? aNp83
Moz Pro | | ceesie0 -
What is domain linking and linking root domains?
Hey guys! I am proud to be a new SEOMOZ Pro member! I ran the report on my site and it says what I am primarily missing is domain linking.... What is this? it says I only have 3 of them for the specific term / subpage I am going after. In addition what does Linking Root Domains mean in my report? Thanks much!
Moz Pro | | shandaman1