Non-Recognition of Links
-
Hi All,
I asked about a client last month and have had to do some other digging to try to find out what's going on with its Google rankings.
According to our link-building spreadsheet, we have got up to 50 links (from 50 domains) in process of being actioned and a large proportion of these are actually in existence.
There are two questions:-
1. Open Site Explorer only recognises 3 domains - as I know that other domains exist and are pointing (mostly 'followed'), what can be the reason OSE doesn't recognise this?
2. What can be done to encourage these external links to be more easily accessible by OSE and, presumably other bots?
Other Points:-
1. I initially thought it might be crawl blocking issue causing the rankings, but Bing/Yahoo rankings are slowly dragging themselves upwards.
2. Robots.txt is not blocking any of the site
3. Pro on-site analysis for the target keyword is 'A'
4. The website's stats per OSE are better than some competitors in the top 20 except on the root domain issue, which is why the above point is important.
Link building for other clients has worked really well without hiccups and with general gradual recognition, so any tips from more experienced folks out there would be gratefully appreciated.
Many thanks,
Martin
-
Hi Martin,
You might find it useful to take a look at the Linkscape Update Schedule in case timing is a factor.
I believe Rand outlined the recent changes to the indexing rationale in this webinar
Using Open Site Explorer to Uncover New Marketing Opportunities , but if you still have questions then as Brian suggested, it may be a good idea to lodge a ticket or email the Help Team. help [at] seomoz.org.
Hope that helps,
Sha
-
Hey, if the page you got the link on was interesting enough that you wanted to get a link on it, than what harm is there in letting the world know about that resource via Twitter, Facebook, or whatever other service you choose? ... and if it's not worth talking about, or you would be embarrassed to speak of it, than how "quality" was that link anyway
On the OSE Catch 22, gotcha... all I can think of is that perhaps the low quality sites are not always re-crawled with the update, thus not picking up the new links?. An SEOmoz staffer with intimate knowledge of the crawl behavior could better answer that one though
Brian
-
Hi guys,
Thanks for the feedback so far and I will be definitely checking GWT and maybe even tweeting out the links. I did think that seemed a little bit... you know, false - but I guess it's just ensuring Google takes note of the actual page? What do people think? I'm unwilling to Facebook them out, because that's even more 'in your face' and I'm unwilling to SPAM out 50 domains just to get them indexed. Advice welcomed on these points.
@Brian - yes, I suppose they could be coming from lower domains, but equally many have been pulled from competitor link data from OSE, so catch 22?
@Theo - I will double-check
@Ross - firmly NO to Black Hat. I don't do this ANYWAY, but equally something's screwing up the SEO anyway, so going down that route could permanently jeopardise the site and that's not what the client's paying for.
-
Like Theo said, I would start with Webmaster Tools ( Links to your site > All domains area ), if they are in there, Google knows about them, and if they have any value to pass though that link, they are passing it.
One other quick note, if you know those pages you are getting links from are all index and follow pages, you may want to just double check to see if they have been indexed ( Google search for site:www.the-exact-domain.com/and-page-url.html ), if you get no results back, then you know those pages are not in the index (not found yet, or otherwise dropped).
On the OSE thing, if I am remembering this correctly Rand said something about how they were focusing the crawl, pulling in less low quality sites - could it be that the domains you are getting links from are low quality?
Brian
-
Hi Martin,
Although OSE is an awesome tool it is still in its infancy and may not have the capacity to crawl the links you are talking about. Another way to check the links is to have a look via majestic SEO, they having a much bigger index than OSE and tend to show a good deal more links.
I would also have a look at the Google Webmaster Tools and see if the links are present in there.
If you are worried about the links being crawled and indexed by google then take the URL and run it through Google itself with the site: command. If it does not turn up there is a chance it may not be indexed. If it is not indexed I believe that the site: command that returns no values must send a Google bot to the URL to crawl it- can't confirm this is true but it just makes good sense.
If you want to be doubly sure you links are getting crawled you can force a crawl by Google by bookmarking the page thorugh a bookmarking service or sharing it in a social network.
WARNING, MESSY BLACK HAT TACTICS COMING UP*****
And if you really want to give it a good ole kick up the jaxie you can load up a automatic bookmarking tool and bookmark the URL with you link on over a couple hundred domains. Problems with this method include:
- need to buy spammy software like bookmark demon
- you are in effect creating a link wheel which may devalue your efforts
- it sticks out like a sore thumb
- links on bookmarking sites drop of the link graph or get devalued very quickly..
However, the positives that come out of this technique are you link will be crawled and indexed and it will have another couple of hundred links pointing at it....... for a while.
If you are working with a client I would recommend just running it through facebook or tweeting out the link and stay away from forcing any crawls. However, if it is the middle of november and you have a christmas shop that needs to rank quickly get that black hat on.
Hope that helps.
-
Hi Martin,
Although OSE is an awesome tool it is still in its infancy and may not have the capacity to crawl the links you are talking about. Another way to check the links is to have a look via majestic SEO, they having a much bigger index than OSE and tend to show a good deal more links.
I would also have a look at the Google Webmaster Tools and see if the links are present in there.
If you are worried about the links being crawled and indexed by google then take the URL and run it through Google itself with the site: command. If it does not turn up there is a chance it may not be indexed. If it is not indexed I believe that the site: command that returns no values must send a Google bot to the URL to crawl it- can't confirm this is true but it just makes good sense.
If you want to be doubly sure you links are getting crawled you can force a crawl by Google by bookmarking the page thorugh a bookmarking service or sharing it in a social network.
WARNING, MESSY BLACK HAT TACTICS COMING UP*****
And if you really want to give it a good ole kick up the jaxie you can load up a automatic bookmarking tool and bookmark the URL with you link on over a couple hundred domains. Problems with this method include:
- need to buy spammy software like bookmark demon
- you are in effect creating a link wheel which may devalue your efforts
- it sticks out like a sore thumb
- links on bookmarking sites drop of the link graph or get devalued very quickly..
However, the positives that come out of this technique are you link will be crawled and indexed and it will have another couple of hundred links pointing at it....... for a while.
If you are working with a client I would recommend just running it through facebook or tweeting out the link and stay away from forcing any crawls. However, if it is the middle of november and you have a christmas shop that needs to rank quickly get that black hat on.
Hope that helps.
-
The fact that OSE doesn't pick up a link doesn't necessarily mean a link isn't 'active' and giving your site value. Even though Linkscape captures a vast amount of URLs, it only crawls a portion of the web, most likely from the bigger pages down. If many of these links to your site are coming from smaller / less powerful domains, they might not (yet) have been picked up by Linkscape.
Try looking at Google Webmaster central to see if the links are included there. If Google lists them as links there, they are very likely to be counted by them as well.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to know exactly which page links to a 404 page on my website?
Hi Moz users, Sometimes I get 404 crawl errors using Moz Pro and when my website has a few dozen pages it is hard for me to find the original page that links to a 404 page. Is there a way to find this automatically using Moz or do I have to look for it manually? I just need to find the original link and delete it to fix my 404 issue. Please let me know thank you for you help. -Marc
Moz Pro | | marcandre0 -
Question on redirect and internal link count
After crawl I saw that dup content came from domain.com/index.php and domain.com. I was going to redirect one of them but im curious which way to go. The /index.php has domain auth of 19, the other 17, so not big difference. Domain.com has 2 external linking domains and index.php has 1. The big difference is in the internal link counts where index.php has 76 and the other only 4. So should I direct index.php to / or the other way around?
Moz Pro | | satoridesign0 -
How to download domain authority of bulk links using paid seomoz tool?
i would like to download domain authority of 50 domains at once. Is it possible with seomoz tool? If yes, can some one tell me how to do that?
Moz Pro | | Sasank0 -
Why does google not recognize inbound links from bbb.org?
After using site explorer I noticed my companys links for bbb.org are not included. They are for our competitors but not us. Anyone know why? Thanks, Cole
Moz Pro | | coleda0 -
Issue getting total links, page & domain authority
Hi guys, I am trying to get total links, page & domain authority using the API. I am requesting the following columns: Cols=6871947673632768328204816384343597383681653687091214 { "fjid": 207343179, "ued": 43324279, "pib": 255645, "ptrr": 0.0056131743357352125, "fmrp": 8.246626591590841, "unid": 954915, "fjf": 4003651, "fjr": 0.00040067116628622016, "ftrp": 8.308303969566644, "ftrr": 0.0012189619975325583, "fejp": 9.265328830369816, "pnid": 45883246, "fjd": 2480265, "ujfq": 1277385, "pjip": 1230240, "fjp": 9.586342983782004, "fuid": 294877628, "uu": "www.google.com/", "pejr": 0.0004768398971363439, "ufq": "www.google.com/", "pejp": 9.647424778525615, "ujp": 300689, "utrp": 7.916901429865898, "ptrp": 9.487254666203722, "utrr": 0.001639219985667878, "fmrr": 0.000731592927123369, "pda": 100, "pjd": 5600882, "ulc": 1342758719, "fnid": 12165784, "fejr": 0.00016052965883996156, "ujb": 107264 } I cannot see the UPA column returned in the JSON object. Im using 34359738368 for the UPA column. I need to retrieve the three fields (page authority, domain authority and total links) in the same query. Is it possible?
Moz Pro | | Srvwiz0 -
Problem with advanced linking domains report in OSE
Hi everyone, I request an advanced linking domains report in Open site explorer, but my csv file have only three columns as a normal linking domains report : Root Domain, Domain Authority, Number of Linking Root Domains. I have missed some important data as: anchor text, origin, target url Etc I try to request the same report two time, but I have the same problem. Someone have experienced this issue and know the fix ? thanks
Moz Pro | | wwmind0 -
Inbound Links To Deleted Pages
Hi, I recently deleted some pages from my website and believe that there will be external inbound links pointing to these pages. I would like to find them and put redirects in place - can anybody tell me how to use SEOMOZ to find where external links are poiting to moved/deleted pages Thanks
Moz Pro | | stayin1 -
Why does new Keywrod tool show incorrect domain linking to root domain count?
Why does new Keywrod tool show incorrect domain linking to root domain count? Why does the "Root Domain Linking Root Domains" column show "4,764" for Homefinder.com/IL/Chicago yet when I run OpenSiteExplorer for homefinder.com it shows "2,963" Linking root domains. Why are these two numbers different?
Moz Pro | | homefinder0