Why aren't certain links showing in SEOMOZ?
-
Hi, I have been trying to understand our page rank and domains that are linking to us. When I look at the list of linking domains, I see some bigger ones are missing and I don't know why. For example, we are in the Yahoo Directory with a link to trophycentral.com, but SEOMOZ is not showing the link. If SEOMOZ is not seeing it, my guess is Google is not either, which concerns me. There are several onther high page rank domains also not showing. Anyone have any idea why? Thanks!
BTW, our domain is trophycentral.com
-
Thanks - I just looked up our site (www.trophycentral.com) on Google and see many of the missing likes. My guess is that it is mostly a timing issue. SEOMOZE is helpful because I can see the ratings and overall estimate, so I guess over time I will look at a few sites.
-
This is exactly what i do and have found the same result. In fact, at this moment in time well over half my links don't show in SEOMOZ
-
Thanks! I checked Google and most of them are there! ... Neil.
-
SEOMoz's link data is the best available in my opinion but you may want to consider looking at other sources as well like Majestic SEO and Google Webmaster Tools to supplement the SEOMoz data. Sometimes these other sources find links that SEOMoz doesn't have in their index.
-
Thank you!
-
Thank you!
-
SEOMoz database of links isn't exhaustive and also is only updated once a month or so, so I wouldn't sweat it too much if the links aren't showing up in there.
You can check back links in Google Webmaster Tools, personally i would use WMT over SEOMoz for this kind of info as after all, its Google's data that determines your rankings.
-
SEOMOZ crawls and updates their link index on a schedule you can see here:
http://apiwiki.seomoz.org/w/page/25141119/Linkscape Schedule
If this is a fairly new link, it may not have gotten indexed yet.
Also, note that page rank may have little or nothing to do with MozRank. Toolbar pagerank is notoriously inaccurate, and could be up to 6 months old. Plus, Toolbar pagerank doesn't reflect all the factors Google uses in PR calculation. And (as if that weren't enough) toolbar PR is based on a sort of Richter Scale - so a jump from, say, 4 to 5 could reflect a HUGE change, or a tiny one.
We normally focus on MozRank, instead.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Redirecting .edu subdomains to our site or taking the link, what's more valuable?
We have a relationship built through a service we offer to universities to be issued a .edu subdomain that we could redirect to our landing page relevant to that school. The other option is having a link from their website to that same page. My first question is, what would be more valuable? Can you pass domain authority by redirecting a subdomain to a subdirectory in my root domain? Or would simply passing the link equity from a page in their root domain to our page pass enough value? My second question is, if creating a subdomain with a redirect is much more valuable, what is the best process for this? Would we simply have their webmaster create the subdomain for us an have them put a 301 redirect to our page? Is this getting in the greyer hat area? Thanks guys!
Technical SEO | | Dom4410 -
Can't understand poor rankings
Hi Guys Our site gets an A for on page optimisation and has much more backlinks and content than our competitors yet we rank no where for majority of keywords Please Help! Mike
Technical SEO | | MikeAquaspresso0 -
Better to Remove Toxic/Low Quality Links Before Building New High Quality Links?
Recently an SEO audit from a reputable SEO firm identified almost 50% of the incoming links to my site as toxic, 40% suspicious and 5% of good quality. The SEO firm believes it imperative to remove links from the toxic domains. Should I remove toxic links before building new one? Or should we first work on building new links before removing the toxic ones? My site only has 442 subdomains with links pointing to it. I am concerned that there may be a drop in ranking if links from the toxic domains are removed before new quality ones are in place. For a bit of background my site has a MOZ Domain authority of 27, a Moz page authority of 38. It receives about 4,000 unique visitors per month through organic search. About 150 subdomains that link to my site have a Majestic SEO citation flow of zero and a Majestic SEO trust flow of zero. They are pretty low quality. However I don't know if I am better off removing them first or building new quality links before I disavow more than a third of the links to the site. Any ideas? Thanks,
Technical SEO | | Kingalan1
Alan0 -
Link Diversity
With the current updates in the Seo world how critical is link diversity. We are revamping our site and planning to add many new pages to our site and planning to build links to relevant pages with relevant anchor texts keywords. Also we are planning to add relevant H1, H2 and H3 tags with metatag description and content with keyword rich content specific to that page. Any advise
Technical SEO | | INN0 -
If I get a natural link for a great site and I have my keyword with anchor text in this link, how should I proceed?
If I get a natural link for a great site and I have my keyword with anchor text in this link, how should I proceed? I need to contact the site and ask to remove the link or request the removal of the anchor text and leave only the site URL? Or yet do not I need to worry about this issue?
Technical SEO | | soulmktpro0 -
OpenSite Explorer doesn't show Twitter
When I type competitors sites into opensiteexplorer it shows their Twitter page as one of the tops links back. However, my site, our Twitter page isn't even on the links back. Does twitter even though no follow hold any value? Why can't opensite explorer see it? If OSE can't see it then do you think Google isn't?
Technical SEO | | PhotoGazza0