Difference between SEOMOZ and Webmaster Tools information
-
Hello,
There is an issue that confuses me and I thought perhaps you will be able to help me shed some light on it.
I have a website which shows 2,549 crawled pages on SEOMOZ and 24,542 pages on webmaster tools!
Obviously there is some technical issue with the site, but my question is: why the vast difference between what the SEOMOZ crawl report and webmaster tools report show?
Thanks!
Guy Cizner
-
Thanks for stepping in everyone, though it looks like we were trying to answer the wrong question. This one is with Roger's crawl of the OP's own site, rather than links indexed in OSE.
Guy, do you have a feel for how many pages SHOULD be in the index? If you only have a couple of thousand pages, then it could be that Google is crawling and indexing some parameters. If you've got 20k+ pages in the index, then Roger isn't finding some things.
Also..are you looking at perhaps just the www.domain subdomain in SEOmoz and is GWT looking at the entire site? If you had a compact www.domain site, but then had forum.domain and wiki.domain, and GWT was reporting pages for all of the subdomains on domain.com, that would explain things too.
-
hello
thanks for all the replies.
the pages crawled are part of an SEO I am running.
How the crawl is done when a campaign is defined?
I assume all the site is being crawled.
thanks
-
This may also shed some light:
Oct 9, 2012 Keri Morgret On-site Community Manager at SEOmoz:
Another reason is that we just don't have the same size server farm that Google and Bing have. We could crawl all of Twitter and get nothing else crawled, or we could crawl some of Twitter, and some of the rest of the web. We aren't able to crawl all of the web, and we release a new index about once a month, so that's why you don't see all of your links or see them right away.
However, what we do offer that is different from Google and Bing is that we show you links for sites that are not your own, we add metrics about the trust and authority of the page, etc.
-
The Mozscape index, as brilliant as it is, can in no way compete with the size of the index that Google can handle.
As a result, your WMT report should always have a bigger amount of pages, links etc crawled. It's just bigger.
-
Either those 'issues' might be the cause. For example incorrect canonicalization that is picked up differently by Google and the SEOmoz bot Roger. Another option could be that Google tries really hard to index each and every page of the web, while Roger has a slightly more restrictive way of crawling the web by only crawling pages above a certain level of authority / only a certain amount of clicks from the homepage etc.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I get coupon information like retailmenot has on the SERPs?
Hello can anyone tell me how I can implement the same tactic that RetailMeNot is using to populate coupon information in the search results? They have below there meta description 4 fields labeled:Coupon Codes: 38, Free Shipping Deals: 6, Best Discount: 20% off, & Total Offers: 49 Is there some schema markup here? Or is this only allowed for RMN I have not seen it elsewhere but want my website coupons page to compete with them in the SERPs. Appreciate your help! dQNkHrb
Technical SEO | | Serenawong1 -
Site Audit Tools Not Picking Up Content Nor Does Google Cache
Hi Guys, Got a site I am working with on the Wix platform. However site audit tools such as Screaming Frog, Ryte and even Moz's onpage crawler show the pages having no content, despite them having 200 words+. Fetching the site as Google clearly shows the rendered page with content, however when I look at the Google cached pages, they also show just blank pages. I have had issues with nofollow, noindex on here, but it shows the meta tags correct, just 0 content. What would you look to diagnose? I am guessing some rogue JS but why wasn't this picked up on the "fetch as Google".
Technical SEO | | nezona0 -
Clarification on indexation of XML sitemaps within Webmaster Tools
Hi Mozzers, I have a large service based website, which seems to be losing pages within Google's index. Whilst working on the site, I noticed that there are a number of xml sitemaps for each of the services. So I submitted them to webmaster tools last Friday (14th) and when I left they were "pending". On returning to the office today, they all appear to have been successfully processed on either the 15th or 17th and I can see the following data: 13/08 - Submitted=0 Indexed=0
Technical SEO | | Silkstream
14/08 - Submitted=606,733 Indexed=122,243
15/08 - Submitted=606,733 Indexed=494,651
16/08 - Submitted=606,733 Indexed=517,527
17/08 - Submitted=606,733 Indexed=517,498 Question 1: The indexed pages on 14th of 122,243 - Is this how many pages were previously indexed? Before Google processed the sitemaps? As they were not marked processed until 15th and 17th? Question 2: The indexed pages are already slipping, I'm working on fixing the site by reducing pages and improving internal structure and content, which I'm hoping will fix the crawling issue. But how often will Google crawl these XML sitemaps? Thanks in advance for any help.0 -
Webmaster tools doesn't pick up 301 redirect
I had a few hundred URLs that died on my site. Google Webmaster Tools notified me about the increase in 404 errors. I fixed all of them by 301 redirecting them to the most relevant page and did multiple header checks to ensure that the 301 has been implemented correctly. Now a few weeks later, Google is giving me the exact same message in Google Webmaster Tools but they are all still 301 redirected. WTF?
Technical SEO | | DROIDSTERS0 -
Google webmaster tools says access denied for 77 urls
Hi i am looking in google webmaster tools and i have seen a major problem which i hope people can help me sort out. The problem is, i am being told that 77 urls are being denied access. The message when i look for more information says the below Googlebot couldn't crawl your URL because your server either requires login to access the page, or is blocking Googlebot from accessing your site. the responce code is 403 here is a couple of examples http://www.in2town.co.uk/Entertainment-Magazine http://www.in2town.co.uk/Weight-Loss-Hypnotherapy-helped-woman-lose-3-stone i think the problem could be that i have sent them to another url in my httaccess file using the 403 re-direct but why would it bring up that google bot could not crawl them any help would be great
Technical SEO | | ClaireH-1848860 -
Tool?
Hi mozzers, I was wondering if theres anything out there that would crawl a site and sort your pages into the number of words they have?
Technical SEO | | PeterM220 -
Internal Link Counts in SEOMoz Report?
Hi, We ran a site diagnostic and it came back with thousands of pages that have more than 100 internal links on a page; however, the actual number of links on those pages seems to be far less than what was reported. Any ideas? Thanks! Phil UPDATE: So we've looked at the source code and realized that for each product we link to the product page in multiple ways - from the product image, product title and price. So we have three internal links to the same page from each product listing, which is being counted by the SEOMoz crawler as hundreds of links on each page. But in terms of the Googlebot, is this as egregious as having hundreds of links to different pages or does it not matter as much?
Technical SEO | | beso1