How do you diagnose if on your site is only 50% crawled?
-
Good Morning from 7 degrees C, goodbye arctic conditions wetherby UK,
If a site had 100 pages for example & that site was plugged into Webmaster Tools how could you diagnose if all the pages had been crawled?
The thing is I want to learn how to diagnose crawl issues with sites, is their a known methodology for this?
Thanks in advance,
David
-
Thanks Karen
-
Hello from a not-so-cold (but not sunny yet) Bristol.
Remember Google can crawl a page and still not index it - so you probably want to know which pages Google has indexed?
To see how many pages are indexed in Webmaster tools it is useful to look at your sitemaps page - it'll tell you how many pages have been submitted and how many of those have been indexed.
One thing you can do to see which pages are in Google's index is type the below into Google search engine
site:http://......
There is a good thread on this in the webmaster tools forum that you might find useful - http://www.google.com/support/forum/p/Webmasters/thread?tid=378b3cdf485ee3a7&hl=en
Thanks,
Karen
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recommendations on social bookmarking sites
I usually do social bookmarking on stumbleupon, scoop.it, pinterest, bundlr, folkd, diigo, reddit and delicious. Does anyone have any recommendation on other good social bookmarking sites?
Technical SEO | | Armen-SEO0 -
How to stop crawls for product review pages? Volusion site
Hi guys, I have a new Volusion website. the template we are using has its own product review page for EVERY product i sell (1500+) When a customer purchases a product a week later they receive a link back to review the product. This link sends them to my site, but its own individual page strictly for reviewing the product. (As oppose to a page like amazon, where you review the product on the same page as the actual listing.) **This is creating countless "duplicate content" and missing "title" errors. What is the most effective way to block a bot from crawling all these pages? Via robots txt.? a meta tag? ** Here's the catch, i do not have access to every individual review page, so i think it will need to be blocked by a robot txt file? What code will i need to implement? i need to do this on my admin side for the site? Do i also have to do something on the Google analytics side to tell google about the crawl block? Note: the individual URLs for these pages end with: *****.com/ReviewNew.asp?ProductCode=458VB Can i create a block for all url's that end with /ReviewNew.asp etc. etc.? Thanks! Pardon my ignorance. Learning slowly, loving MOZ community 😃 1354bdae458d2cfe44e0a705c4ec38dd
Technical SEO | | Jerrion0 -
No follow links from ban sites
Hello boys and girls. I'm auditing my link profile and came across links pointing to my sites, from a ban site (site: domain.com gives back no result).yet the links are no follow. should i try to remove them all? i know no follow should be enough, yet my links are set in a bad neighborhood. what would you recommend??
Technical SEO | | Tit0 -
Local site under generic domain
Howdy Mozers, We have main website on .com domain and local websites for each language like .es, .fr, .in etc. We decided to move all local sites under main domain .com using subdirectories with gTLDs. One of the local sites has a manual penalty. Right now we are redirecting local site which have penalty using 302 redirect. So my question is. Will 302 redirect hurt our main site? Is there any other way to redirect visitors from local site without passing penalty? We have few thousands monthly users who are still using local domain links to get to our site, so we can't remove redirect at all. Best Regards,
Technical SEO | | juris_l
Juris0 -
Why my site is not indexing in google
In google webmaster i have updated my sitemap in Mar 6th..There is around 22000 links..But google fetched only 5300 links for long time...
Technical SEO | | Rajesh.Chandran
I waited for 1 month till no improvement in google index..So apr6th we have uploaded new sitemap (1200 links totally)..,But only 4 links indexed in google ..
why google not indexing my urls? Is this affect our ranking in SERP? How many links are advisable to submit in sitemap for a website?0 -
Local business with multiple sites
I'm auditing a local business' sites (a spa) and I wanted to run my recommendations by everyone. There are 3 sites: www.sitename1.com -- main store location, used for Google Places listing #1 www.sitename2.com -- 2nd store location, used for Google Places listing #2 www.sitename3.com -- used for product sales for both locations Sitename1.com has the most ranking power. I'm going to recommend that they move sitename2.com and sitename3.com to sitename1.com as subfolders, 301 redirecting each page to the corresponding page on sitename1.com/subfolder. Google Places listing #2 would be changed from www.sitename2.com to www.sitename.com/location2. Any risks or problems with this strategy anyone can see?
Technical SEO | | 540SEO0 -
Crawling and indexing content
If a page element (div, e.g.) is initially hidden and shown only by a hover descriptor or Javascript call, will Google crawl and index it’s content?
Technical SEO | | Mont0