Run Crawl Diagnostics
-
hi
i have fix some error refering the error list
how to re-run crawl diagnostics immediate again to check the error ?
thanks
-
Our Crawl Diagnostics at http://pro.seomoz.org/tools/crawl-test will do that for you, up to 3000 URLs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why might my websites crawl rate....explode?
Hi Mozzers, I have a website with approx 110,000 pages. According to search console, Google will usually crawl, on average, anywhere between 500 - 1500 pages per day. However, lately the crawl rate seems to have increased rather drastically: 9/5/16 - 923
Reporting & Analytics | | Silkstream
9/6/16 - 946
9/7/16 - 848
9/8/16 - 11072
9/9/16 - 50923
9/10/16 - 60389
9/11/16 - 17170
9/12/16 - 79809 I was wondering if anyone could offer any insight into why may be happening and if I should be concerned?
Thanks in advance for all advice.1 -
How can I see what Google sees when it crawls my page?
In other words, how can see the text and what not it sees from start to finish on each page. I know there was a site, but I can't remember it.
Reporting & Analytics | | tiffany11030 -
My GWT tells me that verification has failed numerous occasions - will this stop my site being crawled?
I launched www.over50choices.co.uk 6 weeks ago and have had trouble with google indexing and crawling all pages. It tells me 143 submitted & 129 Indexed, but the site has 166 pages? It still shows the old home page image in GWT - which is v annoying! Whilst the site is verified by GA & HTML Tag, it tells me in the Verification section that "reverification failed" on numerous occasions - they seem correspond with when google trys to process the site map. Is this a coincidence ie verification fails when its trying to process the site map, which in turn is leaving me with an out of date site map and therefore not all my pages submitted or crawled? Or will this not effect the googles ability to crawl the site? Your help please. Ash
Reporting & Analytics | | AshShep10 -
Webmaster tools crawl errors
Hi there, iv been tracking my webmaster tools crawl errors for a while now(6 months) and im noticing some pages that are far gone 404 are still poping out on the crawl errors. - that pages have no data for xml linking, and remote linking are from pages that are far gone 404 also. that pages have 404 error page + redirect to homepage, and google still notice them with old cache content. does someone have a clue why is this happening?
Reporting & Analytics | | Or.Shvartz0 -
How to get crawled pages indexed?
Hi, I've got over 1k pages crawled but approx 100 pages indexed. Although, i submit them on Google Fetch and the links are indexable,they are not indexed. What shall i do the get max pages indexed? Any input highly appreciated. Thanks!
Reporting & Analytics | | Rubix0 -
Get a list of robots.txt blocked URL and tell Google to crawl and index it.
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list. My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches, One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file. I need urgent recommendation as I do not want to see drop in my traffic any more.
Reporting & Analytics | | csfarnsworth0 -
Increase number of pages crawled
Only one page is being crawled, how do I increase the number to include most of our site?
Reporting & Analytics | | NorthCoast0 -
Unexplained Crawl Diagnostic Errors & Opencart
Hi, I've been looking at the crawl diagnostics for my site and trying to fix the errors that are showing up but Seomoz is producing some strange results. It's saying pages are duplicated upto 16 times but those pages dont exist. It's adding "page=3", "page=4" to the end of the product URL but I don't see how it's finding those pages, nothing on the site(as far as I can tell) is linking to them. There is no "page=3", just the one product page. Again on the duplicate content it's saying under the "other URLs" there's URLs like "http:///product-a" but again I don't see where it's finding these URLs and obviously those URL's dont work. Those three slashes aren't a typo either. So far I've reduced the amount of errors from 2,005 to 543 but the rest of them I can't make sense of. Also, what does one do when you have two products, eg: "product-a-white" and "product-a-black" to prevent Seomoz from seeing duplicates? Canonical links wont work because there's no parent item, just those two. Google Webmaster tools doesn't seem to have a problem though. Using Opencart 1.5, if it helps. Cheers,
Reporting & Analytics | | AsOneDesign0