Re-running Crawl Diagnostics
-
I have made a bunch of changes thanks to the Crawl Diagnostics Tool but now need to re-run as I have lost where I started and what still needs to be done. How do I re-run the crawl diagnostic tool?
-
Thanks everyone!
-
Thanks Roberto, you beat me to that answer. The one limitation with that is it's a max of 3000 URLs per crawl, but it is the best way to crawl you site outside of the regular cycle.
-
The dashboard report only runs every 7 days, so you will have to wait until then. If you want the information you can use this to generate the raw data http://pro.seomoz.org/tools/crawl-test or use screaming frog.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved URL Crawl Reports providing drastic differences: Is there something wrong?
A bit at a loss here. I ran a URL crawl report at the end of January on a website( https://www.welchforbes.com/ ). There were no major critical issues at the time. No updates were made on the website (that I'm aware of), but after running another crawl on March 14, the report was short about 90 pages on the site and suddenly had a ton of 403 errors. I ran a crawl again on March 15 to check if there was perhaps a discrepancy, and the report crawled even fewer pages and had completely different results again. Is there a reason the results are differing from report to report? Is there something about the reports that I'm not understanding or is there a serious issue within the website that needs to be addressed? Jan. 28 results:
Reporting & Analytics | | OliviaKantyka
Screen Shot 2022-03-16 at 3.00.52 PM.png March 14 results:
Screen Shot 2022-03-15 at 10.31.22 AM.png March 15 results:
Screen Shot 2022-03-15 at 4.06.42 PM.png0 -
Search Console Crawl Errors/Not Found - Strange URLs
Hello, In Google Search Console under Crawl > Crawl Errors > Not found I have strange URLs like the following: https://www.domain.com//UbaOZ/
Reporting & Analytics | | chuck-layton
https://www.domain.com//UPhXZ/
https://www.domain.com//KaUpZ/WYdhZ/SnQZZ/MOcUZ/ There is no info in Linked From tab. Have you seen this type of error??
Does anyone know whats causing it??
How should it be fixed?? Thanks for reading and the help!0 -
Google Web Master Tools show that my site has been crawled, but search results show old title tags, ect.
Does the index report reflect what will be displayed on Google results? Google seems to be indexing my site every Sunday...
Reporting & Analytics | | 928shopper0 -
Does GWT "Fetch as Google Bot" feature affect crawl rate?
Hello Mozians, I have noticed many people saying using GWT fetch as GoogleBot can affect your crawl rate in future, if used regularly. Though, i am not very sure if this is true or just another stale SEO myth. As currently GWT provides a limit of 500 URLs to fetch every month. I hope my doubts will be cleared by the Moz community experts. Thanks!
Reporting & Analytics | | pushkar630 -
How to get crawled pages indexed?
Hi, I've got over 1k pages crawled but approx 100 pages indexed. Although, i submit them on Google Fetch and the links are indexable,they are not indexed. What shall i do the get max pages indexed? Any input highly appreciated. Thanks!
Reporting & Analytics | | Rubix0 -
Get a list of robots.txt blocked URL and tell Google to crawl and index it.
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list. My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches, One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file. I need urgent recommendation as I do not want to see drop in my traffic any more.
Reporting & Analytics | | csfarnsworth0 -
URL Re-Structure - Tracking success of it
Hi guys, I was wondering what would be the best approach to track the success of a URL restructure? What we plan to do is to implement the URL re-structure slowly by only having it on new pages which go live for property listings. Any previous listings will use the old URL structure. I thought it would be best to limit any potential problems by testing it on a smaller number of pages. So my question really is, what metrics should I be looking at to determine the success of this given the fact that we remove any property listings once they get rented or sold?
Reporting & Analytics | | MarkScully0 -
RE: Google Analytics keywords metric and appropriate keywords
Greetings, When running Google Analytics' keyword report, I see that the over 85% of the top 100 keywords used to find us include a word in our name (Eagle's Nest Foundation and Camp -- with "eagle" or "eagle's nest" being the most frequent) or the name of one of our programs. Does this mean that most folks searching for summer camps in North Carolina already know about us and that we therefore need to optimize for broader keywords, to cast a wider net for folks who don't already know about us? Thanks, Dave
Reporting & Analytics | | DMoff0