Site crawl errors - download list of all urls
-
Hi
Ive provided my clients developers with the pdf reports of crawl errors but these seem to miss some urls
I see there are lots of csv file download/email options
Will the email csv button send a report of everything listing all urls that are missing from the pdfs ? if not will the more specific csv reports
Would be good if i can press 1 button and get all issues listed with all urls
It does look like this happens but i just want confirmed best way asap since need to provide reports urgently, any guidance much appreciated ?
All Best
Dan
-
You are welcome! I know the "manual" method often takes the longest, but in reality, it is often the most accurate. Hope this helps!
-
thanks David !
-
I have tried both options before, and tend to see the CSV document be the more reliable of the two. I have seen the same thing as you, where the PDF seems like it leaves out information. Unfortunately, a manual check would be required to make sure that all are included.
I always have the team download them first, double check for errors, them email them from our company email address. Makes it a bit more personal that way, rather then it being emailed directly from analytics.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site crawl warning - concatenated urls from Wordpress
I could use some help on how to fix this. I asked at the walkthrough but was told it was a Wordpress issue but so far I can't find anything to point me in the right direction. There are no errors in the files on server side and I have asked my hosting company too. I am hoping someone here may be able to shed some light on it. One of my websites it giving 404 errors on links that are formed as below and there are over 12.7K of them! Example: <mydomainurl>/www.instagram.com/www.instagram.com/<instagram username=""></instagram></mydomainurl> The link that relates to my website is valid and working, but I don't understand the rest. I am totally stumped on how to move forward with this. Any advice, suggestions, tips on how to fix these errors and stop these types of links getting generated. Thanks.
Moz Bar | | emercarr0 -
Moz Crawl Report Increase in Errors?
Has anyone else noticed a huge increase over the past couple weeks in crawl issues in their dashboards? Without being able to see historical data week over week, I can't tell what's been added. Is this some update with the tool? I'm not seeing any health issues with this feature on the Moz Health page, it just seems strange that I'm seeing this across all our accounts.
Moz Bar | | WWWSEO0 -
Moz crawl issues: All pages keep resolving to our "cookies not enabled" page
Upon running the Moz Pro site crawler, I noticed that I received quite a bit of duplicate titles along with 302 redirects (which is our site creating a temporary 302 to our "cookies not enabled" page). How would I get around the crawler being redirected to this page? I've never ran across this issue before, despite using the crawler with sites that use the same framework as the one thats affected. Any ideas?
Moz Bar | | responsivelabs0 -
I requested a new crawl, this was done but my dashboard only shows the crawl done last week?
We recently moved our old website to a new CMS and structure. there have been some configuration errors and I needed to make some changes with things like canonical url's etc. However I need to check if these changes have made a difference and requested a new crawl through the crawl test page. I was emailed each time that a new crawl had been done but my reporting and dashboards still only show data from the last scheduled crawl. Regards Chris
Moz Bar | | LRQA-Marketing0 -
Moz Dupe content crawl anomaly
Hi Moz has completed a crawl for a site i'm working on which also has a development area (hence with lots of dupe content) on a sub domain (and this dev area hasn't been hidden from crawlers via password, robots, gwt etc etc). Moz dupe content report is not showing any of these urls though even though my campaign setting is on 'root' domain so i would have thought report should be listing the subdomain urls as dupe content (because they are dupe content). Any ideas ? Cheers Dan
Moz Bar | | Dan-Lawrence0 -
How do you stop Moz crawling a page?
Hello, I have a contact form which generates thousands of duplicate crawl errors. I'm going to use to block Google indexing these pages. Will this also block MOZ from crawling these pages and displaying the error? Thanks!
Moz Bar | | Seaward-Group0