Screaming Frog - What are your "go to" tasks you use it for?
-
So, I have just purchased screaming frog because I have some specific tasks that need completing. However, looking at Screaming Frog generally, there is so much information I was wondering for those who use it, what are the top key tasks you use it for. I mean what are your "go to" things you like to check, that perhaps are not covered by the Moz Crawl reports.
Just looking for things I perhaps hadn't thought about, that this might be useful for.
-
Ha ha, I know! It's like giving the developers a little present all wrapped up with a bow...here's the problem, and here's where to fix it
-
Allie,
That's a great example use-case. After my audits, clients are like "you found thousands of internal redirects and 404s - where are they?"
I'm like - hold on I have a spreadsheet of that!
-
I love Screaming Frog! One use case I've used recently is using it to find internal 404 errors prior-to and immediately-after a major site redesign.
After running a crawl, go to Bulk Export > Response Code > Client error (4xx) Inlinks and download the report. It shows the offending URL and the URL referring to it, which makes it easier to update the bad link.
I also have this page bookmarked, and it's my go-to guide:
-
It's one of the best tools so I feel like I use it "for everything." But some includes:
-
Title / meta duplication & finding parameters on ecomm stores
-
Title length & meta desc length
-
Removing meta keywords fields
-
Finding errant pages (anything but 200, 301, 302, or 404 status code)
-
Large sitemap export (most tools do "up to 500 pages." Useless.)
-
Bulk export of external links (what ARE we linking to??)
-
Quickly opening a page in Wayback Machine or Google cache
-
Finding pages without Analytics, as was mentioned.
I use Screaming Frog for tons of other things. Finding the AJAX escaped frag URL, identifying pages with 2 titles, 2 canonicals, 2 H1 tags, etc. Even seeing www & non-www versions live, links to pages that shouldn't be linked and http vs https.
Very cool tool - useful for pretty much everything! haha
-
-
That's awesome. Thanks. Will take a look at all those things this week.
-
I use SF religiously for all the audit work I do. I run a sample crawl (using Googlebot as the crawler) to check for all the standard stuff and go further.
My standard evaluation with SF includes:
- Redirect / dead end internal linking
- Redirect / dead end "external" links that point to site assets housed on CDN servers.
- URL hierarchical structure
- Internal linking to both http and https that can reinforce duplicate content conflicts
- Page Title/H1 topical focus relevance and quality
- Confusion from improperly "nofollowing" important pages (meta robots)
- Conflicts between meta robots and canonical tags
- Slow page response times
- Bloated HTML or image file sizes
- Thin content issues (word count)
- Multiple instances of tags that should only have one instance (H1 headline tags, meta robots tags, canonical tags)
-
That crawl path report is pretty cool, and it led me to the redirect chain report, which I have a few issues to resolve with that with a few multiple redirects on some old links. Fantastic stuff.
-
I am a big fan of Screaming frog myself. Apart from the real basic stuff (checking H1, titles,...etc) it's also useful to check if all your pages contain your analytics tag and to check the size of the images on the site (these things Moz can't do).
It's also extremely useful when you're changing the url structure to check if all the redirects are properly implemented.
Sometimes you get loops in your site, especially if you use relative rather than absolute links on your site - Screaming Frog has an extremely helpful feature: just click on the url and select "crawl path report" - which generates an xls which shows the page where the problem originates
It's also very convenient that you can configure the spider to ignore robots.txt / nofollow / noindex when you are test a site in a pre-production environment. Idem for the possibility to use regex to filter some of the url's while crawling (especially useful for big sites if the they aren't using canonicals or noindex where they should use it)
rgds,
Dirk
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I "No Index" Certain Pages On My Site?
I have some pages on my site that don't really have any content other than some iframes that are embedded from another site. I thought it would be best to tag the page with a no-index so that search engines would leave the page alone since it has no content as far as the search engine can tell (but does provide value to my site visitors). Is this the proper approach or does it do more harm than good?
On-Page Optimization | | Kyle Eaves0 -
SEO can id and class be used in H1?
Can ID and class be used in my H1 tag. I realize best case would be to change it, but it's going to require a change order from the ecommerce company to fix their sloppy code. Will this hurt seo? Example:
On-Page Optimization | | K-WINTER0 -
Does anyone use Genesis Framework? If so can a newbie use it and a few other questions
Hi, So as I search the wonderful land of the internet, I see this Genesis framework brought up quite a bit. I have researched it for a few weeks, but it seems like it uses hooks instead of shortcodes. So I am curious if anyone has used it? And if so what your thoughts are about it? I am a COMPLETE newbie here, so hooks look scary. I am sure with time they will seem like second nature. They claim it has airtight security. So if you have used this framework, how is this any different than an updated stock wordpress site? I understand that vulnerabilities may be in plugins and such, but if it is really airtight, that seems great. Any thoughts are appreciated as I just want the best user experience. So many people use this framework, yet my site gets if I'm lucky 1000 views each month. It is a basic site to let people know we exist. So its not like I have a popular blog with 50,000 pageviews each month. But... going into the future, I want a pleasant and consistent user experience. Maybe a wordpress theme is all you need. Maybe a framework is more for developers. Any thoughts are greatly appreciated. Chris
On-Page Optimization | | asbchris0 -
Should "white label" sites be unique IP addresses?
My company is planning "white label" subsites with unique URLs. Should these sites be unique IPs in order to use them for link building?
On-Page Optimization | | theLotter0 -
Appropriate Use of Rel Canonical
When I'm checking my page on SEOmoz should I use http://www. or http:// or www. or just keyword.com? And I get this for my check Appropriate Use of Rel Canonical Moderate fix <dl> <dt>Canonical URL</dt> <dd>XXX</dd> <dt>Explanation</dt> <dd>If the canonical tag is pointing to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. Make sure you're targeting the right page (if this isn't it, you can reset the target above) and then change the canonical tag to reference that URL.</dd> <dt>Recommendation</dt> <dd>We check to make sure that IF you use canonical URL tags, it points to the right page. If the canonical tag points to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. If you've not made this page the rel=canonical target, change the reference to this URL. NOTE: For pages not employing canonical URL tags, this factor does not apply.</dd> <dd>I have absolutely NO idea what this means 😞
On-Page Optimization | | 678648631264
</dd> </dl>0 -
If i only want to rank for one specific keyword and use it in all my page titles, will it negatively affect my rankings?
If i want to rank highest for one specific keyword (virtualization management, for example) and use that keyword in all the titles on my website, will that negatively affect my search rankings? SEOmoz is telling me that i should use unique titles for my different pages to ensure that they describe each page uniquely and don't compete with each other for keyword relevance.
On-Page Optimization | | foonista0 -
Should H1s be used in the logo? If they are and it is dynamic on each page to relate to the page content, is this detrimental to the site rather than having it in the page content?
On some sites, the H1 is contained within the logo and remains consistent throughout the site (i.e. the company name is in the of the logo). If the h1 in a logo is dynamic for each page (i.e. on the homepage it is company name - homepage) is this better or worse to have it changed out on the logo rather than having it in the page content?
On-Page Optimization | | CabbageTree0 -
Are xml sitemaps still in use today?
Hi, Are you still using XML sitemaps today? If yes, does it bring any benefit to your website like faster indexing of webpages or better rankings? Are you using special features like video sitemaps or sitemap index files? Best regards, Tobias
On-Page Optimization | | Tobiask-1215731