Screaming Frog - What are your "go to" tasks you use it for?
-
So, I have just purchased screaming frog because I have some specific tasks that need completing. However, looking at Screaming Frog generally, there is so much information I was wondering for those who use it, what are the top key tasks you use it for. I mean what are your "go to" things you like to check, that perhaps are not covered by the Moz Crawl reports.
Just looking for things I perhaps hadn't thought about, that this might be useful for.
-
Ha ha, I know! It's like giving the developers a little present all wrapped up with a bow...here's the problem, and here's where to fix it
-
Allie,
That's a great example use-case. After my audits, clients are like "you found thousands of internal redirects and 404s - where are they?"
I'm like - hold on I have a spreadsheet of that!
-
I love Screaming Frog! One use case I've used recently is using it to find internal 404 errors prior-to and immediately-after a major site redesign.
After running a crawl, go to Bulk Export > Response Code > Client error (4xx) Inlinks and download the report. It shows the offending URL and the URL referring to it, which makes it easier to update the bad link.
I also have this page bookmarked, and it's my go-to guide:
-
It's one of the best tools so I feel like I use it "for everything." But some includes:
-
Title / meta duplication & finding parameters on ecomm stores
-
Title length & meta desc length
-
Removing meta keywords fields
-
Finding errant pages (anything but 200, 301, 302, or 404 status code)
-
Large sitemap export (most tools do "up to 500 pages." Useless.)
-
Bulk export of external links (what ARE we linking to??)
-
Quickly opening a page in Wayback Machine or Google cache
-
Finding pages without Analytics, as was mentioned.
I use Screaming Frog for tons of other things. Finding the AJAX escaped frag URL, identifying pages with 2 titles, 2 canonicals, 2 H1 tags, etc. Even seeing www & non-www versions live, links to pages that shouldn't be linked and http vs https.
Very cool tool - useful for pretty much everything! haha
-
-
That's awesome. Thanks. Will take a look at all those things this week.
-
I use SF religiously for all the audit work I do. I run a sample crawl (using Googlebot as the crawler) to check for all the standard stuff and go further.
My standard evaluation with SF includes:
- Redirect / dead end internal linking
- Redirect / dead end "external" links that point to site assets housed on CDN servers.
- URL hierarchical structure
- Internal linking to both http and https that can reinforce duplicate content conflicts
- Page Title/H1 topical focus relevance and quality
- Confusion from improperly "nofollowing" important pages (meta robots)
- Conflicts between meta robots and canonical tags
- Slow page response times
- Bloated HTML or image file sizes
- Thin content issues (word count)
- Multiple instances of tags that should only have one instance (H1 headline tags, meta robots tags, canonical tags)
-
That crawl path report is pretty cool, and it led me to the redirect chain report, which I have a few issues to resolve with that with a few multiple redirects on some old links. Fantastic stuff.
-
I am a big fan of Screaming frog myself. Apart from the real basic stuff (checking H1, titles,...etc) it's also useful to check if all your pages contain your analytics tag and to check the size of the images on the site (these things Moz can't do).
It's also extremely useful when you're changing the url structure to check if all the redirects are properly implemented.
Sometimes you get loops in your site, especially if you use relative rather than absolute links on your site - Screaming Frog has an extremely helpful feature: just click on the url and select "crawl path report" - which generates an xls which shows the page where the problem originates
It's also very convenient that you can configure the spider to ignore robots.txt / nofollow / noindex when you are test a site in a pre-production environment. Idem for the possibility to use regex to filter some of the url's while crawling (especially useful for big sites if the they aren't using canonicals or noindex where they should use it)
rgds,
Dirk
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can access my site using www
Hello, when I try to access my website using www i would like it to redirect to non www but instead it shows a sal error message.
On-Page Optimization | | Voopoo2 -
Should I Use A Code For Last Updated Blog Posts
Hi, I have a quick question about updating blog posts. When I add "Last Updated" to an updated post should this be in any particular type of code or simply just text, perhaps with the tag? Thanks An Advance
On-Page Optimization | | KNpaul0 -
We use Bigcommerce and want to make changes to our mobile site
Our current mobile site does not show the banners we use on the desktop site. Can anyone tell us how we can change the mobile site to show these banners.
On-Page Optimization | | CostumeD0 -
Should I be using the town or city in url with my keyword or keyphrase?
should I be using the town or city in url with my keyword or keyphrase? So lets say I'm trying to rank for butchers in home town should i put the town in the url as well so www.website.com/butchers-in-mytown is that bad? Or would it be best to just put www.website.com/butchers?
On-Page Optimization | | genkee0 -
A "show all" category for products resulting in to many on-page links
I've got reports from my seomoz pro campaign that I have more than 100 on-page links on a page of my ecommerce store. This page is a "show all" category displaying ALL products from ALL my categories on the site. So it is NOT a "show all" for displaying all products in a certain category on one page instead of having to click through page 1, page 2 etc. What I don't clearly understand is why I get this from the reports, as it does not display all products in one single page. What it does is gathering all products from all categories in one place, but instead of showing all products in one page it is divided into pages 1 - 13. What should I do to resolve this? Could it be the seomoz campaign giving me an incorrect result? Appreciate you taking the time to help! Thank you.
On-Page Optimization | | danielpett0 -
Appropriate Use of Rel Canonical
Hello, in on page report card , for a kyeword: armadi portafucili blindati URL: http://www.bighunter.net/shop/searchresult.seam?codiceSettoreSel=CACCIA&codiceCategoriaSel=Armadi Blindati&codiceSottoCategoriaSel=Linea Legno DeLuxe&codiceMarcaSel=SILMEC i have a Critical Factor that don't undestand. It 's not ok "appropiate Use of Rel Canoncal, but in my page i have <link href="http://www.bighunter.net/shop/searchresult.seam?codiceSettoreSel=CACCIA&codiceCategoriaSel=Armadi Blindati&codiceSottoCategoriaSel=Linea Legno DeLuxe&codiceMarcaSel=SILMEC" rel="canonical"> and the link is the same of the url . I don't undestand where is the problem . Who can help me? Best Regards Luca
On-Page Optimization | | lbecarelli0 -
How do I get rid of duplicate page titles when using a php site?
Hi. I have an e-commerce site that sells a list of products. The list is divided into categories and then those categories for the various pages on the site. An example of a page title. would be given root/products.php?c=40 another page would be given root/products.php?c=41 Is there a way to structure the site with SEO in mind?
On-Page Optimization | | curtisgibbsiii0 -
Should I use my blog posts in a sub folder
Ok I did a search and didn't see an answer to this exact question. Most of them were about if a blog should be in a sub folder and not the blog posts themselves... so here it goes. I have a blog on my website the blog itself is in /blog/ but the blog posts themselves are situated in the root folder so it looks something like mydomain.com/cool-seo-blog-post/ Is there any reason I should change this and make it read mydomain.com/blog/cool-seo-blog-post/
On-Page Optimization | | jaybrn10