Screaming Frog - What are your "go to" tasks you use it for?
-
So, I have just purchased screaming frog because I have some specific tasks that need completing. However, looking at Screaming Frog generally, there is so much information I was wondering for those who use it, what are the top key tasks you use it for. I mean what are your "go to" things you like to check, that perhaps are not covered by the Moz Crawl reports.
Just looking for things I perhaps hadn't thought about, that this might be useful for.
-
Ha ha, I know! It's like giving the developers a little present all wrapped up with a bow...here's the problem, and here's where to fix it
-
Allie,
That's a great example use-case. After my audits, clients are like "you found thousands of internal redirects and 404s - where are they?"
I'm like - hold on I have a spreadsheet of that!
-
I love Screaming Frog! One use case I've used recently is using it to find internal 404 errors prior-to and immediately-after a major site redesign.
After running a crawl, go to Bulk Export > Response Code > Client error (4xx) Inlinks and download the report. It shows the offending URL and the URL referring to it, which makes it easier to update the bad link.
I also have this page bookmarked, and it's my go-to guide:
-
It's one of the best tools so I feel like I use it "for everything." But some includes:
-
Title / meta duplication & finding parameters on ecomm stores
-
Title length & meta desc length
-
Removing meta keywords fields
-
Finding errant pages (anything but 200, 301, 302, or 404 status code)
-
Large sitemap export (most tools do "up to 500 pages." Useless.)
-
Bulk export of external links (what ARE we linking to??)
-
Quickly opening a page in Wayback Machine or Google cache
-
Finding pages without Analytics, as was mentioned.
I use Screaming Frog for tons of other things. Finding the AJAX escaped frag URL, identifying pages with 2 titles, 2 canonicals, 2 H1 tags, etc. Even seeing www & non-www versions live, links to pages that shouldn't be linked and http vs https.
Very cool tool - useful for pretty much everything! haha
-
-
That's awesome. Thanks. Will take a look at all those things this week.
-
I use SF religiously for all the audit work I do. I run a sample crawl (using Googlebot as the crawler) to check for all the standard stuff and go further.
My standard evaluation with SF includes:
- Redirect / dead end internal linking
- Redirect / dead end "external" links that point to site assets housed on CDN servers.
- URL hierarchical structure
- Internal linking to both http and https that can reinforce duplicate content conflicts
- Page Title/H1 topical focus relevance and quality
- Confusion from improperly "nofollowing" important pages (meta robots)
- Conflicts between meta robots and canonical tags
- Slow page response times
- Bloated HTML or image file sizes
- Thin content issues (word count)
- Multiple instances of tags that should only have one instance (H1 headline tags, meta robots tags, canonical tags)
-
That crawl path report is pretty cool, and it led me to the redirect chain report, which I have a few issues to resolve with that with a few multiple redirects on some old links. Fantastic stuff.
-
I am a big fan of Screaming frog myself. Apart from the real basic stuff (checking H1, titles,...etc) it's also useful to check if all your pages contain your analytics tag and to check the size of the images on the site (these things Moz can't do).
It's also extremely useful when you're changing the url structure to check if all the redirects are properly implemented.
Sometimes you get loops in your site, especially if you use relative rather than absolute links on your site - Screaming Frog has an extremely helpful feature: just click on the url and select "crawl path report" - which generates an xls which shows the page where the problem originates
It's also very convenient that you can configure the spider to ignore robots.txt / nofollow / noindex when you are test a site in a pre-production environment. Idem for the possibility to use regex to filter some of the url's while crawling (especially useful for big sites if the they aren't using canonicals or noindex where they should use it)
rgds,
Dirk
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it good idea to use special characters in Meta Tags?
Hello Experts, Is it good idea to add special characters while writing meta title and descriptions? Characters such as $ # Check Marks % If yes, will it help increase the CTR of page? Thanks in advance!
On-Page Optimization | | jhakasseo0 -
Keywords used to land on specific page?
Hi all, Does anyone know if there's anywhere where I can see what keywords are used in search engines to land on a specific page? I have access to the Google Analytics account and linked it to Moz as a campaign, but I can't find this data. I'm curious about this because a very uncommon word is used in a page title for a page I try to optimize. It's the Dutch translation of 'malicious'. And now I wonder if it's better to switch to a word that's used more often. Or if it's better to 'win the battle' on this (probably) rarely used word. I've used Google trends to see how many people use it, but it says there's not enough data to show the interest over time.
On-Page Optimization | | RaoulWB0 -
Is there a tool that I can use to scrape and see metatags?
Looking for a tool that allows me to scrape the websites off a page listing of Google and output a spreadsheet with the websites and their related meta-tag details (mainly title tag). Is there a tool out there that can conveniently allow me to do this?
On-Page Optimization | | Gavo0 -
"Turning off" content to a site
One site I manage has a lot of low quality content. We are in the process of improving the overall site content but we have "turned off" a large portion of our content by setting 2/3 of the posts to draft. Has anyone done this before or had experience with doing something similar? This quote from Bruce Clay comes to mind: “Where a lot of people don’t understand content factoring to this is having 100 great pages and 100 terrible pages—they average, when the quality being viewed is your website,” he explained. “So, it isn’t enough to have 100 great pages if you still have 100 terrible ones, and if you add another 100 great pages, you still have the 100 terrible ones dragging down your average. In some cases we have found that it’s much better, to improve your ranking, to actually remove or rewrite the terrible ones than add more good ones.” What are your thoughts? Thanks
On-Page Optimization | | ThridHour0 -
Should i avoid using H2 to optimize my home page
Hi, my site is www.in2town.co.uk and i am looking at ranking for lifestyle magazine but i am a bit worried. The titles of the articles on the home page are h2, and i am just wondering if i should change this to h3 as it may get confusing for google. Can anyone let me know if i should keep the article titles as h2 or if i am right that they need to be changed to h3 or something else any help would be great
On-Page Optimization | | ClaireH-1848860 -
The correct way to go from PHP site to HTML site?
I have a website fully coded in PHP and I am doing a re-design over to an HTML site. I searched through the Q&A and there were some conflicting answers. Some said you will need to 301 all the pages. Others said to use the .htaccess to parse all the files as html. What is the correct way I should go about this? Thanks in advance!
On-Page Optimization | | reliabox0 -
"And" vs "&"
I blog for hotels and I am wondering whether it is best to have on a wordpress tagline the name of the hotel such as Holiday Inn and Suites vs Holiday Inn & Suites. In Google AdWords, the "and" keyword always beats out the "&" word in exact search. The "&" just always looks cleaner. Also, when I refer to the hotel within a blog post, should I use the "and" or "&" in the name? Please help me understand which is best for seo. Thank you!
On-Page Optimization | | lwilkins0 -
Do we need to use the canonical tag on non-indexed pages?
Hi there I have been working in / learning SEO for just over a year, coming from a non dev background, so there are still plenty of the finer points on-page points I am working on. Slowly building up confidence and knowledge with the great SEOMoz as a reference! We are working on this site http://www.preciseuk.co.uk (we are still tweaking the tags and content by the way- not finished yet!) Because a lot of the information is within accordians, a page is generated for each tab of the accordian expanded, for example: http://www.preciseuk.co.uk/facilities-management.php is the main page but then you also have: http://www.preciseuk.co.uk/facilities-management.php?tab=0 http://www.preciseuk.co.uk/facilities-management.php?tab=1 http://www.preciseuk.co.uk/facilities-management.php?tab=2 http://www.preciseuk.co.uk/facilities-management.php?tab=3 http://www.preciseuk.co.uk/facilities-management.php?tab=4 http://www.preciseuk.co.uk/facilities-management.php?tab=5 All of which are in the same file. According to the crawl test, these pages are not indexed. Because it is all in one file, should we add the canonical tag to it, so that this is replicated in all the tab pages that are generated? eg. Thanks in advance for your help! Liz OneResult
On-Page Optimization | | oneresult
[email protected]2