Screaming Frog - What are your "go to" tasks you use it for?
-
So, I have just purchased screaming frog because I have some specific tasks that need completing. However, looking at Screaming Frog generally, there is so much information I was wondering for those who use it, what are the top key tasks you use it for. I mean what are your "go to" things you like to check, that perhaps are not covered by the Moz Crawl reports.
Just looking for things I perhaps hadn't thought about, that this might be useful for.
-
Ha ha, I know! It's like giving the developers a little present all wrapped up with a bow...here's the problem, and here's where to fix it
-
Allie,
That's a great example use-case. After my audits, clients are like "you found thousands of internal redirects and 404s - where are they?"
I'm like - hold on I have a spreadsheet of that!
-
I love Screaming Frog! One use case I've used recently is using it to find internal 404 errors prior-to and immediately-after a major site redesign.
After running a crawl, go to Bulk Export > Response Code > Client error (4xx) Inlinks and download the report. It shows the offending URL and the URL referring to it, which makes it easier to update the bad link.
I also have this page bookmarked, and it's my go-to guide:
-
It's one of the best tools so I feel like I use it "for everything." But some includes:
-
Title / meta duplication & finding parameters on ecomm stores
-
Title length & meta desc length
-
Removing meta keywords fields
-
Finding errant pages (anything but 200, 301, 302, or 404 status code)
-
Large sitemap export (most tools do "up to 500 pages." Useless.)
-
Bulk export of external links (what ARE we linking to??)
-
Quickly opening a page in Wayback Machine or Google cache
-
Finding pages without Analytics, as was mentioned.
I use Screaming Frog for tons of other things. Finding the AJAX escaped frag URL, identifying pages with 2 titles, 2 canonicals, 2 H1 tags, etc. Even seeing www & non-www versions live, links to pages that shouldn't be linked and http vs https.
Very cool tool - useful for pretty much everything! haha
-
-
That's awesome. Thanks. Will take a look at all those things this week.
-
I use SF religiously for all the audit work I do. I run a sample crawl (using Googlebot as the crawler) to check for all the standard stuff and go further.
My standard evaluation with SF includes:
- Redirect / dead end internal linking
- Redirect / dead end "external" links that point to site assets housed on CDN servers.
- URL hierarchical structure
- Internal linking to both http and https that can reinforce duplicate content conflicts
- Page Title/H1 topical focus relevance and quality
- Confusion from improperly "nofollowing" important pages (meta robots)
- Conflicts between meta robots and canonical tags
- Slow page response times
- Bloated HTML or image file sizes
- Thin content issues (word count)
- Multiple instances of tags that should only have one instance (H1 headline tags, meta robots tags, canonical tags)
-
That crawl path report is pretty cool, and it led me to the redirect chain report, which I have a few issues to resolve with that with a few multiple redirects on some old links. Fantastic stuff.
-
I am a big fan of Screaming frog myself. Apart from the real basic stuff (checking H1, titles,...etc) it's also useful to check if all your pages contain your analytics tag and to check the size of the images on the site (these things Moz can't do).
It's also extremely useful when you're changing the url structure to check if all the redirects are properly implemented.
Sometimes you get loops in your site, especially if you use relative rather than absolute links on your site - Screaming Frog has an extremely helpful feature: just click on the url and select "crawl path report" - which generates an xls which shows the page where the problem originates
It's also very convenient that you can configure the spider to ignore robots.txt / nofollow / noindex when you are test a site in a pre-production environment. Idem for the possibility to use regex to filter some of the url's while crawling (especially useful for big sites if the they aren't using canonicals or noindex where they should use it)
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will including "Contact Me" form degrade Google page ranking?
I have a content-rich page about one of my offerings. Will Google knock down the ranking if I include a contact me reply form on the page vs including a link to a standalone reply page? My concern is that including the form will cause Google to downgrade the page as being "too commercial".
On-Page Optimization | | Lysarden0 -
Site IA considering a "Resources" section
I'm working on a resources section for our website and I'm wondering the best way to handle my sites information architecture. The resources section houses things like webinars (both upcoming and recorded), case studies, ebooks, etc. Should things like webinars and case studies be considered supporting content to our main topics? For example we have a product that manages Rebates. So we have a page on our site dedicated to "Rebates". Would a webinar or case study about our Rebates software be grouped with "Rebates" or would it be grouped by content/resource type? So for example a breadcrumb could look like this: Home > Rebates > Rebates Webinar or Home > Webinars > Rebates Webinar Thanks in advance!
On-Page Optimization | | Brando160 -
When, if ever, would you need to use: example.com/en
Is there any reason to have /en on your website if your website is only in English? Or is it worth having in case you are preparing to translate into other languages? And is there any advantage to being: en.example.com over example.com/en Thanks
On-Page Optimization | | CosiCrawley0 -
Should I "No Index" Certain Pages On My Site?
I have some pages on my site that don't really have any content other than some iframes that are embedded from another site. I thought it would be best to tag the page with a no-index so that search engines would leave the page alone since it has no content as far as the search engine can tell (but does provide value to my site visitors). Is this the proper approach or does it do more harm than good?
On-Page Optimization | | Kyle Eaves0 -
The word "in" between 2 keywords influence on SEO
Does anybody know when you have the word "in" between two keywords has this a negative influence in Google? For example: "Holiday Home Germany" is the search term in Google
On-Page Optimization | | Bram76
"Holiday Home in Germany" as h1 on our website or do we have to use "Holiday Home Germany" on our website?0 -
Using display: none in h1 tag
Hello, For the h1 tag on my landing page, I currently have 'Welcome to Edison Ford'. I was thinking of adding the keyword I'm targeting (property maintenance bristol) to the h1 tag which would look like this 'Welcome to Edison Ford Property Maintenance Bristol' Would it be a good idea to wrap the property maintenance bristol part in a span tag and add 'display: none;' to the styling? Thanks in advance.
On-Page Optimization | | PeterAllen910 -
Which redirect to use when redirecting to https page from http page
I have one form under https which is redirected from the regular http page. this site was not made by me and I am trying to understand if the way it was redirected using 302 redirect is a problem Thanks
On-Page Optimization | | ciznerguy0 -
Should my client remove "SEO" from the XML sitemap name?
I have suggested to a client with limited content on their site (considering it's in a very competitive sector with oceans of content possibilities!) that they probably shouldn't name the XML sitemap featuring their "seo content pages" (I hate that terminology BTW!) - google_sitemap_seo.xml My reasoning is that if I was a Google engineer or Google bot, I would probably ignore and disregard those pages because they are most likely poor quality content/doorway pages/boiler plate pages/ "enter your descriptive phrase here" pages. The push back from tech is that it doesn't make a difference so we're not going to do it.
On-Page Optimization | | Red_Mud_Rookie0