Screaming Frog - What are your "go to" tasks you use it for?
-
So, I have just purchased screaming frog because I have some specific tasks that need completing. However, looking at Screaming Frog generally, there is so much information I was wondering for those who use it, what are the top key tasks you use it for. I mean what are your "go to" things you like to check, that perhaps are not covered by the Moz Crawl reports.
Just looking for things I perhaps hadn't thought about, that this might be useful for.
-
Ha ha, I know! It's like giving the developers a little present all wrapped up with a bow...here's the problem, and here's where to fix it
-
Allie,
That's a great example use-case. After my audits, clients are like "you found thousands of internal redirects and 404s - where are they?"
I'm like - hold on I have a spreadsheet of that!
-
I love Screaming Frog! One use case I've used recently is using it to find internal 404 errors prior-to and immediately-after a major site redesign.
After running a crawl, go to Bulk Export > Response Code > Client error (4xx) Inlinks and download the report. It shows the offending URL and the URL referring to it, which makes it easier to update the bad link.
I also have this page bookmarked, and it's my go-to guide:
-
It's one of the best tools so I feel like I use it "for everything." But some includes:
-
Title / meta duplication & finding parameters on ecomm stores
-
Title length & meta desc length
-
Removing meta keywords fields
-
Finding errant pages (anything but 200, 301, 302, or 404 status code)
-
Large sitemap export (most tools do "up to 500 pages." Useless.)
-
Bulk export of external links (what ARE we linking to??)
-
Quickly opening a page in Wayback Machine or Google cache
-
Finding pages without Analytics, as was mentioned.
I use Screaming Frog for tons of other things. Finding the AJAX escaped frag URL, identifying pages with 2 titles, 2 canonicals, 2 H1 tags, etc. Even seeing www & non-www versions live, links to pages that shouldn't be linked and http vs https.
Very cool tool - useful for pretty much everything! haha
-
-
That's awesome. Thanks. Will take a look at all those things this week.
-
I use SF religiously for all the audit work I do. I run a sample crawl (using Googlebot as the crawler) to check for all the standard stuff and go further.
My standard evaluation with SF includes:
- Redirect / dead end internal linking
- Redirect / dead end "external" links that point to site assets housed on CDN servers.
- URL hierarchical structure
- Internal linking to both http and https that can reinforce duplicate content conflicts
- Page Title/H1 topical focus relevance and quality
- Confusion from improperly "nofollowing" important pages (meta robots)
- Conflicts between meta robots and canonical tags
- Slow page response times
- Bloated HTML or image file sizes
- Thin content issues (word count)
- Multiple instances of tags that should only have one instance (H1 headline tags, meta robots tags, canonical tags)
-
That crawl path report is pretty cool, and it led me to the redirect chain report, which I have a few issues to resolve with that with a few multiple redirects on some old links. Fantastic stuff.
-
I am a big fan of Screaming frog myself. Apart from the real basic stuff (checking H1, titles,...etc) it's also useful to check if all your pages contain your analytics tag and to check the size of the images on the site (these things Moz can't do).
It's also extremely useful when you're changing the url structure to check if all the redirects are properly implemented.
Sometimes you get loops in your site, especially if you use relative rather than absolute links on your site - Screaming Frog has an extremely helpful feature: just click on the url and select "crawl path report" - which generates an xls which shows the page where the problem originates
It's also very convenient that you can configure the spider to ignore robots.txt / nofollow / noindex when you are test a site in a pre-production environment. Idem for the possibility to use regex to filter some of the url's while crawling (especially useful for big sites if the they aren't using canonicals or noindex where they should use it)
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I use nofollow to limit the number of links on a page?
My website is an ecommerce and we have on homepage about 470 links ! 1. We have a top bar with my account, login, faq, home, contact us and link to a content page. 2 . Then we have multistore selection 3. Then we have the departament menu, with several parants + child category links 4. Then we have a banner 5. Then we have a list of the recently sold and new products. 6. then we have an image grid with the most important cms/content pages (like faq, about us, etc) 7. then we have footer, with all info pages, contact us, about us, my account etc. There are some links that are repeted 2, 3 times. For a user it is easier to find the informations but I'm not sure how search bots (google) deal with that. So I was thinking on how can I have around 150 links to be followed. To remove the links from the page is not possible. What about to add nofollow to repeted links and some child category, as the spider will crawl the father and will access child on the next page? Is this a good strategy?
On-Page Optimization | | qgairsoft0 -
Question about overoptimization and images "alt"
Hello, I own a shop with lots of categories, in each category there is a lot of pictures, some have already alt, must I put the attribute alt in all images, it would be 100% beneficial for my site or could would I be over-optimizing the site? Thank you
On-Page Optimization | | yuyuyu0 -
How would you target alternative names for your product with the use of internal/landing pages?
Recently, there has been some new industry names that have emerged for the type of software we provide and we want to make sure that we rank for those terms (of course!). The 3 names combined are too long for a title tag and would look odd to incl all 3 in the homepage. Any suggestions (or examples) of how this could done without looking spammy? I also don't really know where to put this within our website. e.g. enterprise portal, enterprise information portal (EIP) or corporate portal Any suggestions would be most welcome
On-Page Optimization | | alexpeps0 -
Using content for cliche' terms, or content found on other sites
howdy, I have a basic question about using content found on other websites for your own use. I have started a pick up lines website for guys to search for pickup lines to use on girls. Anyways, my website has many, if anything a lot, of the same exact pick up lines as all my competitors are using. If I use the same pick up lines found on their site could i be penalized for this as far as SEO? thanks and hope to hear back
On-Page Optimization | | david3050 -
Using a more relevant brand title for blog
I'm a newbee here so I appologize in advance for asking a question that might already be aswered ( i looked I promise). The question is this, I've been fiddling with the title tags and came upon the need to make a decision about separating our blog brand to be more specific to it's content. We're a moving company, our primary website talks about services and is branded with our name (%page_name% | 2 Brothers Moving & Delivery Portland Oregon), our blog is a work in progress "Moving Guide" (%post_title% | Portland Moving Guide). Should I stick with the standard brand name on the blog or call it something keyword specific like above? As a side question what do you all think about my titles in the first place? In case you'd like to take a look: www.2brothersmoving.net www.2brothersmoving.net/blog
On-Page Optimization | | r1200gsa0 -
Does anyone have any experience with using Altruik for optimizing a retail site?
Greetings - can anyone offer an opinion on the services of http://www.Altruik.com/? They claim to be able to effectively optimize the deep pages of online retailers. I welcome any and all feedback! Eric
On-Page Optimization | | Ericc220 -
How could I avoid the "Duplicate Page Content" issue on the search result pages of a webshop site?
My webshop site was just crawled by Roger, and it found 683 "Duplicate Page Content" issues. Most of them are result pages of different product searches, that are not really identical, but very similar to each other. Do I have to worry about this? If yes, how could I make the search result pages different? IS there any solution for this? Thanks: Zoltan
On-Page Optimization | | csajbokz0