Mac-Friendly, DOM-Rendering Spidering Tool for Multiple Users
-
Hello!
I am looking for a spidering tool that:
- Is Mac-friendly
- Can render the DOM and find JS links
- Can spider password-protected sites (prompts for password and then continues spider, etc.)
- Has competitive pricing for 8+ users.
Screaming Frog is amazing - and maybe we're just going to have to bite the bullet there. But if anyone has any other ideas, I've love to hear them. Thanks!
-
So - after digging around a lot and reading and re-reading every article that popped up for "screaming frog alternative", I've come to the conclusion that for the price, there really is nothing better than Screaming Frog right now.
I was impressed, however, with the incredibly helpful team from Deep Crawl. This enterprise tool is designed for larger websites - whereas Screaming Frog can crap out of your local machine runs out of memory. Because it's a more powerful tool, it's more expensive than Screaming Frog - but if you need an enterprise solution, it's definitely worth looking into. Another big differentiator is that Deep Crawl has no limit to the number of users, which is our primary pain point with Screaming Frog.
-
Right now we're updating SEOSpyder ( http://www.mobiliodevelopment.com/seospyder/ ) for rendering pages but i can't give you timeframe when will be done.
So far memory requirements isn't too high and was crawl 250k site with 8G ram machine.
-
Oh actually something I just realized is that potentially ScreamingFrog can do what you want and it will provide you with access to 8 users, but the setup is complicated. You would need to run it in a big virtual machine on AWS or Google Cloud Platform. That way you can scale the machine so it won't time out and everybody will still have access to it.
Back to your question: I've worked with Deepcrawl, a bit with Ryte and more with Botify. They're all great tools that are able to crawl your site. But you probably already looked into some of them.
-
Oh, interesting - can you help me understand about more about the cloud solution are you using...? Thanks!
-
Going to follow this, as I've been looking for something too. But we went the cloud service, as there is nothing that I acme across that can otherwise fulfill all these needs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issues on same page with urls to multiple tabs?
Hello everyone! I'm first time here, and glad to be part of Moz community! Jumping right into the question I have. For a type of pages we have on our website, there are multiple tabs on each page. To give an example, let's say a page is for the information about a place called "Ladakh". Now the various urls that the page is accessible from, can take the form of: mywanderlust.in/place/ladakh/ mywanderlust.in/place/ladakh/photos/ mywanderlust.in/place/ladakh/places-to-visit/ and so on. To keep the UX smooth when the user switches from one tab to another, we load everything in advance with AJAX but it remains hidden till the user switches to the required tab. Now since the content is actually there in the html, does Google count it as duplicate content? I'm afraid this might be the case as when I Google for a text that's visible only on one of the tabs, I still see all tabs in Google results. I also see internal links on GSC to say a page mywanderlust.in/questions which is only supposed to be linked from one tab, but GSC telling internal links to this page (mywanderlust.in/questions) from all those 3 tabs. Also, Moz Pro crawl reports informed me about duplicate content issues, although surprisingly it says the issue exists only on a small fraction of our indexable pages. Is it hurting our SEO? Any suggestions on how we could handle the url structure better to make it optimal for indexing. FWIW, we're using a fully responsive design with the displayed content being exactly same for both desktop and mobile web. Thanks a ton in advance!
Intermediate & Advanced SEO | | atulgoyal0 -
Tool to identify if meta description are showing?
Hi we have a Ecommerce client with 1000s of meta descriptions, we have noticed that some meta descriptions are not showing properly, we want to pull and see which ones are showing on Google SERP results. You can use tools like screaming frog to pull meta description from page, but we want to see if it's showing for certain keywords. Any ideas on how to automate this? Cheers.
Intermediate & Advanced SEO | | brianna00 -
After Ranking Drop Continue SEO or Focus on Improving User Experience Instead?
Six months after starting a marketing campaign and spending a lot of money on SEO audits, link removals, wire frames, copywriting and coding my web site (www.nyc-officespace-leader.com) traffic dropped significantly after I launched a new version of my site in early June. Traffic is down about 27%, but most of the traffic from competitive terms is gone and the number of leads (phone calls, form completions) is off by about 70%. On june 6th an upgraded version of the site with mostly cosmetic changes (narrower header without social media buttons, streamlined conversion forms, new right rail was launched. No URLs were changed, and the text remained mostly the same. But somehow my developers botched up either canonical tags or Robot Text and 175 URLs with very little/no content were indexed by Google. At that point my ranking and traffic. A few days ago a request to remove those pages was made via Google WebmasterTools and now the number of pages indexed is down to 675 rather than the incorrect 850 from before. But ranking, traffic and lead generation have not yet recovered. After spending almost $25,000 over nine months this is rather frustrating. I might add the site has very few links from incoming domains and those links are not high quality. An SEO audit was performed in February and in April a link removal campaign occurred with about 30 domains agreeing to remove links and a disavow file being submitted for another 70-80 domains that would not agree to remove links. My SEO believes that we should focus on improving visitor engagement rather that on more esoteric SEO like trying to build incoming links. They think that improving useability will improve conversions and would generate results faster than traditional SEO. Also, they think that improving click through rates, reducing bounce rates will improve ranking by signaling to Google that the site is providing value to visitors. Does this sound like a reasonable approach? On one hand I don't see how my site with a MOZ domain authority could possibly compete against sites with a high number of quality incoming links and that maybe building a better link profile would yield faster results. On the other hand, it seems logical that Google would reward a site that creates a better user experience. Any thoughts from the MOZ community???? Does it sound like the recent loss of traffic is due to the indexing of the 175 pages? If so, when should my traffic and ranking return? Incidentally, these are the steps taken since last November to improve SEO: SEO Traffic & Ranking Drop Analysis and Recommendations (included in-depth SEO technical audit and recommendations). Unnatural Link Removal Program Content Optimization (Audit & Strategy with 20 page keyword matrix) CORE (also provided wireframe for /visitor-details pages at no-charge) SEO Copywriting for 10 pages New wire frames implemented on site on June 6th Jump in indexed pages by 175 on June 10th. Google Webmaster Tools removal request made for those low quality pages on June 23rd. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan11 -
Best link analysis tool before moving to disavow them
I want to use Google disavow tool. I have 2 questions relating it 1. How to use Google Disavow tool? 2. How to use Moz or OSE for deep backlink mining? If Moz not best fit needs alternative for it.
Intermediate & Advanced SEO | | csfarnsworth0 -
One website, multiple service points
Hi all Wondering if anyone could offer an opinion on this..am talking to a new client who offer kitchen installation and design. They have a central headquarters and cover a 100 mile radius of their location. A lot of search terms they are aiming to target - Kitchen Design, Kitchen Fitters etc offer localised results. This is where my issue lays. I have worked with plenty of clients in the past which have physical presence in multiple locations and have marked up the site so that the site ranks for each of the stores, but trying to make one site appear in many locations where it doesn't have an address is a different issue completely. Not only do they only have one address, they also only have one phone number. We will target, as best we can, the non localised keywords but need to work out what to do to cover the locations 20/30/40 miles from the office which they cover. I welcome any opinions on this please.
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Duplicate content throughout multiple URLs dilemma
We have a website with lots of categories and there are problems that some subcategories have identical content on them. So, is it enough to just add different text on those problematic subcategories or we need to use "canonical" tag to main category. Same dilemma is with our search system and duplicate content. For example, "/category/sports" URL would have similar to identical content with "/search/sports" and "/search/sports-fitness/" URLs. Ranking factors is important for all different categories and subcategories. Ranking factors is also important for search individual keywords. So, the question is, how to make them somehow unique/different to rank on all those pages well? Would love to hear advices how it can be solved using different methods and how it would affect our rankings. When we actually need to use "canonical" tag and when 301 redirect is better. Thanks!
Intermediate & Advanced SEO | | versliukai0 -
SEO Tools for Content Audit
Hi i'm looking for a tool which can do a full content audit for a site for instance - Find pages which: • Lack text content. • Finds pages with lengthy meta descriptions • Finds missing H1 tags or multiple H1 tags . • Duplicate meta descriptions. • Find images with no alt text Are there any tools besides the ones on SEMOZ which can enable me to do a full content audit on factors like these. Or any SEO audit tools out there which you can recommend. Cheers, Mark
Intermediate & Advanced SEO | | monster990 -
Submitting URLs multiple times in different sitemaps
We have a very dynamic site, with a large number of pages. We use a sitemap index file, that points to several smaller sitemap files. The question is: Would there be any issue if we include the same URL in multiple sitemap files? Scenario: URL1 appears on sitemap1. 2 weeks later, the page at URL1 changes and we'd like to update it on a sitemap. Would it be acceptable to add URL1 as an entry in sitemap2? Would there be any issues with the same URL appearing multiple times? Thanks.
Intermediate & Advanced SEO | | msquare0