Mac-Friendly, DOM-Rendering Spidering Tool for Multiple Users
-
Hello!
I am looking for a spidering tool that:
- Is Mac-friendly
- Can render the DOM and find JS links
- Can spider password-protected sites (prompts for password and then continues spider, etc.)
- Has competitive pricing for 8+ users.
Screaming Frog is amazing - and maybe we're just going to have to bite the bullet there. But if anyone has any other ideas, I've love to hear them. Thanks!
-
So - after digging around a lot and reading and re-reading every article that popped up for "screaming frog alternative", I've come to the conclusion that for the price, there really is nothing better than Screaming Frog right now.
I was impressed, however, with the incredibly helpful team from Deep Crawl. This enterprise tool is designed for larger websites - whereas Screaming Frog can crap out of your local machine runs out of memory. Because it's a more powerful tool, it's more expensive than Screaming Frog - but if you need an enterprise solution, it's definitely worth looking into. Another big differentiator is that Deep Crawl has no limit to the number of users, which is our primary pain point with Screaming Frog.
-
Right now we're updating SEOSpyder ( http://www.mobiliodevelopment.com/seospyder/ ) for rendering pages but i can't give you timeframe when will be done.
So far memory requirements isn't too high and was crawl 250k site with 8G ram machine.
-
Oh actually something I just realized is that potentially ScreamingFrog can do what you want and it will provide you with access to 8 users, but the setup is complicated. You would need to run it in a big virtual machine on AWS or Google Cloud Platform. That way you can scale the machine so it won't time out and everybody will still have access to it.
Back to your question: I've worked with Deepcrawl, a bit with Ryte and more with Botify. They're all great tools that are able to crawl your site. But you probably already looked into some of them.
-
Oh, interesting - can you help me understand about more about the cloud solution are you using...? Thanks!
-
Going to follow this, as I've been looking for something too. But we went the cloud service, as there is nothing that I acme across that can otherwise fulfill all these needs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
How to find Topics ? software or results and user intent
Hello, Is there a software that is better than an other to find to right topics to cover in my content. I am thinking about Moz, Marketmuse or Semrush or is it better to look at the search results because they match user intent and see what is covered and cover those in my content Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Change of address webmaster tools
Hi dear experts; I trying to migration to https and follow the guide line that explain how to do it. After redirect 301, I created new property on WMT for ssl version and change the pre address to new one. but as you can see in the attach file, the new property does not appear in the list. the old version of domain is: sharifconsult.ir the new one is : https://sharifconsult.ir yd9fL
Intermediate & Advanced SEO | | seoiransite0 -
Some Tools Not Recognizing Meta Tags
I am analyzing a site which has several thousands of pages, checking the headers, meta tags, and other on page factors. I noticed that the spider tool on SEO Book (http://tools.seobook.com/general/spider-test) does not seem to recognize the meta tags for various pages. However, using other tools including Moz, it seems the meta tags are being recognized. I wouldn't be as concerned with why a tool is not picking up the tags. But, the site suffered a large traffic loss and we're still trying to figure out what remaining issues need to be addressed. Also, many of those pages once ranked in Google and now cannot be found unless you do a site:// search. Is it possible that there is something blocking where various tools or crawlers can easily read them, but other tools cannot. This would seem very strange to me, but the above is what I've witnessed recently. Your suggestions and feedback are appreciated, especially as this site continues to battle Panda.
Intermediate & Advanced SEO | | ABK7170 -
Any Product to Offer Users to Embed Pictures with Backlink
Wistia (video hosting) has an embed feature, which can be set up to include a backlink. In other words, a user could embed a video on their site, but would automatically create a back link to the original page where it is posted. Is there a product to do similar with pictures, where I could give users options to easily take the pictures from my website, but it would include a back link to my site when they do use such picture?
Intermediate & Advanced SEO | | khi50 -
Spellcheck necessary for user generated content?
We have a lot of user generated reviews on our key landing pages. Matt Cutts recommended using correctly spelled content. Would you perform a spellcheck of all already published user reviews or would you leave already published reviews rather intact and only perform spellcheck for new reviews before they are published? Since reviews have been marked up using schema.org, I am not sure whether posterior editing of lots of reviews may raise a flag with google regarding manipulating reviews. Thanks.
Intermediate & Advanced SEO | | lcourse0 -
Google Disavow Tool - Waste of Time
My humble opinion is that Google's disavow tool.... is a utter waste of your time! My site, http://goo.gl/pdsHs was penalized over a year ago after the SEO we hired used black hat techniques to increase ranking. Ironically, while having visibility, Google itself had become a customer. (I guess the site was pretty high quality, trust worthy and user friendly enough for Google employees to purchase from.) Soon enough the message about detecting unnatural links had shown up on the webmaster tools and as expected, our rankings sank and out of view. For a year we had contacted webmasters, asking them remove links pointing back to us. 90% didn't respond, the other 10% complied). Work on our site continued, adding high quality, highly relevant unique content.
Intermediate & Advanced SEO | | Prime85
Rankings never recovered and neither did our traffic or business….. Earlier this month, we learned about Google’s "link disavow tool" and were excited! We had hoped that following the cleanup instruction, using the “link disavow tool”, we would get a chance at recovery!
We watched Matt Cutts’ video, read the various forums/blogs/topics online that were written about it, and then we felt comfortable enough to use it... We went through our backlink profile, determining which links were either spammy or seemed a result of black hat practices or the links added by a 3rd party possibly interested in our demise and added them to a .txt file. We submitted the file via the disavow tool and followed with another reconsideration request. The result came a couple of weeks later… the same cookie cutter email in the WMT suggesting that there are “unnatural links” to the site. Hope turned to disappointment and frustration. Looks like the big box companies will continue to populate the top 100 results of ANY search, the rest will help Google’s shareholders… If your site has gotten in the algorithm crosshairs, you have a better chance of recovering by changing your URL than messing around with this useless tool.0 -
Multiple Google+ Local (Google Place) under one email address
As a automotive dealership group, we have 15+ business listings set up under one Google+ local account. Google+ Local (Google Places) offers the ability to upload a data file for 10+ listings, so we've kept all listings under one login for efficiency. Is there any specific local SEO benefit or any general benefit at all to having each business listing set up under their own separate email address?
Intermediate & Advanced SEO | | autoczar0