Mac-Friendly, DOM-Rendering Spidering Tool for Multiple Users
-
Hello!
I am looking for a spidering tool that:
- Is Mac-friendly
- Can render the DOM and find JS links
- Can spider password-protected sites (prompts for password and then continues spider, etc.)
- Has competitive pricing for 8+ users.
Screaming Frog is amazing - and maybe we're just going to have to bite the bullet there. But if anyone has any other ideas, I've love to hear them. Thanks!
-
So - after digging around a lot and reading and re-reading every article that popped up for "screaming frog alternative", I've come to the conclusion that for the price, there really is nothing better than Screaming Frog right now.
I was impressed, however, with the incredibly helpful team from Deep Crawl. This enterprise tool is designed for larger websites - whereas Screaming Frog can crap out of your local machine runs out of memory. Because it's a more powerful tool, it's more expensive than Screaming Frog - but if you need an enterprise solution, it's definitely worth looking into. Another big differentiator is that Deep Crawl has no limit to the number of users, which is our primary pain point with Screaming Frog.
-
Right now we're updating SEOSpyder ( http://www.mobiliodevelopment.com/seospyder/ ) for rendering pages but i can't give you timeframe when will be done.
So far memory requirements isn't too high and was crawl 250k site with 8G ram machine.
-
Oh actually something I just realized is that potentially ScreamingFrog can do what you want and it will provide you with access to 8 users, but the setup is complicated. You would need to run it in a big virtual machine on AWS or Google Cloud Platform. That way you can scale the machine so it won't time out and everybody will still have access to it.
Back to your question: I've worked with Deepcrawl, a bit with Ryte and more with Botify. They're all great tools that are able to crawl your site. But you probably already looked into some of them.
-
Oh, interesting - can you help me understand about more about the cloud solution are you using...? Thanks!
-
Going to follow this, as I've been looking for something too. But we went the cloud service, as there is nothing that I acme across that can otherwise fulfill all these needs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Rendering by Googlebot vs. Visitor
Hi Moz! After a different question on here, I tried fetching as Google to see the difference between bot & user - to see if Google finds the written content on my page The 2 versions are quite different - with Googlebot not even rendering product listings or content, just seems to be the info in the top navigation - guessing this is a massive issue? Help Becky
Intermediate & Advanced SEO | | BeckyKey0 -
Multiple sitewide (deep)links devalued by Google?
In my experience sitewide links can still be very powerful if used sensibly and in moderation. However, I'm finding that sitewide text blocks with 2 or 3 (deep)links to a single domain appear not to be working that well or not at all in raising the authority of those target pages. Anyone having the same experience? In your experience, is the link value diminished when there are multiple deeplinks to a single domain in a sitewide text area? Is anything more than 1 link per target domain bad? Or could it even be that it's not so much the number of deeplinks to a single domain that matter, but purely the fact that they are sitewide "deeplinks"? Are sitewide deeplinks treated differently than sitewide links linking to an external homepage? Very interested in hearing your personal experience on this matter. Factual experience would be best, but "gut feeling" experience is also appreciated 🙂 Best regards, Joost
Intermediate & Advanced SEO | | JoostvanVught0 -
Disavow Tool - WWW or Not?
Hi All, Just a quick question ... A shady domain linking to my website is indexed in Google for both example.com and www.example.com. If I wan't to disavow the entire domain, do I need to submit both: domain:www.example.com domain:example.com or just: domain:example.com Cheers!
Intermediate & Advanced SEO | | Carlos-R0 -
Multiple, Partial Redirecting URLs from old SEO company
Received quite a surprise when I gained access to the Google webmaster account and saw 4 domains that are link to my clients domain and the number of links for each domain range between 10,000 and 90,000. Come to find out this was a result of their former agency. The business is very local central. I will use the example of a burger place. They main site is burgers.com and burger places are listed by city and state. Their former agency bought several domains like californiaburgers.com and duplicated the listings for that state on this domain. You can view certain pages of the second domain, but the home page is redirected as are most of the city pages with 301s to the main burgers.com domain. However, there are pages on the additional domains that do not redirect, as they are not duplicated on the main domain so nowhere to redirect. Google has only found four of them but looks like there could be at least 50. Pages that are not redirected are indexed by the engines - but not ranking (at least not well). There is a duplicate content issue, although "limited" in the sense that it really is just the name of the business, address and phone number - there is not much to these listings. What is the best approach to overcome? Right now GWT is showing over 300,000 links, however at least 150,000 to 200,000 of that is from these domains.
Intermediate & Advanced SEO | | LeverSEO0 -
Optimising a page for multiple keywords
I remember reading a question a while back about seo for a page targetting multiple keywords but I'm blowed if I can find it now.... I have a page which is optimised for one phrase and want to add 5-6 phrases/keywords... obviously I can't stuff the all the keywords in the page title or the header 1 tag. So I have written the content to mention the other keywords trouble is not wanting to compromise the quality of the page so of the keywords/phrases I have only been able to use once in the content. I assumed as the phrases are all on the same topic/area that this should not really matter. Apart from link building with the correct anchor text is there anything else I should be doing? The other option is to create custom pages for the keyword but again I am not keen on this idea.... Any suggestions?
Intermediate & Advanced SEO | | JohnW-UK0 -
Better to have one subdomain or multiple subdomains?
We have quite a bit of content we are considering subdomaining. There are about 13 topic centers that could deserve their own subdomain, but there are about 2000 original articles that we also want to subdomain. We are considering a) putting the 13 centers (i.e. babies.domain.com, acne.domain.com, etc) and the 3000 articles (on a variety of topics) on one subdomain b) putting the 13 centers on their own subdomain and the remaining 3000 articles on their own subdomain as well (14 subdomains total) What do you think is the best solution and why?
Intermediate & Advanced SEO | | nicole.healthline0 -
How to Set Custom Crawl Rate in Google Webmaster Tools?
This is really silly question to set custom crawl rate in Google webmaster tools. Any one can find out that section under setting tab. But, I have confusion to decide number for request per second and second between requests text field. I want to set custom crawl rate for my eCommerce website. I checked my Google webmaster tools and find out as attachment. So, Can I use this facility to improve my crawling? 6233755578_33ce83bb71_b.jpg
Intermediate & Advanced SEO | | CommercePundit0 -
Google Places, Multiple locations best practice
What is the best practice with having multiple locations in Google Places. Does having multiple Google Places set up for each business have a big effect on local rankings for the individual areas? Should I use the home page for the website listed on each page or is it better to have a specific landing page for each Google Places listing? Any other tips? Thanks, Daniel
Intermediate & Advanced SEO | | iSenseWebSolutions0