Tools that crawl 2 million page sites
-
Our site is about 2million pages deep, 50% of which is stale content. Yes, I know - OMG #unhygienic. Even if we get approval to get rid of half of it. SEOMoz Pro Elite only crawls 20k deep - what can i do to crawl and diagnose the whole site. Are there any tools anyone can suggest. SEOMoz??
-
That's good to know. It sounds like that's probably the best way. I also use Screaming Frog (http://www.screamingfrog.co.uk/seo-spider/) to try and crawl sites and with dedicated 2Gigs of ram, it's able to crawl around 50k pages. If your site is structured in sub-folders, you might be able to break it into parts and then crawl. But then if not, the SEOMOZ Enterprise looks like the way to go.
-
There is an enterprise version of SEOmoz which will do 1 million pages a month and up to 30k keywords which is well worth looking into if you have a enormous web property.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why doesn't Moz crawl whole pages of our website to report All On-Page issues?
Hi friends & mozzers, How can't Moz crawl whole pages of our website: https://www.4atvtires.com/ to report All Serious On-Page issues. We have more than 15000 product pages. And how could it be possible that Moz isn't able to crawl whole, just got crawl report upto 258 pages of our website, and also I can experience the same in Google webmaster ?? Please help to fix this issue as early as possible. Regards,
Moz Pro | | BigSlate
Rann0 -
Duplicate pages coming from links from the login page - what should we do about them?
This is a follow on to an earlier question which was well answered by Dirk Ceuppens regarding abnormal crawl issues. We are seeing that the issues relating to Duplicate Pages are coming from links from the login page which shows information about where the user was redirected from. For example, if the visitor is not logged on and wishes to wish-list an item, they will be redirected to the login page, with the item code and intended action in the url; which can then continue on to the desired page once logged on. The MOZ crawler is seeing these pages as having Duplicated Content whilst they are all the same apart from a piece of information in the URL. Should we be blocking these duplications? Are they a risk to us? What should we be doing? Many thanks, Sarah
Moz Pro | | Mutatio_Digital0 -
Is there a tool to crawl website meta tags to locate any in an incorrect language?
Example: I'm interested in crawling a regional site for Brazil to find any meta tags that are still in English with the goal being to fix any localization issues.
Moz Pro | | mattsolar0 -
Rogerbot did not crawl my site ! What might be the problem?
When I saw the new crawl for my site I wondered why there are no errors, no warning and 0 notices anymore. Then I saw that only 1 page was crawled. There are no Error Messages or webmasters Tools also did not report anything about crawling problems. What might be the problem? thanks for any tips!
Moz Pro | | inlinear
Holger rogerbot-did-not-crawl.PNG0 -
Campaign. Only 1 page is crawled
I have a campaign setup a couple weeks ago and noticed that only 1 page has been crawled. Is there something I need to do to get all pages crawled?
Moz Pro | | priceseo0 -
Can I prevent some pages from being crawled from SEOMoz spider and still not affect Google Spider?
Well, basically that's the question 😄 Can I prevent some pages from being crawled from SEOMoz spider and still not affect Google Spider? This is, I have more than 10.000 pages on the website, and I am not interested in having reports for many of them, but I still wanna get SEO visits on them, so I want Google to crawl it easily... Thanks!
Moz Pro | | MattDG0 -
Keyword ranking tool
Hello, What is best practice for knowing if a keyword is too difficult to try to rank for. i understand the keyword difficulty tool in seoMOZ, but am unsure of how it actually relates to if I should attempt to rank for the keyword. Is there a differential people use such as if a site has a Page Authority of 60 we will not try to rank for a keyword that has a keyword difficulty ranking of 40?
Moz Pro | | digitalops0