SEO Web Crawler IP addresses
-
What are the IP addresses for the SEO Web Crawler?
There is a firewall on my clients website before it goes live, I would like to crawl the site before it goes live, but need to provide the web crawlers IP addreses.
Thank you for your time
-
Hi Jason,
This is Megan from the SEOmoz Help Team. Unfortunately, since we crawl from the cloud, RogerBot's IP address is changing all of the time so we cannot specify a particular IP address. Also, we do need the site to be live before we can crawl it, but you could remove the firewall and block all other bots except rogerbot so it won't get indexed anywhere else before it's ready.
I hope this helps! Please let me know if you have any other questions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Syntax for canonical tag for a default page in a sub directory (not subdomain) of a web site?
I'm getting two "no canonical tag" errors for the default page of a sub-directory default page (www and root) - again NOT a subdomain. Since the page is not the root of its own site, I tagged it as -- I have tried without the default.asp, but the error remains. Been doing this for 24 years and don't remember running across this before.
Moz Pro | | dcmike0 -
1500 Domains... Where to begin? & Web Structure Question.
So, as the title says, I am stuck. I recently have been brought on as the SEO guru for a small-mid size company with the task of rebuilding their web presence. Their website is in pretty unfortunate condition. The more research I do, the farther and farther I am going down the rabbit hole of chaos. Essential the previous CEO was doing all SEO work. He purchased 1500 domains, all keyword specific. Installed wordpress on roughly 1,000 and then began pumping out content. Of the 1,000 roughly 300 of them have about 600-2,000 characters worth of content that is absolute fluff. From there the linking began. Now the content is different enough that Google doesn't seem to notice that its the SAME FREEKIN THING on each domain, but I am very concerned. The company has their main multi-page domain which has other links and sources of traffic, but in essence the previous owner created a micro link web. My advice is to cut those links ASAP and remove the previous work. At the same time, I also don't want them to lose rank. So I guess I am asking a whole slew of questions... Am I right in thinking that we have to build a bridge before we burn a bridge? Is it worth fixing up some of those other domains to have original content to try and bolster what we already have? Would it be better to combine everything into one website, or try and have different domains represent different things. For example Envato.com is an umbrella website with 8 separate websites operating under the same roof using different domains.? Where do I begin? I feel like I have started this project numerous times. I know the keywords, I know where the duplicate content is, I know the structure of the main domain, I am getting the structure of the entire link web. Lastly, any thoughts you all have would be greatly appreciated. I realistically have minimal experience in this realm. I am a a major nub. I understand SEO in theory, sorta. So I'm getting there!
Moz Pro | | HashtagHustler0 -
Rogerbot's crawl behaviour vs google spiders and other crawlers - disparate results have me confused.
I'm curious as to how accurately rogerbot replicates google's searchbot I've currently got a site which is reporting over 200 pages of duplicate/titles content in moz tools. The pages in question are all session IDs and have been blocked in the robot.txt (about 3 weeks ago), however the errors are still appearing. I've also crawled the page using screaming frog SEO spider. According to Screaming Frog, the offending pages have been blocked and are not being crawled. Webmaster tools is also reporting no crawl errors. Is there something I'm missing here? Why would I receive such different results. Which one's should I trust? Does rogerbot ignore robot.txt? Any suggestions would be appreciated.
Moz Pro | | KJDMedia0 -
How to make Seo Tools Site serves ?
hello i want to add seo tools at my site for users i want my visitors able to use pagerank checker , link tracker , backlinks checker , etc - and keyword tools , domain tools , Analytics and Reporting. something like http://smallseotools.com/ what scripts i need ? can i do this all with wordpress ? Thanks 🙂
Moz Pro | | Wagdys0 -
What do the dates refer to in seo moz reports
question is in the title - new trainee asked me and couldn't actually answer!
Moz Pro | | Highlandgael0 -
Any ideas about using research tools data to create client interest in dedicating marketing spend on SEO services vs their other current antiquated marketing methods?
I consult in the dental industry. Many professionals in this industry are operating in the dark ages when it comes to marketing. Any ideas about using research tools data to create client interest in dedicating marketing spend on SEO services vs their other current antiquated marketing methods? I am approaching potential clients to pitch SEO. They aren't necessarily seeking SEO services currently but have a strong marketing budget. I would like to use seomoz tools data about their site and competitors as way to show possibilities/potential of SEO services to gain interest in further discussion of SEO benefits.
Moz Pro | | cody_baird0 -
Is there a recommended format when placing a business address on a webpage?
Hi All, I ask the question as I was trying to GeoTarget tool which happened to not recognise a business address I place in the footer on one of my sites. The tool states that including the address on page helps the search engines identify your location, so I'm curious whether a specific format works best when optimizing for local search? Thanks.
Moz Pro | | davebrown19750