SeoMoz Crawler Shuts Down The Website Completely
-
Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below)
Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately.
I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it.
Here is what caused it from these error lines:
216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
-
After much research and implementing ton of added scripts on my apache server to track it - the bots did effect the shutdown. However, for this not to happen to you or if you ever have a problem of that nature this is how I resolved it.
It is an excellent article about how to implement the script to restart immediately once all available threads for apache are exhausted and your apache crashes. The script basically check apache server status every 5 min and in an event that it crashed - it will automatically restart it and send you an email notification. I say pretty good deal for risking to be offline only 5 min if anything major happens. Just as well I am also running a cron job every morning at 1am to restart apache. Please note that you need to have knowledge of SSH commands and manipulations in order for this to happen. And OMG I am talking like a geek... All the best to you...
-
Wow Randy, what a story man. Actually the funny part is one of the jobs I do is monitor for things like that - but I would not go that far to actually shut someone's site down - precisely for the reason of knowing what that could do. It is great thing to know that for 5 days you still preserved your ranking. That makes me feel so much better. I am keeping the rule of 1 dedicated server per 2 domains (both related). In this whole case we are talking about a domain called babylifetime.com. I am about to embark on a journey of custom development for site similar to squarespace.com but with much more addons - so I need this thing to work properly. I think I got this SEO in organic arena pretty well, but again things like the issue in this thread are what is keeping me on my toes.
-
Googlebot would have to be indexing your site at the very moment that it was down for anything to happen and even if it's down for a half a day, from my experience, rankings are unaffected.
However, there's a small side-effect. If visitors that are coming from X, Y or Z engine, visit your site and there is a 404 or Server Error and they click the back button or get the "Google Can't Find This" page, it can, for that period of time increase your bounce rate. If the originating click starts at say Google and then the clicker goes back to google, it tells google that the page wasn't what they were looking for in relation to the term that they used, or that it didn't load, or that there is a problem with it. Basically any reason that can be tied to bounce rate.
As alarming as that may sound, I don't believe that it would effect your rankings.
The easiest way to see if Google noticed is to log in to your Google Webmaster Tools account and check for errors. If they list any errors such as 404 or "server unavailable" (which I'm not sure they have that one) for any pages that you know are usually live and well, then you'll know they noticed.
But again, I'm not under the belief that it will largely effect your rankings. I've read from Google's words that they do go back to sites that were unavailable or down and try to continue their index.
As for your server being down for 12 hours. That's a lengthy amount of time. I can't even imagine it. You might want to check your hosting capabilities. You should be back up and running in minutes, not hours.
Just to give you a some piece of mind. I have a plethora of affiliate sites that make a small income for me. I once registered a domain name that a very large corporation didn't appreciate. It had a trademarked word in the domain. Long story short, my domain info was set to private so they got legally got the server shut down. I didn't know for days because everything was on auto-pilot and I wasn't checking my related email addresses. When that server was shut down, 100+ websites on that server went down too because that one trademarked (partially) domain was on the same server and same hosting package. The sites were down for about 5 or 6 days while I sorted through the legal paperwork. After I made an agreement to give the big company the domain, minus the 20K in damages that they originally wanted, the hosting company turned the server and hosting package back on.
Not a single one of the domains lost ranking. Not even 1 spot! Today, they still rank in the top 2 to 3 of their biggest terms. So my words are truly from experience and are from a worst-case scenario. I think you'll be fine.
Finally, to clear the air. I didn't do anything bad, nor would I ever do anything bad with a domain name (other than keep it in my portfolio). The big company was upset that I got the domain before they did. All I had on the index page was their description of their product that was named in the domain. That was enough to be taken down for copyright and trademark infringement.
In the end, that company was actually very cool about it. And it's a Fortune 10 company! I was surprised!
-
EGOL thanks for your reply.
A) Also my latest though is that unusual activity is blocking it. But then again, it is dedicated server and should be capable of handling it separately. We are talking about SeoMoz bot and highest dedicated GoDaddy server. Without anything specifically installed to interfere with apache server.
B) RAM, bandwidth, space, PHP memory and other memory limits etc. is all under 20% of actual use.
-
I am willing to bet that the root issue is with the host and one of these situations is occurring: A) the host is throttling your processor resources and shutting your domain down after unusual activity occurs on your site.... B) total activity on the server (your site and other sites) exceed a certain level and the server limits resource available for processing.
I would be looking for a new host.
-
Randy thanks for the response. There is definitely something going on related directly to rogerbot on the server. I have different crawlers running at all times and nothing ever happens. This particular problem ties in when seomoz bots start doing their job (fridays) and is backtracked to specific bot.As for delay. I tried different ones up to 20 - but same problem persists.
At the moment I have tech team reviewing apache server to see specifics of this. I will also post it here for other to see when I find out.
But it is weird and now I don't know when the site will shut down. Driving me crazy man!
As additional question to this thread: When your site goes down for lets say 12 hours and you have many organic google high ranked listings. Does that have huge impact or what is acceptable?
-
Jury,
I'm not sure if rogerBot is doing anything to you site but I do know a way to slow rogerBot and any other robot / crawler which takes directions from the robots.txt file that should be on your site.
Basically, just add the two lines of code that are represented below to your robots.txt file. With this addition, you are telling the useragent (rogerBot) to take 10 seconds between pages. You can change that number to anything you want. The more seconds you add, the slower it goes. And this of course is if rogerBot takes directions. I'm fairly sure it does!
NON-AGENT SPECIFIC EXAMPLE
User-Agent: *
Crawl-Delay: 10
EXAMPLE FOR ROGERBOT
User-Agent: rogerBot
Crawl-Delay: 10
Good Luck,
Randy -
Thanks Lewis...I will do that and see if they have any suggestions...!
-
Hi Jury
If you haven't already i would recommend raising the issue through the help email address [email protected]
On the Q&A forum we can pass thoughts or suggestions but the support team at seomoz will be best placed to answer this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking issue for new website
Hi all, I have got a specific SEO challenge. 6 months ago, we started to build an eCommerce site (located in the UK). In order to speed up the site launch, we copied the entire site over from an existing site based in Ireland. Now, the new UK site has been running for 5 months. Google has indexed many pages, which is good, but we can't rank high (position: between 20-30 for most pages). We thought it was because of content duplication in spite of different regions. So we tried to optimize the pages for the UK site to make them more UK-related and avoid content duplication. I've also used schema to tell google it's a UK-based site and set up Google my business and got more local citations. Besides, If you could give me any suggestions, it'd be perfect.
Intermediate & Advanced SEO | | Insightful_Media
Thank you so much for your time and advice.1 -
How to increase DA of website?
My website Domain Authority is not improving what should I do to improve DA of website
Intermediate & Advanced SEO | | JOkoiue0 -
INTERNAL LINKS strategy on our website
Hi Moz-ers, Currently doing an audit of our website. I have two questions on links. How can I see the current state of my internal links? Also, how can I improve our internal links on the website? what is a good framework to follow what should I avoid Thanks, looking forward to learning more on Moz!
Intermediate & Advanced SEO | | Eric_S
Eric0 -
Multilingual Version of a Website
Hi All, We created 'EN' and 'FR' version of a website and translated all labels and message from English to France with the help of Google Translator. Lets take an example: English version URL - https://www.sitegeek.com/softlayer France version URL - https://fr.sitegeek.com/softlayer France version also contain same reviews available on English version page. So the reviews content or language is same on both pages. To eliminate the duplicate content issue we put following meta tags on both 'EN" and 'FR' version pages : So My question is that (1.) Is this the correct implementation of Multilingual Version of a Website? (2.) Is Added meta tags work for both Google and Bing Search engine? (as Bing not indexing all pages) (3.) We are translated labels and messages from Google Translator. Is this the issue pages not being Indexed in Bing? (4.) Finally, What would the correct SEO approch if we translate our site in other languages? Rajiv
Intermediate & Advanced SEO | | gamesecure0 -
How to know website is hit with panda or penguin?
My Website traffic and keywords dropped day by day. How can I know website is hit with panda or penguin. Website is - 24hourpassportandvisas. com
Intermediate & Advanced SEO | | bondhoward0 -
Website, webshop and blog. Subfolders or subdomains?
Hello fellow mozzers, I've seen a lot of discussion and confusion about whether you should use subfolders or subdomains when you have a website, a blog and a webshop.
Intermediate & Advanced SEO | | WesleySmits
Of course with subfolders the PageRank will be more effective since it's all in one domain. On the other hand subdomains will be a better user experience since you can focus on just the webshop or just the blog. Was wondering how you guys/girls think what would be the best way to handle this.0 -
How can I improve my rankings in Google with help of seomoz
Hey guys, I have joined seomoz today and set up campaigns for my sites. I got reports about keyword rankings, errors, notices etc. But I am still confused about how to use seomoz in order to improve my rankings. My point is does seomoz provides any services for improving position in google or simply seomoz provides only reporting? These reports are good but my ultimate goal to join seomoz is to improve my rankings for my each website and each post. Please help. BJ
Intermediate & Advanced SEO | | intmktcom0 -
Construction website
Hi, I have a construction website that is aimed at tradesmen. There are 2 goals of the site: 1. To allow potential customers to sign up for a trade account. 2. To allow existing customers to access to products and login to their account to make an order. The site is full of categories and products which should be indexed so we rank for these trade products. The homepage redesign is where i am having an issue: Currently the site is set up like a standard retail site but without prices, which are viewable only when logged in. The homepage is designed such that there is several call to actions about promotions, services and to apply for a trade account, that apply to both existing and potential customers. At the moment there is a poor conversion to get potential customers to apply for a trade account. This is because there is too much distraction away from this goal and they are allowed to engage other areas of the site freely. The main purpose of the homepage should be to encourage potential customers to sign up. The secondary purpose to for existing customers to access the accounts and products. I believe potential customers should not be exposed to the categories and products as it is a distraction from the primary goal. Potential customers, i.e. Tradesmen, would already have a certain understanding of the types of products we provide, so I don't feel it is necessary to allow them to crawl the rest of the site unless they have an account. What are your thoughts on that? Here is my lack of understanding: On the homepage, if I restrict access to categories and products to existing account holders only, where a login is required to proceed, would that mean Google cannot access these pages to index them? Or is this only controlled by NoFollows & Robots.txt? Obviously not indexing is undesirable. I do understand potential customers will need some information about our range of products but the idea is to coerce them to sign up for an account so they can see this information. The more information that is provided to a potential customer, the higher the probability a person can make a decision against applying for an account. Restricting access creates a motivator to reveal information and we capture their data to converse with them personally. This increases the probability of us being able to retain their interest by providing a customised service based on their needs. All of this I feel makes perfect sense to me, the only query/obstacle I have is the indexing of the site. If Google cannot index pages that are restricted by account access, then I would like suggestions to solve/compromise/optimise the above. Just to address the desired behaviour of index pages. If in search a our product page appears, the person clicking the link would either be redirected or exposed to a login or sign up screen to view. Thank you so much for your help. Antonio
Intermediate & Advanced SEO | | AVSFencingSupplies0