SeoMoz Crawler Shuts Down The Website Completely
-
Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below)
Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately.
I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it.
Here is what caused it from these error lines:
216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
-
After much research and implementing ton of added scripts on my apache server to track it - the bots did effect the shutdown. However, for this not to happen to you or if you ever have a problem of that nature this is how I resolved it.
It is an excellent article about how to implement the script to restart immediately once all available threads for apache are exhausted and your apache crashes. The script basically check apache server status every 5 min and in an event that it crashed - it will automatically restart it and send you an email notification. I say pretty good deal for risking to be offline only 5 min if anything major happens. Just as well I am also running a cron job every morning at 1am to restart apache. Please note that you need to have knowledge of SSH commands and manipulations in order for this to happen. And OMG I am talking like a geek... All the best to you...
-
Wow Randy, what a story man. Actually the funny part is one of the jobs I do is monitor for things like that - but I would not go that far to actually shut someone's site down - precisely for the reason of knowing what that could do. It is great thing to know that for 5 days you still preserved your ranking. That makes me feel so much better. I am keeping the rule of 1 dedicated server per 2 domains (both related). In this whole case we are talking about a domain called babylifetime.com. I am about to embark on a journey of custom development for site similar to squarespace.com but with much more addons - so I need this thing to work properly. I think I got this SEO in organic arena pretty well, but again things like the issue in this thread are what is keeping me on my toes.
-
Googlebot would have to be indexing your site at the very moment that it was down for anything to happen and even if it's down for a half a day, from my experience, rankings are unaffected.
However, there's a small side-effect. If visitors that are coming from X, Y or Z engine, visit your site and there is a 404 or Server Error and they click the back button or get the "Google Can't Find This" page, it can, for that period of time increase your bounce rate. If the originating click starts at say Google and then the clicker goes back to google, it tells google that the page wasn't what they were looking for in relation to the term that they used, or that it didn't load, or that there is a problem with it. Basically any reason that can be tied to bounce rate.
As alarming as that may sound, I don't believe that it would effect your rankings.
The easiest way to see if Google noticed is to log in to your Google Webmaster Tools account and check for errors. If they list any errors such as 404 or "server unavailable" (which I'm not sure they have that one) for any pages that you know are usually live and well, then you'll know they noticed.
But again, I'm not under the belief that it will largely effect your rankings. I've read from Google's words that they do go back to sites that were unavailable or down and try to continue their index.
As for your server being down for 12 hours. That's a lengthy amount of time. I can't even imagine it. You might want to check your hosting capabilities. You should be back up and running in minutes, not hours.
Just to give you a some piece of mind. I have a plethora of affiliate sites that make a small income for me. I once registered a domain name that a very large corporation didn't appreciate. It had a trademarked word in the domain. Long story short, my domain info was set to private so they got legally got the server shut down. I didn't know for days because everything was on auto-pilot and I wasn't checking my related email addresses. When that server was shut down, 100+ websites on that server went down too because that one trademarked (partially) domain was on the same server and same hosting package. The sites were down for about 5 or 6 days while I sorted through the legal paperwork. After I made an agreement to give the big company the domain, minus the 20K in damages that they originally wanted, the hosting company turned the server and hosting package back on.
Not a single one of the domains lost ranking. Not even 1 spot! Today, they still rank in the top 2 to 3 of their biggest terms. So my words are truly from experience and are from a worst-case scenario. I think you'll be fine.
Finally, to clear the air. I didn't do anything bad, nor would I ever do anything bad with a domain name (other than keep it in my portfolio). The big company was upset that I got the domain before they did. All I had on the index page was their description of their product that was named in the domain. That was enough to be taken down for copyright and trademark infringement.
In the end, that company was actually very cool about it. And it's a Fortune 10 company! I was surprised!
-
EGOL thanks for your reply.
A) Also my latest though is that unusual activity is blocking it. But then again, it is dedicated server and should be capable of handling it separately. We are talking about SeoMoz bot and highest dedicated GoDaddy server. Without anything specifically installed to interfere with apache server.
B) RAM, bandwidth, space, PHP memory and other memory limits etc. is all under 20% of actual use.
-
I am willing to bet that the root issue is with the host and one of these situations is occurring: A) the host is throttling your processor resources and shutting your domain down after unusual activity occurs on your site.... B) total activity on the server (your site and other sites) exceed a certain level and the server limits resource available for processing.
I would be looking for a new host.
-
Randy thanks for the response. There is definitely something going on related directly to rogerbot on the server. I have different crawlers running at all times and nothing ever happens. This particular problem ties in when seomoz bots start doing their job (fridays) and is backtracked to specific bot.As for delay. I tried different ones up to 20 - but same problem persists.
At the moment I have tech team reviewing apache server to see specifics of this. I will also post it here for other to see when I find out.
But it is weird and now I don't know when the site will shut down. Driving me crazy man!
As additional question to this thread: When your site goes down for lets say 12 hours and you have many organic google high ranked listings. Does that have huge impact or what is acceptable?
-
Jury,
I'm not sure if rogerBot is doing anything to you site but I do know a way to slow rogerBot and any other robot / crawler which takes directions from the robots.txt file that should be on your site.
Basically, just add the two lines of code that are represented below to your robots.txt file. With this addition, you are telling the useragent (rogerBot) to take 10 seconds between pages. You can change that number to anything you want. The more seconds you add, the slower it goes. And this of course is if rogerBot takes directions. I'm fairly sure it does!
NON-AGENT SPECIFIC EXAMPLE
User-Agent: *
Crawl-Delay: 10
EXAMPLE FOR ROGERBOT
User-Agent: rogerBot
Crawl-Delay: 10
Good Luck,
Randy -
Thanks Lewis...I will do that and see if they have any suggestions...!
-
Hi Jury
If you haven't already i would recommend raising the issue through the help email address [email protected]
On the Q&A forum we can pass thoughts or suggestions but the support team at seomoz will be best placed to answer this.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting M Dot Mobile Website to Responsive Design Website Questions
Hi amazing Moz community 🙂 Couldn't find this question anywhere, and knew this was the place to ask! We are helping a client redirect an M Dot website to a Responsive Design website. We want to retain our mobile rankings for keywords. Three questions - We should use 301 redirects from the M Dot website to the new website correct? (not 302s?) How long does it take for Google to understand that we have launched a responsive website? Can we remove the 301 redirects after a few days (if the M Dot website interferes/breaks the new Responsive website)? We have verified an account on Google Search Console for the M Dot website, along with a mobile sitemap that has been submitted and verified. What should we do with this M Dot GSC account? Just delete it? Or keep it and upload the NEW XML Sitemap with the new WWW links (because the website is responsive). THANK YOU!
Intermediate & Advanced SEO | | accpar0 -
Quickest way to deindex large parts of a website
Hey there, my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations. Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded with noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages. I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow. It would be great if you could share your thoughts on that. Cheers, Jochen Hey there,
Intermediate & Advanced SEO | | Online-Marketing-Guy
my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations.
Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded wiht noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages.
I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow.
It would be great if you could share your thoughts on that.
Cheers,Jochen0 -
One website, multiple service points
Hi all Wondering if anyone could offer an opinion on this..am talking to a new client who offer kitchen installation and design. They have a central headquarters and cover a 100 mile radius of their location. A lot of search terms they are aiming to target - Kitchen Design, Kitchen Fitters etc offer localised results. This is where my issue lays. I have worked with plenty of clients in the past which have physical presence in multiple locations and have marked up the site so that the site ranks for each of the stores, but trying to make one site appear in many locations where it doesn't have an address is a different issue completely. Not only do they only have one address, they also only have one phone number. We will target, as best we can, the non localised keywords but need to work out what to do to cover the locations 20/30/40 miles from the office which they cover. I welcome any opinions on this please.
Intermediate & Advanced SEO | | Grumpy_Carl0 -
New Website. Changing TLD or not?
Hi, At my company we are making a new website because the days of the old one are numbered. We already decided that the folder structure will be changed so we have more "clean" url's. Now we also would like to change from .net/nl to .nl . Since we already are redirecting all url's (>10.000), we think this is the moment to switch the TLD. What do you guys think? Is their anyone who has some kind of experience/tip they would like to share?
Intermediate & Advanced SEO | | SEO_ACSI0 -
Our website scores A but on google we are still on 7th page
Hi all, I have run on page keyword optimizations with exact terminology used to find our company service or our competition on google. We have ranked A, with almost all points complete. I did the same for our main competitor and they ranked F. Then i did page positioning on Google and they get on page 1 fifth line and we get page 7. We have plenty of unique content and extensive website.
Intermediate & Advanced SEO | | EMGCSR
Could there be any other reason than reason for this other than backlinks? Many thanks for your help.0 -
Question about multiple websites in same field
I know what most people say that it is best to only have the 1 website for focus but if we can put this to the back of our minds, if we create 2 different websites that are totally different designs (one upmarket one and one targeting the cheaper market) but in the same fields (printing) and go after 80% of the same keywords is this ok (could we be penalized). Please note we will not be interlinking the websites, the website .will be on different servers and the names will be registered under different people (2 partners in the business). We will however be accessing webmaster tools from the same location.
Intermediate & Advanced SEO | | BobAnderson0 -
301 Redirect Dilemma - Website redesign
Hi Guys, We are redesigning a clients ecommerce site. As part of the process, we're changing the URL structure to make it more friendly. I have put together a provisional 301 redirect plan but I'm not sure just how far I need to go with it. So far I have extract all the pages from the existing site that Google Webmaster Tools says have links pointing at them - this totals 93 pages. I have matched each page like for like to the new website structure. My next step was to pull the landing pages report from Google Analytics, I have extracted the pages that received entrances over the last 6 weeks. This totals 553, less the redirects I have already done and cleaning up some Google Translate pages I have circa 410 pages left. Many of these pages has more than 1 URL pointing to that page. I'm debating how important it is that that all of these remaining 410 pages have individual redirects set up for them one by one. I have to rule out regex because there is no pattern that makes sense given that I have already set up redirects for the first 93 pages that have external links. My question therefore is how important are 301 redirects on pages that have no external links and receive less than 10 entrances over a 6 week previous period? Do I need to 301 every single product on the old site to it's corresponding page on the new site? Also, I'm not sure how to treat pages that have mutliple URL's on the old site, the existing URL structure is such a mess that in some instances I have 5 URL's for one product page? I could feasibly create 5 seperate redirects but is this necessary? Also what about speed considerations, the server is going to have to load these redirects and it may slow the site down. I'm sitting at 100 odd so far. Any answers are most appreciated. Thanks Derek.
Intermediate & Advanced SEO | | pulseo0 -
Why do branded manufacturer websites have multiple pages for their products?
My favorite golf ball is the Srixon Tour Yellow ball. Srixon has a product detail page here (www.srixon.com) AND there's also a product detail page here at shop.srixon.com. Is there any sort of SEO penalty here because there's some duplication? Does the fact the store is a separate subdomain make this more allowable? Many branded manufacturer websites work this way but it just doesn't make sense to me to have two product pages that you have to manage content when you can have just 1 with a call to action. I also work for a branded manufacturer and am considering rebuilding our website from the ground up with the online store and the main/marketing website blended into one to eliminate this duplication. We have this same duplicated marketing/store setup as well. any feedback is greatly appreciated. Confused.
Intermediate & Advanced SEO | | Timmmmy0