SeoMoz Crawler Shuts Down The Website Completely
-
Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below)
Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately.
I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it.
Here is what caused it from these error lines:
216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
-
After much research and implementing ton of added scripts on my apache server to track it - the bots did effect the shutdown. However, for this not to happen to you or if you ever have a problem of that nature this is how I resolved it.
It is an excellent article about how to implement the script to restart immediately once all available threads for apache are exhausted and your apache crashes. The script basically check apache server status every 5 min and in an event that it crashed - it will automatically restart it and send you an email notification. I say pretty good deal for risking to be offline only 5 min if anything major happens. Just as well I am also running a cron job every morning at 1am to restart apache. Please note that you need to have knowledge of SSH commands and manipulations in order for this to happen. And OMG I am talking like a geek... All the best to you...
-
Wow Randy, what a story man. Actually the funny part is one of the jobs I do is monitor for things like that - but I would not go that far to actually shut someone's site down - precisely for the reason of knowing what that could do. It is great thing to know that for 5 days you still preserved your ranking. That makes me feel so much better. I am keeping the rule of 1 dedicated server per 2 domains (both related). In this whole case we are talking about a domain called babylifetime.com. I am about to embark on a journey of custom development for site similar to squarespace.com but with much more addons - so I need this thing to work properly. I think I got this SEO in organic arena pretty well, but again things like the issue in this thread are what is keeping me on my toes.
-
Googlebot would have to be indexing your site at the very moment that it was down for anything to happen and even if it's down for a half a day, from my experience, rankings are unaffected.
However, there's a small side-effect. If visitors that are coming from X, Y or Z engine, visit your site and there is a 404 or Server Error and they click the back button or get the "Google Can't Find This" page, it can, for that period of time increase your bounce rate. If the originating click starts at say Google and then the clicker goes back to google, it tells google that the page wasn't what they were looking for in relation to the term that they used, or that it didn't load, or that there is a problem with it. Basically any reason that can be tied to bounce rate.
As alarming as that may sound, I don't believe that it would effect your rankings.
The easiest way to see if Google noticed is to log in to your Google Webmaster Tools account and check for errors. If they list any errors such as 404 or "server unavailable" (which I'm not sure they have that one) for any pages that you know are usually live and well, then you'll know they noticed.
But again, I'm not under the belief that it will largely effect your rankings. I've read from Google's words that they do go back to sites that were unavailable or down and try to continue their index.
As for your server being down for 12 hours. That's a lengthy amount of time. I can't even imagine it. You might want to check your hosting capabilities. You should be back up and running in minutes, not hours.
Just to give you a some piece of mind. I have a plethora of affiliate sites that make a small income for me. I once registered a domain name that a very large corporation didn't appreciate. It had a trademarked word in the domain. Long story short, my domain info was set to private so they got legally got the server shut down. I didn't know for days because everything was on auto-pilot and I wasn't checking my related email addresses. When that server was shut down, 100+ websites on that server went down too because that one trademarked (partially) domain was on the same server and same hosting package. The sites were down for about 5 or 6 days while I sorted through the legal paperwork. After I made an agreement to give the big company the domain, minus the 20K in damages that they originally wanted, the hosting company turned the server and hosting package back on.
Not a single one of the domains lost ranking. Not even 1 spot! Today, they still rank in the top 2 to 3 of their biggest terms. So my words are truly from experience and are from a worst-case scenario. I think you'll be fine.
Finally, to clear the air. I didn't do anything bad, nor would I ever do anything bad with a domain name (other than keep it in my portfolio). The big company was upset that I got the domain before they did. All I had on the index page was their description of their product that was named in the domain. That was enough to be taken down for copyright and trademark infringement.
In the end, that company was actually very cool about it. And it's a Fortune 10 company! I was surprised!
-
EGOL thanks for your reply.
A) Also my latest though is that unusual activity is blocking it. But then again, it is dedicated server and should be capable of handling it separately. We are talking about SeoMoz bot and highest dedicated GoDaddy server. Without anything specifically installed to interfere with apache server.
B) RAM, bandwidth, space, PHP memory and other memory limits etc. is all under 20% of actual use.
-
I am willing to bet that the root issue is with the host and one of these situations is occurring: A) the host is throttling your processor resources and shutting your domain down after unusual activity occurs on your site.... B) total activity on the server (your site and other sites) exceed a certain level and the server limits resource available for processing.
I would be looking for a new host.
-
Randy thanks for the response. There is definitely something going on related directly to rogerbot on the server. I have different crawlers running at all times and nothing ever happens. This particular problem ties in when seomoz bots start doing their job (fridays) and is backtracked to specific bot.As for delay. I tried different ones up to 20 - but same problem persists.
At the moment I have tech team reviewing apache server to see specifics of this. I will also post it here for other to see when I find out.
But it is weird and now I don't know when the site will shut down. Driving me crazy man!
As additional question to this thread: When your site goes down for lets say 12 hours and you have many organic google high ranked listings. Does that have huge impact or what is acceptable?
-
Jury,
I'm not sure if rogerBot is doing anything to you site but I do know a way to slow rogerBot and any other robot / crawler which takes directions from the robots.txt file that should be on your site.
Basically, just add the two lines of code that are represented below to your robots.txt file. With this addition, you are telling the useragent (rogerBot) to take 10 seconds between pages. You can change that number to anything you want. The more seconds you add, the slower it goes. And this of course is if rogerBot takes directions. I'm fairly sure it does!
NON-AGENT SPECIFIC EXAMPLE
User-Agent: *
Crawl-Delay: 10
EXAMPLE FOR ROGERBOT
User-Agent: rogerBot
Crawl-Delay: 10
Good Luck,
Randy -
Thanks Lewis...I will do that and see if they have any suggestions...!
-
Hi Jury
If you haven't already i would recommend raising the issue through the help email address [email protected]
On the Q&A forum we can pass thoughts or suggestions but the support team at seomoz will be best placed to answer this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to index your website pages on Google 2020 ?
Hey! Hopefully, everyone is fine here I tell you some step how you are index your all website pages on Google 2020. I'm already implementing these same steps for my site Boxes Maker. Now Below I'm giving you some steps for indexing your website pages. These are the most important ways to help Google find your pages: Add a sitemap. ... Make sure people know your site. ... Ensure full navigation on your site. ... Apply the indexing application to your homepage. ... Sites that use URL parameters other than URLs or page names may be more difficult to broadcast.
Intermediate & Advanced SEO | | fbowable0 -
Having possible problems with rankings due to development website
Hi all, I've got an interesting issue and a bit of a technical challenge for you. It's a bit complicated to explain, but please bear with me. We have a client website (http://clientwebsite.com) which we are having a hard time ranking in the past few months. Main keywords simply don't show up in Top100 searches, even though we are constantly building backlinks through Guest Posts, Citations, Media mentions, Profile links etc. Normally, we use ahrefs to look at the client's website backlinks, but just today we used Majestic to look at the backlink profile and one backlink stood out. This is a backlink from a development server (http://developmentwebsite.com) which redirects to http://clientwebsite.com
Intermediate & Advanced SEO | | zakkyg
The developers who were working on the redesign of the client website, put it up on their server and forgot to delete it.
Also, the content inside the development website is almost identical with the client website. We then checked to see if http://developmentwebsite.com is indexed.
It's not. Although, inside the robots file http://developmentwebsite.com/robots.txt there's:
User-agent: *
Allow: /
The funny (and weird thing) is that http://developmentwebsite.com/ and all development website inner pages are not indexed in Google. But if we go to http://developmentwebsite.com/inner-page, it doesn't redirect to the corresponding http://clientwebsite.com/inner-page, it's the same development website page URL and the pages even have links to the client website, but like I said, none of the pages of the development website are indexed, even though crawlers are allowed in the robots.txt's development website. In your opinion, could this be the reason why we are having a hard time to rank the client website? Second question is:
How do we approach in solving this issue?
Do we simply delete the whole http://developmentwebsite.com with all the inner pages?
Or should we do 301 redirrects on a per-page basis?0 -
Duplicate content on recruitment website
Hi everyone, It seems that Panda 4.2 has hit some industries more than others. I just started working on a website, that has no manual action, but the organic traffic has dropped massively in the last few months. Their external linking profile seems to be fine, but I suspect usability issues, especially the duplication may be the reason. The website is a recruitment website in a specific industry only. However, they posts jobs for their clients, that can be very similar, and in the same time they can have 20 jobs with the same title and very similar job descriptions. The website currently have over 200 pages with potential duplicate content. Additionally, these jobs get posted on job portals, with the same content (Happens automatically through a feed). The questions here are: How bad would this be for the website usability, and would it be the reason the traffic went down? Is this the affect of Panda 4.2 that is still rolling What can be done to resolve these issues? Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
Guidelines for a second website business domain
Hi There, A client is setting up a second website selling the same products from a separate domain with the same descriptions etc. The site will have a separate URL, but will administered from the same CMS. The only difference is the new site has only one brand instead of several on the main site. E.G The main site sells all plumbing brands, the second site just one brand. Your thoughts and advice for best practise would be much appreciated. Andy (Marz Ventures)
Intermediate & Advanced SEO | | MarzVentures0 -
Why is my website not ranking? Has A grade on-site from Moz.
I have a website that I am trying to get to up the SERPs. However the site isn't appearing in the search - even when I search for the business name. The site is http://www.jl-engineering.com/ The keyword targeted is DPF Cleaning. Could anyone explain to me why the site isn't showing at all - and how to fix this? Thanks
Intermediate & Advanced SEO | | SWD.Advertising0 -
External links from banned websites
Currently working with a client that has seen his rankings diminish after the penguin update. I've manually analyzed all his 600 backlinks and identified approximately 85 external links from websites that have been banned by Google. How do these sites affect his current rankings? Should i just disavow all these links using the Google disavow tool? Any comments would be highly appreciated!
Intermediate & Advanced SEO | | Nick_Johansson0 -
Are there discrepancies between GWT and SEOMoz?
In our keyword rank tracking report, we've dominated a keyword in Google and have secured the slot for years. All evidence points in this direction. In Google Webmaster Tools, however, this particular keyword averages a rank of 6.5. Is anyone else experience these kinds of discrepancies? What is your take on it?
Intermediate & Advanced SEO | | NaHoku0 -
Optimising My Website Link Containers
Hi, I'm looking at my links containers and trying to optimise them. I would be greatful if anyone can give me some feedback on my plan for perfect optimaisation. My links are constructed as follows: I have a two states:
Intermediate & Advanced SEO | | James77
1/. A Non Hover state which contains an Image and Text
2/. A Hover state which contains a bit more text - I do this as containing full text on the non hover state would not be good for users and would look ugly as well. Here's an example block of the HTML - as you can see from the URL, its quite a deep page level. From the URL and Alt / Titles the Page I am Linking to is about: "The Royal Hotel Accommodation New York Holidays". I Just a bit confused on how I should apply ALT and Title (Titles in particular) attributes given the nested DiV's etc - I can apply these to parent level, or apply all levels, or apply them to a mix. Also is there any obvious thinks you can think of I am missing that may help onsite SEO? Thanks in Advance CURRENT UNOPTIMISED CODE:
The Royal Hotel
New York Holidays Accommodation
The Royal Hotel
MY OPTIMISED CODE (Adding Title and Alt attributes):
The Royal Hotel
New York Holidays Accommodation
The Royal Hotel
0