HELP! My client got a DDOS Attack! Need advice
-
Here the setup:
-
Server is hosted inhouse. It got attacked using a DDOS from 20+ IP addresses spoofing in different counries. Our server overloaded and didn't work anymore.
-
URL is registered at GoDaddy.
-
Signed up at Dreamhost. We pointed DNS to Dreamhost successfully.
-
Attacks kept coming and messed up other sites on the Dreamhost shared server. We didn't know we were being followed at first. We originally thought they were attacking the IP address on our inhouse server.
-
Dreamhost noticed the attack and put us on a seperate IP and disabled our URL until the attacks 'stopped'.
MY QUESTION IS:
What do I do if they don't stop? Close shop? 99% of the business is internet driven. This has to be the blackest Blackhat SEO ever.
-
-
Thanks for sharing GKLA, Very useful information . Thanks you all!
-
Take a look at this option: http://www.cloudflare.com/features-security
-
These IP were spoofing from many countries. They would disappear in minutes. Anyway, we found the main IPs that were attacking. YES YOU ARE RIGHT about identifying the one common factor. At 1st we thought blocking IPs would work, but when that didn't work, we started blocking the 'sytle' they were using.
-
It looks like you got this resolved. We went through something similar many years ago but we were lucky because our website is for the US only. The attack was coming in from China, Russia and several other European countries.
We simply blocked all countries except the US, Mexico and Canada in our Firewall.
You just need to identify the one common factor in the attack and filter that out through your firewall.
-
Update:
Switched to Amazon Cloud and got Amazon involved. They helped out by providing some tools. Basically we filtered the attacks by not accepting IPs who were transferring a certain amount of packets. Woot Woot! We have been up and running now for about 6 days with no problem. All I know is that the attacker had a browser with a Russian Language. The site Ship Car Overseas survived!
-
Update:
We dropped Dreamhost.com since they couldn't help. They were useless in this area.
We copied the DB and pointed the URL in GoDaddy to our new host at Amazon Cloud. Well, the DDoS attacks a still coming in. The site was up for a short while (I'm talking minutes) then refreshed the pages and the ISP says the site wasn't there anymore. Damn, this attacker is relentless. I will be enabling the Amazon Balance Loader tomorrow. If this renders the DDoS attack ineffective, then Amazon solves it. But I won't find out until tomorrow.
-
Here is what dreamhost said:
" it does indeed look like you were getting attacked yet again. Unfortunately there isn't much you or myself can do in these cases.. I've disabled your domain again and will re-enable it in a week. I'm hoping that by then, the attacker has given up and moved on. If this is not the case, I regret to say that you will need to find hosting elsewhere as we do not offer a DDoS protection service. Please let me know if you have any questions.Thanks! Jason Y "
In conclusion dreamhost can't help.
-
Thanks there cowboy. Dreamhost still has not replied. I think I'll keep everything tracked here just in case other people run into this DDOS problem in the future. So far this is what has happened:
- Dreamhost disabled our URL and we are still waiting for their response.
- I took the Database and transfered all files to a new domain.
- Launching a massive Adwords Campaign to make up for the loss of 3 days revenue.
The reason I decided to transfer the DB to a new domain was I don't want to be a sitting duck if Dreamhost says they can't help. I am pretty sure they can help, but I put into place my plan B just in case. I'll keep everyone posted.
-
Hey again Francisco, upon rereading your question, it looks like I went off half cocked when I answered it. I missed that you had solved the immediate problem and that you were wondering what course of action to takke if they don't stop. the attack
If someone continues deliberately attacking your site I'm thinking the only course of action is to change your domain name. It's not a good solution so I hope someone else chimes in with a better one.
-
Hello Francisco: Really sorry to hear bout this. Bummer!
I've never personally experienced a DDOS attack ,so I called the web host I use to get his advice. He said that Dreamhost should be able to offer some kind of DDOS mitigation service.He seemed surprised that they weren't able to block it if it was coming in from only 20+ IP addresses.
He also said that if the attack continued, they'd probably not want the account after a certain point. He seemed surprised that they weren't able to block it if it was coming in from only 20+ IP addresses.
One of the main reasons I use him is that he's always been helpful when I've had problems. He said that he'd be willing to host you for a month to see if he could help. His company name is TRK hosting
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise / Help on Bad Link Removals
Hey everyone.
White Hat / Black Hat SEO | | TheITOteam
Im new to the community and new to backlinks - hence the question to the community today.
I would like help understanding options and work load around back links and removing them.
I have a client with over 8000 back links as a few years ago he paid someone about £10 to boost his rankings by adding thousands of backlinks.
We fear this is having a bad effect on their site and rankings organically as 90% of these back links have a spam score of over 50% and also no follows. My questions to the community (if you could be so kind to share) are:
1. Whats the best way to decide if a Backlink is worth keeping or removing
2. Is there a tool to decide this or assist with this somewhere on the internet? Ive had advise stating if its not hurting the page we should keep it. However, again...
How do I know what damage each Backlink is causing to the domain? I appriciate anyones time to offer some advice to a novice looking to clear these1 -
Negative SEO from Spammers Killing Client Rankings
Hi - I have identified a client website which was; a ) hacked and had several fraudulent pages added e.g. www.xxx.com/images/uggaustralia.html added which have 301 redirect links to another fraudulent websites. b) had an auto generated back link campaign (over 12k back links at present) with targeted anchor text at cheap ugg boots, ugg sale etc. I've removed the dodgy redirect web pages and also undertook a link audit using Google WMT, OSE and Seo Majestic and have disavowed all the spammy links at domain level. Consequently my client has dropped from top three for the key phrase to #9. Google WMT now sees ugg boots uk, ugg boots sale etc. as some of the most popular anchor text for the site even though it's blatantly obvious that the site has nothing to do with Ugg boots. No manual webspam penalties are in place however the auto generated anchor text campaign is still ongoing and is generating more spammy links back to non existent web pages - which still Google appears to be picking up. Question is - how long do you reckon it will take for the links to disappear and is there anything I can speed Google along as this issue if not of my making? p.s. For the record I've found at least 500 sites that have been targeted by this same campaign as well.
White Hat / Black Hat SEO | | Door4seo0 -
Redesign Troubleshooting Help
We launched a redesign at the end of May and soon after, our website was de-indexed from Google. Here are the changes that I implemented so far to try to fix this issue: 301 redirect chain - We changed all our URLs and implemented 301 redirects. However, these are multiple redirects meaning 1 URL redirects to a second and then a 3rd. I was told that this could confuse Google. For example: http://cncahealth.com 301s to http://www.cncahealth.com 301s to https://www.cncahealth.com We wrote a rule for each variation of the URL and not there is only a one to one 301 redirect and this was validated with urivalet.com. Canonical tags did not match URL - We created the new website in a CMS where the CMS generated non-SEO friendly URLs. We applied 301 redirects to those CMS URLs, but when we enable canonical tags within the CMS, it uses the original CMS URL and not the URL of the page, so the canonical URL doesn't match the page. For now, I disabled canonical tags until I can figure out a way to manually insert canonical tag code in the pages without using the CMS canonical tag feature. After doing these two fixes our website still doesn't seem like it is able to get re-indexed by Google even when I submit the sitemap in Google Webmaster Tools...the sitemap doesn't get indexed? Questions...there are two more concerns that I am hoping can be answered in this community: Cache-Control = private : I saw from URIvalet.com that our cache-control is set to private. Is this affecting us being indexed and should this be set to public? Load Balancer - Our old website was not on a load balancer, but our new website is. When I look in our analytics at servers, I notice that the site is being picked up on one server and then another server at different times. Is Google seeing the same thing and is the load balancer confusing Google? I'm not sure what else could be an issue with us not being indexed. Maybe its just a waiting game where after I implemented the 1 & 2 change I just have to wait or does 3 & 4 or other issues also need to be addressed in order to get re-indexed? I hope someone can help me. Thanks!
White Hat / Black Hat SEO | | rexjoec0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Has Panda help this site achieve great heights? How? and Why?
Today I went about my business in trying to understand what is happening in our market, eyewear, after the last Panda update. I was interested to know if any of our competitors were effected as much as we were for a very competitive key phrase To my surprise a new kid appeared on the block, well, on page one, position two. Imagine my second surprise, when the new kid turn out to be a 3 month old domain, yes 3 months, with zero page rank and zero back links. I was in for one more surprise before I stood up, walked to the window and gazed into space to contenplate the meaning of Panda and SEO as we know it. This third surprise was the site in question is a counterfeiting site using black hat SEO with fast results. It has a Blog its a good looking site with the key phrase menstioned a hundred times. google-UK-%20Search-Result.jpg panda-help.jpg
White Hat / Black Hat SEO | | ShoutChris0 -
Is it negative to put a backlink into the footer's website of our clients ?
Hello there ! Everything is in the subject of this post but here is the context : we are a web agency and we, among others, build websites for our clients (most of them are shops). Until now, we put a link in their footer, like "developped by MyWebShop". But we don't know if it is bad or not. With only one website we can have like hundred of backlinks at once, but is it good for SEO or not ? Will Google penalize us thinking that is blackhat practices ? Is it better to put our link in the "legal notices" or "disclaimer" part of the websites ? What is the best practice for a lasting SEO ? I hope you understand my question, Thnak you in advance !
White Hat / Black Hat SEO | | mywebshop0 -
My Google PR is Decreasing HELP!
We have just started in on an SEO campaign after a year or so break from engaging in active SEO efforts. Our rankings and organic traffic seems to be increasing but we just dropped from a PR 5 to a PR 4 after being a PR 5 for probably a couple years. We are not doing anything black hat or sketchy and try hard to make sure all of our links are relevant and quality links. Does anyone know why this might have happened or if it is an indication of anything?
White Hat / Black Hat SEO | | MyNet0