Server downtime is a pain in the butt. Some asshat thought it would be funny to flood our network for a dozen of days, since the beginning of 2013. We have filtered the heavy traffics. Mostly UDP and TCP floods but sometime ICMP came. Things were normal up until 3:34 PM. Our IDS sparked a huge spike and suddenly simultaneously almost all of our servers went down.
I still don't get why people DDoS our servers. I don't get paid high in working for this small hosting provider and my main job is to keep the server up, which technically I was failed at it. Our company is a dedicated hosting place, has platforms using classic LAMP and WAMP (the only difference is we use Perl & Python instead of PHP). We have used Cloudflare to evade DDoS but it seems the attacker found a way to resolve our real hostname. While other dudes do recovery, I searched through the network logs to analyze the tiny bit of evidence and what kind of crap hits us this hard. Mostly the traffics were traditional craps which we already filtered. And then something strange came... It seemed we have massive request generated via SQL queries.
We had been hit by huge SQL requests generated via ~4000 computers/bots!
I came to this conclusion because it was highly possible we have a few vulnerabilities in our SQL database - before Christmas, our maintaining group just performed a total upgrade and change on CMS which this cause the problem. On the usual, we have to retest and audit our systems right after every upgrade/change. However because we didn't, which Jarmo the security admin is supposed to be responsible on.
The whole attack worked like this: an asshat or two scanned our clients' websites for vulnerability on SQL Injection. I would raise a point here that we have a solid system on detecting/preventing attempts on major command injections; however still we left out some crucial part. Then the attacker use his bots to generate injected queries on our DB. This should only take out our DB, but due the requests came from mass traffics which also take out everything else.
4:25 PM, patched all the vulnerabilities and re-setup the security system.
5:12 PM, servers were up but traffics still high. 5:50 PM, added more filtering.
6:03 PM, traffics died down.
6:25, IDSs detected more attempts of attack, I came to the conclusion that we had been purposely targeted.
6:44, someone tried to do port scanning lousy, look like this guy forgot to do verbose scanning.
6:56, Woot, Jarmo traced attacker's origin. It was from Norway :-/
7:01, I finish to write this post while my eyes still stick on the server logs.
What a day!