According to the latest statistics, the amount of DDoS (distributed denial-of-service) attacks has increased during the past year, with those responsible using increasingly more sophisticated techniques in order to restrict target sites from operating.  Application layer attacks in particular seem to have become a weapon of choice in recent months.

What is an application layer attack?

Rather than relying on sheer volume (as a more traditional network layer DDoS attack would) an application layer assault targets individual applications within a website, such as any form that requires the site to perform requests for information.  When targeted effectively, forms like this can be loaded up with requests, slowing down the network.  As a result, genuine users will become frustrated and abandon their search.  This can lead to a loss of legitimate business.

Application Layer DDoS v Network Layer DDoS

A major issue with application layer attacks is that because they’re conducted on a much smaller scale, they’re harder to detect.  With the more “traditional” network layer attacks, the bombardment of traffic would be immediately viewable to administrators as a traffic spike.  Application layer attacks, though, will often be indistinguishable from normal visitors to the website. As a result, one of the major warning signs of a DDoS is taken away.

A word on traditional network layer ‘blanket’ attacks 

It’s worth highlighting the fact that a full network layer DDoS attack can still have devastating consequences, and that they remain a serious threat.  The only difference in terms of mitigation is that a standard network layer DDoS – in which the attacker floods a website with more traffic than it’s capable of handling– is that the traffic will follow a discernible pattern, making it easier to initially identify.

Just how devastating can an application layer attack be?

An application layer attack can have very serious consequences.  One recent fully-funded multi-vector DDoS was documented by Incapsula, the website security firm responsible for mitigating the effects of the assault.  The attack was made on a prominent company in a highly competitive industry.

Stage One.  The assault began with a relatively small scale SYN flood attack, peaking at around 30Gbps and lasting for less than an hour.  This represented only the very beginning of the bombardment.

Stage Two.  Just as the initial SYN assault subsided, the website was again, this time by a volumetric attack in the form of a 10m requests per second HTTP flood.  Unlike those in stage one, these floods – which deliberately honed in on certain resource-heavy pages -didn’t cease.  Although the client website is fully protected, they are actually still taking place.  This indicates the levels of determination that some attackers possess.

Stage Three.  After briefly attempting to target Incapsula’s own resources, using the same flood technique (a move which was deflected), the attackers attempted to go for the client site’s AJAX objects – objects which are often unprotected by bot filtering methods.  AJAX objects can have a direct impact on the database, which is typically a more sensitive chokepoint.  In one sense, this was a clever move.  However, because this part of the attack was made on a ‘registered users only’ area, the security firm were able to greatly narrow down potential suspects, and filter out the malicious bots using pinpoint identification algorithms.

Stage Four.  After a further wave of HTTP floods designed to capture session cookies, a more transparent attack in the guise of legitimate human visitors took place, and was only caught due to the slight variation in the site’s traffic that occurred as a result.  Customers complaining that the security firm had “invaded” their browsers enabled Incapsula to identify the overall cause of the assault: a large-scale botnet that made use of hijacked computers to bombard the client’s website with browser sessions.  Working with one of the hijacked PCs owners, the security firm were then able to identify the responsible Trojan device.  The Botnet itself was removed before it could be identified, but the information obtained was enough to enable Incapsula to build a security patch capable of blocking any future assaults.

Without Incapsula’s mitigation experience, the client would likely have suffered from significant amounts of downtime – downtime that could have been worth thousands.

In Conclusion

Attacks like these will only increase and grow in complexity.  Anyone looking for protection from assaults like these needs to ensure that they inquire about Application Layer mitigation in addition to protection from standard network based DDoS attacks.