As businesses continue to move critical operations online, distributed denial of service (DDoS) attacks are increasing in frequency, sophistication and range of targets.
In a 2011 Verisign study, 63 per cent of respondents reported experiencing at least one attack that year, while 51 per cent reported revenue loss as a result of downtime from the attack. Those numbers are undoubtedly higher today as the size, frequency and complexity of DDoS attacks continue to grow.
Mitigation against these types of attacks is challenging and generally requires layered solutions across data centers and the cloud in order to manage. The success of these attacks and their ability to damage a company's infrastructure, revenue and reputation is indicative that many IT managers still haven't found the right protection formula to proactively mitigate them.
A DDoS attack occurs when a botnet is used to send an overwhelming amount of bad traffic to an intended target, such as a company's website. Let's use an e-commerce site for example: nearly every e-commerce site has an ‘Add to Cart' button. If a DDoS attacker could script a thousand bots (some botnets have over a million bots) to simulate clicking on that ‘Add to Cart' button and generate more traffic than the site could handle, legitimate shoppers would have no chance of getting their click in.
The key to fighting a complex attack like this is being able to differentiate a real shopper from a bot so the website can service one and ignore the other.
With the ease of access to the internet and prevalence of social media today, unsuspecting computer users are making it easier than ever for malicious actors to target them with malcode. This trend has helped provide the perfect environment for DDoS attacks to grow both in size and complexity. In fact, attacks of 100 Gigabits per second (Gbps) have been recorded.
To put that into context, the largest recorded DDoS attack was 2Gbps in 2002. Considering that most websites have less than 1Gbps of network bandwidth, even small attacks today can quickly prove devastating.
In addition it's not just web infrastructure the attackers are targeting, but increasingly the Domain Name System (DNS) infrastructure as well. Arbor Networks' 2012 Worldwide Infrastructure Security Report indicated that 41 per cent of respondents experienced DDoS attacks against their DNS infrastructure.
We've seen this with several recent attacks against financial institutions and others that used new malicious code to attack the DNS sub-system of the victim organisations. This type of attack brought the targets down in two ways; bandwidth exhaustion and by overwhelming processing capacity.
Bandwidth exhaustion, a result of continuous querying of the victim DNS by a botnet, caused the network pipes of the target's DNS server to become saturated, resulting in an error message for legitimate users. This was then complicated by a ton of very large DNS packets being sent, thus overwhelming it's transactional capabilities. So, even if the attack didn't saturate the bandwidth, it would saturate the computing resources of the target. Pretty sneaky.
DDoS attacks, while previously a nuisance, are now a fact of life on the web. They aren't going anywhere, so enterprises need to have a battle plan for combating this ever-evolving threat.
In 2013, we expect to see more enterprises trying to block harmful traffic before it reaches the network or application to eliminate the many risks associated with cyber-attacks, like data breaches and network downtime.
As the traditional solutions on which many enterprises have relied for this - like over-provisioning bandwidth and firewalls - have proved costly and ineffective, companies will turn to cloud-based DDoS protection and managed DNS services to enable rapid deployment, provide transactional capacity to handle proactive mitigation, and eliminate the need for significant investments in equipment, infrastructure and subject matter expertise.
Taking the cloud approach will help businesses trim operational costs while hardening their defenses to thwart even the largest and most complex attacks. These cloud-based solutions will be critical in the future (they arguably already are). As companies and individuals become more reliant on the internet for critical processes and everyday tasks, downtime, no matter the reason, will not be an option.
Sean Leach is vice president of technology for VeriSign