Code repository GitHub has had to deal with another DDoS attack on its servers, the second one in a year.
According to a service log by the organisation, the problems started around 10.39am on Tuesday with reports of “connectivity problems”. Nearly an hour later, it reported that these connectivity issues were identified as a DDoS attack. It wasn't until 1.52pm that service was restored.
The service suffered a minor outage the previous day for around half an hour.
At the present time, it is unknown who or what caused the outage, but the outage follows a similar attack in March when Github came under attack from servers linked to China. In that incident, the site sustained an attack lasting four days.
Yesterday's attack was caused by “a wide combination of attack vectors," according to GitHub. The attack also included techniques involving hijacking of unsuspecting user traffic to flood GitHub servers.
Dave Larson, CTO at Corero Network Security, told SCMagazineUK.com that while the previous attack suffered by GitHub appears to have been driven by a nation state, it's too early to tell who is responsible for this most recent attack or what the motivation might have been.
“Github supports a global community of millions of software developers collaborating on countless different software projects. It's likely that one of these projects involves a cause that hackers disagree with – such as developing software to support freedom of speech, by avoiding government Internet blocks. Opponents to a cause may then seek to attack these projects by target the whole site with DDoS attacks,” he said.
“DDoS attacks are increasingly significantly, with many of our customers experiencing an average of four attacks per day. Techniques are becoming more sophisticated and attacks are easier to launch, meaning that any organisation of any size can become a victim.”
Larson added that regardless of the motivations, this significant DDoS event “highlights the need for a proactive defence woven into enterprise IT infrastructure, upstream hosting and internet service provider networks, in order to protect our growing dependence on online business and activity.”
Paco Hope, principal consultant at Cigital told SC that DDoS is not defended exclusively in operations – for instance, at the network layer or in the cloud.
“When we design and build software, we have to look for ‘expensive' operations that allow attackers to achieve amplification effects,” he said.
“We spot amplification flaws through secure design review and secure code review. Security reviews of design and code can identify functions that require very little input from a user, but consume significant resources (network, CPU cycles, memory) to produce a response. Even when network monitoring and incident response run well, flawed applications can be hamstrung by their own bad designs. The GitHub attack is not a design failure. It is simple enough to buy services that monitor, defend, and respond to such attacks.”
Adrian Crawley, regional director northern Europe at Radware, told SC that it is known that the webpage had connectivity issues because of a DDoS attack.
“This normally happens when the Web Application or related infrastructure cannot accept regular requests anymore. The way attackers achieve this is simply by misusing the available resources with non-legitimate requests on the application layer or sending mass amounts of junk data to the network layer. In both cases regular users will not be able to connect anymore,” he said.
The news comes just after an Akamai report found that DDoS attacks continued to rise in the second quarter of this year, doubling year-on-year for the third quarter in a row. A solitary Shellshock exploitation was responsible for 49 percent of all web application attacks during the period.
“The threat posed by distributed denial of service (DDoS) and web application attacks continues to grow each quarter,” said John Summers, vice president of cloud security business unit, Akamai. “Malicious actors are continually changing the game by switching tactics, seeking out new vulnerabilities and even bringing back old techniques that were considered outdated.”