Poor measurement leaves networks dangerously open to attack
Poor measurement leaves networks dangerously open to attack

The Heartbleed bug not only shook the IT industry, but also impacted many end-users, making it one of the most high-profile threats in history to upset the online world. With the threat landscape becoming more pervasive, and potentially causing irreversible brand damage, securing an organisation's network should be paramount…right?

Unfortunately recent research from Vanson Bourne on behalf of Tenable has shown that despite 88 percent of IT decision-makers believing security should be a top or high priority to the business, only 54 percent are using sufficient metrics when trying to determine their IT security status.

The metrics currently being used to measure security environments in UK businesses reveal some worrying findings:

  • 57 percent of respondents are measuring the quantity of malware detected
  • Less than half (41 percent) stated they use time taken to identify an issue as a key metric
  • 42 percent of respondents look at whether the system is running up-to-date anti-virus software
  • 50 percent monitor the state of their environment for vulnerabilities manually
  • Six percent of respondents claimed they did not use any metrics to monitor their security systems

The ever-evolving attack surface, compounded by multiple devices coming into the organisation and the rise of cloud use, means businesses are under constant threat of breach. Understandably, this drives a desire from IT professionals to evaluate what they are getting for their security investment, understand their attack surface, measure the efficiency of controls, and demonstrate tangible reductions in risk to the organisation. However, it seems many IT decision makers, even with the best of intentions, are going about this in the wrong way.

Looking at the quantity of malware detected tells a business nothing other than it's being attacked – there's no value-add around how many attacks are stopped or where they are being targeted. Equally, manually trying to measure security system vulnerabilities is a losing battle, even without the human error factor, threats are becoming so evolved they need to be monitored for constantly.

IT decision makers need to ask themselves the following questions in order to set up metrics that accurately measure and report on a network:

  • What is the time taken to patch critical, automatically exploitable vulnerabilities (per business unit)?
  • What is the percentage of systems with up-to-date malware defense (per business unit)?
  • What percentage of systems are scanned in-depth (credentialed) by a vulnerability scanner?
  • What is the time taken to identify unknown assets on the network?
  • What percentage of staff completing security awareness training?

Having the right metrics could have saved companies copious amounts of time and money in addressing threats such as Heartbleed. Reflecting on the multitude of news headlines that resulted from this vulnerability, it seems very few companies were set up to defend against, or respond effectively to, such a threat.

Guidance from the SANS Institute also serves as best practice when it comes to evaluating the security of a network. The organisation cites three steps to ensuring system security: 1) find all of your systems – gain situational awareness of where the attack surface resides; 2) assess all the systems you have – understand the threat and target critical vulnerabilities continuously and in real time, and 3) prioritise – with the ever-increasing attack surface there is a need to understand the most critical vulnerabilities on the most business-critical systems and address these first.

By following this guidance, and using more accurate metrics, IT departments can be confident of their network's security status, and able to face the evolving threat landscape – whether it is the next generation of Heartbleed, or an even more pervasive attack – head on.

Gavin Millard is EMEA Technical Director at Tenable Network Security