The latest Symantec Internet Security Threat Report has been published, and as usual contains some interesting statistics. Not least that the number of organisations opting not to report the numbers of records lost in a breach was up by 85 percent year on year.
Kevin Haley, director of Symantec Security Response, insists that "the increasing number of companies choosing to hold back critical details after a breach is a disturbing trend. By hiding the full impact of an attack, it becomes more difficult to assess the risk and improve your security posture to prevent future attacks."
Does the IT security industry, however, actually agree with this take on transparency as it relates to meaningful security posture?
There has been an ongoing debate about disclosure in all its forms for the longest time and transparency is a part of that for sure. Yet whether transparency, in terms of the number of records exposed in a breach for example, is core to a secure posture, to truly understanding and tackling risk, has to be debatable. Alex Vovk, CEO and co-founder of Netwrix, told SCMagazineUK.com that "in light of evolving cyber threats the fact that most companies prefer not to disclose information about security violations is highly disturbing", not least as "hiding data breaches from the public makes it more difficult to properly evaluate the threat level and take necessary response actions."
Which is hard to argue with, but the question being asked is less to do with disclosure per se and surely more to do with what is being disclosed? Are the numbers key to the disclosure? "It is vital to gather information about the exact number of stolen records", Vovk insists, "since it allows to critically evaluate the real scope of a data breach and develop an appropriate strategy for mitigating the negative impact."
There's an argument to be made that we the industry, as well as we the media, are becoming rather too obsessed with the whole size of a breach metric. Is this at the cost of how a breach was allowed to happen and how it could have been prevented though? Surely there are more vital questions to be asked of your security posture than simply doing defence by the numbers?
Ian Trump, the Security Lead at LOGICnow (disclosure: the author of this article also co-authored a paper on 'Mitigating Cyber-crime Through Meaningful Measurement Methodologies' with Ian last year) argues that transparency is not important to a business unless the state or regulators demand it. "Disclosure rules are not designed to single out the company or drive a robust security posture" Trump told SC, continuing, "disclosure is supposed to protect the customer, by informing them when their information may be in unauthorised hands. If a business views fear of disclosure as the main reason for the security posture, in my opinion they have already failed at security."
Trump is, perhaps unsurprisingly, adamant that the two issues of size and disclosure, and why it happened and how it could have been prevented, are not joined at the hip. "In most cases, the why and how may take many weeks if not months to figure out" he says, adding, "the size and disclosure requirements driven by state and federal regulations for companies are actually detrimental."
After all, Trump suggests, if you don't know what has actually been taken, how can you realistically tell your customers they are at risk? "These regulations force companies to error on the side of caution and notify everyone" Trump concludes, "vastly inflating the potential breach's size when the actual scope may be far smaller."
Adrian Sanabria, Senior Security Analyst at 451 Research, has also been thinking about the importance of transparency and SC managed to catch up with him. "I don't see a clear and comprehensive solution", he told SC, but added that after researching breaches for more than 15 years he feels strongly that "transparency isn't just the best choice for the rest of the industry, it's actually the best choice for the business also."
Certainly from the post-breach PR perspective, there's no arguing with this viewpoint, and most people are, to a degree, sympathetic with organisations which is immediately transparent and apologetic. "The knee-jerk reaction of some companies is to immediately deny the event or try to prevent information from getting out", Sanabria suggests "has the opposite effect on image." But Sanabria goes further, suggesting that perhaps transparency before a breach might not be bad PR either. "There's a theory I have that people have a natural tendency to think that it is safer to do business with the company that's already had its breach" he insists, explaining that "if you think about it, the breached company's security details are out in the open – they have no choice but to take security seriously now. A company that hasn't had a breach is an unknown quantity."
John Hetherton, senior manager for Information Governance at Espion, sees transparency as having a shockwave effect, albeit a temporary one. Call these shockwaves security panic if you prefer, but the impact is the same according to Hetherton who says that "if the breach is significant enough this results in board awareness, which results in questions being asked, which results in resources being made available for security which may not have been previously there." So transparency does affect security posture then?
Not so fast, says Gunter Ollmann, CSO at Vectra Networks who told SC that there is "no easy answer to the debate over breach disclosure", simply because "there is yet to be an accepted definition of what qualifies as a breach."
He has a point. After all, every large organisation is continually a victim of malware and insider attacks, so where does the threshold kick in before public notification is necessary? "It is generally accepted that there is a need to alert the public if customer records have been accessed by an attacker", Ollmann admits, adding "however, if the records are encrypted and the attacker was unable to derive any PII, then there is no legal requirement to make customers aware."
Furthermore, since many organisations aren't sophisticated enough to identify threat activity to any meaningful level of granularity, "if you can't observe it, how can you prove you were breached?" As far as the 'size of the breach mentality' Ollmann is convinced that this is driven largely by vendor marketing teams and that there are only two real statistics of value: how many breaches per year, and how many breaches per organisation: "The former helps provide insights into the scale of the threat" Ollmann insists, concluding "the latter enumerates companies and organisations that are both perpetual targets and organisations that just aren't doing enough to protect their customers."
Does it really matter if 100 records were breached or 100 million, in other words, isn't that the breach happened at all where the focus should sit? Andy Taylor, Lead CCP Assessor for APMG International and someone who has worked in information security since the mid-1980s, is convinced that the size of the breach is almost irrelevant "except for its ability to grab the headlines."
As Taylor points out, "a breach of 10 or 100 will often go unnoticed by the press even if those 100 are important customers. A breach of a million will usually be reported."
There sits the heart of the matter: The response to a breach should be the most important factor. "If it takes three weeks to work out who might be affected and then longer to do something about it, then that is bad news and needs to be reported", Taylor agrees, continuing "if the breach has been handled correctly and the mitigations have resulted in a minimal amount of collateral damage to customers by responding quickly and effectively, then that is the most important piece of news."
Yet there remains way too much mystery surrounding breaches insofar as people are unaware why they happen, what happens when they do and what they need to do to reduce the chances of them happening again. "This is particularly true in smaller organisations", Taylor concludes, "and broadcasting the big breaches of big companies does nothing to tell SMEs and individuals that they are also at risk and to therefore educate them regarding the problems."
As Professor Steven Furnell, senior member of the IEEE (Institute for Electric and Electronic Engineers) and Professor of IT Security at Plymouth University, told SC in summing up "size is often easier to measure and gives the big scary numbers. This is not to say that the numbers are not important and that we shouldn't be worried about them, but it can distract attention from the impact issue that ought to be of more concern. Fixating too much upon the scale (of the threats or a resulting exposure) runs the risk of focusing our attentions upon the wrong thing. It is arguably sufficient to know that significant threats exist in volume, and that they are widespread - beyond this, the significant thing is what we choose to do about them..."