Is responsible disclosure responsible enough?
Is responsible disclosure responsible enough?
A Jeep taken over from 10 miles away via in-car entertainment system in the summer and just this week news breaking of critical medical devices that are being 'owned' by botnet operators. Vulnerabilities in your web browser are one thing, but when they are in your car or an MRI scanner then the potential impact takes on a different hue. As, indeed, does the small matter of how the security researchers who most often uncover the coding flaws disclose them.
New research from AlienVault reveals that 64 percent of security professionals think that when security researchers get no response from vendors when it comes to disclosing a vulnerability with 'life-threatening implications' then the vulnerability should be made public. Some 19 percent of the 650 IT security pros questioned at Black Hat in Las Vegas earlier in the year went as far as to say the vulnerability should be fully disclosed to the media. This is in stark contrast to the traditional process of responsible disclosure whereby all stakeholders agree to a set period for a fix to be produced before any such publication.
Maybe it's not that surprising, given that when a group of researchers disclosed a flaw in the Chevy Impala software that could give hackers control of the vehicle it took General Motors five years to fix it. Fast forward to this year and the public hacking stunt, if not actually a public disclosure, with the Jeep led to an almost immediate recall and fix by Chrysler. AlienVault security advocate Javvad Malik told SCMagazineUK.com that "there's only so much public shaming that will remain effective until fatigue sets in and you'll end up with ‘oh there's another hacker claiming the sky is falling' and lose effectiveness." Indeed, Malik insists that often when a vulnerability is disclosed publicly it's not necessarily the public opinion that gets a vendor to change "but rather a government or regulatory body that applies pressure." What Malik wants is a "mechanism whereby researchers could work with regulatory bodies directly" although he admits it is something of a tricky question with many moving parts. SCMagazineUK wondered what other industry insiders thought, so we asked them:
Ian Trump, the security lead at LogicNow, rightly calls this a huge issue, and one for which security professionals largely find themselves in uncharted waters when dropped into the middle of it. "I don't think when human life is at stake we can rely on a system which is semi-voluntary and is governed by people doing the right thing" Trump said, talking exclusively to SCMagazineuk.com "This situation reminds me of the debates about regulatory bodies versus free market advocates" he continued. "Was it not the automobile industry that was outraged back in the day with mandatory safety standards like seat belts?" Trump thinks that the Internet of Things has thrown us right back into this debate, and that government (or licensing, or regulatory, or insurance agencies) need to gain visibility into what the risk is to human safety and be empowered to act. "This process, invented largely when the Internet could not be imagined to hurt anyone, has to adapt to the new reality that planes, trains and automobiles are really fast moving data centres," Trump insists, adding: "It's not morally or ethically acceptable to disclose anything which could hurt or harm a third party".
Nick Pollard, UK general manager of Guidance Software (which has trained some 50,000 cyber investigators) agrees that, "a framework for truly responsible disclosure to the public should be implemented by government agencies who have the resources and capabilities to enforce along with the accepted mandate of jurisdiction in these matters." This framework should be implemented as an international standard of vulnerability handling and disclosure, according to Pollard, who says it should allow private vulnerability testers a time period of disclosure with the vendor, creating guidelines of handling and response time by the vendor, and recourse or accepted allowances by vulnerability testers should vendors not respond. "It should not fall to individuals or commercial interests to have this responsibility and make that possibly fatal decision," Pollard told SC, adding that, "there has long been a palpable tension between security researchers and software vendors." With vendors incentivised to focus on new features over closing security holes, and researchers often selling vulnerabilities that require immediate fixes, it's no wonder the debate surrounding responsible disclosure has been marked by lawsuits and PR holy wars.