IOT and the interface between people and hackable machines/devices was a major area of concern for panelists on this week's the Future of Cyber security panel at IP Expo.
It was chaired by Rory Cellan-Jones, technology correspondent at the BBC, and the panelists were Eugene Kaspersky, CEO and founder of Kaspersky Lab; James Lyne, global head of security research, Sophos; Joshua Corman, CTO, Sonatype and Rik Ferguson, vice president security research at Trend Micro.
Asked what about their greatest fears in cyber-security, Kaspersky responded that it was “attacks on critical systems. Power stations ... things we depend on as nations.” And as to who might attack them, he suggested, “Cyber mercenaries able to conduct sophisticated attacks and paid by cyber-criminals.”
Lyne added that if we unpick IOT devices, the “whole range of devices and connected junk from toothbrushes to cars,” we find we have come to depend more and more on this technology, as founding pillars of our society, but if you lift the lid slightly, you discover the security is very poor.
“We're building bigger and bigger boats and making the holes bigger, opening old wounds that we should have learnt our lesson from. These flaws should be fixed endemically – if you sold a car with flawed brakes there would be consequences. There is a role for regulators to enforce – plus we need to educate consumers, who have never been more aware of cyber security, but there is a certain level of fatigue setting in.”
The biggest problem, says Lyne, is keeping up to date with what is good practice. On the plus side, the prevalence of ransomware has meant that people are becoming more aware that their data has a value to criminals because it has a value to them.
For Corman healthcare was one of his biggest concerns, “Where bits and bytes meet flesh and blood. Planes, trains, cars, medical devices etc. Every PCI-compliant merchant has had a breach – and while [credit] cards are important, they are replaceable, but we are not spending any time on these safety critical devices. They have low standards and a different set of attackers. With industrial control systems, those with little skill can do significant damage.”
He added, that of all hackable things that can cause harm, he is most concerned about hospitals which never patch, often have just zero or one security person, and are widely using XP, NT, and Win 98.
Corman cited the Hollywood Presbyterian hospital which failed to pay out after a ransomware attack and had to send patients elsewhere. People almost got killed by accident, so what could a more determined attacker do – such as the hacker who left Anonymous and joined ISIS?
The problem, according to Ferguson, is that people in this industry are not doing a very good job and the widespread hacks reported are a demonstration of their failure. “These sort of attacks like an SQL attack, shouldn't be possible. Companies don't do a good job and citizens are impacted. There is also poor implementation in enterprises. Encryption, multi-factor, segmentation of data – these solutions are not rolled out. Who among us really makes information available only on a need-to-know/least privilege basis?”
He reiterated the mantra of education, education, education as being needed, but noted how the cyber-security industry is often, “preaching to the choir – we need to make the message available in an accessible way. Apathy is part of it. But also there is also a widespread view of ‘it's never going to happen to me.' We need to allow interactive scenarios where we can make people know they can be penetrated.”
Kaspersky agreed, noting how penetration testing enables companies ‘attacked' to “ learn so much about your organisation.”
Ferguson proposed this approach be extended further, saying, “Right now there is lot of sandboxing conducted by companies, but we should do it for employees – let them fuck-up in a safe space.”
Lyne noted how the security function was still seen as “restricting people's ability to do stuff. We do sometimes fall into trap of being the NO guy. Often it's because you are the only info security person in the office, you are overwhelmed with the demands on you and you just can't do any more and so you will say NO to untested innovations. But if security can be seen as enabling people to do things, people will come onside. Improved ease of use, simplicity, say yes where we can.”
Right now, Lyne says, “We are all failing,” and cited the example of a company that had spent a lot of money buying, ‘all the right boxes,' but had failed to turn them on. Or another that didn't pay for logging for a week so had no log records after their system was hacked.
Cellan-Jones asked about the prevalence of state actors, asking who are they and whether these are just used as an excuse by companies to say that there was nothing they could have done.
Kasperskey commented, “We don't have access to the traffic, but in many cases we can guess, we can see the language spoken. Most state sponsored attacks are: in native Atlantic English, native Russian, and simplified Chinese – there is also French, German, Spanish, Arabic – but no Japanese. And there are those that pretend to use native English.“
Corman added, “We tend to say sophisticated state sponsored attack but most are very simple that anyone can do, and we get distracted by the headlines and the difficult stuff. But there are other adversaries, sub national, less capable but with high intent to use their capabilities, eg Anonymous. In comparison, nation states are more rational. But the less capable actors are capable enough and very likely to act.”
Corman also noted how today, theoretically, one person could tip the balance of an election if they had the means, motive and opportunity. He suggested that we “need to raise defences so only nation states and criminals can be on the battlefield.”
Corman agreed with Cellan-Jones that companies hide behind nation-state attacks. “People use ‘nation state' as excuse, or say they are compliant, as if to say, it's not my fault. But these attacks need to be addressed. We now need to build our defences out to the edge, starting with the data at the centre, look at what's going on internally (while still defending the perimeter), to see what's happening and act to mitigate it.” He went on to call for greater collaboration on strategies that work, suggesting the good guys don't work well together.
Lyne noted how a lot of cooperation was happening in the background, saying, “For us, it's about who can build the best tech, not, ‘you can't see my malware.' We share with Eugene [Kaspersky] and have done for 26 years. We are also really hopeful about the new National Cyber Security Centre [NCSC] – they are good people, true believers in their mission, and capable. If they achieve half of what they are setting out to do, they will have achieved a significant improvement.”
Lyne also discussed concerns about blame issues in reporting of breaches, commenting, “Banks are talking about information sharing. They say, ‘When we have someone drive a van into a branch, we are the victim. If we are breached by a sophisticated attacker, we are to blame.' Do we have the tools to distinguish between the innocent victim company and the lax company – which is an important consideration for EU GDPR reporting.”
He went on to consider culpability of vendors, saying, “Mostly we are defending software with known vulnerabilities and they [software developers] are not responsible for them. They take no liability or accountability for the IT we are buying so why would they invest more in being more secure?”
Ferguson asked whetherit was perhaps time for naming and shaming?
Lyne reiterated his view that people are using the nation state card as a shield, and while there are high end exploits, more often they are turned over by simple phishing links in a Word document. Conversely, he agreed that some get hit by zero-days that they couldn't do anything about and are also described as idiots.
The Europol ‘future risks' idea was also suggested of a having a cyber-security rating system for countries, which could filter down to an organisation level. One objection was the idea that a 5-star security company would be seen as too expensive.
Lyne brought in the issue of user fallibility, reminding the audience you can't expect everyone to be an expert. “You can't ignore the user, and have a safety bubble around them – but there needs to be more focus on making security easy to use, not making them make choices that they don't understand.”
Kaspersky noted that in a free market, particularly in the consumer sector, it will be the case that “the more secure is more expensive and later to market, so built-in security is almost not possible,” but added that for critical security, “It's a very good idea.”
Corman again asked, “What are the negligence levels for software companies, security companies and the user?” He argued that we haven't tried to be transparent about free market choice – ie, more expensive but more secure, less expensive but less secure. And when it comes to software, he noted that there's a “big difference between negligence and vulnerabilities in code. If you can prove negligence they are already liable (thus a need for new laws). But you can't hold people liable for previously unknown vulnerabilities.”
Ferguson pointed out that if software creators were held liable for any flaws in their code, it would negatively hit voluntary cooperative efforts, saying: “It would kill open source if you are legally liable for something going wrong with it.”