"We are very much in a race against against our attackers,” said Jeff Penrose, a policing veteran and currently involved in worldwide business and strategy for IBM's I2 which produces investigative software for law enforcement.
Opening his talk, “The cyber-security arms race” at FIC 2016 in Lille, Penrose drew a broad scene for his audience. While attackers are constantly probing their targets' systems for weaknesses, and though defenders can protect against those, they face the daily ordeals of insider threats, supply chain vulnerabilities and the vulnerabilities they don't even know about. It really is, said Penrose, "a race for us defenders”.
The management of risk is critical to cyber-security, and given the constantly changing threat landscape and increasingly dispersed sets of data that businesses use every day, “It's impossible to react to every threat,” he said.
To that end, those given the arduous task of policing systems need to ask a couple of questions about how they use their data. Namely: who is using critical data, how is it being used and what is it being used for?
Dealing with this kind of widespread, complex problem might be seen as a job for a machine, if only because of the sheer size and scope of the data, let alone how it's distributed. So, said Penrose, we must “put humans at the heart of our concerns” – humans, after all, write law, policy and practice for cyber-security and write the profiles and calibrate the machines which handle this data and protect against threats.
It was at this point that Penrose invited on stage Emmanuel Jacque, current general director of the OAK branch of SAS, a man with years of experience in the military and currently concerned with building a project that is supposed to detect early signs of radicalisation.
Jacque agreed with Penrose on the centrality of humans to dealing with cyber-enabled problems: “Machines do not attack machines”, said Jacque.
Jacque has been working on software meant to combat radicalisation, specifically achieving “early detection with different technologies”. The project uses three levels of analysis: the first looks at semantics and signs. By trawling social media and the internet, the operators can pick up on particular language, colloquialisms and slang terms used among certain target groups of people, said Jacque. It's used to “highlight a kind of vocabulary used by a particular group”.
That then moves onto a technical analysis: what kind of information the targeted individuals are sharing between each other. Finally a relationship analysis is carried out. With the addition of that final facet, the investigator is given “the basic structure of a social environment”. With this kind of data management and analysis technology, an investigator can model social groups and determine their habits.
All of this builds a ‘social environment' for law enforcement. With a picture of that social environment, police can not only find those who have already been radicalised but those who are also in danger of being radicalised in the future. It doesn't even need to be used in an explicitly law enforcement capacity, but according to Jacque, can also be used to broadcast a counter-narrative to radical ideas.