Don't underestimate the idiots - monitoring behaviour on your network

News by Max Metzger

How do we spot anomalous system behaviour on our networks that indicates a security breach? That was the theme of last week's SC Magazine UK roundtable aboard HMS President, sponsored by Splunk.

A host of big data and cyber-security professionals gathered last week aboard HMS President on the Thames for a Splunk-sponsored roundtable event to discuss a topic that is hot on the minds of the industry: What is normal? More specifically, how do we spot anomalous system behaviour on our networks that indicates a security breach?

Our keynote speaker was the ponytailed Alex Van Someren, a partner at Amadeus Capital Partners and a venture capital investor with a current focus on the cyber security sector.  Van Someren is also a member of the Royal Society Cyber-Security Research Steering Group and currently working with Cylon, a non-profit which provides educational programmes to help cyber-sector startups develop more quickly.

Van Someren sees two principle issues. First, looking at incoming attack information: "Is this inbound traffic actually legitimate traffic?" Second, looking at outbound information and "understanding whether we want it to leave". 

Examining these, however, is getting ever harder as more people want to take their devices out of the office and bring-your-own-device solutions prevail. "The walled garden model of cyber-security is clearly over," he says and so, "we have to think about the physical location less and the logical location more".

But getting whatever 'the normal' is established is hard. As ever the human element is the weakest one. IT professionals' real task here is "to get the human operator to understand when an abnormality is seen".

Van Someren is a self-proclaimed "champion of automation". He told attendees that getting machines to do that job takes the weak, human element entirely out of an already difficult job. There are already companies that do this, some look for anomalies in high speed data flows, which humans cannot really do. Others train machines to know what exactly is normal, with long-term passive training exercises. The aim of these, Van Someren says is to "take away from the user the challenges of spotting the needle in the haystack of big data".

The other problem is the question of what data is most valuable? "No one really seem to have adequate tools to say which files on our servers should be leaving our office," he says. But the tools do exist to do this  job; by looking at permission  filing maps and crawling files, automated tools can select which terms and files are sensitive. "We're going to increasingly need these automated systems," he says but that doesn't mean they'll be taking our jobs quite yet.

"I don't think the robots are here to steal our jobs; I think these systems need supervision." 

He added: "There will be changes of context," that will require a human operator to oversee. An example was provided later from one attendee who noted how the Chancellor's budget was top secret until announced, and then the very same document, with the same key words, was in the public domain.

So how do we police our big data in an age of ever more prevalent cloud storage? "The business is clearly moving faster than the security," said Mark Ridley, a director of tech at Reed, a job hunting company.

The cloud also provides another problem, one of mobility, noted Nick Iannou, blogger and noted author as well as head of IT at RG Partnership Ltd. "When you have so many terabytes and exabytes in their system, you can't move." That said, companies often use cloud and non-cloud services. Stephen Gailey, a former CISO and the EMEA security specialist at Splunk, our sponsor for the day, said that in order to secure that kind of arrangement, "from a security and monitoring perspective, what you need is a hybrid solution".

How do we protect our data and deal with the advanced hackers involved in cyber-criminality, geopolitical espionage or corporate spying, if as Phil Cracknell, a risk advisor to Arriva, says "next gen attacks have already started".

Echoing Alex's earlier point, Becky Pinkard, director of Pearson's security operations centre, said the way we look at threats has to change: "We have this growing awareness of where data is going" and so "we've got to move away from a device-focused mindset", and "the focus should be on the data, regardless of where it sits." 

The question we should be asking ourselves is "how are going to make that work?"

Bhoopathi Rapolu, an author and big data strategist, picked up on Pinkard's statement, saying that tracing the location of an attack is not always a dependable way of identifying the attacker. "Most attacks do not attack in a straight A to B fashion," said Bhoopathi. 

When Citibank was attacked, it looked like the assault originated in France but in fact the attackers were elsewhere and merely taking the easiest path across the network. "The way attackers design their paths is not the easiest to spot," he said. 

The political spooks, cyber-hoodlums and corporate spies aside, carelessness is often the big threat to organisations and businesses. 

That the weak human element is tough to work around was a common theme around the table.

Regarding the contextual value of data, Hiten Vadukul, enterprise architect at Virgin Active, brought up a good question: "Whose role is it to classify data?" We may see a move toward getting employees to start classifying the security required for their individual data. In a world where data is ever expanding, it's a big ask to only get CISOs to look after that. Vadukul thinks this should be the case even if it isn't yet: "The average person in the department doesn't understand what the responsibility is" and they must be made accountable for their data, "until we understand that, we've got a cavalier attitude" towards data.

"Never underestimate your idiots," the attendees were told in perhaps a slightly cynical note. There is always the need to watch your employees' activity. That might be OK in the US and the UK, but the worker's councils of France and Germany are not so easy to persuade in terms of what is allowable under privacy regulations. 

Stephen Gailey offered up the option that "an automated system would get you round a lot of the jurisdictional privacy laws in Europe" in that it would no longer focus on individual judgements.

The users that you are so often trying to protect from cyber-attack are also often blissfully unaware of that fact. Ongoing awareness training is certainly advised. As Gailey reminded the table, "You need to build a security framework where its easier for them to do what you want them to do." 

To which Paul Appleton, global head of information security at international security company G4S, said that though you can educate and you can control, "you're never going to completely secure" the network.

Like it or not, the great divide between the tech side of businesses and the business side of businesses will have to start working more closely together. The table generally agreed that, as Pinkard said, "We need to have CISOs who have the business understanding as well the technical understanding."

It's also the information officer's duty to tell the other parts of the business what exactly their job is and what the business is paying for, especially if automation is the future. The problem is, nobody knows why they're paying to put out a fire that, as they see it, never happens. 

Nick Ioannou put it plainly: "If you do it well, people ask what you're spending money on."

CISOs and information security personnel will have to start telling their higher-ups why they are valuable according to Pinkard: "It goes back to the outputs – showing the board what we're doing." 

So, where do we take network monitoring now? Automation clearly seems a favourite in the industry. The question here, says Rapolu, "To what extent can we apply machine learning and when can we let the human take over?" 

Vadukul offered a partial solution. He thinks it requires a different mindset to determine what we need to build now. It will need "skillsets that require IT skills, but fall outside" of those skills. Behavioural science, for example, is "critically important" to  understanding data alerts, Vadukul thinks.

There may have been more questions asked than answered at the roundtable, but among the participants, there were high hopes that machine reading and machine learning might be the silver bullets that solve the future problems of policing the network in the age of big data, converting information into actionable intelligence.


Find this article useful?

Get more great articles like this in your inbox every lunchtime

Video and interviews