Changing the model of security information and event management

News by Tony Morbin

Next generation SIEM is taking information from virtually any source and cleaning it before it even ends up in the hands of incident response (IR) - and is aiming to analyse data beyond security applications.

Security information and event management (SIEM) systems seemed to be disappearing along with the perimeter that they defended, but now next generation SIEM is taking information from virtually any source and cleaning it before it even ends up in the hands of incident response (IR).

Speaking at LogPoint’s ThinkIn conference in Copenhagen this month, Jesper Zerlang, CEO at Logpoint, explained how this transformation has happened and what it means for companies in the sector and SIEM end-users.

LogPoint is planning for 1,000 percent growth over the next three to five years on the back of its new approach to SIEM. "Eventually we want to IPO the company, so in three to five year years we would list the company and take it to be one of top three players in the space," adds Zerlang.

As to the platform itself, it was explained how data put into an analytic system needs to be sufficient quality and quantity to get meaningful results.

Logpoint ingests data from say 150 different systems and languages, then translates it to its own proprietary format to do the analytics. "Others need a lot of data cleaning, we do this at the ingest point so the data is cleaner to deal with," explains Zerlang.

He continues: "Incident response was saying they need to automate investigations and clean noise in IR, but we say, fix the root cause so the SIEM is not generating pollution in the first place. We want to change from unintelligent noise that needs to be evaluated….SIEM will be at the heart of everything."

Christian Have, CPO at Logpoint adds: "Logpoint is the provider of the data that enables (IR) by providing one common taxonomy for all the data, so analytics can identify all users with anomalous behaviour, all the triggers that feed into the incident response tools.

Christian Have, CPO at Logpoint

"So we may not know how HR wants to deal with it [ the anomalies], but curating and providing this set of high quality data and high fidelity data enables it to be consumers by IR. If the IR tools need more data, they have an API to know what to ask for and know the values we provide, so they can ask for more log-in data for this user."

Key to the system is automation using machine learning to reduce the volume of alerts needed to be handled by human analysts so SC asked how long before true AI eliminates the need for human analysts?

Zerlang used the analogy of flying an airline: "You have people to ensure alarms are real, but the majority of the work is done by machine learning and the people are there apply to right solution to serious issues. If 490 out of 500 issues are automated, then highly skilled analysts handle the remainder. Machine learning is here and now. And there’s cognitive AI and mathematical AI. Even for true AI you’ll need a human to train it, so it may be 20 plus years to fully automate. There's a lot of hype about AI, but machine learning can take algorithms and identify an outlier to a peer group and add value."

Have added: "Artificial intelligence is such an exotic topic, what it really is, whereas we can see how machine learning can be applied now. It can find more complex threats in the data, and be used to direct responses and priorities to the most critical threats. There’s a lot more we can do with ML before getting AI."

And how much concern should there be about attackers also using machine learning or AI? "It will be used on both sides of fence – but it’s too expensive and complex for most circumstances so it’s not really applicable for the broader market. For mid-sized companies – they want something that will work tomorrow," said Zerlang.

Talking of wanting something that will work tomorrow, SC asked about the issue of existing customers having to wait for upgrade patches for months while new customers got the latest functionality immediately. Zerlang replied, "Regarding the upgrade path for existing users, our last upgrade, version 6.00 six months ago, was a bumpy road as we needed a new technical foundation and needed to do a massive upgrade."

Have adds: "We changed the entire underlying architecture, the operating system etc to prepare for the machine learning revolution. Uploading a small patch to an old system was possible but difficult, and we wanted all old systems to be upgradable. The patch took longer than the new version to install, three and half months to upgrade. It was the first time we’d done that kind of change, but we don’t see need for it again for the next decade – it was better to rip off the plaster now."

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Video and interviews