Are over-confident employees to blame for phishing success?

News by Davey Winder

Is security awareness training producing overconfident employees that are more of a threat than untrained ones? Davey Winder investigates...

According to Jingguo Wang, a University of Texas Arlington associate professor in the College of Business' Department of Information Systems and Operations Management, business folk are overconfident with regards to their ability to spot phishing attacks.

Research conducted along with Yuan Li from Columbia College, Missouri and HR Rao from the University of Texas at San Antonio suggests that a majority think they are smarter than the criminals sending the phishing emails. Which is probably why they end up falling for them.

Indeed, even one of the researchers nearly got caught out last year. Rao had recently sent a package via UPS so when he received a phishing mail using the ‘problem with a package' line to trick recipients into opening an attachment he admits he came close to falling for it.

The study, entitled ‘Overconfidence in Phishing Email Detection' was published in the Journal of the Association for Information Systems tested around 600 people by showing them a fifty-fifty mix of genuine phishing emails targeting financial institutions and genuine business emails from the same organisations.

Wang told Phys.org that the survey showed a need for feedback mechanisms within training to regulate confidence if such errors were to be avoided.

SC Media has been finding out if security industry agrees that staff may be getting trained to miss, rather than spot, threats.

We started by talking to Sarah Janes, managing director at Layer8, a company which specialises in training, awareness and delivering a security culture change.

SC Media asked Janes if phishing awareness training can always work, or can it create the kind of overconfident employees that the research mentions?

"Yes, but only if done in the right way" Janes says, continuing "take traditional phishing awareness; it will be one of two things, either several different example emails, some fake and some real, pointing out what you should look for; or alternatively, you get sent a fake phishing email and if you click through you get some sort of message telling you that it was illegitimate, but don't worry because this time it was just a test."

According to Sarah, the problem is when the real phishing email arrives and it is so sophisticated in its appearance and content that people are liable to be duped even if they've been well trained.

"That's because phishing emails prey on people's emotions and behaviour triggered by an emotion is difficult to train for," Janes explains, adding "the type of emotion we're talking about here is not wanting to make the boss mad because you haven't done something on time or that they asked for, or the fear of a penalty for not re-registering a domain name on time or completing a VAT return."

With the mass of data available on the internet it's easy for a hacker target companies during busy periods, or to garner organisational news via social media to make phishing emails read authentically.

Janes has seen phishing training work by using a process called ‘the hacker's perspective' which flips situations on their head and asks the question, ‘if you were trying to hack your business what would you do; write the phishing email that would get clicked on'. 

"By getting people to think in this way they can identify the vulnerabilities," says Janes concluding, "understand their team's behaviours and practices and collaborate on best practice and self-checking."

Where traditional training helps, then, is measurement. Run a fake phishing campaign at the start, train in the normal way, then re-run the fake phishing email again. Layer8 has seen reductions in click-throughs of up to 45 percent using this metric.

We then spoke to a company CSO, Gunter Ollmann from Vectra Networks, who told us he wasn't a great fan of phishing training for two reasons:

"There's always a better mousetrap" Gunter insisted, adding "an informed attacker has the ability to make an attack indistinguishable from regular traffic so you end up training staff to be suspicious of every email." And the problem with that is it leads to an overall loss of business and time as hyper-vigilant employees investigate non threats.

Secondly, Gunter says, everyone has an off day. "An employee may be reading the suspicious email, hovers over the URL so they can inspect it as trained, but then accidently knocks the mouse or touchscreen and activates a link leading to a compromise" warns Gunter "should that honest mistake be penalised?"

Ian Trump, global security lead at SolarWinds MSP goes even further in his dislike of the kind of study that triggered this discussion in the first place. "I'm tired of studies that focus on one aspect of security, and say users suck, anti-virus sucks or <insert technology> sucks" Ian says "without looking at any other mitigation technology the study only proves that one solution used on its own sucks."

As Ian points out, it was inevitable that phishers would realise they could improve their chances of a payday by using spelling and grammar check in emails. And given enough time, effort, practice and skill a cyber-criminal (or nation state actor) can build an extremely-legitimate looking fraudulent email.  

"So is it fair to heap the responsibility on the user?" Ian asks, concluding "it's disingenuous and unfair to study user confidence in detecting fraudulent emails without the technological layers in place to intercept the evil email in the first place. If your business security strategy relies on the user then you are not even close to mitigating the business risk of compromise..."

Topics:

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Upcoming Events