Panic buying for sometimes unreal threats is a problem for firms but an opportunity for vendors. Psychology is all. Mark Mayne explains.
Security is an emotive subject, and that includes IT security. Although IT security technology is constantly evolving, many of our responses to the issue are nothing to do with the newest threat itself. Humans respond to threats and stress in complex ways and years of research have come up with theories to explain some of the stronger responses. These hard-wired responses are exploited by advertising and marketing experts, often not consciously. Could it be that IT security decisions are made on predictable psychological reactions, rather than the cold hard technological facts? Perhaps it's time look into the psychology of it all.
Security is a basic human need, and has been recognised as such for some time. In 1943, US psychologist Abraham Maslow posited a five-layered structure of human need, beginning with essentials such as food, followed by security, family, esteem and self-actualisation. This model, Maslow's ‘hierarchy of needs', has been used since in understanding human motivation, management training, and personal development.
However, ‘security' in reality is less easy to define. It is always a trade-off, a balance between never leaving the house, or connecting to the internet, and leaving all the doors and windows open when you do. This applies across the board, from the desktop to geo-politics.
Paul Hanley, senior manager, Deloitte security and privacy team, said: “Security is a very emotive subject, whether it is our home, our children or our bank account details. We all want what is best for our protection, although purchasing a tank to drive your children to school does not mean they will be protected from all evil. Likewise, securing your home with landmines is frowned upon in Surrey.”
Although we all make these decisions daily, we don't always have the expertise to make them intelligently. Flying seems more dangerous than driving, but accident rates prove otherwise. We often make decisions based on how we feel, rather than what the numbers say.
BT Counterpane CTO Bruce Schneier wrote: “If we make security trade-offs based on the feeling of security rather than the reality, we choose security that makes us feel more secure over security that actually makes us more secure.”
In 1979, economists Daniel Kahneman and Amos Tversky published a paper on a concept they called prospect theory, later to win Kahneman the 2002 Nobel prize in economics. Prospect theory contradicted popular economic theory at the time, which dictated that two groups of individuals would choose in equal number to either lose or win the same amount of money – essentially because the odds are the same. Kahneman and Tversky discovered that this was untrue.
In an experiment, they took a roomful of subjects and divided them into two groups. One group was asked to choose between these two alternatives: a sure gain of $500 and 50 per cent chance of gaining $1,000. The other group was asked to choose between these two alternatives: a sure loss of $500 and a 50 per cent chance of losing $1,000. When faced with a gain, about 85 per cent of people chose the sure smaller gain over the risky larger gain. But when faced with a loss, about 70 per cent chose the risky larger loss over the sure smaller loss.
Nigel Nicholson, professor of organisational behaviour at London Business School, said: “We are amazingly loss-averse, and generally pay far more attention to loss rather than gain. Product warranties are a good example of this. It's also part of the reason newspapers carry far more bad news than good.”
Security technology suffers from the same difficulty – a choice between a small definite loss (the cost of the product) and a large indefinite loss, caused by a successful attack on your network. Assuming other variables are equal, buyers would rather take the chance that the attack won't happen than suffer the sure loss that comes from purchasing the product. Security is a negative sell. Incidentally, this is why marketing messages often mention ‘discount' or ‘saving' – this frames the outlay as a gain, making people more likely to proceed.
Bruce Schneier continued: “Security sellers know this, even if they don't understand why, and continually try to frame their products in positive results. That's why you see slogans with the basic message, ‘We take care of security so you can focus on your business', or carefully crafted ROI models that demonstrate how profitable a security purchase can be. But these never seem to work.”
An obvious solution to this problem is to encourage fear, an extremely powerful primal emotion that overwhelms much of our rational decision-making ability. If people are truly scared, they will take almost any steps to make the feeling go away. This phenomenon is not confined to IT security; counter-terror efforts since September 11 have focused on it heavily. The colour indicator used in the US to denote level of danger is useless in reality, but it does induce general fear, impeding the ability to think rationally.
Howard Schmidt, president of the Information Security Forum (ISF), said; “All security professionals know that there are tens of new malware items produced every day, along with several vulnerability bulletins. This is something we can analyse and then act upon proportionately. I once gave a briefing to a highly-placed executive about the security landscape. The next few weeks saw him contact me over almost every bulletin and alert, asking what he should do – he was totally gripped by fear.”
Another example in the IT sphere was the bug-that-wasn't, Y2K. Originally posited by Canadian IT consultant Peter de Jager, the concept was hyped by the media to the point that governments reacted. The UK government ordered that the SAS and armed troops be on standby in case of national disaster. The pointlessness of this became clear after the event, when countries such as Italy that refused to take any action suffered just as much IT chaos as those that had invested heavily in the bug-cleanup.
The use of fear as a marketing tool has possibly been overused recently, so much so that some companies are taking the opposite tack in search of customers. One UK example is Sweet Dreams Security, which provides home security products with an unusual light-hearted edge. Razor wire is shaped like butterflies, while burglar alarms are disguised as flowers. Matthias Megyeri began Sweet Dreams to offset traditional fear-based marketing. “I began to understand the psychology of crime and fear. I also realised we need cuteness in our lives. Security has almost become a mental illness. The fear of crime has grown, so the security market has grown with it, but nobody is taking care of our mental well-being. There's too much fear, and it's damaging,” he said.
Of course, the use of fear as a sales tool assumes that the prospective buyer believes that ‘product x' will solve the problem. Martin O'Neal, managing director of security consultancy Corsaire, said: “As far as buyer psychology is concerned, perception is everything. A good example of this is anti-virus software. It is currently sold as a panacea for addressing malicious software, yet is acknowledged by most experts in the field as having little practical value. Just about every blue-chip corporate will be spending thousands of pounds every year on AV licensing. No-one wants to put their head above the parapet and choose an alternative. A thousand corporate buyers can't all be wrong, can they?”
Schneier agreed: “A lot of the things we take for granted should be questioned. For example, firewalls and anti-virus – do they really work, and do we need them? The media has a big role to play here. We see things distorted every day. A good example here is cyber-terror, almost entirely a media creation.”
O'Neal continued: “It's the same story with IDS and web application firewalls. There's rarely any value in them, but about 70 per cent of businesses have one or both installed. This is because businesses are buying the perception of security, rather than actual security. If the perceived wisdom is that a web app firewall is necessary, it is very hard for a CISO to stand in the way and state that it's useless and a waste of money – people don't want to get caught out. It's that culture of best practice, where unless you've got all the right boxes you're in trouble, even if those boxes are not configured properly.”
This last point is an example of ‘satisficing' (a portmanteau of ‘satisfy' and ‘suffice'), where, in a complex decision-making situation that is simply too hard to analyse accurately, we usually opt for the ‘best fit' answer – the one that looks like it will be good enough, rather than the actual best. This is often the reason computer users simply choose the standard configuration options in a system, rather than checking out every single alternative.
Another interesting concept is the ‘affect heuristic', first explored by psychology professor Paul Slovic. This particularly potent theory states that if a person is answering a survey, for example, and is asked an emotional question first, such as “how many dates have you been on recently?”, this will cause all the following questions to be answered more emotionally than rationally. The affect heuristic is widely used in advertising, and especially in security sales, where threats are rarely directed at you personally, but rather at those you trust or feel loyalty towards. It is also why celebrities are often used in marketing and malware campaigns.
Howard Schmidt said: “Lao Tzu told us that we need to know our enemies in order to stand any chance of beating them. But we also need to know ourselves first. Why do so many intelligent people fall for really blatant phishing scams? There's been an awful lot of work done into the criminal psyche, but not all of this transfers directly across into the online world.”
Clearly, psychology plays a huge role in our understanding of security, and is a large part of the decision-making process at an individual level. So why has so little research been done on the interactions?
Bruce Schneier has a theory: “There's simply so little cross-discipline work going on in this area. The advertising field is putting in a lot of work, but this is all relatively new in a security context. Marketing professionals use a lot of the techniques, but unconsciously. They use them because they've worked in the past, not because they've done the research into why.
“Security professionals are totally uninformed about this, because it's not their specialism,” Schneier added. “The concepts explain so much about the industry that there should clearly be more work done. Staying siloed in our areas of expertise is not going to bring the answers that this area poses. Security is always about the same essential things, just the domains are different.”
DESIGNING FOR SECURITY
It's a truth universally acknowledged that human error is one of the biggest factors behind security lapses of all sorts. This thinking has led to a variety of research projects looking at the issues behind how people navigate software interfaces.
A 1999 project by Alma Whitten and JD Tygar on the usability of PGP encryption software found that most test subjects were unable to sign and encrypt an email in PGP within 90 minutes. Earlier usability studies looked at security software as the primary goal of the user, rather than the secondary, which led to skewed research.
Another growing area is ‘gender HCI', which studies the differences between male and female patterns of software usage in order to eliminate gender bias from software tools. For example, women tend to use their peripheral vision more than men, so wider displays reduce bias. The two sexes also attempt to solve problems differently – in a spreadsheet-based environment, tests found that men were keener to play with different dropdowns and options, with women less so. However, women subjects were more thoughtful about the tinkering process.
ISF president Howard Schmidt said: “It's no secret that people need different training and awareness coaching, due to the fact that people all learn differently. This should not be forgotten in a security context. It's also worth remembering that computers learn too. Although two laptops can be identical when they leave the factory, after a year's use by two different people, the DNA of the system, all the applications and so on, will be totally different. One size doesn't fit all.”