There is a debate emerging in information-security circles that may become pivotal in the months ahead: is there value in training employees to become security-aware and, if so, can you maintain that state of awareness?
Some professionals are suggesting that such endeavours are likely to be nothing more than a trivial pursuit. This camp believes that only better, widely adopted technological solutions can protect people from themselves.
It's a fundamental question because, as the public becomes acutely aware of the basics of information security through the HMRC scandal (which must now be viewed as the defining moment for information security in the UK), the profession may need to accept that human error will always occur and factor it into corporate security strategies.
There has already been a shift in focus towards protecting what happens inside an organisation as much as outside, but what is emerging is that, while malicious behaviour is, in the short to mid term at least, predictive, employee behaviour is not.
This kind of friendly fire occurs when, for example, the stressed marketing executive with looming deadlines downloads a rogue file or uses an USB stick against best-practice guidelines. The very good reasons why they shouldn't do this, outlined at the last security awareness briefing are forgotten or ignored. The "it'll-be-all-right" reflex kicks in. This happens all the time and is what occurred at HMRC.
Most human beings don't like being told what to do, and even the most progressive in-house training tends to be greeted with cynicism. Punishing employees for not "following the rules" can also be pointless if evidence of the failure of much wider social inhibitors such as ASBOs is anything to go by. People are employed for their subject expertise, whatever their level in the business, and they need to be able to get on with that.
If employees are to be encouraged to be security aware, then the carrot is often better than the stick. Perhaps a reward for good practice would be beneficial rather than punishment for "bad" behaviour.
So the technologists have a point over the humanists - the only way to prevent accidents is to put systems in place that simply and irrefutably stops them happening at the point of operation.
However, many would argue that we already have enough technology and IT expertise to prevent the most obvious errors - but that there is a failure of application by management, either for cost reasons or simple neglect.
It's worrying that government agencies, which have been showered with public money over the last decade, have used little of it to deploy information security systems that protect public data from the most basic human error.
Whatever the outcome of this debate, it is obvious at the start that redundant technology allied with zero awareness is the worst possible scenario. The eventual findings of the forthcoming Poynter enquiry into HMRC will tell us just how close its management came to operating to that bleak benchmark.