Yet another politician has suggested that blocking the web can save us from terrorists. If only life was that simple.
As my friends and family will tell you, my reading list is somewhat eclectic, and a visit to my Amazon wish list can result in a trail of unusual recommendations for months to come. This is due to my interest in what my sister calls "dodgy" subjects, such as terrorism, nuclear weapons, espionage and the like.
I admit I should be careful which books I travel with these days, but usually my tastes just result in the occasional strange look and a good chance I'll be sat by myself on the train.
However, if Franco Frattini, the European Commissioner responsible for justice, freedom and security, has his way, I - along with many other legitimate researchers - could be in trouble. Frattini is the latest in a long line of well-meaning but sadly deluded politicians who believe there is a nice technical fix to stop criminals misusing the internet.
As reported by Reuters in September, he is proposing research into the use of technology fixes to "prevent people from using or searching dangerous words like bomb, kill, genocide or terrorism".
Leaving aside the fact that the very concept of a "dangerous word" is downright Orwellian, let's look at the lack of appreciation for the limits of technology, and the ongoing demonisation of the internet as the source of all modern evils.
Firstly, it is deeply worrying to see such a senior person seriously suggesting that stopping access to web resources on topics such as bombs or genocide will be in the slightest bit effective. Yes, bomb-making instructions are widely available on the internet. The majority of them are reproductions of US Army publications and the Loompanics back catalogue; paper publications put into electronic format.
Would removing all the bomb-making instructions on the web reduce the terrorism threat? Doubtful. The Provisional IRA managed to detonate hundreds of devices between 1969 and 1997. According to this logic, presumably it was an early adopter of Arpanet, the internet's predecessor? Of course not. Any half-decent chemistry graduate can make homebrew explosives; the surprising point about most of the recent attacks is, thankfully, how poorly constructed the devices have been. Perhaps we should ban chemistry and, to be safe against the nuclear threat, physics textbooks too?
Then there is the classic problem of correlation versus causation. Have many terrorists and other criminals used the internet to research their methods? Quite probably. Does this mean that everyone researching "criminal" topics is a criminal? Not at all. There may be a complete correlation between terrorists and people searching for bomb recipes, but there's a very poor correlation in the other direction.
Applying this logic in the corporate world it would be necessary to class most security professionals as computer criminals, which is hardly realistic. If you block such information from the general internet community it will just filter underground, so only the bad guys will have access. This is not a new debate, but one that seems to recur with annoying regularity.
However, suggesting quick, soundbite-friendly technical solutions to complex social problems is an easy win. The same approach has been promoted by some in the world of IT security, as if we could set up some sort of infosec "magic circle" and keep it all secret. Similar proposals have been tacked on to the Patriot Act in the US, which wipes out parts of the Bill of Rights.
The technical feasibility of such a scheme is also doubtful. A case in point is the relative ineffectiveness of China's "great firewall". Despite the ethically dubious cooperation of large computer suppliers with the Chinese government, activists inside the country still find ways around the blocking.
The web and its associated technologies certainly challenge traditional laws and enforcement practices. It is tempting to assume technology can be used to fix what it has broken. In reality, the solutions are far more elusive and the collateral damage of quick fixes considerable.
- Nick Barron is a security consultant. He can be contacted at firstname.lastname@example.org.