Wikis may seem like a great idea, but as Wikipedia has shown, you can't trust everyone's community spirit.
Web publishing has changed a lot over the past few years, and the big shift has been that you now need almost no technical know-how to get your material online.
So, we have the YouTube video site, which hosts an amazingly eclectic selection of homebrew clips, from would-be celebrities' video diaries to terrorist propaganda. Then there's the whole web log phenomenon. There are countless “blogs”, ranging from those by angst-ridden teenagers to some incredibly useful technical ones from high-profile security professionals.
And there's the oh-so-cutely-named “wiki” (the name comes from the Hawaiian term for “fast”). This is a website where all of the content is provided by the users. Anyone is free to amend the content as they see fit, updating and editing it as appropriate. Wikis found early adopters in the software development community, as they provided a very convenient way to implement support forums.
Probably the most well known wiki today is www.wikipedia.org, a publicly maintained and open-access encyclopedia. Its proponents suggest it is better than the Encylopedia Britannica, claiming the same benefits as the open-source software development ideal. The idea is appealing: because anyone can edit the content to correct it, the quality will rapidly improve. People will delete incorrect entries and replace them with the right information. As web pioneer Paul Graham put it: “The good stuff spreads, and the bad gets ignored.”
Unfortunately what arguably works for open-source software doesn't automatically apply to reference material. Almost by definition, people accessing wikipedia don't know the answers; that's why they're looking it up. Also, there's a huge problem of difference of opinion. Even with fairly simple cases such as an author's date of birth it's surprising how much disagreement there can be.
So what, you might ask, has this to do with security? Well, recently wikipedia was abused by some malicious editors. The entry on the infamous Blaster worm was edited and a link inserted to a bogus patch file. In fairness, the patch did install the official Microsoft patch, but it added a backdoor allowing attackers access to the machine. The problem was picked up and rapidly corrected, but there's obvious scope for additional abuse (for example, uploading an “infected” graphics file next time a buffer overflow surfaces in an image processing library).
What's interesting about this incident is that it didn't breach any security policy; the “attackers” weren't exploiting any holes, they were simply doing what wikipedia is designed for. The action was no doubt malicious, but wikipedia clearly identifies such “vandalism” as a problem in its own description (http://en.wikipedia.org/wiki/Wiki). The assurance that wikis are intended to make vandalism easy to repair rather than preventing it will probably not reassure you.
Arguably this isn't even a computer security problem; it's more an issue of abuse trust. There is a growing problem with people implicitly trusting what they read on the internet and, ironically, the most technically skilled can be the worst offenders. It is increasingly common for software developers to use the web as the first and sometimes the only port of call when trying to solve a problem.
This is not an easy problem to fix. Traditional printed media employ professional editorial staff who add a valuable quality filter. Websites may or may not include this sort of control, and it's usually impossible to tell just by viewing the site. Wikipedia and other “community” resources are at high risk of malicious content.
A degree of healthy scepticism and some basic training in critical thinking are essential for anyone using the web as a research tool. I'm prepared to bet they are missing from most corporate induction courses, though.
Blind trust in any information source, whether online or not, is a dangerous thing.