Tricks of the infosec trade

Opinion by Ken Munro

Sending hackers on a wild goose chase, and 'playing dead' in front of thieves, are brilliant ploys - but they're not enough.

Sending hackers on a wild goose chase, and ‘playing dead' in front of thieves, are brilliant ploys – but they're not enough.

If you found or stole a laptop in a bag, started it up and all that was displayed was “missing operating system” on a black screen, what would you assume? That the hard disk had failed?

Would you go to the trouble of forensic recovery of data? More likely is that you would wipe the disk and start from scratch.

Yet one of the interesting features of TrueCrypt full disk encryption is exactly that – the bootloader screen can be customised to appear like a dead operating system. Type in the password – the disk still decrypts and the OS loads. It reminded me of an old maxim, that of ‘security by obscurity', which was misused widely in the past but may be more relevant now.

Consider the same laptop, but this time it fires up with a branded encryption bootloader screen. It's unlikely to be easily cracked, so you rummage around the bag, find the owner's business card and try to social engineer the password from them by telephone. If you're a good social engineer, you'll have a fair chance of extracting it.

So a little bit of stealth and thought in the presentation of the bootloader reduces the chance of your data being pinched.

It's the same principle as the honeypot – it advertises itself as a vulnerable Windows Domain Controller, yet is actually a nice, secure Debian build. The hacker gives up as their exploits don't work, and in the process alerts the business to their presence. The honeypot code is available for free, and needs only a very basic system to run. Why wouldn't you put one on your network?

If you really want to have some fun misdirecting scripts and scanners (and confusing pen testers that rely on these too heavily), why not have a custom error page for your web applications that includes a random error string. Pick one at random from a list of Oracle, MS SQL, MySQL errors, and maybe even a few odd framework errors and some 16-digit numbers that look like credit card PANs. To avoid confusing your users, why not make the text the same colour as the background or hide it in a comment? Obviously, make very sure your app is secure first, then sit back and laugh as you read your web logs.

Annoyingly, some vendors don't make it easy to obscure their systems. Outlook Web Access can be a real pain to hide, hence it's almost always located at, /owa or similar. Surely it would be wise for Microsoft to make it easy to place OWA in a non-default location? Why not put it in a different path that's less likely to be found by scripts and scanning tools?

Hiding an insecure system isn't the answer, though. Google will probably find it in the end if it's on the internet. I still chortle when I see yet another Google dork used to uncover multiple instances of a vulnerable web application. I saw one pop up in a popular open source e-commerce application a few weeks back: an unnoticed vulnerability leading to the disclosure of database connector strings.

Making your web server banners a bit less verbose will help a bit too, though that's no excuse for not patching in the first place. Though if you want to pass a PCI ASV scan, you probably don't want an Apache banner that says: “Server: Apache/2.0.19OpenSSL/0.9.8 FrontPage/ PHP/5.3.8.”

If you rely only on obscurity to protect your systems, then it's probably only a matter of time before someone stumbles across their vulnerabilities, or sufficient information about them leaks out.

Obscure your systems, but secure them first.


Find this article useful?

Get more great articles like this in your inbox every lunchtime

Upcoming Events