‘Ghost’ plane hack could cause mid-air crash

News by Andy McCorkell

White hat hacker reveals potential for ‘crying wolf’ exploit of weakness in 1980s tech that could potentially cause collisions when planes are in autopilot by social engineering of IOT.

The potential to confuse pilots with fake alerts while duping a plane’s autopilot navigation system has been identified by penetration tester Ken Munro of Pen Test partners.

The ‘Crying Wolf’ attack seeks to exploit vulnerabilities in the Traffic Alert & Collision Avoidance System (TCAS) that was was first developed in the early 1980s.

It uses transponders on aircraft to communicate with other nearby planes about their distance, altitude, and where they are heading.

Normally, a TCAS alert would sound if two aircraft are on collision course.

But in a recent blog post, Munro says that certain autopilot modes, especially on the Airbus, the plane automatically follows the TCAS and climbs or descends with no help from the pilot.

It has also been shown that it is possible to conjure up “fake” traffic with the TCAS.

Pent Test Partners also investigated how planes on autopilot would respond in such scenarios.

In his blog, Munro wrote: “Creating real alerts is perfectly possible, but very dangerous and very illegal. We were therefore restricted to working on flight simulators.”

The simulator model had its limitations, he conceded, but added  as they are used for approved training it should deal with TCAS alerts in much the same was as real planes.

The TCAS uses two types of secondary surveillance radar transponders to work out the height and location of a plane, one with a 24bit aircraft address along with altitude and GPS-derived position data called “mode S”, the other with a four digit transponder code and altitude information called “Mode C”.

The Mode S is easy to decode and a cheap, while a US$ 10, DVB USB dongle can be used to collect and plot the aircraft data.

‘Stacks’ of at least three fake or ‘ghost’ aircraft were needed to force a plane to climb more than 3,000 feet per minute, he added.

The most likely reponse from a pilot would be turn off TCAS resolution advisories, because ‘ghost’ planes do not show up on radar, according to a recent research from Oxford University.

But as Munro points out, even this disabling of the TCAS RA, would leave pilots less able to deal with a legitimate TCAS alert, so a form of ‘Crying Wolf’ attack.

Munro concludes: “We have shown that careful placing of fake aircraft through rogue transponder broadcasts can cause an aircraft under autopilot control to climb or descend towards legitimate traffic.

“Human pilots could also be directed to follow the same rogue resolution advisories, or confused in to inadvertently disabling safety systems.”

Commenting on the vulnerability identified, Sarb Sembhi, CTO, CISO, Virtually Informed, told SC Media UK: "
This blog post shows how traditional attack techniques can be used on systems based on rules, by abusing the rules to cause annoyance that leads to switching off a safety system. This attack is interesting from many perspectives (as one of) many recent exploits have targeted IP based IoT systems such as cars, ships, cruisers, etc. Here is an attack that that is more like a social engineering attack on a technical system.

"Fail safe systems should be made to fail safely, not switched of for safety. That is a fundamental flaw in the logic of the practice by pilots that somehow has been accepted or ignored by the aviation authorities. The whole thing needs a rethink, as it is the sort of flaw you see exploited in the movies - not thinking for one minute that it is a real life danger."

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Video and interviews