US GAO finds nukes are controlled by computers from 1970s

News by Roi Perez

The US US Government Accountability Office has released a report showing that The Pentagon is controlling its nuclear arms with computers from the 1970's.

The United States Government Accountability Office has released a report showing the dire state of the US Government's IT infrastructure.

One of the revelations is that the Pentagon appears to be controlling its nuclear and ballistic arms using an IBM computer from the 1970s, showing it's age by the fact that it uses eight inch floppy disks that can store about 80kb each.

According to the GAO, the US Department of Defense's Strategic Automated Command and Control System (SACCS), “Coordinates the operational functions of the United States' nuclear forces, such as intercontinental ballistic missiles, nuclear bombers, and tanker support aircrafts. This system runs on an IBM Series/1 computer — a 1970s computing system — and uses 8-inch floppy disks.”

The office added: “The [Department of Defence] plans to update [SACCS's] data storage solutions, port expansion processors, portable terminals, and desktop terminals by the end of fiscal year 2017.”

Likewise, the US Treasury appears to be using a 56-year-old IBM mainframe using programs written in assembly code to do its taxes and currently has no plans to update its systems.

An eight-year-old IBM System z10 mainframe using COBOL for some of its personnel systems was found in use at The Department of Homeland Security, and the Department of Justice is using primitive tech to store prison inmates databases.

The report tries to explain that, "Federal legacy IT investments are becoming increasingly obsolete: many use outdated software languages and hardware parts that are unsupported", and went on to explain the situation by saying that, "Federal IT investments have too frequently failed or incurred cost overruns and schedule slippages while contributing little to mission-related outcomes. The federal government has spent billions of dollars on failed and poorly performing IT investments which often suffered from ineffective management, such as project planning, requirements definition, and program oversight and governance."

The federal government currently spends US$ 60 billion (£41 billion) a year on keeping existing systems up and running. The GAO report found that this number is forecast to increase by 2 percent by 2017, presumably still taking up the main stake of the budget which would go on new systems.

Wieland Alge, VP & GM EMEA at Barracuda Networks spoke with SC and said that, “At first glance, it's easy to make fun of the Pentagon for relying on 1970's technology. If you look a bit closer, you'll actually see that many other large organisations are running similar legacy systems. The Pentagon has by all accounts been running a bulletproof, isolated system, which certainly appears to have been doing its job to full satisfaction.”

“Even today, we see many industrial environments using a similar set up. Sure, the industrial world isn't run on floppy disks, but there is still a lot of 1990's and early 2000's technology used to control plants and steer machines. The real dangers surface when these organisations try to connect legacy systems to networks, thereby exposing them to modern vulnerabilities.”

"Like a Brit setting off on a sunny summer holiday only to be scorched by the sun on day one, IT teams can't prepare for something that has been out of view for so long. The key to securing connected legacy and modern devices in Industry 4.0 is to seal the entire attack surface as quickly as possible."

Jonathan Sander, VP of product strategy at Lieberman Software spoke with SC and said that, “While some frame the use of these 70's era IBM computers as attempts at security by obscurity, it's possible to see it in a different light. What you have with these systems is a completely understood and predictable platform. It's 24/7 operations on a platform that has been tested in every conceivable way for four decades. How many systems made in the last few years can claim that? Security isn't the same beast when you're dealing with systems that are fully purpose-built. The computers controlling these missile silos aren't also there to run spreadsheets or play flash games on the Internet. They are very boring, very specialised, and very reliable as a result. Yes, they are also obscure in the sense that there aren't many people who can operate, repair, or even understand them. Some will claim that results in a measure of security, but that debate is moot when you consider that the real security in these systems is that there are no ways to divert them from their one and only task. There is no system that hasn't been hardened, no software that hasn't been purged of vulnerability over the decades. That is, until someone finds one they missed. But the theory is not that they are secure because no one can understand them, but rather that they are secure because the few who understand these systems well have been making them more and more secure over a very long time.”

Peter Godden, VP of EMEA, Zerto: "Recent reports of the US Department of Defence still utilizing floppy disks - nearly 40 year old technology - on critical systems that coordinate intercontinental ballistic missiles, nuclear bombers and tanker support aircraft is highly alarming, though not entirely surprising. Across industries, you see many organizations - especially those in highly regulated environments such as healthcare and financial services - spending large sums of money to simply maintain highly vulnerable, deficient systems such as tape based backups. Advances in the area of business continuity and disaster recovery solutions are helping organizations with this 'antiquities' issue by not only bringing their data centres and critical data into the modern era, but improving flexibility to dynamically react to all forms of disasters and ensure uninterrupted operations that are critical to business success.”

Simon Crosby, co-founder and CTO at Bromium: “Unfortunately this news is not surprising. Systems that are supposedly secure, including SCADA systems, still use old systems where they are not permitted to upgrade their software because it might alter some critical timings. A Major US Airline still runs all of its check in and airport software on Windows 2.1. Medical systems where the embedded software is supposed to be secure is all run on a 1990's version of Windows. All of these systems are desperately insecure. These are ancient systems that have been improved but are so impossibly slow to get anything changed because of the approvals process. These systems, including the US nuclear arsenal, are insecure by virtue of these long approvals processes and the inflexibility of the moving systems that are installed and working. It is all a painfully slow process of getting everything certified and approved, and is part of the larger general problem that both compliance and approvals for IT security are too slow and do not move fast enough. And you can bet that the case is almost exactly the same here in the UK”

Bob Ertl, senior director of product management at Accellion: “The GAO's report on the government's archaic IT systems is alarming, but unfortunately not surprising. Layers of bureaucracy plus fierce competition for budget dollars are historically responsible for the public sector lagging behind in technology adoption. The problem with that is when you put off making technology upgrades, you put off making security upgrades. The massive data breach at the Office of Personnel Management highlights this very issue. While it remains to be seen whether the lessons learned from that breach will be applied, hopefully this report from the GAO provides additional context for just how dire the security situation is at the federal Government level.”

Steve Armstrong, SANS Instructor and CEO of Logically Secure: “The nuclear firing chain, the process of taking a presidential launch approval and notification and sending the right message to the right place to fire the right missile at the right target must be totally secure. That's not the network to get false-positives on; thus systems are bespoke.  Read bespoke also as really really expensive. The fact that they are running on old hardware is also not surprising as spending on nuclear programs is not as high a priority as it once was.  Furthermore, many systems in the 1970's were real time processing - no multi-tasking here to be usurped or corrupted.  Those computers would have been hand built in country with code written just to run the bespoke software to do the job.  The code would have been manually code reviewed by several staff and checked for trapdoors or other Easter eggs. Come forward nearly 50 years and the coders to support the software are rare, the hardware to maintain the capability is really rare (remember NASA buying parts on eBay?)  and replacements are probably now being hand built from schematics. Is it less secure than a modern system connected to the Internet coded by a new university graduate no, is it facing an uncertain future as to how it will be supported going forward yes.  Is it insecure, that's probably classified.  Many systems of that age have problems in their network stack in the kernel of the OS or even the programming logic.  As to how exploitable it is that depends on the exposure of the system to other networks; many pen test frameworks include 'try everything' options that could get lucky. The safest way to protect these old legacy systems is to wrap good security around them, prevent direct access and limit all access except to those most trusted and local to the site (no remote Indian support desk for these systems).”


Find this article useful?

Get more great articles like this in your inbox every lunchtime

Video and interviews