The TCG's publication of specifications for full-disk encryption points to the future of storage. By Jessica Twentyman.

Some day soon, storage disks with built-in encryption will be as ubiquitous as cars with built-in seat-belts. So say executives at disk manufacturer, Seagate, which as part of industry body the Trusted Computing Group (TCG), recently published the final specs for industry-wide, full-disk storage encryption standards.

“Encryption will become part of the definition of what a storage device is,” the Seagate executives claim in a blog. “Just like seat-belts, expect to see business using fully-encrypted storage in the future to help deal with the growing stream of sieve-like data thefts and losses across the business landscape.”

It's a bold prediction, but its realisation is still some way off. Certainly, it is too distant for the many information security professionals struggling with the thorny problem of securing sensitive data held in corporate network-attached storage (NAS) and storage area network (SAN) environments, not to mention countless server-based shared folders and other document management systems.

“In the rush to accommodate growing volumes of sensitive stored data, too few organisations have consulted IT security staff at a sufficiently early stage in the procurement decision,” says Mark Chaplin, a senior research consultant at the Information Security Forum (ISF). “What has tended to happen is that a company is sold a SAN, its storage administrators get it up and running and only then are security staff brought in to decide how best to protect the data at rest that sits on devices on these huge storage networks. In effect, IT security staff are asked to retrofit security to an implementation and that's no easy task.”

That approach won't impress the growing chorus of auditors, regulators, partners and customers who increasingly demand that organisations vouch for the full security of data at rest, fuelled by perfectly legitimate concerns over data loss, theft and inappropriate access.

And while many organisations have used tape as their preferred storage medium, there's a reason why most are focusing on disk-based storage for the long term, says Lynn Collier, EMEA solutions director at Hitachi Data Systems. “The influx of new data volumes means that high-availability disk systems are the quickest and most effective way to store data, if still not the cheapest,” she says. “Disk-based storage is future-proofed. Its long-term benefits outweigh short-term procurement costs. Tape can degrade and managing data deletion in a tape environment has its own challenges. The long-term reliability and longevity of disk-based systems make them a far more attractive option for critical information archives,” she says.

But before jumping into large and complex disk encryption projects, IS professionals face difficult choices as they wrestle with a range of approaches, many based on proprietary technology, says Eric Ouellet, an analyst with IT market research company Gartner.

They should make those choices with care, he says: “Encryption can be used to enhance and benefit an organisation's security posture and resistance to threats and common risks. However, if deployed without adequate planning and understanding of the organisation's resources, existing controls and a clear approach to risk mitigation, the result can be that organisations are no better off than before applying encryption.”

Choices, choices
Built-in hard-disk encryption does make sense, Ouellet believes. Already, he says, there are a number of offerings, from companies, including Seagate, Hitachi and Toshiba, although most products don't yet comply with the latest TCG standards. Built-in hard-disk encryption offers three compelling advantages: scalability, managed complexity and cost.

However, retrofitting a large, centralised storage environment would involve replacing drives and may represent significant cost, he warns. So built-in hard-disk encryption should be considered suitable only for new installations in organisations that hold significant volumes of sensitive data.

Fortunately, there are other options. One is appliance-based encryption, as demonstrated by storage vendor NetApp. In March, it announced that its DataFort and Lifetime Management Key applications had attained Level 4+ in the Common Criteria for Information Technology Security Evaluation, an international framework. Both technologies were acquired by NetApp in its 2005 purchase of storage encryption specialist, Decru and are now built in to NetApp's range of storage devices.

This kind of integration greatly simplifies encryption and decryption efforts and also centralises key management, without the performance impact typically associated with such endeavours, says John Rollason, product marketing manager for EMEA at NetApp. “Very, very secure encryption – as defined by the Common Criteria – is something customers are looking for,” he says. “Traditional security technologies focus on securing assets by protecting the perimeter, but that doesn't address secure storage, leaving data at rest vulnerable to attacks. Once these barriers are breached, data assets may be fully exposed – but not if they're encrypted. By making encryption easier, we're making storage security easier,” says Rollason.

Other approaches include: hard-disk driver controller encryption, from suppliers such as PMC-Sierra and packaged within storage environment solutions from vendors such as EMC, HP and Sun; and switching fabric-based encryption, increasingly offered by networking companies such as Cisco and Brocade. Both approaches have only been deployed in a few production environments, warns Ouellet. The first approach is useful in document management environments where security professionals are concerned about the threat of hard-disk drive media theft; the second should be put on hold until “2008 road maps become 2009 deployable realities and the base of live customer deployments grows beyond mostly test sites and early adopters."

Classification is key
The message is clear: encryption services that are built-in or integrated and require minimal administrative or user interaction, tend to be the most effective.

But none of these technologies is likely to be successful in deployment without a strong policy of data classification, warns Stuart Okin, UK managing director of IT security specialist Comsec Consulting. “Organisations need to have a much clearer idea of the data they hold, its risk profile and what controls should be applied to it,” Okin says. “It may not be necessary to apply encryption to every piece of data in order to provide good security. Effective authentication and access control will be sufficient, in many cases, to limit access to sensitive data,” he adds.

Think in terms of an attacker, he advises and act accordingly. “What are the primary assets in your storage systems that might be important from an attacker's perspective – whether it's intellectual property or customers' credit card details? What would be their motivation to attack and what might be their likely entry and exit points into targeted components?”

What an information security professional should end up with by going through this ‘threat modelling' process, Okin says, is a good idea of where their organisation's key vulnerabilities lie and consequently, what actions the organisation needs to take.

“Organisations that initiate data encryption projects and fail typically do so because they did not have an effective data classification programme,” agrees Ouellet.

He says they should also remember that individuals who have authorised access to a system or files are also – under most deployment scenarios – automatically granted access to the cryptographic keys needed to decrypt them.

A data classification scheme, Ouellet says, should only be “as simple or as complex as warranted by the associated risks and value of the data”. In other words, oversimplifying or overcomplicating a data classification policy will not provide adequate controls to mitigate the risks that face an organisation.

“The oversimplification of the classifications scheme often leads to inappropriate set controls being applied to mitigate risks. However, an overly complex classification scheme usually means that the end-user will require extensive or enhanced training to effectively apply it,” Ouellet says.

Finally, there's another key area to bear in mind, he says, as companies move to an era of disk-based information archive. That issue is digital shredding – a relatively new concept in non-military and government organisations, but one that is gaining traction elsewhere. At a high level, it describes several product categories, encompassing both appliances and applications, that erase specific data using scrubbing and ‘zeroisation' techniques, and also the practice of encrypting data and subsequent erasing of all copies and records of the cryptographic keys used. “The term ‘digital shredding' is very much a new concept that is rapidly becoming popular, particularly in environments planning on adopting hard-disk drives with built-in encryption capabilities,” Ouellet says.

“When considering digital shredding, organisations need to develop stringent access control and audit policies regarding cryptographic keys, to ensure that all copies are accounted for and destroyed during the shredding process; otherwise, the data may still be available to be read and may result in a disclosure incident similar to those associated with improper control of media containing sensitive information.” And that, after all, is an issue that no data-sensitive organisation can afford to ignore.

Trusted Computing Group sets the standard

As more and more data is stored digitally, the need for better security and, specifically, disk-based encryption, grows. While many hard drive makers have launched encryption systems based on proprietary technologies for their own products, the Trusted Computing Group (TCG) has set final specifications for industry-wide full-drive encryption.

The TCG says this will not only apply to regular hard drives, but also to solid state drives and similar products. With all the major hard drive manufacturers backing it, the new standard holds out the hope that manufacturing and deployment costs can be lowered, as the technology can be integrated directly into the disk.

The recent TCG announcement covered the publication of three standards for storage encryption. One is for PC hard drives (‘Opal'), one is for enterprise hard drives (‘Enterprise Security Subsystem Class Specification'), and one is for secure interoperability with other storage standards, such as SCSI and ATA. All of the large vendors, including Fujitsu, Hitachi, Seagate and Toshiba, will deliver hard drives that support these standards and management software vendors such as Secude, Wave Systems and WinMagic are also on board. “Others will surely follow,” predicts Jon Oltsik, a senior analyst with market research firm Enterprise Strategy Group (ESG).

That's great news for information security professionals. It means that the days of implementing, deploying and managing add-on software to manage disks on end-user and enterprise storage disks may be coming to an end. Fitted with self-encrypting drives, all computers – whether they're in an executive's laptop or on an enterprise storage device – will encrypt data as soon as they're deployed. And that, in turn, will mean that information security professionals no longer have to deal with proprietary hardware implementations and a lack of software management support.

What do these new TCG standards mean? Oltsik has three predictions:

1. Software encryption is all but dead. Soon, most business laptops will be offered with encrypting hard drives at a nominal premium over a standard system. In three to five years, he predicts, every disk drive may be encryption-enabled as it rolls off the production line.

2. Information security professionals need to develop a plan. Many have no idea that TCG even exists, he says, but this is no longer acceptable. “Since laptops and desktop PCs will come with encryption ‘baked in', it is incumbent upon IT and endpoint management and security teams to create a plan for phasing in systems with self-encrypting drives and to phase out encryption software over time.”

3. Information security professionals should also expect to see encrypting drives in enterprise arrays. “This will take a bit more time, as demand for array-based encryption isn't nearly as high,” he says. However, every storage system produced by EMC, Fujitsu, Hitachi, HP and IBM may eventually follow this path.

“My suggestion,” says Oltsik, “is that IT and security decision-makers should come to terms with this as soon as possible. Your long-term information assurance strategy may depend on it.”

Implications of storage security in the cloud

For many organisations, the answer to tackling burgeoning storage volumes is to hand them over to another company to deal with by using so-called ‘cloud computing' services. But it's time that IT security professionals were asking some serious questions of their cloud computing providers, and demanding serious, security-conscious answers, according to pan-industry security think-tank the Jericho Forum.

“We don't argue for a minute that there are good business drivers for using cloud services, especially in a downturn, where reduced cost and faster time-to-market are so important,” says Paul Simmonds, a Jericho Forum board member. “What we are challenging is the notion that the provider will handle the security of stored data to your satisfaction as a matter of course. You simply haven't got that guarantee.”

It's an issue that the Jericho Forum has placed high on its agenda for 2009, Simmonds says. He sees it as a “natural evolution” from the group's work on deperimeterisation and collaborative open architectures, which also focuses on computing that takes place outside the protected boundaries of the corporate infrastructure. The goal is to come up with a framework that enables companies to determine how cloud technologies can be used securely.

It's clearly a question that's troubling many. In a recent survey conducted by IT consultancy firm Avanade, respondents said, by a five-to-one ratio, that they trust existing internal systems over cloud-based systems, due to fears about security threats and loss of control over data.

The Jericho Forum's research, its members hope, should lead information security professionals – and their colleagues elsewhere in the business – to consider questions that they haven't previously dissected in the headlong rush to adopt cloud computing: when you repatriate data from a cloud provider, how can you be sure no trace of that data resides on the provider's own systems? What leaks might exist between the cloud service back into your own infrastructure? Does the provider adhere to the same physical, logical and personnel controls applied to your internal systems? What if the provider goes bust?

The results of the Forum's research, due to be unveiled this month, will tackle the high-level security aspects of cloud computing. It takes the form of a 3D cube that attempts to map out the key decisions companies will have to make when deciding which tasks – and data – can be handled in the cloud, which should be confined to internal systems and how to tie data residing in both the cloud and internal systems together in a way that is secure. The model takes into account the huge variety in forms that cloud computing services can take – open or proprietary; perimeterised or deperimeterised; internal or external.

“People need to be aware that the cloud isn't just one thing,” explains Simmonds. “You can have internal, proprietary, perimeterised clouds, for example, or external, open and deperimeterised clouds. The trick will be in deciding which model fits the risk profile of your organisation, depending on the task at hand.” In the long term, he says, it may be necessary to tag data with metadata describing where it can – and can't – reside within the wider cloud model.

In the meantime, no information security professional can afford to be complacent about cloud computing, according to a recent blog by Amrit Williams, former Gartner analyst and now CTO of security management company, BigFix. “When we allow services to be delivered by a third party, we lose all control over how they secure and maintain the health of their environments – and you simply can't enforce what you can't control,” he writes. “The ‘experts' will tell you otherwise, convince you that their model is 100 per cent secure and that you have nothing to fear; then again, those experts don't lose their jobs if you fail.”