Analysis: What will the next decade bring in the information security world?

Share this article:

What will the next decade bring? More of the same - or fresh, tough challenges, such as AR? Mark Mayne gazes into the crystal ball.

Predictions are a dangerous game. Bill Gates once forecast: “Two years from now, spam will be solved.” That was in 2004, and unfortunately spam still makes up more than 90 per cent of business email.

However, as we head for the second decade of the 21st century, are there any solid trends emerging? What is the future shape of IT security? Will we still be swamped in malware and spam or will newer, more potent threats emerge? Will the infosecurity vendor community be able to innovate to meet the security challenges of the coming decade?

Certainly technology is going to change significantly in the next ten years, as virtualisation, consumerisation and cloud services grow in importance. A recent Gartner report (Top Predictions for IT Organisations and Users, 2010 and Beyond: A New Balance) went as far as claiming that by 2012, 20 per cent of businesses will own no IT assets at all. Hardware will still be required, but if the ownership of hardware shifts to third parties, then there will be major changes to every facet of the hardware industry. For example, says Gartner, enterprise IT budgets could either be shrunk or reallocated to more strategic projects, while IT headcounts could be reduced or staff re-skilled to meet new requirements. Hardware distribution may also have to change radically.

Cloud technologies have been a buzzword in the industry for the past two years at least, and their popularity is almost assured due to the host of benefits to both customers and, critically, the vendors – for whom economies of scale and single code bases are attractive margin-enhancers. However, QinetiQ manager Tony Dyhouse, director of cyber security at the Digital Systems Knowledge Transfer Network, urges caution: “Cloud services are very attractive, but not always desirable. The theory only works so long as one size fits all. If the service is more bespoke, then the cloud may not be the best place to put it.”

Dyhouse continued: “Cloud-based services are interesting. There has been intense interest from both customers and vendors alike, but many questions are still unresolved. For example, once a business has outsourced its data to a third party-held, shared server, what happens if that server is seized by law enforcement? Removing data from the server, even if unrelated to the investigation, would not be allowed for forensic reasons, leading to problems. It's vital for enterprises looking at cloud services to do so with their eyes open, and check the contract extremely carefully – who is responsible for data security in transit, and what happens in the event of a problem?”

Outsourcing is not new, and the need for enterprises to shrink the bottom line and gain 24/7 support has driven a boom in markets such as Malaysia and India. Gartner estimates that by 2012 Indian IT services firms will represent 20 per cent of cloud aggregators on the market. “The collective work from India-centric vendors represents an important segment of the market's cloud aggregators,” it says.

Sam Curry, vice president product management and strategy, RSA, said: “Roughly 16 per cent of the world's population live in India, which turns out almost two million IT graduates a year. India is also massively tied in with the West's high-tech industry and companies such as HCL, Wipro, Infosys and Tata are world giants. India is also passing laws on strong authentication, online banking, credit card processing, privacy and more. India is a vibrant player now in information security, and its impact will only continue to grow.”

The move from desktop and internal data storage towards cloud-based systems is also likely to meld with the rise of the mobile world. Gartner believes that by 2013, mobile phones will overtake PCs as the most common web access device worldwide. According to its PC installed base forecast, the total number of PCs in use will reach 1.78 billion units in 2013, but the installed base of smartphones and browser-equipped enhanced phones will exceed 1.82 billion units. By 2014, there will be a 90 per cent mobile penetration rate and 6.5 billion mobile connections.

Alan Goode, analyst, Goode Intelligence, said: “The move to the cloud will enable information to be accessed on a wider range of devices than today. The demand for more mobile communications will also have the effect of pushing much of our current security technology back into the cloud. Our predictions are that by 2013 80 per cent of smartphone users will be regularly accessing the internet, and much of this traffic will be via tailored apps, rather than browser-based.”

Gartner further predicts that revenue from application downloads will hit $6.2bn this year. Mobile phone users are anticipated to download more than eight billion applications during 2010, pushing spending in app stores to $6.2bn, rising to more than $29bn by 2013. Advertising-sponsored mobile applications are set to generate almost 25 per cent of mobile application stores' revenue by 2013.

Dyhouse said: “It's certainly difficult to predict with any certainty what will be happening in the infosec world in 2020 – there are likely to be a series of ‘step changes', such as technological innovations that will make threat predictions difficult. However, mobile will certainly be an interesting area. There's such a huge rush to market with new applications and even operating systems that you have to wonder how much rigorous secure code testing is going on. Of course, this situation is one that will have to change over the next ten years. The market demands fast development and short time-to-market to maintain margins, which is usually the opposite of secure.”

Goode continued: “Interestingly, we've found that enterprises still haven't come to terms with the current mobile security status quo. In our research, we found that 40 per cent of businesses still don't have a clear policy on mobile use. There's a huge blurring of corporate and personal data boundaries here, and this process of blurring is only just beginning. Of course, there are consumer, privately owned devices being used for corporate tasks, and there is increasing crossover in the use of social media accounts. Who has responsibility for what data and when is far from clear in many cases.”

Social media has become one of the great success stories of the internet, and increased development is inevitable. The next wave is likely to be a split between the business drive towards real-time working and collaboration via companies from Google to Cisco, and location-based mobile services on the consumer side.

Patrick Walsh, CTO eSoft, agrees: “The future of IT security will centre around securing collaboration networks. One core reason for the migration to the cloud is the benefits of information and technology being available to anyone from anywhere. Instead of three people having different copies of one document, everyone works on one centralised copy and efficiencies are gained.

“This move towards the web as a platform for collaboration will continue and accelerate over the next ten years, with technologies such as Google Wave paving the way and creating fresh security challenges, such as the potential for real-time attacks. A person can post a Wave to their blog that contains a series of pictures from a trip and an anonymous visitor can post comments on the picture or post their own pictures back. What if those pictures contained an exploit for a vulnerability that can compromise a computer that just views the image, such as last year's MS08-052 vulnerability? In real-time, anyone viewing the page or tuned in to follow it would be compromised. Effectively, what this means is that security response times will need to shrink from hours to seconds and web security will continue to be the most important aspect of security for the foreseeable future.”

Although security response times must fall, many argue that the current internet structure is simply not fit for purpose. Because the building blocks of the web were designed to be robust rather than secure, the traceability of online communication is very low. There are a variety of proposals in the pipeline to re-engineer the underlying protocols, resulting in a more transparent, trustworthy environment.

Mary Landesman, senior security researcher, ScanSafe, said: “There is a lot of work behind the scenes to change the architecture of the internet, and in spite of the economic conditions, a lot of this work is continuing. The largely anonymous internet of today will give way in the next ten to 15 years to a more verifiable, reputational structure. It will be the passing of an era, a move out of the dark ages. I think that without this move, we will see user trust in the internet erode to the point where people will desert the web, which will eventually cause the entire concept of a free, inclusive domain to implode. There is the potential for another digital divide here, one not built purely on the availability of connectivity, but on having the credentials and security devices necessary to connect to a more trusted, higher quality environment.”

Interestingly, in addition to the evolution of security technologies, many feel that the security industry itself must also evolve. Dyhouse believes that the language used to describe security events must change. “Look at the way that we describe the mechanics of a data breach – the language is not accessible to outsiders. We've invented a difficult language and overlaid it with very niche terms and phrases. We need to consider the education of non-IT experts: it's just too complex at the moment, and that is counter-productive!”

Landesman agrees: “There's an overall acceptance that educating users is pointless, because they will still make mistakes, but I don't believe that. I think user education does improve security, and I also think that the security industry as a whole needs to take more social responsibility. There's still a tendency to mis-report attacks to the wider media for short-term marketing gains, but this can result in the bigger picture being lost. We must change these attitudes.”

It's not just external attitudes that will have to change. New technologies and pressures will mean new skillsets and job role emphasis. An example of this evolution is just emerging, as SIEM technologies enable clearer overviews of data logs and management to be used by researchers. The result is a growing trend for more analytical skillset demand among potential infosec employers.

RSA's Curry continues: “We're already seeing this trend begin to bite. I recently met with a company who opened the conversation by saying that this was what was happening. IT and security are ultimately services to the business and must demonstrate in business terms the value of what they are doing. They also have to be able to expand the types of business directives and business policies that they can respond to – that means that IS professionals ultimately have to be (or put more positively, ‘can look forward to' being) the glue between the technical controls and the business drivers that the C-level cares about: increasing revenue, lowering cost, reducing risk, and so on.”

This trend is likely to encourage a divide between highly technical players and business-level executives, says Curry. “We will see a split among junior and senior IS professional job descriptions. Put another way, there's an opportunity for a clear progression of responsibility, skills and experience to make IS a discipline with entry-level, management, director and ‘chief' ranks that have real meaning beyond the number of people or boxes that they manage.”

Predicting how infosec will look in a decade is not a straightforward process, but it is certain that substantial changes are imminent. Fraud levels are continuing to rocket, while malware and spam levels reach new records each month. As technology reaches further into our business and personal lives, it must become more accountable and secure, or fail, as users desert due to rising risks.

Eschelbeck's exciting times

Gerhard Eschelbeck has been in the security industry for more than 15 years, holding posts including CTO of Qualys before becoming CTO of Webroot in 2004. Prior to joining Qualys, Eschelbeck was senior vice president of engineering for security products at Network Associates, vice president of engineering of anti-virus products at McAfee Associates, and founder of IDS, a secure remote control company acquired by McAfee.

Eschelbeck told SC: “There has been a sea-change in IT working practices over the past ten years, and in the process we've become completely decentralised.

There has certainly been a lot of hype over cloud computing, but virtual private clouds are the immediate future. They can be as secure as necessary for the enterprise in question, depending on the security model. I see a move back towards a more centralised approach to security and IT in general. This is the opposite of what will happen to the employees, who will become increasingly geographically distributed.

There will also soon be very little difference between mobile devices and desktops, in my opinion. We're already aware of thin clients, but even this staging post will be supplanted by the browser as the OS – we're already seeing Google pushing this hard via its Chrome OS and browser.

Interestingly, we as an industry are still very focused on security from a Windows perspective, and that's sure to change as this shift begins. The result of this will also be to push security technology back into the cloud, resulting in a ‘clean pipe'-type situation.

This is not only because cloud services offer benefits to both customer and vendor, but also because we are reaching the physical limits of current technology. Pushing updates out to millions of desktops just isn't practical, and even as pipes get wider and processors faster, so does the volume of updates, patches and so on that we are trying to fit down and through them. This isn't a situation that can continue for long.

These changes will be far-reaching, and I think we'll start to see many of them begin to impact within the next three to five years, and really become mainstream thinking by the end of that time. It is highly likely that the web will remain the top vector for attacks, and the increasing targeting of ‘trusted' social networks of all types will continue.

I can genuinely say I have never experienced a more exciting time in this industry than right now – we have been doing the same thing for the past ten years in many ways, but change is imminent.

Augmented Reality: the face of the internet to come?

The web was invented in 1989 by Tim Berners-Lee at CERN. He wrote the first web client and server the following year. Since then, there has been considerable development, from the static 1990s web through the development of Flash and JavaScript, to the more dynamic Web 2.0. So how much difference might this coming decade make?

One thing is certain about the future of the internet, and that is its increasing mobility. The early mobile web was limited, and attempts to create compelling content unsuccessful. However, smartphones have changed this completely, with the most recent able to run Flash content, stream media and perform secure transactions.

And the market for digitally-enhanced reality, or Augmented Reality (AR), is just beginning. The augmentation is generally in real-time and in semantic context with the surrounding environment, such as sports scores on TV during a match. Mash-ups of Google maps and a huge variety of data feeds, connected to GPS and phone-mounted camera images are already available on smartphone platforms, and given that Samsung is beginning to preload AR browsers onto handsets, the future for such technology is bright. Recently, Wallpaper Magazine released an iPhone app guide to London and other cities, based on AR. And December 2009's Esquire included AR elements on its cover and inside – square stickers that, when held up to a web camera, triggered video segments.

Analyst firm Juniper Research predicts that the annual number of mobile downloads featuring augmented reality content is expected to rise from one million in 2009 to 400 million by 2014. AR enterprise apps are unlikely to launch before 2012 due to technological constraints, but they will be able to command a very high subscription price, continued the report.

Total annual revenues from AR-enabled apps will reach $732 million in 2014, said Juniper. Gartner agrees, stating that by 2015 context will be as influential to mobile consumer services and relationships as search engines are to the web.

Of course, the fixed ‘broadband' internet we know today will not be entirely abandoned. Years of consultations have resulted in undertakings to build a UK-wide super-fast broadband network within years, partly funded by a ‘broadband tax' on landlines. On 8 January, prime minister Gordon Brown wrote in the Daily Telegraph: “We must assist broadband providers to move farther and faster; to bring super-fast connections to households and businesses to every corner of the country. That is why we have set out plans for £1 billion of extra investment to ensure that all regions of Britain – including those with sparse populations – are covered by 2017.”

While the most popular content is likely to evolve in unpredictable ways, it is certain that the internet itself will become pervasive. Entirely new devices such as tablet PCs are set to début this year from major manufacturers such as Apple, Microsoft and HP, with analyst Deloitte recently crediting the new market with selling “tens of millions” of units in 2010 alone. Bandwidth will increase exponentially, and media provision will gradually migrate entirely online, while business processes will be increasingly automated.

Share this article:

Sign up to our newsletters