When it comes to information security some things never change


In 1969, Management Today printed an exposé of business managers' naïve approach to information security. Here, we reprint the article, showing little has changed in 42 years.

In 1969, Management Today printed an exposé of business managers' naïve approach to information security. Here, we reprint the article, showing little has changed in 42 years.

Recently, in Sir George Williams University, Montreal, Canada, rioting students destroyed two computers valued at £675,000. They threw tapes, cards and files out of windows and succeeded in demolishing much of the information necessary for the installation to function. It was estimated that it would take three months to replace the equipment, and, more significantly, eight months to reconstitute the files, even allowing for the fact that a back-up set of essential files was unharmed…

In addition, the university claimed a loss of contracts valued at over £4,000 per month. In the same week, in the same city, a bomb was exploded in the Montreal Stock Exchange which severely damaged the computer system. It resulted in an eight-week return to blackboard and chalk techniques for displaying prices. In the UK, an airline suffered a serious loss of competitive position when the details of an expensive real-time computer system, in which many millions of pounds had been invested, were made available to competitors. The operation was performed in such a way that the airline was unable to take legal action against the offenders. In the US, the Borden Company reported a deficiency of over £1 million resulting from what appeared to be an error in converting part of the existing accounting system to a computer system. According to The Wall Street Journal, the error had remained undetected for two years…

A computer breakdown which lasted for six days revealed the complete inadequacy of a Midland company's stand-by arrangements. The firm with which the arrangements had been made some four years previously had expanded the configuration to the point where the two machines were now totally incompatible – and in any event the stand-by machine was loaded with work three shifts a day, seven days a week. And a programmer in an American bank managed to acquire quite a considerable sum of money for himself simply by programming the computer to bypass his account when reporting on overdrafts…

Ignorance at the top
The list of incidents is endless, and the tragedy is that until your own computer becomes involved, the chances of adequate precautions being taken are very remote. Why is this? The obvious answer is that managements are unaware of the risks undergone. Why are they unaware? Among the many reasons are, first, a false sense of security generated by the apparent technical complexity of computer systems; and second, top managements are often sadly lacking in any real understanding of computers. A one-week ‘introduction to computers' course, a quick conducted tour of an installation, and a demonstration of the machine's capabilities (often no more than a ludicrously rigged print-out) are generally considered sufficient for what they need.

In fact, this ‘instant pre-packed computer knowledge' does more harm than good. It adds to the misguided impression that computer systems, especially software, are so complex that they defy understanding by outsiders and must therefore be secure from misuse. Perhaps in the very early days of machine language programs there was a degree of justification for this belief. But today, with high-level programming languages and the extensive use of manufacturers' common packages, the security-through-complexity claim loses any validity it might have had.

Another cause of mistakes is failure to appreciate the true value of installations and their significance to the company's efficient operation. It is seldom realised that from day one, at least 50 per cent of an organisation's investment in computers is tied up in software, and that the longer the installation is in operation, the greater the software's value becomes.

Ignorance of this fact may help to explain why hardware is insured for its full face value, while software is allowed to sit precariously on the tops of desks just a few inches away from any passing briefcase.

As more and more of a company's manual processes are transferred to the computer, so the areas of designated responsibility narrow. Managers relieved of many mundane tasks are also relieved of the information previously available and necessary for making decisions, and are sometimes relieved of the need even to make decisions. Thus, at the end of the day, a responsibility load that was once spread over a large area involving a large number of people, all personally operating their own systems, is concentrated in one area and one small select group.

This significant, sometimes dramatic, redefining of responsibilities and duties oddly enough tends to produce a ‘Parkinson' type of reaction from top management. From exercising almost draconian control over normal business operations (‘three signatures are required for petty cash claims in excess of 2s 6d') and general security (‘only two people, you and I, are permitted to remove customers' record cards') they freely grant facilities for the select few to make – or defraud the company of – thousands of pounds, and proudly discuss with all and sundry the fact that their complete file of 20,000 customers' records can be copied in four minutes or printed in a quarter of an hour.

The need to get results is another source of error. Having committed themselves to a computer system and signed the cheque, there is an understandable desire on management's part to demonstrate that the money has been well spent. Unfortunately, results from computer projects tend to be very slow in appearing, and this gives rise to what can only be described as an obsession to get something, anything, working; then all reason and precaution are thrown to the wind as one crash program after another is mounted.

Eventually, something is made to work, by which time the installation's costs have doubled, anticipated benefits have not materialised, and the work programme is two years behind schedule. Possible consequences of this now almost standard situation are that the work completed is usually of a poor quality, riddled with errors and omissions (which in themselves could prove to be very costly in the long term), and that managements can only see endless years of crash action ahead of them.

Perhaps, in this situation, it is asking a bit too much of them to consider, firstly, the security and control of their installations, and secondly, to pay either in hard cash and/or sacrificed results for something which they feel may ultimately only demonstrate their wisdom and forethought. However, the sacrifice of reason for the sake of suspect gains and possible long-term disaster is hardly a management principle to be condoned or encouraged.

What then should managements do? To begin with, there must be a change of attitude. There is an urgent need for managements to appreciate that computers are a serious security risk and offer an open invitation to anyone bent on theft, fraud or sabotage. This realisation should inspire an evaluation of the computer's true worth to the total operation of the company.

Finding the answer
The questions that should be asked are: how long could the company continue to function efficiently in the event of a complete computer breakdown (the exercise should be detailed down to the level of what effects would be felt if, for example, just File A were to be irretrievably lost); and how valuable would the company's records and systems be to competitors – or, put another way, how seriously would it damage the company if all or part of the information retained in the computer centre was to fall into the wrong hands?

Having established a realistic value, the next step is to estimate the extent of the company's exposure to security risks. This is probably the most difficult aspect of the whole exercise. With limited knowledge of what they are capable of doing, management is seldom able to question reasonably or appraise any of the activities found in, or emanating from, the computer centre.

Managers are therefore on extremely weak ground when deciding questions like: what to look for; who to appoint to conduct an investigation; how far-reaching should the investigation be; what recommendations to accept or reject; just how secure should an installation be, and against what; and how much time and money should be allocated for the purpose?

The first thing to appreciate is that by relying entirely upon their own staff, managements run the grave risk of more than defeating the desired objective. Their own staff are themselves a permanent, changing and major security hazard. By participating in a security exercise, they could well be introduced to previously unthought-of possibilities. Also, too detailed involvement of the people against whom security precautions are directed is not far short of handing out keys to the safe. There is therefore an obvious requirement for independent experts. This could mean calling in reputable computer or management consultants, and/or using the advisory services of firms specialising in security. Whatever approach is adopted, management should tread cautiously, for there is only a handful of people in the UK with real experience of the technical, operational and managerial aspects of computers as well as the peculiar security problems inherent in their use.

Once the balance between internal and external participation has been established, an investigation should be held to locate any malpractice or erroneous situations that might be present in existing and/or developing systems, and to pin-point risk areas where misuse could occur. Each installation has it own particular security problems, and the following is simply a list of areas that should be examined irrespective of special circumstances: physical security – i.e. locks, bars, grilles, filing and storage cabinets, hardware stand-by facilities, fire and flood precaution, insurance; personnel access – i.e. access to the total computer area, the computer room, the data preparation section, the computer, the tape library, the programmers' offices, files, documentation; and software controls – i.e. the built-in features for controlling the operating accuracy and validity of program systems and manual procedures.

The logical conclusion to an investigation is a report which indicates the company's exposure to security risks, and details what needs to be done and how much it will cost. What are the costs likely to be? Quite obviously, much depends upon individual circumstances. A company that has experienced a break in security is more likely to consider what needs to be done and make money available than one that has never been given any cause for concern. An organisation that runs the risk of going out of business if, say, its library of computer files were to be destroyed (e.g. a computer bureau) has more reason to install tight security than the company whose computer is relatively insignificant.

The following, then, is only a rough guide which considers neither special circumstances nor individual valuation. For simplicity's sake, the purchase price of computer hardware is used as the estimating factor – this reasonably reflects the volume and type of work handled, the complexity of systems, and the overall size of the installation. An investigation covering the three ‘obligatory' areas mentioned above could (and if done thoroughly, should) amount to about five per cent of the hardware costs. A figure very much less than this could be false economy.

The cost of designing and then implementing a security system depends largely upon how much software needs to be written or amended. In the unlikely event of no software work being required, the extra cost should not amount to much more than 3.5 per cent per annum of the hardware value.

It is worth mentioning that computers also offer facilities which in many cases are considerably more secure than existing arrangements. For example, the mere fact that computer information is stored in code, although a simple code, is a reliable and often cheaper safeguard against the petty, sometimes serious, misuse that occurs when the same information is retained in an easily recognisable form. Thus consideration of the computer as a security device is a worthwhile secondary exercise to the establishment of the security of the computer itself.

Managements are likely to reap additional side benefits from investigating and implementing computer security precautions. Possibly the most important of these is the increased knowledge they obtain. This enables them, sometimes for the first time, to exercise effective managerial control over the operation of their installations. Because of their necessary involvement, they are often able to make hitherto unthought-of contributions to their computer departments' technical, operational and managerial strategies. By obtaining detailed valuations, they are in a stronger position to exercise better cost control.

Within the next ten years, computer hardware, information systems, and methods of access to, transmitting, capturing, retrieving, processing and storing data, will alter out of all recognition. Also new and radical management techniques will emerge, making it necessary for managers to have in-depth experience of the use of computers as a tool of management. All accounting procedures will be directly or indirectly carried out by computers, and as much as 90 per cent of companies' information processes will depend to a greater or lesser extent on computers. Direct access to computer-stored information will be readily and necessarily available to large numbers of day-to-day operating staff, many of whom will be sited hundreds of miles from the actual computer. There will be an increase in the use of general systems and program packages.

Rise of the machines
From this the main conclusions are: that the computer's importance and influence will considerably increase and will expand into new areas; that companies will become increasingly dependent upon computer-based information systems; that access to computer-stored information will be freely available to many more people than at present; that the roles of managers and computer personnel will become more closely related; that many organisations will be using identical computer software systems, the details of which will be commonly known to many people; and that computer-based systems will be the single largest repositories of information, and that information thus retained will be a major factor in the efficient operations of a business.

The following extract, from a very successful American computer bureau's publicity brochure, illustrates the importance that at least one company places upon security: “We use security receptionists and guards at all times – day and night. Master files and programs are delivered from independent security storage centres just a half-hour before production runs start, and go back immediately the runs are satisfactorily completed. Before being given production okay, our programs and systems are independently audited to guarantee that in-error data will be picked up. No changes are effected without a re-audit of the complete system. Every operation on the computer is monitored as it happens on a teleprinter sited within our president's office. Last year we spent $80,000 on security alone.”

The need for such standards of security can be shown by one last example. About nine years ago, the management of one of the UK's largest companies was advised on the financial benefits to be achieved from centralising on one computer the accounting procedures of three virtually autonomous divisions. The advice was accepted entirely on the strength of the facts presented, i.e. financial savings. Systems and program work soon commenced. During intensive program testing, the chief programmer took home a print-out of Division A's customer record file.

While searching for possible errors, he noticed that one of Division A's customers was Division B, and that Division B paid the full purchase price for Division A's product, i.e. no discounts were applied, even though the quantities involved qualified for a maximum discount. Within a short time the chief programmer established: that it was company policy not to apply discounts to inter-divisional purchase; that divisions were free to place orders on any suppliers; and that he and Division B's chief buyer had a lot in common.

On the day the computer system went live (under the direct control of the chief programmer), a new company appeared on the computer files, buying from Division A and selling to Division B. Within 18 months the new company was buying two per cent of Division A's total production and supplying 95 per cent of Division B's requirements. At about this time, Division A's assistant marketing manager was unable to satisfy an urgent order for a customer and obligingly suggested that for this particular order they contact the new company. Three months later the assistant marketing manager had joined forces with the new company (while still remaining at his post, just as his two partners had at theirs) and this move was mainly responsible for trade between Division A and the new company trebling within six months.

When the fraud was eventually discovered, it was calculated that direct losses on sales between Division A and Division B were running at £30,000 a year, and loss of profits on lost contracts at £25,000 a year. The question of blame was unanswered because there were so many people involved and so many points where just a minor inquiry could and should have revealed the whole sinister business.

This article was written by G. F. Parker for Management Today in 1969


Find this article useful?

Get more great articles like this in your inbox every lunchtime

Upcoming Events