On 27th June 1996 the Computing and Control Division of the Institution of Electrical Engineers hosted a colloquium in London under the title Information Security - Is IT Safe? The speakers were drawn from Government security services, MoD, the police, and security software houses, so those attending gained a useful insight into the `official' view of information security.
The keynote address was given by Andrew Saunders, Director of the Communications-Electronics Security Group (CESG) at Cheltenham. CESG is the UK national authority for cryptography and technical information security. Starting with definitions, IT Security was described as including confidentiality, integrity, and availability. The last of these is often overlooked, but denial-of-service attacks can be as damaging as data theft or data corruption. Balancing risk against benefit requires some careful thought, particularly in the area of threat analysis: there is no point in spending money on securing a system that is unlikely to be attacked or that is not important to the organisation. The key message from this talk was that security must be a combination of measures: personnel selection, operational procedures, physical security, and technical security.
Murray Donaldson of CESG described the progress being made towards
common criteria for the evaluation of IT security products, devices,
and systems. Starting with the US `Orange Book' in 1985, programmes in
the US, Europe, and Canada each developed new sets of evaluation
criteria. After much detailed discussion, these have been brought
together into one (large) draft document which is now available for
and which will form the input to a future ISO standard.
The next speaker was Paul Fleury, Head of the Information Systems Security Group, which is responsible for the non-technical aspects of security policy in government. He described the Unified Incident Reporting and Alert Scheme (UNIRAS - no connection with the graphics package of the same name). This is the government equivalent of CERT: it issues IT Security alerts and briefings to government departments and contractors, and collates incident reports for threat analysis. An important feature is that it guarantees anonymity to anyone reporting an incident so that the emphasis is on alerting others to potential risks and fixing problems rather than blaming the victims. A report is produced at the end of each financial year summarising the incidents. In 1995-6 the main categories were: virus infections (1775), theft (1136), hacking (801), and 369 incidents of other types. The effects were characterised as Integrity (33%) Confidentiality (22%) and Availability (45%). The main trends over the year were a large increase in virus reports and in the value of stolen IT equipment: £6.5M with six individual thefts valued over £100k each. These figures probably parallel the experience of non-government organisations, but comparisons are difficult to make because of different definitions. As an example, 98% of the `hacking' incidents were described as `legitimate users abusing their priviliges' - this is certainly computer misuse, but may not be regarded as `hacking' in other communities.
Elizabeth France, the Data Protection Registrar, started her talk by explaining her role and powers and the fact that she is responsible directly to Parliament and not to Government. Apparently she often has to explain this to ministers and senior members of government agencies too! There are several anomalies in the Registrar's powers and obligations: for example, she can only enforce the law against a registered data user but has no power to investigate complaints against one who has avoided registration.
The existing Data Protection Act derives from an international agreement known as Treaty 108, and was introduced more to allow the UK to continue trading with other countries than to protect the rights of the citizen. The 1984 Act will have to be replaced by 1998 because the EU has issued a Directive on data protection which must be incorporated into national law. There are explicit references to Privacy (the existing law does not say anything about this) Registration, Consent, and Individual Rights. The Data Protection Principles are largely unchanged, and any organisations that already comply with the spirit of the existing law will find the new law quite easy to comply with.
Mrs France reiterated the point that staff training and awareness is a vital part of a security policy, and cited a case where old medical records were found being used as drawing paper by a Brownie pack! The Registrar's Office gets about 3000 complaints each year, 50% of which lead to investigations. Prosecutions are rare as the approach is to persuade the offending data users to comply with the law. Even so, a few cases do reach the courts each year.
Detective Inspector John Austen of the Metropolitan Police Computer Crime Unit is a well-known figure in computer security circles, having lead the 8LGM investigation and several high-profile cases in more recent years. He started by describing an incident that took place on the night of 9th January 1995: an obscene image involving a young woman and a horse appeared at the end of a routine bulletin issued by a financial wire service. It was distributed automatically to many customers in several countries. The investigation was easy and a member of staff was soon arrested, but the wire service company lost sixteen major customers almost overnight - not because of the image itself, but because the customers lost faith in the company's security and thus would not trust the information they were buying. The story illustrated the knock-on effects of an apparently small incident and again underlined the importance of all-round attention to security (who vets your office cleaners?)
It is easy to forget that there are computers other than PCs and servers: consider all the computers embedded in complex equipment - they can be hacked too. In fact, 50% of the incidents reported to the Computer Crime Unit relate to private telephone exchanges. Phone Phreaking has been around much longer than computer hacking, and may still account for more financial loss. Equipment such as PABXs and building control systems is often managed by people with no knowledge of computer security, so hackers commonly find standard maintenance accounts with default passwords: enough to run up a hefty phone bill or bring your organisation to a grinding halt. Password management seems to be something that everyone is bad at: the Met tried one of the simpler bits of captured software against some police computers and achieved a 75% hit rate on some password files.
Poor security system design shows up in many areas: one recent arrest was a New Zealander who had been travelling in comfort for nine months armed simply with a bit of software to re-program credit cards. By taking advantage of the inadequate checks applied by point-of-sale terminals he could simply invent credit card accounts at will.
The talk concluded with a profile of the typical real intruder: not a spotty schoolkid in an attic hacking at night, but a (probably spotty) 16-35 year old well educated, frustrated (social / sexual / employment / you-name-it), sci-fi-addicted, vindictive and/or profit-oriented pawn. Pawn, that is, of the elite hackers - the very few really clever ones - and of the criminals or subversives with an interest in the hacker's `product'.
Bob Hill, the Ministry of Defence Project Manager for the Security in Open Systems Technology Demonstrator Programme spoke about the MoD's wish to reduce its dependence on bespoke development and make use of more commercial off-the shelf IT products. This was a session full of acronyms - references in the above sentence alone gave rise to MoD, SOS, TDP, and COTS! The essence of the programme is to show that a secure mail and directory service can be built using components from several manufacturers and that the same components can meet the needs of both government and others. The project claims a world first, in that it has enlisted Microsoft as prime contractor and persuaded it to co-operate with Digital, Nortel, Novell, EDS, SPYRUS, and Zoomit to produce the first demonstrator. Later this year, phase two will start to address the `pull' applications such as database query and WWW.
David Ferbrache of DRA is another familiar figure on the UK network security scene. He discussed the nature of the hacking threat to open networks, reminding the audience again about the risk from `social engineering' (Hello? This is the network support centre. We have found a problem with your files and we need your password to sort it out...) Moving on to technical vulnerabilities, the sheer size and complexity of modern systems is now beyond most people's understanding: a typical Unix system with X represents about 2.8M lines of code, and a Windows-95 system with a few desktop applications may well be more complex than that. It is easy for vulnerabilities to creep in: buffer overflows have been exploited in many programs, and race conditions are a lot more common than people expected too. Simple things like validation of program arguments are often overlooked, and programs are written using invalid assumptions (everyone will understand what we mean if we only use two digits for the year...). There is a lot of `Somebody Else's Problem' syndrome around, such as the database back-end processes that assume the front-end has done the security checks and are thus vulnerable to anyone who decides to talk directly to the back-end. Finally there are systems developed with no security at all - `it will only be used by our own staff' and `security will be added in the next version' are common ostrich-like statements.
The defence community takes security seriously, and regularly tests its systems by setting `tiger teams' to attack them. The results leave no room for complacency: of 38000 probes made last year, 24000 were successful to some extent, only 9000 of those were detected, and less than 100 of these `incidents' were reported. It does not take much effort to imagine the effect of the same probes in a less security-conscious community!
John Hughes of TIS(UK) spoke about countermeasures to protect against attacks from the Internet. Amid the cowboys-and-indians clipart this was a fairly standard `Firewalls for Managers' talk, though it also highlighted further problems created by export controls on cryptography and provoked a heated debate with a member of the audience on the relative merits of public-key and symmetric cryptosystems for key management.
Alex McIntosh of PC Security Ltd spoke on `Protection of Commercial Data and National Law Enforcement' - a tense combination as he was quick to admit. The Encryption Debate is raging in public in the US but is also alive and well in most other countries as a balance is sought between the percieved needs of government - particularly the law enforcement agencies - and those of other users of cryptography - business and individuals. Governments in many countries allow the use of encryption within national borders, but almost all control its export and some control its import. This makes life very difficult for multinational organisations and leads to some odd anomalies: the US recently relaxed the rules about exporting laptop computers containing encryption technology, but this benefit only applies to US nationals and is not extended to foreign nationals even if they work for US companies and are based in the US!
One major item in the encryption debate centres around key lengths: longer keys are much harder to attack than shorter ones. If the underlying cryptosystem is good then adding one bit to the key length should double the length of time taken to break the key. The US government currently takes the position that they will only allow 40-bit keys in freely-exportable cryptosystems, yet a report published in January 1996 suggests that businesses should be using at least 70 bit keys if the threat to them is from other businesses, and 75 bits if the threat is from national governments. A table was exhibited giving estimates of the time required for various types of attacker to break 40-bit and 56-bit keys. These ranged from five hours for a pedestrian hacker with $400 of hardware attacking a 40-bit key to 0.0002 seconds for an intelligence agency with $300M hardware. The story on 56-bit keys is not much better: the pedestrian hacker is set back by a 38-year attack time, but a professional applying $300k of special-purpose hardware could break the same key in three hours and the intelligence agency hardly has time to draw breath in the 12 seconds it would take with their country-sized appropriation.
On 10th June 1996, the UK Government published a proposal for the licencing of `Trusted Third Parties' that would provide encryption services to business and individuals and which could be required to provide decryption keys to law enforcement agencies in certain cicumstances. This proposal is rather different from the US attempts to enforce key escrow, and has some potential merits for organisations and individuals that need encryption but are not competent to build their own system. It is still unlikely to find favour with big business though, and the Shell oil company was used as an example of a business that has already thought through the issues. One very important issue is that of liability: Shell stated that not even the largest government could manage liability for the damage that could be caused by the loss of keys, and went on to define a corporate trust model along the lines of `We do not trust the government of the US. We do not trust the government of the UK....' Eventually Shell implemented their own strong encryption system including key management and key escrow, all operated from their headquarters in London. The work was done by PCSL, and interestingly they have recently obtained agreement from all the relevant US agencies that the same system is acceptable for use by Fortune 500 companies. This is another nail in the coffin of government key escrow, but it does act as a useful reminder that key escrow is an essential service: organisations do not want to lose access to their data if someone forgets a password or leaves the company. Similarly an encrypted will is not much use if the only person knowing the key is the one who has just died!
The final talk was given by Nigel Hickson of the Information Security Policy unit of the Department of Trade and Industry. The DTI is the UK body with control over encryption technology use and export so the audience were able to get authoritative answers to some important questions. The current position could be summarised as:
It is legal to use any sort of encryption inside the UK.
All export of encryption technology is controlled.
The details of the export controls are published through HMSO. The DTI enquiry unit (0171 215 5000) can put you in touch with experts if you need further assistance.
The main part of the talk was concerned with BS7799, the UK standard on information systems security. The standard is a recent one, and was largely developed by the DTI working with people from large companies. Several supporting booklets have now been published and are available from the DTI, in particular the Information Society Initiative security booklet (call 0345 15 2000) and the Information Security Policy Statement (call 0171 215 1399). There are also booklets on `Information Security and the Internet', `Computer Assurance Guidelines', and `The Business Manager's Guide to Information Security'. A process for accreditation to BS7799 standards is now being put together: this will be fairly flexible, as the Standard has 105 `controls' and it is up to each organisation to decide which are most appropriate to implement. BS7799 has also been proposed for `fast track' acceptance as an ISO standard though its chances of success are not clear.
Returning to the encryption issue, it was stated that there would be no new controls, that the concept of licenced trusted third parties was currently favoured, and that international working was recognised as important. The recent announcement provides a framework for policy and more detailed proposals can be expected later in 1996: this means that any comments on the existing proposal should be submitted quickly. The EU is apparently considering a `second information security decision' and an OECD expert group is working on global crypto guidelines.
This report is available on the Web at: http://http1.brunel.ac.uk:8080/~andrew/reviews/information-security-colloq.html
The full colloquium digest is available from the IEE under reference 96/151.
The IEE can be contacted by telephone on 0171 240 1871, and is on the Web at http://www.iee.org.uk/
The event was organised by IEE Professional Group C14 (Information Technology) which would welcome comments and suggestions for future meetings. The contact is Sarah Evans (0171 344 8423 and email@example.com).
Both documents are available from HMSO (0171 873 9090) at £14.30 each.