Protecting a business is as much to do with ‘human factors’ as it is with the IT department, argues Paul Kearney, Head of Enterprise Risk Research at BT Group.
Much can be learned from history. Take, for example, the Trojan Horse – a contraption that was received as a gift during the siege of Troy but actually contained enemy soldiers. The trick worked for the invaders because the defenders let their greed and curiosity overcome their caution.
And then there was Archimedes – the man employed 200 years BC to produce a machine that could defeat the Romans by smashing their siege ladders as they placed them against the huge city walls of Syracuse. The solution worked for a while until the citizens became over confident and dropped their guard.
In both cases, it wasn’t the failure of the defences that led to defeat but the gullibility, naivety and complacency of the people who trusted them.
Winding on 2000 years or so, machines still challenge the vulnerabilities of organisations and it’s somehow appropriate that ‘Trojan’ has become the term used to describe one of today’s most feared types of computer malware.
Other threats include ‘phishing’ (where an e-mail message appears to be from a well-known and trusted organisation but isn’t), hacking, electronic fraud, electronic burglary using tiny USB devices, and the physical loss or theft of computer hardware – particularly laptops.
But regardless of the methods used, the real threat isn’t technology, but the human beings that mastermind the attack.
No organisation is immune from their attentions – in fact, the bigger the organisation, the more likely it is to attract criminals and pranksters driven by the buzz of seeing the results of their handiwork in media headlines.
Massive impact
The consequences of cyber attacks are significant, of course. Security breaches can have a massive impact on an organisation’s bottom line. The authoritative Information Security Breaches Survey says the worst incidents currently cost large businesses (with between 250 and 499 employees) between £90,000 and £170,000 and very large businesses (employing 500 or more) between £1 million and £2 million [i].
And cases of security breaches are legion. CDs containing personal data on about seven million families were lost in transit from Her Majesty’s Revenue and Customs (HMRC) and another government department [ii], and the Driver and Vehicle Licensing Agency mislaid a vast quantity of driving and vehicle licence details [iii]. A laptop containing sensitive defence data was stolen from the boot of a car in London [iv] and millions of customers in Britain and America were claimed to be at risk after credit and debit card records were stolen from retailer TJX’s computer systems [v]. A former informant of the US secret service has been accused of the latter crime, thought to be America’s biggest and most complex case of identity theft [vi].
The list goes on and there’s a constant battle between security professionals and cyber-criminals – organised or merely opportunist – who can find an outlet for data.
Security professionals, however, are often technologists, so their instinct is to look to technical solutions. Unfortunately, if they aren’t designed well, people will make mistakes in using them or just give up on them entirely. And if they aren’t efficient, they can end up hindering the progress of the tasks they are supposed to protect.
There are even security specialists who believe that IT users are merely a nuisance and regard their colleagues as ‘vulnerabilities’ against which their systems must be protected using rigid rules and procedures. This ‘command and control’ mentality can result in unnecessary restrictions on employees going about their work with the perverse effect of reducing security as staff try to find ways around the blocks in their path.
Human characteristics can create weaknesses and loopholes criminals can exploit. Consider people’s natural desire to be helpful, for instance. If an outsider claims to be a colleague wanting help with something, individuals are inclined to help, opening the doors of their fortress as a result. They are just too trusting – unaware of the tricks that people can, and will, get up to. And even security professionals can easily fall into that trap.
Typically somebody rings a helpdesk to say that they are working away and really need to prepare something for an important meeting but have forgotten their password. Impassioned pleas like this can all too often result in passwords being given away.
Con trick
Such tactics are known as ‘social engineering’ but they are merely a new take on an old-fashioned con trick.
Another example on a more physical security level involves people putting on overalls and carrying a clipboard to blag themselves into buildings. Employees just assume they are members of the maintenance team – people who know what they’re doing. And if you look the part, you can get access to all sorts of things…
Some companies employ people who do such things on a ‘white hat’ basis – white hat being an analogy to the old cowboy films where the villain wears a black hat while the sheriff wears a white one. So a ‘white hat hacker’ is somebody who has the skills of a hacker but who is employed to identify vulnerabilities – a typical poacher turned gamekeeper.
We should be designing systems that make best use of the complementary characteristics of people and technology to strengthen security. Computers don’t get tired, and can prevent people making mistakes. But they only do what they are programmed to do, and that may not be enough. Humans tire more readily, exposing themselves to attack. But if they think something odd is going on that they don’t understand, they can use their common sense and report it.
However, there is a potential conflict of interest between productivity and security – if you’re in a rush and working to a deadline you might be tempted to circumvent security measures. Indeed, management can make this worse through productivity and sales incentives.
Organisations need to motivate and educate their employees so that they see security as part of their job and to understand why they are being asked to adopt certain behaviours rather than just being able to ‘tick the box’.
People – asset or vulnerability?
This approach is endorsed in the latest information security breaches survey carried out by PricewaterhouseCoopers for the UK’s Department for Business Enterprise and Regulatory Reform (BERR) [i].
Their report states: “Companies increasingly realise that their people, while their greatest asset, can be their greatest vulnerability and so need to be educated on security risks.”
The survey discovered that more than half of the UK companies screened had not carried out a formal security risk assessment and that 67 per cent did nothing to prevent confidential data leaving their premises on devices such as USB sticks.
Broader research by the European Network Information Security Agency (ENISA) has warned that increased cyber-criminal activity is threatening the economic interests of the EU. The agency has called upon industry to collaborate to make the Internet a safer place to do business globally.
Given that people within an organisation can be the weakest link in terms of security, what can be done?
Education is a good start but so is a new approach within IT departments to make software much easier to use securely and a better understanding of human factors rather than total reliance on technology.
There are plenty of academics who specialise in human factors and human computer interactions and some companies have specialist labs that test systems for usability characteristics. But they are mainly aimed at functional features – making things easier to use – rather than the usability of security.
International standards such as ISO 27001 play a part too and lay down technical controls covering such things as usernames and passwords. But only one out of 133 controls covers human issues. In any event, 79 per cent of companies contacted for the BERR survey were unaware of the standard.
Best practice
Standards apart, there’s a need to achieve best practice in terms of user interface design and human factors to maximise security. Enterprises must take responsibility for these issues – it is in their own interests to share knowledge and work together to improve things.
Another positive move would be to embrace the mandatory incident reporting procedures that are commonplace in the aviation industry. Because these highlight not just actual accidents but near-misses, they provide a more accurate view of the situation – one that provides a sounder basis for future security decisions. California, for example, has made it mandatory for companies to report losses of personal information.
Unless action is taken on the human factors of security, the public’s confidence in e-commerce and anything else beginning with ‘e’ will be lost. If something isn’t done quickly, it could be a case of closing the stable door after the Trojan Horse has bolted.
BT is exhibiting at Infosecurity Europe 2009, the No. 1 industry event in Europe held on 28th – 30th April in its new venue Earl’s Court, London. The event provides an unrivalled free education programme, exhibitors showcasing new and emerging technologies and offering practical and professional expertise. For further information please visit www.infosec.co.uk
Panel:
- Human factors – human nature
- Here are five ways that organisations should consider to reduce the human ‘vulnerabilities’ identified by the BERR report [i]:
- Create incentives to encourage secure behaviour in the same way as you do to boost productivity.
- Educate the workforce and enlist their support – they are part of the solution not part of the problem.
- Make the security features of software more useable.
- Beware of the ‘man in overalls’ or suspect him. Don’t just hold the door open for someone – check they have a pass.
- Share knowledge to improve things collectively. Information on security breaches can be shared anonymously if required.
Courtesy: Infosecurity PR
<>