Digital decisions: Understanding behaviours for safer cyber environments

DART

I was invited to participate in a panel discussion at a workshop on digital decision-making and risk-taking hosted by the Decision, Attitude, Risk & Thinking (DART) research group at Kingston Business School.

During the workshop, we addressed the human dimension in issues arising from increasing digital interconnectedness with a particular focus on cyber security risks and cyber safety in web-connected organisations.

We identified behavioural challenges in cyber security such as insider threats, phishing emails, security culture and achieving stakeholder buy-in. We also outlined a potential further research opportunity which could tackle behavioural security risks inherent in the management of organisational information assets.

2016-04-25 14.50

Advertisements

‘Wicked’ problems in information security

10299945186_12bb26640f_z

Incorporating security activities into the natural workflow of productive tasks, makes it easier for people to adopt new technologies and ways of working, but it’s not necessarily enough to guarantee that you’ll be able to solve a particular security-usability issue. The reason for this is that such problems can be categorised as wicked.

Rittel and Webber in ‘Policy Sciences’ define a wicked problem in the context of social policy planning as a challenging – if not impossible – problem to solve because of missing, poorly defined or inconsistent requirements from stakeholders, which may morph over time and which can be demanding to find an optimal solution for.[1]

One cannot apply traditional methods to solving a wicked problem; a creative solution must be sought instead. One of these creative solutions could be to apply design thinking techniques.

Methods for design thinking include performing situational analysis, interviewing, creating user profiles, looking at other existing solutions, creating prototypes and mind mapping.

Plattner, Meinel and Leifer in ‘Design Thinking: Understand–Improve–Apply’ assert that there are four rules to design thinking, which can help security professionals better approach wicked problems:[2]

  1. The human rule: all design activity is ultimately social in nature.
  2. The ambiguity rule: design thinkers must preserve ambiguity.
  3. The redesign rule: all design is redesign
  4. The tangibility rule: making ideas tangible always facilitates communication.

Security professionals should adopt these rules in order to develop secure and usable controls, by engaging people, utilising existing solutions and creating prototypes that can help by allowing the collection of feedback.

Although this enables the design of better security controls, the design thinking rules rarely provide an insight into why the existing mechanism is failing.

When a problem occurs, we naturally tend to focus on the symptoms instead of identifying the root cause. In ‘Toyota Production System: Beyond Large-Scale Production’, Taiichi Ohno developed the Five Whys technique, which was used in the Toyota production system as a systematic problem-solving tool to get to the heart of the problem.

In one of his books, Ohno provides the following example of applying this technique when a machine stopped functioning:[3]

  1. Why did the machine stop? There was an overload and the fuse blew.
  2. Why was there an overload? The bearing was not sufficiently lubricated.
  3. Why was it not lubricated sufficiently? The lubrication pump was not pumping sufficiently.
  4. Why was it not pumping sufficiently? The shaft of the pump was worn and rattling.
  5. Why was the shaft worn out? There was no strainer attached and metal scrap got in.

Instead of focusing on resolving the first reason for the malfunction – i.e. replacing the fuse or the pump shaft – repeating ‘why’ five times can help to uncover the underlying issue and prevent the problem from resurfacing again in the near future.

Eric Reis, who adapted this technique to starting up a business in his book The Lean Startup,[4] points out that at “the root of every seemingly technical problem is actually a human problem.”

As in Ohno’s example, the root cause turned out to be human error (an employee forgetting to attach a strainer), rather than a technical fault (a blown fuse), as was initially suspected. This is typical of most problems that security professionals face, no matter which industry they are in.

These techniques can help to address the core of the issue and build systems that are both usable and secure. This is not easy to achieve due to the nature of the problem. But, once implemented, such mechanisms can significantly improve the security culture in organisations.

 

References:

[1] Horst W. J. Rittel and Melvin M. Webber, “Dilemmas in a General Theory of Planning”, Policy Sciences, 4, 1973, 155–169.

[2] Hasso Plattner, Christoph Meinel and Larry J. Leifer, eds.,  Design Thinking: Understand–Improve–Apply, Springer Science & Business Media, 2010.

[3] Taiichi Ohno, Toyota Production System: Beyond Large-Scale Production, Productivity Press, 1988.

[4] Eric Reis, The Lean Startup, Crown Business, 2011.

Image by Paloma Baytelman https://www.flickr.com/photos/palomabaytelman/10299945186/in/photostream/

To find out more about the psychology behind information security, read Leron’s book, The Psychology of Information Security. Twitter: @le_rond


Productive Security

500995147_5f56493a1e_z

The majority of employees within an organisation are hired to execute specific jobs, such as marketing, managing projects, manufacturing goods or overseeing financial investment. Their main – sometimes only – priority will be to efficiently complete their core business activity, so information security will usually only be a secondary consideration. Consequently, employees will be reluctant to invest more than a limited amount of effort and time on such a secondary task that they rarely understand, and from which they perceive no benefit.

Research[1] suggests that when security mechanisms cause additional work, employees will favour non-compliant behaviour in order to complete their primary tasks quickly.

There is a lack of awareness among security managers[2] about the burden that security mechanisms impose on employees, because it is assumed that the users can easily accommodate the effort that security compliance requires. In reality, employees tend to experience a negative impact on their performance because they feel that these cumbersome security mechanisms drain both their time and their effort. The risk mitigation achieved through compliance, from their perspective, is not worth the disruption to their productivity. In extreme cases, the more urgent the delivery of the primary task is, the more appealing and justifiable non-compliance becomes, regardless of employees’ awareness of the risks.

When security mechanisms hinder or significantly slow down employees’ performance, they will cut corners, and reorganise and adjust their primary tasks in order to avoid them. This seems to be particularly prevalent in file sharing, especially when users are restricted by permissions, by data storage or transfer allowance, and by time-consuming protocols. People will usually work around the security mechanisms and resort to the readily available commercial alternatives, which may be insecure. From the employee’s perspective, the consequences of not completing a primary task are severe, as opposed to the ‘potential’ consequences of the risk associated with breaching security policies.

If organisations continue to set equally high goals for both security and business productivity, they are essentially leaving it up to their employees to resolve potential conflicts between them. Employees will focus most of their time and effort on carrying out their primary tasks efficiently and in a timely manner, which means that their target will be to maximise their own benefit, as opposed to the company’s. It is therefore vital for organisations to find a balance between both security and productivity, because when they fail to do so, they lead – or even force – their employees to resort to non-compliant behaviour. When companies are unable to recognise and correct security mechanisms and policies that affect performance and when they exclusively reward their employees for productivity, not for security, they are effectively enabling and reinforcing non-compliant decision-making on behalf of the employees.

Employees will only comply with security policies if they are motivated to do so: they must have the perception that compliant behaviour results in personal gain. People must be given the tools and the means to understand the potential risks associated with their roles, as well as the benefits of compliant behaviour, both to themselves and to the organisation. Once they are equipped with this information and awareness, they must be trusted to make their own decisions that can serve to mitigate risks at the organisational level.

References:

[1] Iacovos Kirlappos, Adam Beautement and M. Angela Sasse, “‘Comply or Die’ Is Dead: Long Live Security-Aware Principal Agents”, in Financial Cryptography and Data Security, Springer, 2013, 70–82.

[2] Leron Zinatullin, “The Psychology of Information Security.”, IT Governance Publishing, 2016.

Photo by Nick Carter https://www.flickr.com/photos/8323834@N07/500995147/


Cake and Security

There is no doubt that security is necessary, but why is it so unpleasant to follow a security policy? Reminding yourself to stick to the rules feels like your partner telling you…. to eat your salad. You know they are right, but anticipating that bland taste and mindless chewing that awaits you simply puts you off. You decide to leave it for tomorrow, so much so that you never get to it.

Cakes, on the other hand, are yummy and require no effort whatsoever to indulge in our cravings for them. Nobody needs to force us to eat a piece.

In our day-to-day lives we prefer to do “cake” tasks without giving it a second’s thought. Things like storing confidential files on Dropbox or emailing them to our personal accounts…. you know, taking a little bite here and there. It’s “only for today”, “no biggie”… This one-time thing is so harmless, it’s like a comfort snack. We might later feel guilty that we bypassed a few “salad” controls. Maybe we used our personal USB drive instead of a company-issued encrypted one, but at the end of the day… who cares? Who will notice? As long as there is no dramatic impact on our health, a bite here or a bite there won’t cause any harm.

reward

And one day we realise that it’s not all rosy. The result of our laziness or lack of willpower eventually rears its ugly head when the doctor makes us stand on the scales and has a look at our blood pressure. So to add to your partner’s words of wisdom, is the doctor’s warning of an unhealthy present and a bleak future; something that would sound very similar during the company’s security audit.

“You have got to eat more salad and lay off the cakes!”

To make matters worse, even with our best intentions to have the salad at the office cafeteria, we discover that the one available is practically inedible. Pretty much like finding that the company’s secure shared drive doesn’t have the necessary space to store our files or that the encrypted pen drive is not compatible with the client’s Mac.

So if there are chefs coming up with ways to make salads more appealing, what can security professionals do to help us, the employees, maintain our “security diet”?

They could aim at making security more like a cake – effortless, even attractive, but still keep it as healthy as a salad. Sound simple? Perhaps not so much, but they should invest in usability studies to make sure that the secure solution is the easiest to use. It might involve discovering an entirely new culinary art on how to make a cake-tasting salad altogether. But if they fail to realise just how unpalatable the salads are to begin with, we should let them know. Security professionals need employees’ support.

Organisations are like families: everyone has to stay healthy, otherwise when a single member gets sick, the whole family is at risk of getting sick as well, whether it be catching an infectious disease or adopting an unhealthy lifestyle. It’s like having the slimmest, fittest family member refrain from adding biscuits to the grocery list in order not to tempt the couch-potatoes. It’s a team effort. In order for a company to stay healthy, everyone has to keep a healthy lifestyle of eating salad regularly, even when it is not that pleasant.

unpleasant but necessary measures

The whole company needs to know that security is important for achieving its goals -not as something that gets in the way-, just as we should all know that having a healthy diet of greens will guarantee a sound body. Employees contribute to the efficient operation of the business when they comply with security policies. Not only does security ensure confidentiality and the integrity of information, but it also guarantees that the resources are available for employees to complete their primary tasks.

We need to realise that we contribute to security; and we can inflict serious damage on a company when we don’t comply with security policies, no matter how insignificant or harmless they may seem. As employees, we are individually responsible for the organisation’s exposure to security risks just as we are responsible for exposing ourselves to illness. Our behaviour and daily regime significantly shape our quality of life, and our practices shape the quality of our business.

The health of the company is everyone’s business. Let’s all eat our salad while helping the security specialists to come up with better tasting ones.

Leron Zinatullin is the author of The Psychology of Information Security.

Twitter: @le_rond


Poker and Security

Good poker players are known to perform well under pressure. They play their cards based on rigorous probability analysis and impact assessment. Sounds very much like the sort of skills a security professional might benefit from when managing information security risks.

What can security professionals learn from a game of cards? It turns out, quite a bit. Skilled poker players are very good at making educated guesses about opponents’ cards and predicting their next moves. Security professionals are also required to be on the forefront of emerging threats and discovered vulnerabilities to see what the attackers’ next move might be.

At the beginning of a traditional Texas hold’em poker match, players are only dealt two cards (a hand). Based on this limited information, they have to try to evaluate the odds of winning and act accordingly. Players can either decide to stay in the game – in this case they have to pay a fee which contributes to the overall pot – or give up (fold). Security professionals also usually make decisions under a high degree of uncertainty. There are many ways they can treat risk: they can mitigate it by implementing necessary controls, avoid, transfer or accept it. Costs of such decisions vary as well.

ID-10042164

Not all cards, however, are worth playing. Similarly, not all security countermeasures should be implemented. Sometimes it is more effective to fold your cards and accept the risk rather than pay for an expensive control. When the odds are right a security professional can start a project to implement a security change to increase the security posture of a company.

When the game progresses and the first round of betting is over, the players are presented with a new piece of information. The poker term flop is used for the three additional cards that the dealer places on the table. These cards can be used to create a winning combination with each player’s hand. When the cards are revealed, the player has the opportunity to re-assess the situation and make a decision. This is exactly the way in which the changing market conditions or business requirements provide an instant to re-evaluate the business case for implementing a security countermeasure.

ID-10058910

There is nothing wrong with terminating a security project. If a poker player had a strong hand in the beginning, but the flop shows that there is no point in continuing, it means that conditions have changed. Maybe engaging key stakeholders revealed that a certain risk is not that critical and the implementation costs might be too high. Feel free to pass. It is much better to cancel a security project rather than end up with a solution that is ineffective and costly.

However, if poker players are sure that they are right, they have to be ready to defend their hand. In terms of security, it might mean convincing the board of the importance of the countermeasure based on the rigorous cost-benefit analysis. Security professionals can still lose the game and the company might get breached, but at least they did everything in their power to proactively mitigate that.

It doesn’t matter if poker players win or lose a particular hand as long as they make sound decisions that bring desired long-term results. Even the best poker player can’t win every hand. Similarly, security professionals can’t mitigate every security risk and implement all the possible countermeasures. To stay in the game, it is important to develop and follow a security strategy that will help to protect against ever-evolving threats in a cost-effective way.

Images courtesy of Mister GC / FreeDigitalPhotos.net

Leron Zinatullin is the author of The Psychology of Information Security.

Twitter: @le_rond


Teaching Information Security Concepts at KPMG

KPMG1

I delivered a 1,5-day Information Security Concepts course at KPMG UK.

We covered a wide range of topics, including information security risk management, access control, threat and vulnerability management, etc.

According to the feedback I received after the course, the participants were able to understand the core security concepts much better and, more importantly, apply their knowledge in practice.

Leron is very engaging and interesting to listen to
Leron has the knowledge and he’s very effective making simple delivery of a complex topic
Leron is an effective communicator and explained everything that he was instructing on in a clear and concise manner

There will be continuous collaboration with the Learning and Development team to deliver this course to all new joiners to the Information Protection and Business Resilience team at KPMG.


Modelling Conflicts Presentation

Modelling conflicts from neicher