Sharing views on security culture

Talk01

I’ve been invited to talk about human aspects of security at the CyberSecurity Talks & Networking event.  The venue and the format allowed the audience to participate and ask questions and we had insightful discussions at the end of my talk. It’s always interesting to hear what challenges people face in various organisations and how a few simple improvements can change the security culture for the better.

Advertisements

How to Create a Security Culture at the Workplace

October is National Cyber Security Awareness Month(NCSAM) which is designed to engage and educate public and private sector partners through events and initiatives to raise awareness about cybersecurity,

I’ve been asked to share my views on creating a security culture at the workplace with The State of Security.

I believe the goal is not to teach tricks, but to create a new culture which is accepted and understood by everyone. In order to effectively do so, messages need to be designed and delivered according to each type of employee: there is no such thing as a one-size-fits-all security campaign. Questions that must always be answered include: What are the benefits? What does it matter or why should I care? What impact do my actions have?

Security campaigns must discard scare tactics such as threatening employees with sanctions for breaches. Campaigns should be oriented towards the users’ goals and values, as well as the values of the organisation, such as professionalism and delivery.

A security campaign should emphasise that employees can cause serious damage to an organisation when they engage in non-compliant behaviour, even if it appears to be in an insignificant way. They should understand that they are bearing some responsibility for the security of the organisation and its exposure to risk.

Furthermore, the entire organisation needs to perceive security as bringing value to the company, as opposed to being an obstacle preventing employees from doing their job. It is important for employees to understand that they contribute to the smooth and efficient operation of business processes when they follow recommended security practices, just as security enables the availability of resources that support these processes.


The Psychology of Information Security Culture

logo

In order to reduce security risks within an enterprise, security professionals have traditionally attempted to guide employees towards compliance through security training. However, recurring problems and employee behaviour in this arena indicate that these measures are insufficient and rather ineffective.

Security training tends to focus on specific working practices and defined threat scenarios, leaving the understanding of security culture and its specific principles of behaviour untouched. A security culture should be regarded as a fundamental matter to address. If neglected, employees will not develop habitually secure behaviour or take the initiative to make better decisions when problems arise.

In my talk I will focus on how you can improve security culture in your organisation. I’ll discuss how you can:

  • Understand the root causes of a poor security culture within the workplace
  • Aligning a security programme with wider organisational objectives
  • Manage and communicate these changes within an organisation

The goal is not to teach tricks, but to create a new culture which is accepted and understood by everyone. Come join us at the Security Awareness Summit on 11 Nov for an amazing opportunity to learn from and share with each other. Activities include show-n-tell, 306 Lightening Talks, video wars, group case studies and numerous networking activities. Learn more and register now for the Summit.

Original


The Psychology of Information Security book reviews

51enjkmw1ll-_sx322_bo1204203200_

I wrote about my book  in the previous post. Here I would like to share what others have to say about it.

So often information security is viewed as a technical discipline – a world of firewalls, anti-virus software, access controls and encryption. An opaque and enigmatic discipline which defies understanding, with a priesthood who often protect their profession with complex concepts, language and most of all secrecy.

Leron takes a practical, pragmatic and no-holds barred approach to demystifying the topic. He reminds us that ultimately security depends on people – and that we all act in what we see as our rational self-interest – sometimes ill-informed, ill-judged, even downright perverse.

No approach to security can ever succeed without considering people – and as a profession we need to look beyond our computers to understand the business, the culture of the organisation – and most of all, how we can create a security environment which helps people feel free to actually do their job.
David Ferbrache OBE, FBCS
Technical Director, Cyber Security
KPMG UK

This is an easy-to-read, accessible and simple introduction to information security.  The style is straightforward, and calls on a range of anecdotes to help the reader through what is often a complicated and hard to penetrate subject.  Leron approaches the subject from a psychological angle and will be appealing to both those of a non-technical and a technical background.
Dr David King
Visiting Fellow of Kellogg College
University of Oxford

Read the rest of this entry »


Digital decisions: Understanding behaviours for safer cyber environments

DART

I was invited to participate in a panel discussion at a workshop on digital decision-making and risk-taking hosted by the Decision, Attitude, Risk & Thinking (DART) research group at Kingston Business School.

During the workshop, we addressed the human dimension in issues arising from increasing digital interconnectedness with a particular focus on cyber security risks and cyber safety in web-connected organisations.

We identified behavioural challenges in cyber security such as insider threats, phishing emails, security culture and achieving stakeholder buy-in. We also outlined a potential further research opportunity which could tackle behavioural security risks inherent in the management of organisational information assets.

2016-04-25 14.50


Building a security culture

Building on the connection between breaking security policies and cheating, let’s look at a study[1] that asked participants to solve 20 simple maths problems and promised 50 cents for each correct answer.

The participants were allowed to check their own answers and then shred the answer sheet, leaving no evidence of any potential cheating. The results demonstrated that participants reported solving, on average, five more problems than under conditions where cheating was not possible (i.e. controlled conditions).

The researchers then introduced David – a student who was tasked to raise his hand shortly after the experiment begun and proclaim that he had solved all the problems. Other participants were obviously shocked by such a statement. It was clearly impossible to solve all the problems in only a few minutes. The experimenter, however, didn’t question his integrity and suggested that David should shred the answer sheet and take all the money from the envelope.

Interestingly, other participants’ behaviour adapted as a result. They reported solving on average eight more problems than under controlled conditions.

Much like the broken windows theory mentioned in my previous blog, this demonstrates that unethical behaviour is contagious, as are acts of non-compliance. If employees in a company witness other people breaking security policies and not being punished, they are tempted to do the same. It becomes socially acceptable and normal. This is the root cause of poor security culture.

The good news is that the opposite holds true as well. That’s why security culture has to have strong senior management support. Leading by example is the key to changing the perception of security in the company: if employees see that the leadership team takes security seriously, they will follow.

So, security professionals should focus on how security is perceived. This point is outlined in three basic steps in the book The Social Animal, by David Brooks:[2]

  1. People perceive a situation.
  2. People estimate if the action is in their long-term interest.
  3. People use willpower to take action.

P-A-A

He claims that, historically, people were mostly focused on the last two steps of this process. In the previous blog I argued that relying solely on willpower has a limited effect. Willpower can be exercised like a muscle, but it is also prone to atrophy.

In regard to the second step of the decision-making process, if people were reminded of the potential negative consequences they would be likely not to take the action. Brooks then refers to ineffective HIV/AIDS awareness campaigns, which focused only on the negative consequences and ultimately failed to change people’s behaviour.

He also suggests that most diets fail because willpower and reason are not strong enough to confront impulsive desires: “You can tell people not to eat the French fry. You can give them pamphlets about the risks of obesity … In their nonhungry state, most people will vow not to eat it. But when their hungry self rises, their well-intentioned self fades, and they eat the French fry”.

This doesn’t only apply to dieting: when people want to get their job done and security gets in the way, they will circumvent it, regardless of the degree of risk they might expose the company to.

That is the reason for perception being the cornerstone of the decision-making process. Employees have to be taught to see security violations in a particular way that minimises the temptation to break policies.

In ‘Strangers to Ourselves’, Timothy Wilson claims, “One of the most enduring lessons of social psychology is that behaviour change often precedes changes in attitudes and feelings”.[3]

Security professionals should understand that there is no single event that alters users’ behaviour – changing security culture requires regular reinforcement, creating and sustaining habits.

Charles Duhigg, in his book The Power of Habit,[4] tells a story about Paul O’Neill, a CEO of the Aluminum Company of America (Alcoa) who was determined to make his enterprise the safest in the country. At first, people were confused that the newly appointed executive was not talking about profit margins or other finance-related metrics. They didn’t see the link between his ‘zero-injuries’ goal and the company’s performance. Despite that, Alcoa’s profits reached a historical high within a year of his announcement. When O’Neill retired, the company’s annual income was five times greater than it had been before his arrival. Moreover, it became one of the safest companies in the world.

Duhigg explains this phenomenon by highlighting the importance of the “keystone habit”. Alcoa’s CEO identified safety as such a habit and focused solely on it.

O’Neill had a challenging goal to transform the company, but he couldn’t just tell people to change their behaviour. He said, “that’s not how the brain works. So I decided I was going to start by focusing on one thing. If I could start disrupting the habits around one thing, it would spread throughout the entire company.”

He recalled an incident when one of his workers died trying to fix a machine despite the safety procedures and warning signs. The CEO called an emergency meeting to understand what had caused this tragic event.

He took personal responsibility for the worker’s death, identifying numerous shortcomings in safety education. For example, the training programme didn’t highlight the fact that employees wouldn’t be blamed for machinery failure or the fact that they shouldn’t commence repair work before finding a manager.

As a result, the policies were updated and the employees were encouraged to suggest safety improvements. Workers, however, went a step further and started suggesting business improvements as well. Changing their behaviour around safety led to some innovative solutions, enhanced communication and increased profits for the company.

Security professionals should understand the importance of group dynamics and influences to build an effective security culture.

They should also remember that just as ‘broken windows’ encourage policy violations, changing one security habit can encourage better behaviour across the board.

References:

[1] Francesca Gino, Shahar Ayal and Dan Ariely, “Contagion and Differentiation in Unethical Behavior: The Effect of One Bad Apple on the Barrel”, Psychological Science, 20(3), 2009, 393–398.

[2] David Brooks, The Social Animal: The Hidden Sources of Love, Character, and Achievement, Random House, 2011.

[3] Timothy Wilson, Strangers to Ourselves, Harvard University Press, 2004, 212.

[4] Charles Duhigg, The Power of Habit: Why We Do What We Do and How to Change, Random House, 2013.

To find out more about building a security culture, read Leron’s book, The Psychology of Information Security. Twitter: @le_rond


The root causes of a poor security culture within the workplace

15542330623_c052788d46_z

Demonstrating to employees that security is there to make their life easier, not harder, is the first step in developing a sound security culture. But before we discuss the actual steps to improve it, let’s first understand the root causes of poor security culture.

Security professionals must understand that bad habits and behaviours tend to be contagious. Malcolm Gladwell, in his book The Tipping Point,[1] discusses the conditions that allow some ideas or behaviours to “spread like viruses”. He refers to the broken windows theory to illustrate the power of context. This theory advocates stopping smaller crimes by maintaining the environment in order to prevent bigger ones. The claim goes that a broken window left for several days in a neighbourhood would trigger more vandalism. The small defect signals a lack of care and attention on the property, which in turn implies that crime will go unpunished.

Gladwell describes the efforts of George Kelling, who employed the theory to fight vandalism on the New York City subway system. He argued that cleaning up graffiti on the trains would prevent further vandalism. Gladwell concluded that this several-year-long effort resulted in a dramatically reduced crime rate.

Despite ongoing debate regarding the causes of the 1990s crime rate reduction in the US, the broken windows theory can be applied in an information security context.

Security professionals should remember that minor policy violations tend to lead to bigger ones, eroding the company’s security culture.

The psychology of human behaviour should be considered as well

Sometimes people are not motivated to comply with a security policy because they simply don’t see the financial impact of violating it.

Dan Ariely, in his book The Honest Truth about Dishonesty,[2] tries to understand why people break the rules. Among other experiments, he describes a survey conducted among golf players to determine the conditions in which they would be tempted to move the ball into a more advantageous position, and if so, which method they would choose. The golfers were offered three different options: they could use their club, use their shoe or simply pick the ball up using their hands.

Although all of these options break the rules, they were designed in this way to determine if one method of cheating is more psychologically acceptable than others. The results of the study demonstrated that moving the ball with a club was the most common choice, followed by the shoe and, finally, the hand. It turned out that physically and psychologically distancing ourselves from the ‘immoral’ action makes people more likely to act dishonestly.

It is important to understand that the ‘distance’ described in this experiment is merely psychological. It doesn’t change the nature of the action.

In a security context, employees will usually be reluctant to steal confidential information, just as golfers will refrain from picking up a ball with their hand to move it to a more favourable position, because that would make them directly involved in the unethical behaviour. However, employees might download a peer-to-peer sharing application to listen to music while at work, as the impact of this action is less obvious. This can potentially lead to even bigger losses due to even more confidential information being stolen from the corporate network.

Security professionals can use this finding to remind employees of the true meaning of their actions. Breaking security policy does not seem to have a direct financial impact on the company – there is usually no perceived loss, so it is easy for employees to engage in such behaviour. Highlighting this link and demonstrating the correlation between policy violations and the business’s ability to generate revenue could help employees understand the consequences of non-compliance.

References:

[1] Malcolm Gladwell, The Tipping Point: How Little Things Can Make a Big Difference, Little, Brown, 2006.

[2] Dan Ariely, The Honest Truth about Dishonesty, Harper, 2013.

Image by txmx 2 https://flic.kr/p/pFqvpD

To find out more about the behaviours behind information security, read Leron’s book, The Psychology of Information Security. Twitter: @le_rond