Offer ends 30 November 2016.
I’ve been asked to share my views on creating a security culture at the workplace with The State of Security.
I believe the goal is not to teach tricks, but to create a new culture which is accepted and understood by everyone. In order to effectively do so, messages need to be designed and delivered according to each type of employee: there is no such thing as a one-size-fits-all security campaign. Questions that must always be answered include: What are the benefits? What does it matter or why should I care? What impact do my actions have?
Security campaigns must discard scare tactics such as threatening employees with sanctions for breaches. Campaigns should be oriented towards the users’ goals and values, as well as the values of the organisation, such as professionalism and delivery.
A security campaign should emphasise that employees can cause serious damage to an organisation when they engage in non-compliant behaviour, even if it appears to be in an insignificant way. They should understand that they are bearing some responsibility for the security of the organisation and its exposure to risk.
Furthermore, the entire organisation needs to perceive security as bringing value to the company, as opposed to being an obstacle preventing employees from doing their job. It is important for employees to understand that they contribute to the smooth and efficient operation of business processes when they follow recommended security practices, just as security enables the availability of resources that support these processes.
In order to reduce security risks within an enterprise, security professionals have traditionally attempted to guide employees towards compliance through security training. However, recurring problems and employee behaviour in this arena indicate that these measures are insufficient and rather ineffective.
Security training tends to focus on specific working practices and defined threat scenarios, leaving the understanding of security culture and its specific principles of behaviour untouched. A security culture should be regarded as a fundamental matter to address. If neglected, employees will not develop habitually secure behaviour or take the initiative to make better decisions when problems arise.
In my talk I will focus on how you can improve security culture in your organisation. I’ll discuss how you can:
- Understand the root causes of a poor security culture within the workplace
- Aligning a security programme with wider organisational objectives
- Manage and communicate these changes within an organisation
The goal is not to teach tricks, but to create a new culture which is accepted and understood by everyone. Come join us at the Security Awareness Summit on 11 Nov for an amazing opportunity to learn from and share with each other. Activities include show-n-tell, 306 Lightening Talks, video wars, group case studies and numerous networking activities. Learn more and register now for the Summit.
I wrote about the games you can play to enhance your privacy and cyber security knowledge. We also talked about gamification in the security context. But how do we apply this knowledge to “gamify” security awareness efforts in you organisation?
A recent company I’ve been working with has been experimenting with their security awareness programme; in particular, they’ve designed posters to remind employees of potentially risky behaviours. They placed these posters in the areas where violations could occur: near the confidential bins or printers. They’ve invested in a memorable design and created funny-looking creatures people can relate to. For example, they’ve had something resembling an angry Twitter bird to emphasise the fact that employees should be mindful of what they share on social media. Other examples included monsters on the lookout for confidential data.
I liked the idea and I saw employees discussing the posters shortly after they were released. But what if we wanted to take this a step further? What if people could not only look at the posters but also engage with them?
The recently released and hugely popular Pokemon Go app gives us an example of how this could be done. In the game, players are encouraged to explore the real world around them and catch creatures that appear on the map. The game uses augmented reality to make the experience of catching Pokemon a lot more fun.
The app developers used classic game design elements in this game:
- There’s a ton of items to be collected, like stardust, pokeballs, various potions and eggs.
- You get frequent rewards and feedback on your progress.
- The game is very social in nature and players are encouraged to engage with each other.
- There are leadership boards and there is a chance to get your name displayed in a gym – a place where Pokemon battles take place.
How can some of the ideas from this game be applied to a security awareness programme?
What if we take the monsters from the company’s posters above and make them more engaging? It only takes a small financial investment to attach a QR code to a monster, so an employee could get immediate access to the relevant section in the security policy. Or how about giving employees a quick quiz and, if answered correctly, reward them with bonus points?
These points could be also collected for accomplishing other tasks. Your employee volunteered to participate in a security awareness presentation with her story? 100 points! Attended a lunch and learn session? How about 20 points? Reported a phishing email? Stopped a tailgater? There are many ways people can demonstrate their involvement in a security awareness programme.
As long as participation is voluntary, there are clear objectives and rules, feedback is readily available and rewards are desirable, we’ve got a chance to change security culture for the better!
“So often information security is viewed as a technical discipline – a world of firewalls, anti-virus software, access controls and encryption. An opaque and enigmatic discipline which defies understanding, with a priesthood who often protect their profession with complex concepts, language and most of all secrecy.
Leron takes a practical, pragmatic and no-holds barred approach to demystifying the topic. He reminds us that ultimately security depends on people – and that we all act in what we see as our rational self-interest – sometimes ill-informed, ill-judged, even downright perverse.
No approach to security can ever succeed without considering people – and as a profession we need to look beyond our computers to understand the business, the culture of the organisation – and most of all, how we can create a security environment which helps people feel free to actually do their job.”
David Ferbrache OBE, FBCS
Technical Director, Cyber Security
“This is an easy-to-read, accessible and simple introduction to information security. The style is straightforward, and calls on a range of anecdotes to help the reader through what is often a complicated and hard to penetrate subject. Leron approaches the subject from a psychological angle and will be appealing to both those of a non-technical and a technical background.”
Dr David King
Visiting Fellow of Kellogg College
University of Oxford
I was invited to participate in a panel discussion at a workshop on digital decision-making and risk-taking hosted by the Decision, Attitude, Risk & Thinking (DART) research group at Kingston Business School.
During the workshop, we addressed the human dimension in issues arising from increasing digital interconnectedness with a particular focus on cyber security risks and cyber safety in web-connected organisations.
We identified behavioural challenges in cyber security such as insider threats, phishing emails, security culture and achieving stakeholder buy-in. We also outlined a potential further research opportunity which could tackle behavioural security risks inherent in the management of organisational information assets.
Building on the connection between breaking security policies and cheating, let’s look at a study that asked participants to solve 20 simple maths problems and promised 50 cents for each correct answer.
The participants were allowed to check their own answers and then shred the answer sheet, leaving no evidence of any potential cheating. The results demonstrated that participants reported solving, on average, five more problems than under conditions where cheating was not possible (i.e. controlled conditions).
The researchers then introduced David – a student who was tasked to raise his hand shortly after the experiment begun and proclaim that he had solved all the problems. Other participants were obviously shocked by such a statement. It was clearly impossible to solve all the problems in only a few minutes. The experimenter, however, didn’t question his integrity and suggested that David should shred the answer sheet and take all the money from the envelope.
Interestingly, other participants’ behaviour adapted as a result. They reported solving on average eight more problems than under controlled conditions.
Much like the broken windows theory mentioned in my previous blog, this demonstrates that unethical behaviour is contagious, as are acts of non-compliance. If employees in a company witness other people breaking security policies and not being punished, they are tempted to do the same. It becomes socially acceptable and normal. This is the root cause of poor security culture.
The good news is that the opposite holds true as well. That’s why security culture has to have strong senior management support. Leading by example is the key to changing the perception of security in the company: if employees see that the leadership team takes security seriously, they will follow.
So, security professionals should focus on how security is perceived. This point is outlined in three basic steps in the book The Social Animal, by David Brooks:
- People perceive a situation.
- People estimate if the action is in their long-term interest.
- People use willpower to take action.
He claims that, historically, people were mostly focused on the last two steps of this process. In the previous blog I argued that relying solely on willpower has a limited effect. Willpower can be exercised like a muscle, but it is also prone to atrophy.
In regard to the second step of the decision-making process, if people were reminded of the potential negative consequences they would be likely not to take the action. Brooks then refers to ineffective HIV/AIDS awareness campaigns, which focused only on the negative consequences and ultimately failed to change people’s behaviour.
He also suggests that most diets fail because willpower and reason are not strong enough to confront impulsive desires: “You can tell people not to eat the French fry. You can give them pamphlets about the risks of obesity … In their nonhungry state, most people will vow not to eat it. But when their hungry self rises, their well-intentioned self fades, and they eat the French fry”.
This doesn’t only apply to dieting: when people want to get their job done and security gets in the way, they will circumvent it, regardless of the degree of risk they might expose the company to.
That is the reason for perception being the cornerstone of the decision-making process. Employees have to be taught to see security violations in a particular way that minimises the temptation to break policies.
In ‘Strangers to Ourselves’, Timothy Wilson claims, “One of the most enduring lessons of social psychology is that behaviour change often precedes changes in attitudes and feelings”.
Security professionals should understand that there is no single event that alters users’ behaviour – changing security culture requires regular reinforcement, creating and sustaining habits.
Charles Duhigg, in his book The Power of Habit, tells a story about Paul O’Neill, a CEO of the Aluminum Company of America (Alcoa) who was determined to make his enterprise the safest in the country. At first, people were confused that the newly appointed executive was not talking about profit margins or other finance-related metrics. They didn’t see the link between his ‘zero-injuries’ goal and the company’s performance. Despite that, Alcoa’s profits reached a historical high within a year of his announcement. When O’Neill retired, the company’s annual income was five times greater than it had been before his arrival. Moreover, it became one of the safest companies in the world.
Duhigg explains this phenomenon by highlighting the importance of the “keystone habit”. Alcoa’s CEO identified safety as such a habit and focused solely on it.
O’Neill had a challenging goal to transform the company, but he couldn’t just tell people to change their behaviour. He said, “that’s not how the brain works. So I decided I was going to start by focusing on one thing. If I could start disrupting the habits around one thing, it would spread throughout the entire company.”
He recalled an incident when one of his workers died trying to fix a machine despite the safety procedures and warning signs. The CEO called an emergency meeting to understand what had caused this tragic event.
He took personal responsibility for the worker’s death, identifying numerous shortcomings in safety education. For example, the training programme didn’t highlight the fact that employees wouldn’t be blamed for machinery failure or the fact that they shouldn’t commence repair work before finding a manager.
As a result, the policies were updated and the employees were encouraged to suggest safety improvements. Workers, however, went a step further and started suggesting business improvements as well. Changing their behaviour around safety led to some innovative solutions, enhanced communication and increased profits for the company.
Security professionals should understand the importance of group dynamics and influences to build an effective security culture.
They should also remember that just as ‘broken windows’ encourage policy violations, changing one security habit can encourage better behaviour across the board.
 Francesca Gino, Shahar Ayal and Dan Ariely, “Contagion and Differentiation in Unethical Behavior: The Effect of One Bad Apple on the Barrel”, Psychological Science, 20(3), 2009, 393–398.
 David Brooks, The Social Animal: The Hidden Sources of Love, Character, and Achievement, Random House, 2011.
 Timothy Wilson, Strangers to Ourselves, Harvard University Press, 2004, 212.
 Charles Duhigg, The Power of Habit: Why We Do What We Do and How to Change, Random House, 2013.