Building on the connection between breaking security policies and cheating, let’s look at a study[1] that asked participants to solve 20 simple maths problems and promised 50 cents for each correct answer.
The participants were allowed to check their own answers and then shred the answer sheet, leaving no evidence of any potential cheating. The results demonstrated that participants reported solving, on average, five more problems than under conditions where cheating was not possible (i.e. controlled conditions).
The researchers then introduced David – a student who was tasked to raise his hand shortly after the experiment begun and proclaim that he had solved all the problems. Other participants were obviously shocked by such a statement. It was clearly impossible to solve all the problems in only a few minutes. The experimenter, however, didn’t question his integrity and suggested that David should shred the answer sheet and take all the money from the envelope.
Interestingly, other participants’ behaviour adapted as a result. They reported solving on average eight more problems than under controlled conditions.
Much like the broken windows theory mentioned in my previous blog, this demonstrates that unethical behaviour is contagious, as are acts of non-compliance. If employees in a company witness other people breaking security policies and not being punished, they are tempted to do the same. It becomes socially acceptable and normal. This is the root cause of poor security culture.
The good news is that the opposite holds true as well. That’s why security culture has to have strong senior management support. Leading by example is the key to changing the perception of security in the company: if employees see that the leadership team takes security seriously, they will follow.
So, security professionals should focus on how security is perceived. This point is outlined in three basic steps in the book The Social Animal, by David Brooks:[2]
- People perceive a situation.
- People estimate if the action is in their long-term interest.
- People use willpower to take action.
He claims that, historically, people were mostly focused on the last two steps of this process. In the previous blog I argued that relying solely on willpower has a limited effect. Willpower can be exercised like a muscle, but it is also prone to atrophy.
In regard to the second step of the decision-making process, if people were reminded of the potential negative consequences they would be likely not to take the action. Brooks then refers to ineffective HIV/AIDS awareness campaigns, which focused only on the negative consequences and ultimately failed to change people’s behaviour.
He also suggests that most diets fail because willpower and reason are not strong enough to confront impulsive desires: “You can tell people not to eat the French fry. You can give them pamphlets about the risks of obesity … In their nonhungry state, most people will vow not to eat it. But when their hungry self rises, their well-intentioned self fades, and they eat the French fry”.
This doesn’t only apply to dieting: when people want to get their job done and security gets in the way, they will circumvent it, regardless of the degree of risk they might expose the company to.
That is the reason for perception being the cornerstone of the decision-making process. Employees have to be taught to see security violations in a particular way that minimises the temptation to break policies.
In ‘Strangers to Ourselves’, Timothy Wilson claims, “One of the most enduring lessons of social psychology is that behaviour change often precedes changes in attitudes and feelings”.[3]
Security professionals should understand that there is no single event that alters users’ behaviour – changing security culture requires regular reinforcement, creating and sustaining habits.
Charles Duhigg, in his book The Power of Habit,[4] tells a story about Paul O’Neill, a CEO of the Aluminum Company of America (Alcoa) who was determined to make his enterprise the safest in the country. At first, people were confused that the newly appointed executive was not talking about profit margins or other finance-related metrics. They didn’t see the link between his ‘zero-injuries’ goal and the company’s performance. Despite that, Alcoa’s profits reached a historical high within a year of his announcement. When O’Neill retired, the company’s annual income was five times greater than it had been before his arrival. Moreover, it became one of the safest companies in the world.
Duhigg explains this phenomenon by highlighting the importance of the “keystone habit”. Alcoa’s CEO identified safety as such a habit and focused solely on it.
O’Neill had a challenging goal to transform the company, but he couldn’t just tell people to change their behaviour. He said, “that’s not how the brain works. So I decided I was going to start by focusing on one thing. If I could start disrupting the habits around one thing, it would spread throughout the entire company.”
He recalled an incident when one of his workers died trying to fix a machine despite the safety procedures and warning signs. The CEO called an emergency meeting to understand what had caused this tragic event.
He took personal responsibility for the worker’s death, identifying numerous shortcomings in safety education. For example, the training programme didn’t highlight the fact that employees wouldn’t be blamed for machinery failure or the fact that they shouldn’t commence repair work before finding a manager.
As a result, the policies were updated and the employees were encouraged to suggest safety improvements. Workers, however, went a step further and started suggesting business improvements as well. Changing their behaviour around safety led to some innovative solutions, enhanced communication and increased profits for the company.
Security professionals should understand the importance of group dynamics and influences to build an effective security culture.
They should also remember that just as ‘broken windows’ encourage policy violations, changing one security habit can encourage better behaviour across the board.
References:
[1] Francesca Gino, Shahar Ayal and Dan Ariely, “Contagion and Differentiation in Unethical Behavior: The Effect of One Bad Apple on the Barrel”, Psychological Science, 20(3), 2009, 393–398.
[2] David Brooks, The Social Animal: The Hidden Sources of Love, Character, and Achievement, Random House, 2011.
[3] Timothy Wilson, Strangers to Ourselves, Harvard University Press, 2004, 212.
[4] Charles Duhigg, The Power of Habit: Why We Do What We Do and How to Change, Random House, 2013.
To find out more about building a security culture, read Leron’s book, The Psychology of Information Security. Twitter: @le_rond
5 Comments