Design security for people: how behavioural science beats friction

Security failures are rarely a technology problem alone. They’re socio-technical failures: mismatches between how controls are designed and how people actually work under pressure. If you want resilient organisations, start by redesigning security so it fits human cognition, incentives and workflows. Then measure and improve it.

Think like a behavioural engineer

Apply simple behavioural-science tools to reduce errors and increase adoption:

  • Defaults beat persuasion. Make the secure choice the path of least resistance: automatic updates, default multi-factor authentication, managed device profiles, single sign-on with conditional access. Defaults change behaviour at scale without relying on willpower.
  • Reduce friction where it matters. Map high-risk workflows (sales demos, incident response, customer support) and remove unnecessary steps that push people toward risky workarounds (like using unapproved software). Where friction is unavoidable, provide fast, well-documented alternatives.
  • Nudge, don’t nag. Use contextual micro-prompts (like in-app reminders) at the moment of decision rather than one-off training. Framing matters: emphasise how a control helps the person do their job, not just what it prevents.
  • Commitment and incentives. Encourage teams to publicly adopt small security commitments (e.g. “we report suspicious emails”) and recognise them. Social proof is powerful – people emulate peers more than policies.

Build trust, not fear

A reporting culture requires psychological safety.

  • Adopt blameless post-incident reviews for honest mistakes; separate malice investigations from learning reviews.
     
  • Be transparent: explain why rules exist, how they are enforced and what happens after a report.
     
  • Lead by example: executives and managers must follow the rules visibly. Norms are set from the top.

Practical programme components

  1. Security champion network. One trained representative per team. Responsibilities: localising guidance, triaging near-misses and feeding back usability problems to the security team.
     
  2. Lightweight feedback loops. Short surveys, near-miss logs and regular champion roundtables to capture usability issues and unearth workarounds.
     
  3. Blameless playbooks. Clear incident reporting channels, response expectations, and public, learning-oriented postmortems.
     
  4. Measure what matters. Track metrics tied to risk and behaviour.
     

Metrics that inform action (not vanity)

Stop counting clicks and start tracking signals that show cultural change and risk reduction:

  • Reporting latency: median time from detection to report. Increasing latency can indicate reduced psychological safety (fear of blame), friction in the reporting path (hard-to-find button) or gaps in frontline detection capability. A drop in latency after a campaign usually signals improved awareness or lowered friction.

Always interpret in context: rising near-miss reports with falling latency can be positive (visibility improving). Review volume and type alongside latency before deciding.

  • Inquiries rate: median number of proactive security inquiries (help requests, pre-deployment checks, risk questions). An increase usually signals growing trust and willingness to engage with security; a sustained fall may indicate rising friction, unresponsiveness or fear. 

If rate rises sharply with no matching incident reduction, validate whether confusion is driving questions (update docs) or whether new features need security approvals (streamline process).

  • Confidence and impact: employees’ reported confidence to perform required security tasks (backups, secure file sharing, suspicious email reporting) and their belief that those actions produce practical organisational outcomes (risk reduction, follow-up action, leadership support).

An increase may signal stronger capability and perceived efficacy of security actions. While a decrease indicates skills gaps, tooling or access friction or perception that actions don’t lead to change.

Metrics should prompt decisions (e.g., simplify guidance if dwell time on key security pages is low, fund an automated patching project if mean time to remediate is unacceptable), not decorate slide decks.

Experiment, measure, repeat

Treat culture change like product development: hypothesis → experiment → measure → adjust. Run small pilots (one business unit, one workflow), measure impact on behaviour and operational outcomes, then scale the successful patterns.

Things you can try this month

  • Map 3 high-risk workflows and design safer fast paths.
  • Stand up a security champion pilot in two teams.
  • Change one reporting process to be blameless and measure reporting latency.
  • Implement or verify secure defaults for identity and patching.
  • Define 3 meaningful metrics and publish baseline values.
     

When security becomes the way people naturally work, supported by defaults, fast safe paths and a culture that rewards reporting and improvement, it stops being an obstacle and becomes an enabler. That’s the real return on investment: fewer crises, faster recovery and the confidence to innovate securely.

If you’d like to learn more, check out the second edition of The Psychology of Information Security for more practical guidance on building a positive security culture.

Security is a social design problem, not a tech one

I’m super proud to have written this book. It’s the much improved second edition – and I can’t wait to hear what you think about it.

https://amzn.asia/d/9lr6Sd9

Please leave an Amazon review if you can – this really helps beat the algorithm, and is much appreciated!

Building resilience and sustainable performance

I had a pleasure of sharing some practical lessons on building resilience at the Cybersecurity Summit.

I touched on sustainable performance strategies and the importance of body, emotions, mind and purpose in preventing burnout.

Protecting systems starts with protecting the people who run them.

Volunteering as a telephone crisis supporter

The festive period can bring joy, but it can also be a time of loneliness and stress, which is why it’s so important to check in with ourselves and others.

One way I’ve had the chance to contribute is through volunteering as a telephone crisis supporter with Lifeline Australia. I’ve been answering calls from people who may be facing one of the toughest moments of their lives. Every conversation reinforces the power of simply being there for someone when they need it most.

One of the most moving parts of this role is hearing the shift in a caller’s voice – from distress to a sense of calm – because they feel heard, supported and not alone. It’s a small moment that can make a big difference.

As we head into the holidays, remember that you’re not alone either. If you’re struggling, reach out – whether to a friend, family member or a service like Lifeline. And if you’re looking for a meaningful way to give back, I can’t recommend volunteering with Lifeline enough. It’s been one of the most rewarding experiences of my life.

Take care of yourself and those around you this holiday season. Let’s make kindness, connection and understanding the greatest gifts we give.

Developing effective negotiation skills

Negotiation is a core skill that can make or break your success as a CISO.

While technical expertise is important, it’s equally critical to recognise the value of negotiation skills in cyber security leadership. By developing and applying strong negotiation skills, you’ll be better equipped to lead your organisation in an increasingly complex and challenging cyber security landscape.

I recently completed a negotiations workshop by Filip Hron and highly recommend him as a facilitator and his book ‘Negotiations Evolved’. I particularly appreciate his focus on ethics and value creation.

In this blog, I outline how some of the skills can be applied to the cybersecurity context.

More

Trust in People: Macquarie University Cyber Security Industry Workshop

I’ve been invited to to share my thoughts on human-centric security at the Macquarie University Cyber Security Industry Workshop.

Drawing on insights from The Psychology of Information Security and my experience in the field, I outlined some of the reasons for friction between security and business productivity and suggested a practical approach to a building a better security culture in organisations.

It was great to be able to contribute to the collaboration between the industry, government and academia on this topic.

Behavioural science in cyber security

Why your staff ignore security policies and what to do about it.               

Dale Carnegie’s 1936 bestselling self-help book How To Win Friends And Influence People is one of those titles that sits unloved and unread on most people’s bookshelves. But dust off its cover and crack open its spine, and you’ll find lessons and anecdotes that are relevant to the challenges associated with shaping people’s behaviour when it comes to cyber security.

In one chapter, Carnegie tells the story of George B. Johnson, from Oklahoma, who worked for a local engineering company. Johnson’s role required him to ensure that other employees abide by the organisation’s health and safety policies. Among other things, he was responsible for making sure other employees wore their hard hats when working on the factory floor.

His strategy was as follows: if he spotted someone not following the company’s policy, he would approach them, admonish them, quote the regulation at them, and insist on compliance. And it worked — albeit briefly. The employee would put on their hard hat, and as soon as Johnson left the room, they would just as quickly remove it.  So he tried something different: empathy. Rather than addressing them from a position of authority, Johnson spoke to his colleagues almost as though he was their friend, and expressed a genuine interest in their comfort. He wanted to know if the hats were uncomfortable to wear, and that’s why they didn’t wear them when on the job.

Instead of simply reciting the rules as chapter-and-verse, he merely mentioned it was in the best interest of the employee to wear their helmets, because they were designed to prevent workplace injuries.

This shift in approach bore fruit, and workers felt more inclined to comply with the rules. Moreover, Johnson observed that employees were less resentful of management.

The parallels between cyber security and George B. Johnson’s battle to ensure health-and-safety compliance are immediately obvious. Our jobs require us to adequately address the security risks that threaten the organisations we work for. To be successful at this, it’s important to ensure that everyone appreciates the value of security — not just engineers, developers, security specialists, and other related roles.

This isn’t easy. On one hand, failing to implement security controls can result in an organisation facing significant losses. However, badly-implemented security mechanisms can be worse: either by obstructing employee productivity or by fostering a culture where security is resented.

To ensure widespread adoption of secure behaviour, security policy and control implementations not only have to accommodate the needs of those that use them, but they also must be economically attractive to the organisation. To realise this, there are three factors we need to consider: motivation, design, and culture.

More

How to Create a Security Culture at the Workplace

October is National Cyber Security Awareness Month(NCSAM) which is designed to engage and educate public and private sector partners through events and initiatives to raise awareness about cybersecurity,

I’ve been asked to share my views on creating a security culture at the workplace with The State of Security.

I believe the goal is not to teach tricks, but to create a new culture which is accepted and understood by everyone. In order to effectively do so, messages need to be designed and delivered according to each type of employee: there is no such thing as a one-size-fits-all security campaign. Questions that must always be answered include: What are the benefits? What does it matter or why should I care? What impact do my actions have?

Security campaigns must discard scare tactics such as threatening employees with sanctions for breaches. Campaigns should be oriented towards the users’ goals and values, as well as the values of the organisation, such as professionalism and delivery.

A security campaign should emphasise that employees can cause serious damage to an organisation when they engage in non-compliant behaviour, even if it appears to be in an insignificant way. They should understand that they are bearing some responsibility for the security of the organisation and its exposure to risk.

Furthermore, the entire organisation needs to perceive security as bringing value to the company, as opposed to being an obstacle preventing employees from doing their job. It is important for employees to understand that they contribute to the smooth and efficient operation of business processes when they follow recommended security practices, just as security enables the availability of resources that support these processes.

The Psychology of Information Security book reviews

51enjkmw1ll-_sx322_bo1204203200_

I wrote about my book  in the previous post. Here I would like to share what others have to say about it.

So often information security is viewed as a technical discipline – a world of firewalls, anti-virus software, access controls and encryption. An opaque and enigmatic discipline which defies understanding, with a priesthood who often protect their profession with complex concepts, language and most of all secrecy.

Leron takes a practical, pragmatic and no-holds barred approach to demystifying the topic. He reminds us that ultimately security depends on people – and that we all act in what we see as our rational self-interest – sometimes ill-informed, ill-judged, even downright perverse.

No approach to security can ever succeed without considering people – and as a profession we need to look beyond our computers to understand the business, the culture of the organisation – and most of all, how we can create a security environment which helps people feel free to actually do their job.
David Ferbrache OBE, FBCS
Technical Director, Cyber Security
KPMG UK

This is an easy-to-read, accessible and simple introduction to information security.  The style is straightforward, and calls on a range of anecdotes to help the reader through what is often a complicated and hard to penetrate subject.  Leron approaches the subject from a psychological angle and will be appealing to both those of a non-technical and a technical background.
Dr David King
Visiting Fellow of Kellogg College
University of Oxford

More