Design security for people: how behavioural science beats friction

Security failures are rarely a technology problem alone. They’re socio-technical failures: mismatches between how controls are designed and how people actually work under pressure. If you want resilient organisations, start by redesigning security so it fits human cognition, incentives and workflows. Then measure and improve it.

Think like a behavioural engineer

Apply simple behavioural-science tools to reduce errors and increase adoption:

  • Defaults beat persuasion. Make the secure choice the path of least resistance: automatic updates, default multi-factor authentication, managed device profiles, single sign-on with conditional access. Defaults change behaviour at scale without relying on willpower.
  • Reduce friction where it matters. Map high-risk workflows (sales demos, incident response, customer support) and remove unnecessary steps that push people toward risky workarounds (like using unapproved software). Where friction is unavoidable, provide fast, well-documented alternatives.
  • Nudge, don’t nag. Use contextual micro-prompts (like in-app reminders) at the moment of decision rather than one-off training. Framing matters: emphasise how a control helps the person do their job, not just what it prevents.
  • Commitment and incentives. Encourage teams to publicly adopt small security commitments (e.g. “we report suspicious emails”) and recognise them. Social proof is powerful – people emulate peers more than policies.

Build trust, not fear

A reporting culture requires psychological safety.

  • Adopt blameless post-incident reviews for honest mistakes; separate malice investigations from learning reviews.
     
  • Be transparent: explain why rules exist, how they are enforced and what happens after a report.
     
  • Lead by example: executives and managers must follow the rules visibly. Norms are set from the top.

Practical programme components

  1. Security champion network. One trained representative per team. Responsibilities: localising guidance, triaging near-misses and feeding back usability problems to the security team.
     
  2. Lightweight feedback loops. Short surveys, near-miss logs and regular champion roundtables to capture usability issues and unearth workarounds.
     
  3. Blameless playbooks. Clear incident reporting channels, response expectations, and public, learning-oriented postmortems.
     
  4. Measure what matters. Track metrics tied to risk and behaviour.
     

Metrics that inform action (not vanity)

Stop counting clicks and start tracking signals that show cultural change and risk reduction:

  • Reporting latency: median time from detection to report. Increasing latency can indicate reduced psychological safety (fear of blame), friction in the reporting path (hard-to-find button) or gaps in frontline detection capability. A drop in latency after a campaign usually signals improved awareness or lowered friction.

Always interpret in context: rising near-miss reports with falling latency can be positive (visibility improving). Review volume and type alongside latency before deciding.

  • Inquiries rate: median number of proactive security inquiries (help requests, pre-deployment checks, risk questions). An increase usually signals growing trust and willingness to engage with security; a sustained fall may indicate rising friction, unresponsiveness or fear. 

If rate rises sharply with no matching incident reduction, validate whether confusion is driving questions (update docs) or whether new features need security approvals (streamline process).

  • Confidence and impact: employees’ reported confidence to perform required security tasks (backups, secure file sharing, suspicious email reporting) and their belief that those actions produce practical organisational outcomes (risk reduction, follow-up action, leadership support).

An increase may signal stronger capability and perceived efficacy of security actions. While a decrease indicates skills gaps, tooling or access friction or perception that actions don’t lead to change.

Metrics should prompt decisions (e.g., simplify guidance if dwell time on key security pages is low, fund an automated patching project if mean time to remediate is unacceptable), not decorate slide decks.

Experiment, measure, repeat

Treat culture change like product development: hypothesis → experiment → measure → adjust. Run small pilots (one business unit, one workflow), measure impact on behaviour and operational outcomes, then scale the successful patterns.

Things you can try this month

  • Map 3 high-risk workflows and design safer fast paths.
  • Stand up a security champion pilot in two teams.
  • Change one reporting process to be blameless and measure reporting latency.
  • Implement or verify secure defaults for identity and patching.
  • Define 3 meaningful metrics and publish baseline values.
     

When security becomes the way people naturally work, supported by defaults, fast safe paths and a culture that rewards reporting and improvement, it stops being an obstacle and becomes an enabler. That’s the real return on investment: fewer crises, faster recovery and the confidence to innovate securely.

If you’d like to learn more, check out the second edition of The Psychology of Information Security for more practical guidance on building a positive security culture.

Security is a social design problem, not a tech one

I’m super proud to have written this book. It’s the much improved second edition – and I can’t wait to hear what you think about it.

https://amzn.asia/d/9lr6Sd9

Please leave an Amazon review if you can – this really helps beat the algorithm, and is much appreciated!

Navigating the endless sea of threats

Cyber security is a relentless race to keep pace with evolving threats, where staying ahead isn’t always possible. Advancing cyber maturity demands more than just reactive measures—it requires proactive strategies, cultural alignment, and a deep understanding of emerging risks.

I had an opportunity to share my thoughts on staying informed about threats, defining cyber maturity, and aligning security metrics with business goals with Corinium’s Maddie Abe ahead of my appearance as a speaker at the upcoming CISO Sydney next month.

More

 Reflecting on a transformative week in Dubbo

I just spent an incredible week immersed in Aboriginal culture, where I had the privilege of working shoulder to shoulder with First Nations organisations as part of my AGSM Executive MBA journey.

This experiential learning project allowed me to take the academic knowledge from all my previous MBA courses and apply it in real-world contexts. What a great way to wrap up the program!

It was also an opportunity to deliver the final client presentation to Indigiearth, a 100% Aboriginal-owned native foods business, concluding the capstone strategic consulting engagement we’ve been working on this term.

Learning directly from Elders and community members enriched my understanding of Aboriginal traditions, values and the profound connection to land that underpins Indigenous enterprises. I’m proud to have been a part of this journey, bringing together cultural respect and strategic vision.

Collaborating with the enemy: key lessons for cyber security

In cybersecurity, collaboration is essential. With growing complexity in the threat landscape, leaders often find themselves working with parties they may not fully align with—whether internal teams, external stakeholders, or even rival firms.

Adam Kahane’s book Collaborating with the Enemy: How to Work with People You Don’t Agree with or Like or Trust outlines principles for collaborating effectively, especially in challenging environments where trust and agreement are minimal. Kahane’s “stretch collaboration” approach can transform the way cybersecurity leaders address conflicts and turn rivals into partners to meet critical security goals. In this blog, I’ll share my key takeaways.

More

Inclusion and accessibility: shaping culture and driving business outcomes

I’m grateful to have had an opportunity to continue to learn and contribute to the important discussion on building the culture of diversity, inclusion and accessibility in cyber security.

I like being on panels like this because it gives me an opportunity to share my views and continue to educate myself not only through research but also through lived experiences.

I believe shaping the inclusive culture begins with creating awareness about the barriers to diversity and inclusion. Accessibility is an important consideration. Testing new systems and processes with people with accessibility needs is key to discovering where issues may exist.

The best way to make security more accessible is to engage with the people who interact with it. Treating usability and accessibility together with other security requirements rather than a separate item is useful to ensure it gets built-in from the start.

Trust in People: Macquarie University Cyber Security Industry Workshop

I’ve been invited to to share my thoughts on human-centric security at the Macquarie University Cyber Security Industry Workshop.

Drawing on insights from The Psychology of Information Security and my experience in the field, I outlined some of the reasons for friction between security and business productivity and suggested a practical approach to a building a better security culture in organisations.

It was great to be able to contribute to the collaboration between the industry, government and academia on this topic.

Ethical cyber security leadership

Picture an easy Sunday morning. It’s sunny and quiet with only birds chirping outside. You make yourself a cup of coffee and sit on the sofa to catch-up on what’s happening in the world. You open your favourite news site and here it is – first story of the day in large font.

Breaking news: massive data breach! It’s your company in the headline.

This is the modern reality, cyber attacks are becoming increasingly common and it’s no longer a matter of if but when.

How do you manage this PR nightmare? What do you tell the media? Can you regain the trust of your customers and partners?

These are not the questions you want to be thinking about in the middle of a crisis. The real story begins way before that. It starts with responsible data management practices and securing people’s information.

More

Collaborating with the Optus Macquarie University Cyber Security Hub

I recently had a chance to collaborate with researchers at The Optus Macquarie University Cyber Security Hub. Their interdisciplinary approach brings industry practitioners and academics from a variety of backgrounds to tackle the most pressing cyber security challenges our society and businesses face today.

Both academia and industry practitioners can and should learn from each other. The industry can guide problem definition and allow access to data, but also learn to apply the scientific method and test their hypotheses. We often assume the solutions we implement lead to risk reduction but how this is measured is not always clear. Designing experiments and using research techniques can help bring the necessary rigour when delivering and assessing outcomes.

I had an opportunity to work on some exciting projects to help build an AI-powered cyber resilience simulator, phone scam detection capability and investigate the role of human psychology to improve authentication protocols. I deepened my understanding of modern machine learning techniques like topic extraction and emotion analysis and how they can be applied to solve real world problems. I also had a privilege to contribute to a research publication to present our findings, so watch this space for some updates next year.