Trust in People: Macquarie University Cyber Security Industry Workshop

I’ve been invited to to share my thoughts on human-centric security at the Macquarie University Cyber Security Industry Workshop.

Drawing on insights from The Psychology of Information Security and my experience in the field, I outlined some of the reasons for friction between security and business productivity and suggested a practical approach to a building a better security culture in organisations.

It was great to be able to contribute to the collaboration between the industry, government and academia on this topic.

Advertisement

Ethical cyber security leadership

Picture an easy Sunday morning. It’s sunny and quiet with only birds chirping outside. You make yourself a cup of coffee and sit on the sofa to catch-up on what’s happening in the world. You open your favourite news site and here it is – first story of the day in large font.

Breaking news: massive data breach! It’s your company in the headline.

This is the modern reality, cyber attacks are becoming increasingly common and it’s no longer a matter of if but when.

How do you manage this PR nightmare? What do you tell the media? Can you regain the trust of your customers and partners?

These are not the questions you want to be thinking about in the middle of a crisis. The real story begins way before that. It starts with responsible data management practices and securing people’s information.

More

Collaborating with the Optus Macquarie University Cyber Security Hub

I recently had a chance to collaborate with researchers at The Optus Macquarie University Cyber Security Hub. Their interdisciplinary approach brings industry practitioners and academics from a variety of backgrounds to tackle the most pressing cyber security challenges our society and businesses face today.

Both academia and industry practitioners can and should learn from each other. The industry can guide problem definition and allow access to data, but also learn to apply the scientific method and test their hypotheses. We often assume the solutions we implement lead to risk reduction but how this is measured is not always clear. Designing experiments and using research techniques can help bring the necessary rigour when delivering and assessing outcomes.

I had an opportunity to work on some exciting projects to help build an AI-powered cyber resilience simulator, phone scam detection capability and investigate the role of human psychology to improve authentication protocols. I deepened my understanding of modern machine learning techniques like topic extraction and emotion analysis and how they can be applied to solve real world problems. I also had a privilege to contribute to a research publication to present our findings, so watch this space for some updates next year.

The complexity of communication

As someone who worked for both large multinationals and small tech startups, I’m often asked whether the scale of the organisation matters when building security culture.

I think it does. Managing stakeholders and communication gets increasingly complex in larger organisations. In fact, the number of communication paths tends to increase dramatically with every new stakeholder introduced to the network.

I’ve had the privilege to advise a number of smaller companies in the beginning of their journey and I must admit it’s much more effective to embed secure behaviours from the start. We talk about security by design in the context of technical controls – it’s no different with security culture.

While working as a consultant, I helped large corporations with that challenge too. The key is to start small and focus on the behaviours you want to influence, keeping stakeholder engagement in mind. Active listening, empathy and rapport building are essential – just rolling out an eLearning module is unlikely to be effective.

Book signing

I’ve been asked to sign a large order of my book The Psychology of Information Security and hope that people who receive a copy will appreciate the personal touch!

I wrote this book to help security professionals and people who are interested in a career in cyber security to do their job better. Not only do we need to help manage cyber security risks, but also communicate effectively in order to be successful. To achieve this, I suggest starting by understanding the wider organisational context of what we are protecting and why.

Communicating often and across functions is essential when developing and implementing a security programme to mitigate identified risks. In the book, I discuss how to engage with colleagues to factor in their experiences and insights to shape security mechanisms around their daily roles and responsibilities. I also recommend orienting security education activities towards the goals and values of individual team members, as well as the values of the organisation.

I also warn against imposing too much security on the business. At the end of the day, the company needs to achieve its business objectives and innovate, albeit securely. The aim should be to educate people about security risks and help colleagues make the right decisions, showing that security is not only important to keep the company afloat or meet a compliance requirement but that it can also be a business enabler. This helps demonstrate to the Board that security contributes to the overall success of the organisation by elevating trust and amplifying the brand message, which in turn leads to happier customers.

Free security awareness training for your staff

NCSC Stay Safe Online

Who needs to buy e-learning modules for employee security awareness programmes when NCSC kindly made available their training for free?

NCSC’s Top Tips For Staff includes online videos (that can be also included in your own learning management system), knowledge check and an infographic.

It’s a quick and easy way to get you started on the journey of building security culture in your company and meet some of the compliance requirements. This can be especially helpful for startups and non-profits with limited budgets.

Can AI help improve security culture?

I’ve been exploring the current application of machine learning techniques to cybersecurity. Although, there are some strong use cases in the areas of log analysis and malware detection, I couldn’t find the same quantity of research on applying AI to the human side of cybersecurity.

Can AI be used to support the decision-making process when developing cyber threat prevention mechanisms in organisations and influence user behaviour towards safer choices? Can modelling adversarial scenarios help us better understand and protect against social engineering attacks?

To answer these questions, a multidisciplinary perspective should be adopted with technologists and psychologists working together with industry and government partners.

While designing such mechanisms, consideration should be given to the fact that many interventions can be perceived by users as negatively impacting their productivity, as they demand additional effort to be spent on security and privacy activities not necessarily related to their primary activities [1, 2].

A number of researchers use the principles from behavioural economics to identify cyber security “nudges” (e.g.  [3], [4]) or visualisations [5,6].  This approach helps them make better decisions and minimises perceived effort by moving them away from their default position. This method is being applied in the privacy area, for example for reduced Facebook sharing [7] and improved smartphone privacy settings [8]. Additionally there is greater use of these as interventions, particularly with installation of mobile applications [9].

The proposed socio-technical approach to the reduction of cyber threats aims to account for the development of responsible and trustworthy people-centred AI solutions that can use data whilst maintaining personal privacy.

A combination of supervised and unsupervised learning techniques is already being employed to predict new threats and malware based on existing patterns. Machine learning techniques can be used to monitor system and human activity to detect potential malicious deviations.

Building adversarial models, designing empirical studies and running experiments (e.g. using Amazon’s Mechanical Turk) can help better measure the effectiveness of attackers’ techniques and develop better defence mechanisms. I believe there is a need to explore opportunities to utilise machine learning to aid the human decision-making process whereby people are supported by, and work together with, AI to better defend against cyber attacks.

We should draw upon participatory co-design and follow a people-centred approach so that relevant stakeholders are engaged in the process. This can help develop personalised and contextualised solutions, crucial to addressing ethical, legal and social challenges that cannot be solved with AI automation alone.

More

How to be a trusted advisor

Being a security leader is first and foremost acting as a trusted advisor to the business. This includes understanding its objectives and aligning your efforts to support and enable delivery on the wider strategy.

It is also about articulating cyber risks and opportunities and working with the executive team on managing them. This doesn’t mean, however, that your role is to highlight security weaknesses and leave it to the board to figure it all out. Instead, being someone they can turn to for advice is the best way to influence the direction and make the organisation more resilient in combating cyber threats.

For your advice to be effective, you first need to earn the right to offer it. One of the best books I’ve read on the subject is The Trusted Advisor by David H. Maister. It’s not a new book and it’s written from the perspective of a professional services firm but that doesn’t mean the lessons from it can’t be applied in the security context. It covers the mindset, attributes and principles of a trusted advisor.

Unsurprisingly, the major focus of this work is on developing trust. The author summarises his views on this subject in the trust equation:

Trust = (Credibility + Reliability + Intimacy) / Self-Orientation

It’s a simple yet powerful representation of what contributes to and hinders the trust building process.

It’s hard to trust someone’s recommendations when they don’t put our interests first and instead are preoccupied with being right or jump to solutions without fully understanding the problem.

Equally, as important credibility is, the long list of your professional qualifications and previous experience on its own is not sufficient to be trustworthy. Having courage and integrity, following through on your promises and active listening, among other things are key. In the words of Maister, “it is not enough to be right, you must also be helpful”.