On this #WorldBookDay my publisher named me Author of the Month and is kindly offering a 15% discount on The Psychology of Information Security with the code Leron15.
They also wrote a nice blog about it.
In the case of cyber security, this begins with understanding why current security practices might not be effective and why people often find workarounds rather than follow security processes.
Security professionals have access to the amounts of data never seen before. Antivirus software, firewalls, data loss prevention solutions – they all generate a staggering amount of alerts.
Security operation centres and the underlying SIEM technology allow us to aggregate, correlate and make sense of these vast troves of data. We can create dashboards and metrics that might look slick and even be useful to security teams but do such data add value to business stakeholders? Do they tell a story to the Board?
I’ve been asked to sign a large order of my book The Psychology of Information Security and hope that people who receive a copy will appreciate the personal touch!
I wrote this book to help security professionals and people who are interested in a career in cyber security to do their job better. Not only do we need to help manage cyber security risks, but also communicate effectively in order to be successful. To achieve this, I suggest starting by understanding the wider organisational context of what we are protecting and why.
Communicating often and across functions is essential when developing and implementing a security programme to mitigate identified risks. In the book, I discuss how to engage with colleagues to factor in their experiences and insights to shape security mechanisms around their daily roles and responsibilities. I also recommend orienting security education activities towards the goals and values of individual team members, as well as the values of the organisation.
I also warn against imposing too much security on the business. At the end of the day, the company needs to achieve its business objectives and innovate, albeit securely. The aim should be to educate people about security risks and help colleagues make the right decisions, showing that security is not only important to keep the company afloat or meet a compliance requirement but that it can also be a business enabler. This helps demonstrate to the Board that security contributes to the overall success of the organisation by elevating trust and amplifying the brand message, which in turn leads to happier customers.
While in the lockdown in the UK, I was reflecting on the many countries I had a chance to work and travel to in my career so far.
Following-up from my previous blog on lessons I learned from working in cyber security across multiple sectors, geography is another lens I can apply when thinking about the diversity of cyber security projects I had a chance to contribute to and people to work with.
It was a pleasure to serve clients and collaborate with colleagues from all over the world and this really gave me a global perspective on cyber security challenges across continents.
I was fortunate to visit more places than mentioned on this map, however, the ones I selected here were the most memorable and challenging from a project perspective.
There are still plenty of countries I have yet to explore, so I’m eagerly awaiting the time when it’s safe to travel again.
I have been fortunate to help and collaborate with a wide variety of organisations during my cyber security career to date. These companies range from large multinationals that are household names to small tech startups that you probably haven’t even heard of.
Although the regulatory landscape, security maturity and key risks often vary dramatically between industries, there are common themes that both an upstart FinTech and an energy giant can benefit from.
Being able to see what works, for example, in the world of Operational Technology and apply some of the learnings to an insurance company and vice versa can bring a fresh perspective and result in unique solutions that can be easily overlooked in traditional sector-specific paradigms. Identifying these synergies and collaboration opportunities between organisations of different sizes, industries, cultures and technological stacks has allowed me to better understand specific issues, challenge the conventional thinking and tailor my advice to fit the overall strategy of a given organisation for best results.
I recently passed the AZ-500: Microsoft Azure Security Technologies exam and earned the
Microsoft Certified: Azure Security Engineer Associate credential. In this blog I would like to share some tips that will help you prepare and ace it too.
Who needs to buy e-learning modules for employee security awareness programmes when NCSC kindly made available their training for free?
NCSC’s Top Tips For Staff includes online videos (that can be also included in your own learning management system), knowledge check and an infographic.
It’s a quick and easy way to get you started on the journey of building security culture in your company and meet some of the compliance requirements. This can be especially helpful for startups and non-profits with limited budgets.
I’ve been exploring the current application of machine learning techniques to cybersecurity. Although, there are some strong use cases in the areas of log analysis and malware detection, I couldn’t find the same quantity of research on applying AI to the human side of cybersecurity.
Can AI be used to support the decision-making process when developing cyber threat prevention mechanisms in organisations and influence user behaviour towards safer choices? Can modelling adversarial scenarios help us better understand and protect against social engineering attacks?
To answer these questions, a multidisciplinary perspective should be adopted with technologists and psychologists working together with industry and government partners.
While designing such mechanisms, consideration should be given to the fact that many interventions can be perceived by users as negatively impacting their productivity, as they demand additional effort to be spent on security and privacy activities not necessarily related to their primary activities [1, 2].
A number of researchers use the principles from behavioural economics to identify cyber security “nudges” (e.g. , ) or visualisations [5,6]. This approach helps them make better decisions and minimises perceived effort by moving them away from their default position. This method is being applied in the privacy area, for example for reduced Facebook sharing  and improved smartphone privacy settings . Additionally there is greater use of these as interventions, particularly with installation of mobile applications .
The proposed socio-technical approach to the reduction of cyber threats aims to account for the development of responsible and trustworthy people-centred AI solutions that can use data whilst maintaining personal privacy.
A combination of supervised and unsupervised learning techniques is already being employed to predict new threats and malware based on existing patterns. Machine learning techniques can be used to monitor system and human activity to detect potential malicious deviations.
Building adversarial models, designing empirical studies and running experiments (e.g. using Amazon’s Mechanical Turk) can help better measure the effectiveness of attackers’ techniques and develop better defence mechanisms. I believe there is a need to explore opportunities to utilise machine learning to aid the human decision-making process whereby people are supported by, and work together with, AI to better defend against cyber attacks.
We should draw upon participatory co-design and follow a people-centred approach so that relevant stakeholders are engaged in the process. This can help develop personalised and contextualised solutions, crucial to addressing ethical, legal and social challenges that cannot be solved with AI automation alone.
Being a security leader is first and foremost acting as a trusted advisor to the business. This includes understanding its objectives and aligning your efforts to support and enable delivery on the wider strategy.
It is also about articulating cyber risks and opportunities and working with the executive team on managing them. This doesn’t mean, however, that your role is to highlight security weaknesses and leave it to the board to figure it all out. Instead, being someone they can turn to for advice is the best way to influence the direction and make the organisation more resilient in combating cyber threats.
For your advice to be effective, you first need to earn the right to offer it. One of the best books I’ve read on the subject is The Trusted Advisor by David H. Maister. It’s not a new book and it’s written from the perspective of a professional services firm but that doesn’t mean the lessons from it can’t be applied in the security context. It covers the mindset, attributes and principles of a trusted advisor.
Unsurprisingly, the major focus of this work is on developing trust. The author summarises his views on this subject in the trust equation:
Trust = (Credibility + Reliability + Intimacy) / Self-Orientation
It’s a simple yet powerful representation of what contributes to and hinders the trust building process.
It’s hard to trust someone’s recommendations when they don’t put our interests first and instead are preoccupied with being right or jump to solutions without fully understanding the problem.
Equally, as important credibility is, the long list of your professional qualifications and previous experience on its own is not sufficient to be trustworthy. Having courage and integrity, following through on your promises and active listening, among other things are key. In the words of Maister, “it is not enough to be right, you must also be helpful”.