The evolution of cyber security – Defence Online

Defence online.pngI wrote for Defence Online on transparency in cyber security. Feel free to check out the article on their website.


[Podcast] Interview about The Psychology of Information Security

I’ve had a chance to discuss current challenges in and approaches for building a security culture during an interview with IT Governance Publishing about my book.  I also talked about why I do what I do. I hope you enjoy it.


What can a US Army General teach us about security?

General

General Douglas MacMarthur said “never give an order that can’t be obeyed”. This is sound advice, as doing so can diminish the commander’s authority. If people want to do what you are asking them to do, but can’t, they would doubt your judgement in the future.

Despite the fact that most of us operate in commercial organisations rather than the US Army, there are some lessons to be learned from this.

Security professionals don’t need to rally their troops and rarely operate in command-and-control environments. Their role has largely shifted to the one of an advisor to the business when it comes to managing cyber risk. Yet all too often advice they give is misguided. In an effort to protect the business they sometimes fail to grasp the wider context in which it operates. More importantly, they rarely consider their colleagues who will have to follow their guidance.

Angela Sasse gives a brilliant example of this when she talks about phishing. Security professionals expect people to be able to identify a phishing email in order to keep the company secure. Through numerous awareness sessions they tell them how dangerous it is to click on a link in a phishing email.

Although it makes sense to some extent, it’s not helpful to expect people to be able to recognise a phishing email 100% of the times. In fact, a lot of information security professionals might struggle to make that distinction themselves, especially when it comes to more sophisticated cases of spear phishing. So how can we expect people who are not information security specialists to measure up?

To make matters worse, most of modern enterprises depend on email with links to be productive. It is considered normal and part of business as usual to receive an email and click on the link in it. I heard of a scenario where a company hired an external agency and paid good money for surveying their employees. Despite advance warnings, the level of engagement with this survey was reduced as people were reporting these external emails as “phishing attempts”. The communications team was not pleased and that certainly didn’t help establish the productive relationship with the security team.

The bottom line is that if your defences depend on people not clicking on links, you can do better than that. The aim is not to punish people when they make a mistake, but to build trust. The security team should therefore be there to support people and recognise their challenges rather than police them.

After all, when someone does eventually click on a malicious link, it’s much better if they pick up the phone to the security team and admit their mistake rather than hope it doesn’t get noticed. Not only does this speed-up incident response, it fosters the role of the security professional as a business enabler, rather than a commander who keeps giving orders that can’t be obeyed.


Vulnerability scanning gone bad

4197732260_60306abecf_z

Security teams often have good intentions when they want to improve the security posture of a company by introducing new tools.

In one organisation, for example, they might want to mitigate the risk of exploiting application vulnerabilities and decide to deploy a code-scanning tool. This would make sure that applications are tested for exploits before they are released. Great idea but the uptake on the use of this tool was surprisingly low and created a lot of friction.

After closer examination, it turns out that this was primarily due to challenges with communication with the development teams that would need to use the tool. The impacted teams weren’t sufficiently trained on the use of it and there wasn’t enough support from the management to adopt it.

Development teams have tight timelines and budgets to work to in order to meet the business objectives. Anything that could disrupt these aspects is viewed with caution.

As a result, applications that should have had their code scanned either hadn’t, or had to be scanned at a much later stage of the development cycle. It was not incorporated in the DevOps pipeline– the scans were run as part of a manual check before release in production. Not only the risk of having applications with flaws in them remain largely unchanged, the whole process of delivering working software was prolonged.

These new applications were being delivered to facilitate revenue growth or streamline exiting processes to reduce cost and complexity. The impact on the business was that the new functionality they were expecting took longer to materialise, resulting in users’ frustration.

What can you do to prevent such situations from happening? Here are a few recommendations:

  1. Communicate frequently and at the right level. Communication must start at the top of an organisation and work its way down, so that priorities and expectations can be aligned. A person may need to hear the same message multiple times before they take action.
  2. Articulate the benefits. Security and risk teams need to ensure they position any new processes or tools in a way that highlights the benefits to each stakeholder group.
  3. Provide clear steps. In order to ensure the change is successful, security professionals should clearly outline the steps for how to start realising these benefits.

Communicating and providing support on new security policies, tools and practices to impacted teams is absolutely critical. This is especially important in large organisations with many stakeholder groups spread across multiple geographies. Always keep the people in mind when introducing a change, even if it’s the one for the better.

Image by Hugo Chinaglia


Author of the month for January 2019

discount-banner

IT Governance Publishing named me the author of the month and kindly provided a 20% discount on my book.

There’s an interview available in a form of a podcast, where I discuss the most significant challenges related to change management and organisational culture; the common causes of a poor security culture my advice for improving the information security culture in your organisation.

ITGP also made one of the chapters of the audio version of my book available for free – I hope you enjoy it!


The Psychology of Information Security is now an audiobook too!

Snip20181127_2

Thanks to my publisher, my book is now available in the audio format. It’s been narrated by Peter Silverleaf, who’s done a great job as always.

If you would rather listen to an audio while driving, exercising or commuting, this version is for you. The book has intentionally been kept to the point which means you can finish the audio in slightly over two hours. The fact that it costs the equivalent of two cups of coffee is an added benefit.

You can get it for free on Audible as part of their introductory offer (you can listen to the sample there too), through Apple iTunes or download it in the MP3 format on my publisher’s website.

I know I’m slightly biased here, but I highly recommend it!


Cyber Security: Law and Guidance

IMG-2260

I’m proud to be one of the contributors to the newly published  Cyber Security: Law and Guidance book.

Although the primary focus of this book is on the cyber security laws and data protection, no discussion is complete without mentioning who all these measures aim to protect: the people.

I draw on my research and practical experience to present a case for the new approach to cyber security and data protection placing people in its core.

Check it out!


Behavioural science in cyber security

Why your staff ignore security policies and what to do about it.               

Dale Carnegie’s 1936 bestselling self-help book How To Win Friends And Influence People is one of those titles that sits unloved and unread on most people’s bookshelves. But dust off its cover and crack open its spine, and you’ll find lessons and anecdotes that are relevant to the challenges associated with shaping people’s behaviour when it comes to cyber security.

In one chapter, Carnegie tells the story of George B. Johnson, from Oklahoma, who worked for a local engineering company. Johnson’s role required him to ensure that other employees abide by the organisation’s health and safety policies. Among other things, he was responsible for making sure other employees wore their hard hats when working on the factory floor.

His strategy was as follows: if he spotted someone not following the company’s policy, he would approach them, admonish them, quote the regulation at them, and insist on compliance. And it worked — albeit briefly. The employee would put on their hard hat, and as soon as Johnson left the room, they would just as quickly remove it.  So he tried something different: empathy. Rather than addressing them from a position of authority, Johnson spoke to his colleagues almost as though he was their friend, and expressed a genuine interest in their comfort. He wanted to know if the hats were uncomfortable to wear, and that’s why they didn’t wear them when on the job.

Instead of simply reciting the rules as chapter-and-verse, he merely mentioned it was in the best interest of the employee to wear their helmets, because they were designed to prevent workplace injuries.

This shift in approach bore fruit, and workers felt more inclined to comply with the rules. Moreover, Johnson observed that employees were less resentful of management.

The parallels between cyber security and George B. Johnson’s battle to ensure health-and-safety compliance are immediately obvious. Our jobs require us to adequately address the security risks that threaten the organisations we work for. To be successful at this, it’s important to ensure that everyone appreciates the value of security — not just engineers, developers, security specialists, and other related roles.

This isn’t easy. On one hand, failing to implement security controls can result in an organisation facing significant losses. However, badly-implemented security mechanisms can be worse: either by obstructing employee productivity or by fostering a culture where security is resented.

To ensure widespread adoption of secure behaviour, security policy and control implementations not only have to accommodate the needs of those that use them, but they also must be economically attractive to the organisation. To realise this, there are three factors we need to consider: motivation, design, and culture.

Read the rest of this entry »


Of people and security: To build a working security culture, focus first on empathy and communication

A security department may sometimes be referred to by executives as the ‘Business Prevention Department’. Cyber security professionals, eager to minimise potential risks, can put controls in place that may stifle productivity and innovation.

Cyber security professionals are often too aware of what the business shouldn’t do and forget to mention what it should be doing instead. Ok, USB ports are now blocked, but have we provided people with an alternative to share files securely? Yes, we might’ve mitigated the risk of introducing malware through a flash drive, but have we considered a wider impact on the ability of employees to perform their core business activities, and, in turn, on overall profitability of the company.

Instead of saying ‘No’ to everything, let’s try to understand the business context of what we are trying to protect and why. Because that’s what actually matters and is absolutely key when designing security solutions that work.

People often think that security is the opposite of usability. In reality, the reverse is true. Design and security can coexist by defining constructive and destructive behaviours: what people should and shouldn’t do. Effective design, therefore, streamlines constructive behaviours while making risky ones harder to accomplish.

To do this effectively, security has to be a vocal influence in the design process, and not an afterthought. But it can only regain this influence if the value to the people and business is first demonstrated.

Wondering why your security policies don’t work? Ask your staff!  Empathy, communication and collaboration are vital to build a culture of security. Security professionals need to shift their role from that of policeman enforcing policy from the top-down through sanctions to someone who is empathetic to the business needs and takes time to understand them.

Security mechanisms should be shaped around the day-to-day working lives of employees, and not the other way around. The best way to do this is to engage with employees and to factor in their unique experiences and insights into the design process. The aim should be to correct the misconceptions, misunderstandings and faulty decision-making processes that result in non-compliant behaviour.

Changing culture is not easy and will take time; but it is possible. Check out my book to find out more about developing an effective business-oriented security programme and improving security culture in your organisation.


DevOps and Operational Technology: a security perspective

3391525700_187606bd23_z

I have worked in the Operational Technology (OT) environment for years, predominantly in major Oil and Gas companies. And yes, we all know that this space can move quite slowly! Companies traditionally employ a waterfall model while managing projects with rigid stage gates, extensive planning and design phases followed by lengthy implementation or development.

It’s historically been difficult to adopt more agile approaches in such an environment for various reasons. For example, I’ve developed architecture blueprints with a view to refresh industrial control assets for a gas and electricity distribution network provider in the UK on a timeline of 7 years. It felt very much like a construction project to me.  Which is quite different from the software development culture that typically is all about experimenting and failing fast. I’m not sure about you, but I would not like our power grid to fail fast in the name of agility. The difference in culture is justified: we need to prioritise safety and rigour when it comes to industrial control systems, as the impact of a potential mistake can cost more than a few days’ worth of development effort – it can be human life.

The stakes are not as high when we talk about software development.  I’ve spent the past several months in one of the biggest dot-coms in Europe and it was interesting to compare and contrast their agile approach to the more traditional OT space I’ve spent most of my career in. These two worlds can’t be more different.

I arrived to a surprising conclusion though: they are both slow when it comes to security. But  for different reasons.

Agile, and Scrum in particular, is great on paper but it’s quite challenging when it comes to security.

Agile works well when small teams are focused on developing products but I found it quite hard to introduce security culture in such an environment. Security often is just not a priority for them.

Teams mostly focus on what they perceive as a business priority. It is a standard practice there to define OKRs – Objectives and Key Results. The teams are then measured on how well they achieved those. So say if they’ve met 70% of their OKRs, they had a good quarter. Guess what – security always ends up in the other bottom 30% and security-related backlog items get de-prioritised.

DevOps works well for product improvement, but it can be quite bad for security. For instance, when a new API or a new security process is introduced, it has to touch a lot of teams which can be a stakeholder management nightmare in such an environment. A security product has to be shoe horned across multiple DevOps teams, where every team has its own set of OKRs, resulting in natural resistance to collaborate.

In a way, both OT and DevOps move slowly when it comes to security. But what do you do about it?

The answer might lie in setting the tone from the top and making sure that everyone is responsible for security, which I’ve discussed in a series of articles on security culture on this blog and in my book The Psychology of Information Security.

How about running your security team like a DevOps team? When it comes to Agile, minimising the friction for developers is the name of the game: incorporate your security checks in the deployment process, do some automated vulnerability scans, implement continuous control monitoring, describe your security controls in the way developers understand (e.g. user stories) and so on.

Most importantly, gather and analyse data to improve. Where is security failing? Where is it clashing with the business process? What does the business actually need here? Is security helping or impeding? Answering these questions is the first step to understanding where security can add value to the business regardless of the environment: Agile or OT.

Image by Kurt Kurasaki.