I was asked to deliver a keynote in Germany at the Security Transparent conference. Of course, I agreed. Transparency in security is one of the topics that is very close to my heart and I wish professionals in the industry not only talked about it more, but also applied it in practice.
Back in the old days, security through obscurity was one of the many defence layers security professionals were employing to protect against attackers. On the surface, it’s hard to argue with such a logic: the less the adversary knows about our systems, the less likely they are to find a vulnerability that can be exploited.
There are some disadvantages to this approach, however. For one, you now need to tightly control the access to the restricted information about the system to limit the possibility of leaking sensitive information about its design. But this also limits the scope for testing: if only a handful of people are allowed to inspect the system for security flaws, the chances of actually discovering them are greatly reduced, especially when it comes to complex systems. Cryptographers were among the first to realise this. One of Kerckhoff’s principles states that “a cryptosystem should be secure even if everything about the system, except the key, is public knowledge”.
Modern encryption algorithms are not only completely open to public, exposing them to intense scrutiny, but they have often been developed by public, as is the case, for example, with AES. If a vendor is boasting using their own proprietary encryption algorithm, I suggest giving them a wide berth.
Cryptography aside, you can approach transparency from many different angles: the way you handle personal data, respond to a security incident or work with your partners and suppliers. All of these and many more deserve attention of the security community. We need to move away from ambiguous privacy policies and the desire to save face by not disclosing a security breach affecting our customers or downplaying its impact.
The way you communicate internally and externally while enacting these changes within an organisation matters a lot, which is why I focused on this communication element while presenting at Security Transparent 2019. I also talked about friction between security and productivity and the need for better alignment between security and the business.
I shared some stories from behavioural economics, criminology and social psychology to demonstrate that challenges we are facing in information security are not always unique – we can often look at other seemingly unrelated fields to borrow and adjust what works for them. Applying lessons learned from other disciplines when it comes to transparency and understanding people is essential when designing security that works, especially if your aim is to move beyond compliance and be an enabler to the business.
Remember, people are employed to do a particular job: unless you’re hired as an information security specialist, your job is not to be an expert in security. In fact, badly designed and implemented security controls can prevent you from doing your job effectively by reducing your productivity.
After all, even Kerckhoff recognised the importance of context and fatigue that security can place on people. One of his lesser known principles states that “given the circumstances in which it is to be used, the system must be easy to use and should not be stressful to use or require its users to know and comply with a long list of rules”. He was a wise man indeed.
Customers are becoming increasingly aware of their rights when it comes to data privacy and they expect companies to safeguard the data they entrust to them. With the introduction of GDPR, a lot of companies had to think about privacy for the first time.
I’ve been invited to share my views on innovating in the age of GDPR as part of the Cloud and Cyber Security Expo in London.
When I was preparing for this panel I was trying to understand why this was even a topic to begin with. Why should innovation stop? If your business model is threatened by the GDPR then you are clearly doing something wrong. This means that your business model was relying on exploitation of consumers which is not good.
But when I thought about it a bit more, I realised that there are costs to demonstrating compliance to the regulator that a company would also have to account for. It’s arguably easier achieved by bigger companies with established compliance teams rather than smaller upstarts, serving as a barrier to entry. Geography also plays a role here. What if a tech firm starts in the US or India, for example, where the regulatory regime is more relaxed when it comes to protecting customer data and then expands to Europe when it can afford it? At least at a first glance, companies starting up in Europe are at a disadvantage as they face potential regulatory scrutiny from day one.
How big of a problem is this? I’ve been reading about people complaining that you need fancy lawyers who understand technology to address this challenge. I would argue, however, that fancy lawyers are only required when you are doing shady stuff with customer data. Smaller companies that are just starting up have another advantage on their side: they are new. This means they don’t have go and retrospectively purge legacy systems of data they have been collecting over the years potentially breaking the business logic in the interdependent systems. Instead, they start with a clean slate and have an opportunity to build privacy in their product and core business processes (privacy by design).
Risk may increase while the company grows and collects more data, but I find that this risk-based approach is often missing. Implementation of your privacy programme will depend on your risk profile and appetite. Level of risk will vary depending on type and amount of data you collect. For example, a bank can receive thousands of subject access requests per month, while a small B2B company can receive one a year. Implementation of privacy programmes will therefore be vastly different. The bank might look into technology-enabled automation, while a small company might look into outsourcing subject request processes. It is important to note, however, that risk can’t be fully outsourced as the company still ultimately owns it at the end of the day
The market is moving towards technology-enabled privacy processes: automating privacy impact assessments, responding to customer requests, managing and responding to incidents, etc.
I also see the focus shifting from regulatory-driven privacy compliance to a broader data strategy. Companies are increasingly interested in understanding how they can use data as an asset rather than a liability. They are looking for ways to effectively manage marketing consents and opt out and giving power and control back to the customer, for example by creating preference centres.
Privacy is more about the philosophy of handling personal data rather than specific technology tricks. This mindset in itself can lead to innovation rather than stifling it. How can you solve a customers’ problem by collecting the minimum amount of personal data? Can it be anonymised? Think of personal data like toxic waste – sure it can be handled, but with extreme care.
It was another fantastic event by SANS. This time, apart from a regular line up of great speakers, there were some interactive workshops.
Javvad Malik facilitated one of them and challenged the participants to create their own awareness videos.
It felt like we covered the entire production cycle in under two hours: we talked about brainstorming, scripting, filming styles, editing and much more! But the most important part was about putting the ideas into practice and we actually got to create out own security awareness videos.
The audience was split into several groups, each tasked with producing an engaging clip with only one requirement: it shouldn’t be boring.
Javvad’s tips certainly helped and with a bit of humour, my team’s video won the first prize!
I’ve spend last week in Vienna at the annual intergovernmental conference focused on protecting critical energy infrastructure.
The first two days were dedicated to the issues of security and diplomacy.
A number of panel discussions, talks and workshops covered the following topics:
- Implementing the EU strategy for safe, open and secure cyberspace
- Cyber-threats to critical energy infrastructure
- Operational resilience
- Reducing the risks of conflicts stemming from the use of cyber-capabilities
- Cyber-diplomacy: developing capacity and trust between states
For the rest of the conference we moved from the Diplomatic Academy of Vienna to Tech Gate, a science and technology park and home to a number of local cyber startups.
We’ve discussed trends in technology and cyber security, participated in Cyber Range simulation tutorial and a scenario-based exercise on policy development to address the growing cyber-threat to the energy sector.
AIT Austrian Institute of Technology together with WKO Austrian Economic Chambers, ASW Austrian Defence and Security Industry, and the Austrian Cyber Security Cluster hosted a technology exhibition of latest solutions and products as well as R&D projects.
Participants had an opportunity to see state-of-the-art of next generation solutions and meet key experts in the field of cyber security for protecting critical infrastructures to fight against cyber-crime and terrorism.
Talks continued throughout the week with topics covering:
- Securing the energy economy: oil, gas, electricity and nuclear
- Emerging and future threats to digitalised energy systems
- Cyber security standards in critical energy infrastructure
- Public sector, industry and research cooperation in cyber security
- Securing critical energy infrastructures by understanding global energy markets
The last day focused on innovation and securing the emerging technologies. The CIO of City of Vienna delivered an insightful presentation about on cities and security implications of digitalisation. A closing panel discussed projected trends and emerging areas of technology, approaches and methods for verifying and securing new technologies and the future of the cyber threat.
It’s the second year I’m attending the IoT Security Foundation conference and it continues to be a great event.
Strategic and technical tracks run in parallel with vendor showcases and means that there’s something interesting for everyone.
It’s great to see industry practitioners and academics coming together to discuss the ethics of IoT, challenges with design and development and the direction of travel of security.
Some of recorded talks are available on the IoTSF website.
Best practice guidance on vulnerability disclosure, connected consumer products and security compliance framework are available to download.
I’ve been nominated for a Security Serious Unsung Hero award in the Best Educator category. This will be awarded to a professor, lecturer or teacher who leads by example to inspire and motivate the next generation of cyber security professionals. I’m humbled to be considered. Thank you!
Join me at the event.
I’ve been invited to talk about human aspects of security at the CyberSecurity Talks & Networking event. The venue and the format allowed the audience to participate and ask questions and we had insightful discussions at the end of my talk. It’s always interesting to hear what challenges people face in various organisations and how a few simple improvements can change the security culture for the better.