At King & Wood Mallesons’ recent Digital Future Summit, Tech Law and Telecommunications, Media and Entertainment Sector lead Cheng Lim discussed cyber risk and response with a panel - Cyber Security Cooperative Research Centre CEO Rachael Falk, former Telstra CEO and Cyber Security Industry Advisory Committee Chair Andy Penn, and experienced company directors Catherine Brenner and Mike Hawker AM.
Below are excerpts from their conversation, edited for length and readability. Links to watch or listen to their full conversation can be found below.
---
Cheng Lim: Rachel, Andy, Catherine and Mike, thank you for joining us. As you view today's cybersecurity environment, what is one thing either government or organisations could do, which you think might make a real difference?
Rachael Falk: It's not just a technical problem or a cultural problem. If I could wave a magic wand today, it would see minimum cybersecurity standards in place for all companies, but certainly SOCI (Security of Critical Infrastructure) companies, to bring up cybersecurity standards to a level where we would potentially see less intrusions and less cyber incidents. But I say that reluctantly, too, because culture plays a huge part, as does having the right tools.
Andy Penn: It's not actually a tech problem. It's a business problem. It's a business risk, like any other and there's no reason why boards and management can't get their heads around the issues. Conceptually, it's actually not that complicated. The implementation, the execution and the defences obviously take a lot of work. But I do think boards can immerse themselves in this topic and get their heads around what the cybersecurity risks are, and what they need to do to protect themselves from them. Like any risk, you're never going to eradicate it completely. But our job is to take all reasonable practical steps that we can, and you’ve got to understand the topic - to enable you to have competence to do that, and ask the right questions.
Catherine Brenner: Promoting and developing a culture of cyber resilience. That's a mindset and practices that are built and maintained - that's really not dissimilar to the way that we approach the workplace health and safety risk. It’s having a common language, personal responsibility, ownership, and a culture of speaking up and rewarding transparency. With that goes regular training, clear roles and responsibilities and response plans that are practised, current and adaptable and strong, all with regular consistent governance. So really, like so many other things, it's about the culture.
Mike Hawker: Your data is your lifeline, don't lose it. Think about it from a strategic point of view, which culturally, I think will help everyone improve the quality of their cybersecurity infrastructure, and the security of their data. On the flip side, I think that for critical infrastructure, the government has a lot more capability in this space than a lot of organisations do. It would be helpful for governments to work with organisations to try and reduce the risk, or in the event of a major crisis, give you some advice as to how to manage that, because I think there's a lot of complexity in regulated organisations where you might have different rules from different regulators, for instance, how long you have to hold your data.
The importance of investing in self-defence, training & frameworks
Cheng Lim: Andy as ex CEO of Telstra, and someone clearly passionate about cyber, what are key things boards and companies can do to protect themselves?
Andy Penn: There's two lenses that I would apply to my business and my operations if I was on a board right now, and that will capture 95% of what you need to worry about. One, I would do an inventory of every computer system that exists across the organisation. So every discrete, computer system that exists across the organisation. I would then apply the lens of the essential 8, which is effectively a government framework. which says in relation to each system: Are we properly patching it? Have we got proper user access? Have we got multi factor authentication, etcetera? There's about eight basic cyber principles. Ask the question - where are we at in relation to that essential 8, and to the extent we're falling short, what is our plan to lift it? That would take you a long way forward in feeling a lot more confidence.
The other lens I would apply is in relation to the data that I hold, as the company. Can you answer these five questions in relation to any data that you hold in the company. They’re questions such as: What is the data? Where is it? How is it protected? Who has access to it? Now, if you just imagine as a board of directors, if you had an inventory of all of your systems, and how they were protected and how they scaled in relation to the essential 8, and you had an inventory of all of the aspects or databases or components or cohorts of data that you had: Where is it, who's protecting it, who's got access to it? That would actually take you a very, very long way down the path to a more confident way to manage your cyber risk.
Cheng Lim: Rachael, how do you think boards can actually assess what good looks like?
Rachael Falk: To Andy's point. Spend more time because good isn't just, ‘can we go down the list and are we complying with the NIST framework? (Note: A US-developed cybersecurity protocol) Or whatever.’ It's got to be – ‘OK, have we done an audit? You know, where's our CISO or CSO?, Can you give me confidence that right down from the board to the person at the more junior level of the organisation understands that this is our risk appetite?’ You can't eliminate this risk completely. Get a third-party audit, actually test every important aspect of your organisation.
Cheng Lim: Catherine, you’re chair of Australian Payments Plus and you're on the Board at Scentre, two organisations that sit at the heart of where we shop and how we pay for it. What keeps you awake at night on cyber?
Catherine Brenner: I quite like the framework, which talks about soft defence, passive defence and an active defence. What does that mean? So soft defence is really the people part. That's the training and a cyber resilient culture, you know, don't click on the phishing email, don't let someone tailgate in thinking you're being friendly, because they've lost their pass. If something doesn't seem right, or you think you might have made an error speak up really quickly. Then there's the passive defence elements that goes into the recipe for getting a good night's sleep. And that's the sort of stuff Andy was talking about. That's what I think of as network hygiene. Probably a little bit of training is sitting on top of that as well, but really investing in the cyber resilient culture is that active defence part.
The value of engaging external experts to assist with audits and compliance
Cheng Lim: Mike, how do you think boards can better equip themselves to do this? What do you think about boards needing to have members who have technology expertise?
Mike Hawker: I think you're better off having people come in who are experts in different parts of that field, to help you navigate and understand what is going on in terms of how you need to set up your data, how you democratise it, all the elements of collection, trying to put all the capabilities around it to meet all your requirements from a customer point of view, how you use it internally and from a regulatory point of view. Regulations are really just good processes to give you better governance of your data. In terms of the board mix, I don't think you're going to have one single specialist, you've got to have a collection of people coming in and bringing different perspectives from different angles. - As to the threat vectors, how to manage data and various different components in the front end, the middle layer or a back end? And then clearly, you've got to test all this stuff. Find out the data you might have, which is the most critical if you lost it, and then work out from there. I do think that you can materially change the risk if you can find the critical data and have it encrypted and have some processes around being able to access it. It's getting that mindset moving. I'm not saying you don't do the compliance. But try and change it to where people actually understand why it's important and how your role may or may not impact the ability for somebody to come and steal the data from you.
Information sharing, and the role of Government
Cheng Lim: Andy, have you got any insights you can share with us about what government's thinking about in terms of protecting the nation from cyber threats?
Andy Penn: I've made this point to government - be careful throwing Optus or anybody else under the bus. There is no organisation in this country that holds more data, customer data, than federal and state governments. Government needs to play a role in relation to hardening its own systems. Another area is in sort of what I would describe as national programmes that benefit everybody. So, things like threat sharing, or skills building, awareness building, or digital identity programmes, data, sovereignty programmes, government procurement programmes. This is stuff that government can do, that filters through the whole economy. It’s largely in the current strategy, but I think it all could be substantially beefed up and accelerated, particularly given the rate of digital adoption and the rate of increase in malicious activity.
How to respond to a ransom attack
Cheng Lim: Let's imagine you're directors of a company which has been hacked, and services have been disrupted and data stolen and a ransom demanded. How would you approach this issue? Starting with you, Catherine, what are the key considerations and what advice would you be asking for and who from?
Catherine Brenner: I'd be looking to have Cyber Incident Response experts, legal expertise, external crisis communication people around the table providing advice. Hopefully, we already had them in place - to enable a much more rapid response, to have had the insurers sign off on them and be comfortable with them, and also involve them in simulations. And the sorts of questions that you would be asking them and looking for their advice on is on severity, the impact on business operations, reputation and key stakeholders. There might be a possible loss of human life or another catastrophic outcome, which could be imminent, and weighing up against the risks of a non-restoration of data or public disclosure, future attacks or being in breach of the laws such as terrorism, financing, and your insurance conditions, and also the overlay of ethical considerations. A whole lot of information that we need, and advice off the back of incomplete and changing information. It all feeds into the task of decision-making in what could well be a no-win situation.
Mike Hawker: I do think that it becomes a quite interesting issue for directors – ‘do you pay or not pay a ransom?’ Everyone wants to say, ‘you don't pay’. But if you're a hospital, and you've got operating theatres, and you’ve got people in theatre - the duty of care to the health of the person versus paying someone who may be illicit is an interesting question. How do you measure your customer duty? If you're stuck in the middle of this? Legally? That's a difficult issue to deal with. It helps if there was some discussion about that.
The importance of rigorous testing
Mike Hawker: This decision-making, it is not black and white. And that's why you have experts in the field to help navigate some of the greyness in the decision making. So, experts on telling you who is a threat actor, experts on your legal responsibilities, what you can and can't do, your obligations from a customer point of view. But I think the most critical thing is you need to have this tested regularly. It’s no good, making that decision on the day that something happens, because it's way too late. You needed to have that organised, and you don't want too many people who haven't been through those processes, because you tend to, through practice, get better organised at who does what and quick decision making.
Catherine Brenner: Scenario planning and testing and simulations are really important. Unless you're feeling pressure and stress in those scenarios, and those simulations they're not worthwhile. You need to feel sick. You need to be feeling that the sweat beads are there because really the scenario is about testing the ability of the team and individuals to work under pressure, with incomplete information and changing information, particularly because they can very quickly move beyond the cyber realm.
The Communications challenge
Andy Penn: You need to be out there on the front foot communicating, telling people what you do know, and being honest about what you don't know. And, you know, that was always my approach - ‘I can tell you this, but I don't know that. But you’ll be the first to know when I've got more information’. And journalists will say, ‘Well, can you guarantee it won't happen again?’ ‘No, I can't guarantee that. But I can guarantee that I've got every appropriate resource in the company doing everything they possibly can’. So, my first principle is - be on the front foot with the communication. And don't fear not having all the facts at your fingertips. It's difficult. It's an awkward, unsettling situation to be in, but go out, open up and take it head on.
Catherine Brenner: Communication is absolutely critical. What can we say given all of this and how much we don't know? To our staff, our customers, suppliers, investors, law enforcement, media regulators, all of the different stakeholders, and know – who is going to be responsible for communicating that to those stakeholders?
Andy Penn: Initially, you’ll feel like the victim. But if you allow that to manifest itself into your communications, that's going to run badly for you. You have to put the customer at the heart of everything. Recognise they're the ones who've got the vulnerability now. And I would be always advising to be more open and less open. That’s because the person who's best positioned to make a judgement is the person whose customer data it is. And so therefore, I believe that if a company develops a reasonable basis to believe that personal information in relation to their customers may have been leaked, then they have an obligation to tell those customers as soon as is practicable.
Mike Hawker: You can't have the view that this is not going to happen to our company.
Rachael Falk: Absolutely. If you're running a business and you're connected to the internet, even with one computer, you're at risk.