Emily Wearmouth [00:00:01] Hello and welcome to today's episode of Security Visionaries, a podcast for anyone in the cyber data or adjacent industries. Today, we're going to be talking about risky business. And I don't mean Tom Cruise's mediocre rom com. I mean cyber risk. And more specifically, we're going to be discussing risk appetite, risk assessment and risk communication. Now, I know I always tell you I'm excited about my guests, but today those guests include the man who quite literally wrote the book about risk measurement and management, as well as a man who's been responsible for the risk programs at a couple of pretty huge international banks. Let me introduce them. First up, Jack Freund. Over the course of his 25 year career in technology and risk, Jack has become a leading voice in cyber risk, measurement and management. He's the coauthor of the award winning book on Risk Quantification, Measuring and Managing Information Risk A Fair Approach, and holds a doctorate in Information Systems. So welcome, Jack.
Jack Freund [00:00:58] Thank you Emily, happy to be here.
Emily Wearmouth [00:01:00] Alongside Jack. I'm also really pleased that we have David Fairman here to give us his sage perspective. David is an experienced CSO strategic advisor, investor, and coach. He has extensive experience in the global financial services sector, is known for his innovative thinking, and in 2015 he was named as one of the top ten CISOs to know. So I am very proud, therefore, to know him. Welcome, David.
David Fairman [00:01:23] Emily Thanks very much for having me on the program today.
Emily Wearmouth [00:01:26] Right. I'm going to dive straight in. The first thing I want to ask you both is why is it important to understand not just your organizational risk level, but also your organizational risk appetite or comfort level? Shall we start with you, David?
David Fairman [00:01:41] Yeah, sure. So risk management is an interesting thing. I like to define risk management as applying your finite resources to your highest areas of risk. What that means is you are never going to address and remediate or eliminate all risk. So I think the risk appetite play, or defining risk appetite and understanding what that looks like for your organization really helps you to make some of those definitive decisions. Like I say you're not going to be able to close all or eliminate all risk. You're going to need to make some priority decisions as to how you allocate those finite resources, Understanding what risks your organization is facing, then understanding what your appetite is to experience a loss due to those risk scenarios allows an organization to make more informed decisions, to take action to close out or minimize. And maybe that's the word we want to talk about. Minimize. It's not about necessarily eliminating completely. It's about addressing taking action to bring those risk loss scenarios within that risk appetite, within that range of what the organization is willing to experience due to a risk scenario or risk loss scenario. So defining that risk appetite improves, and ensures much better decision making and allows the organization to operate within some controlled boundaries.
Emily Wearmouth [00:03:15] Jack, what are your thoughts? And I suppose as a supplementary question, how many companies actually understand what their risk appetite is?
Jack Freund [00:03:22] Yeah, it's a great question. And David's right, it really is about prioritization. And I think back to the times, you know, I've spent communicating quantified cyber risk to different organizations, whether I was internal to that organization or, you know, helping other organizations do the same thing and offer the same advice as, you know, it's novel. The first time you display a loss exceedence curve and a distribution of loss potential in terms of quantified risk. But inevitably, maybe in that same meeting, maybe in the next meeting, you're going to get asked the question, you know, so what? So what do I do with this? How do I operationalize this? And that's really where the appetite part of it comes into play. It's a matter of assigning a priority label to the saying that means this is unacceptable and we have to do something about it, a.k.a. allocate scarce resources to bring it back within tolerance, or we monitor it or we accept it. We sort of manage it that way. And that's really what that threshold represents, is that boundary between doing something and not doing something. So it's really valuable. I don't think a lot of firms have really thought through what appetite means in terms of operationalization. I think a lot of firms probably still, you know, say something to the fact a very fluffy kind of we don't accept any high risk scenarios. And, you know, operating in a competitive marketplace, selling products and services is a high risk scenario for any company. So that's not enough really. You have to sort of find another threshold to define how much is enough and how much is not enough.
David Fairman [00:04:59] And so Jack, can I double click on that for a minute. I'd love your take on this because I think you highlight a really good point. You know, a lot of organizations might make some of these very high level statements in terms of risk appetite. So, you know, we will not accept any high, high risks. But what does that actually really mean? How do we make that a little bit more real, a little bit more tangible? I come from a banking background, so the natural categories, I start to think about customer impact, financial impact, regulatory impact. What else would we have the employee impact as an example. So maybe putting some boundaries around that. So anything that impacts more than 10% of customers. Anything that impacts more than 10% of employees. Anything that results in a financial loss or loss of revenue of X amount of dollars. Anything that results in regulatory action in some way, shape or form. So I think defining those sorts of categories that's relevant to your organization really helps make these risk appetite statements real and more usable.
Jack Freund [00:06:08] Yeah, that's great insight. And I agree. I think, you know, you name some of the typical financial services categories. I've always been a fan of the Basel two operational risk categories. I think they're applicable whether you're in finance or not. And I really think sort of connecting the the quantified risk to those categories is important. So, for example, if you're thinking about. You know, the operational losses associated with ransomware. And there's a range of possible outcomes and you're trying to decide where on that distribution is an acceptable amount of loss. And you sort of drop a point and sort of, you know, help people manage or they think that is but, you know, let's say you get agreement that there's a line after which you, you know, we don't accept more than a 10% chance of losing $100 million or something like that. You know, once you find that line, then you have to sort of identify what are all the risk triggers that can cause that scenario to happen. And this is where you really connect. It's different, I think, in cyber risk and other operational risk disciplines because there are so many things that could go wrong. So you have you know, you have these injuries, these entire catalogs full of these varieties of, you know, attack methods and controls and the combination thereof that can trigger these things to happen. And that's why I think the operationalization happens, is we're able to communicate. Okay. So, you know, maturity on these process controls or coverage on these thing based controls which are just, you know, deploying end points and point protections. You know, we're missing all of these things. So improving those gives you the opportunity to reduce the probability of experience. These losses that we've already determined are unacceptable. I think that's where we sort of translate that is, you know, the sort of connection between one and here's how bad things could be. And then here's a list of things that we need to fix in order to cause that to happen.
Emily Wearmouth [00:08:01] We're having this conversation in 2023, and I know you wrote the book a little while ago, Jack, and I wondered how have things changed in particular in in the last five years? You know, we've seen huge economic, geopolitical, societal changes over the last five years, have they had an impact on some of the foundational assumptions that business leaders are basing risk decisions upon? I would imagine, for instance, that in 2019, a foundational assumption was that the workforce was in the office as a really straightforward one. You know, how much of these assumptions changed?
Jack Freund [00:08:33] Well, I think that's a great example of how dynamic it is working in the risk space. The book's ten years old or getting ready to release the second edition here in calendar year 2024 and definitely made some updates. You know, there are some more systemic things that have certainly changed. And I think you're right. I think we've all become much more aware of how interconnected everybody is and that brings with it benefits clearly, and it also brings with it some risks that we need to account for. So, you know, disruptions in Ukraine, disruptions in Israel and all these other sort of things that have been happening really highlight how interconnected we are from a supply chain perspective. You know, I began my career in telecom, manufacturing and in the late nineties, and we talk about supply chain all the time or manufacturing companies as we move towards digital transformation where every company becomes a technology company. We don't think much about that, but we're starting to. I think that's going to have significant shifts in the way that we assess risk, that we think about identifying risk. And it's called for a lot of really interesting programs such as software bills of material. I don't know the software that I'm relying upon to do these things. And, you know, the sort of third-party aspects of everybody's business are becoming much more heightened and much more important.
Emily Wearmouth [00:09:56] But my ears pricked at that third-party risk. That's certainly something we're seeing discussed a lot more. Is that something that the organizations generally do a good job of planning for, or do you move into a zone of a lack of visibility causing problems for risk planning?
David Fairman [00:10:12] You know, I think the third-party risk topic is significantly evolved, particularly since the pandemic in 2020, when it was more than just third-parties with fourth-parties, fifth-parties, sixth-parties that started to see disruption of upstream as a disruption of services caused by upstream dependencies. I think supply chain risk management. I can't think of any of my peers or practitioners that I talk to. I can't think of many conversations that we don't talk about supply chain risk. I think it's one of the systemic issues that we as organizations today are facing. Coming back to Jack's point, we're in a very interconnected world. We're in a digital society today with the interconnectedness and interdependencies on so many other organizations is, you know, is more so than ever before, I think. So I think what we are starting to see is that organizations recognize that supply chain is a significant concern, is a significant risk. To your point around, do we lose visibility? Yes, I think we do. I think we're trying really hard as a practice, as a profession, and collaborating with our risk partners to try and understand how do we get our hands around this in a better way. Think about cloud services and SaaS services, and there's some really prime examples of what is it that we're using? What's our what does our business process look like and who will be dependent on them? And who are those those SaaS service providers dependent on? If you're using a service provider which is actually hosting this service on AWS or Microsoft? What levels of assurance, how comfortable are you? Do you know who they're depending on? Is it someone that is reputable or is it just another cloud service that might not have the infrastructure needed to support their application? I think these are the challenges and these are the questions we're asking. I think these are the challenges we're struggling to solve. Supply chain risk to me is absolutely a board level topic.
Emily Wearmouth [00:12:16] Do you think as risk potentially becomes more complex getting a picture of risk becomes more complex and perhaps the world has become we've got new waves of risk coming in and it becomes harder to perhaps predict and capture risk. Does that affect people's psychology to make them more cautious or does that turn them into more comfortable risk takers? And I know I'm asking you deep questions that a psychologist probably ought to have joined us in the conversation to address. But I'm wondering from your experience in the conversations you have with boards, with business leaders, does that lack of control tend to make people more blasé about the risk or does it make them more paranoid?
Jack Freund [00:12:56] I think it makes them more aware that a thing is happening and they have to ask themselves, is this a thing that we need to be concerned about too? I recall one of Bruce Snyder's early books. He talked about, you know, the reason that a thing is in the news is because it doesn't happen all the time. That makes it interesting for us to read stuff that happens all the time. Our minds automatically sort of ignore it, you know? So it's sort of like novel for risk people to find a new thing we need to worry about because as I like to joke, you know, on any sunny day we can find a cloud, right? That's sort of what we do in risk management is we find all these, you know, business plans, these great initiatives, and we have to find the bad things that can happen that can stop them. But that's how you're successful in risk management is, you know, sort of being I goth kid that way, right. But when you when you get to that, you have to also sort of way into this the emotional reaction versus the the more logical mathematical side of it. And that's where risk quantification is really kind of important. So a thing can seem like it's happening all the time because there are news articles about it. So it's really important to acknowledge that a thing happened. Otherwise you seem insensitive as to sort of dealing with that psychological side of it, but also to find a more evidence-based, data-based view of why this thing is happening and is it happening very often.
Emily Wearmouth [00:14:18] Is there a risk, David, that this sort of high profile media coverage that we see of data loss instances and, you know, we can all hear from your accent that you're probably down Australia way and there were a couple of quite high profile instances last year. Does it create a confusion within executive leadership teams of the difference between possibility and probability of these incidents?
David Fairman [00:14:42] Look, I think what it's really done is it's really driven an awareness and a heightened level of discussion at the executive and board level around, that these things are real. This could happen to us. What do we need to do to address it? Have we been investing enough? Do we have the right people and capabilities within the organization? Does it confuse them? I think what it's done, I think there's been a level of confusion. I think there's been a level of I don't know if I want to use the word surprise, but let's call it surprise for them to realize that actually this is real. We do need to take it seriously. And what it's done is it's driven a lot more proactive discussion. And I've seen some really good outcomes. You know, I hate to say it, but never let a good incident go to waste. Right. I've seen some really good outcomes across a number of large corporations in APAC and across the globe because of this, because of what we've seen in our cybersecurity risk landscape over the over the past few years and this heightened volume of attacks that we're seeing and real material impact being experienced. I think it's actually helped us as security and risk practitioners to drive that conversation and get more focus. So has it confused them? I think they were confused and maybe a little bit surprised at the start, but I think what it's done, it's created those conversations and driven much more clarity and there's a much better understanding of it today than there was probably three, four, five years ago.
Jack Freund [00:16:17] Yeah, it's hard because on the one hand, you're right, you know, as a security person, you never want to see a bad thing happen. But at the same time, it also helps validate one, the mission that you have and the narrative you've been telling business leaders for years and decades that, you know, hey, look, bad things can happen, it can cause real harm. And now we're in a different space where it's very because of that interconnectedness that we started talking about earlier, that, you know, everyone's so aware of it. The 24 hour news cycle talks about it and people are concerned about it. So we have to make sure that, you know, we have communicated all the bad things that can happen. And yeah, I mean, sometimes you get a little charge when you actually have a thing. You know, you're like, I told you so, like that kind of thing. But it's never, you know, dancing on graves. It's always like, you know, Yes. Now let's go and solve this problem. Make sure that never happens again.
Emily Wearmouth [00:17:07] Right. I want to talk about trust. And partly because I've come up with a really convoluted analogy and I want to make use of it. So here we go. I would be disinclined to go on a jungle rope course if I didn't trust the activity centers owners to undertake effective risk management. The perceived shortcomings of their risk process, I would argue, are potentially as likely to stop me climbing to the top of their course as a visible frayed rope might. So my own identification of the risk ways with my belief that they can properly assess risk. And so I wonder if we follow my analogy through. How important is it that we trust the people in the processes of risk assessment? And does that change that risk appetite, how prepared we are to take on risk if we trust the people that we're asking to manage it, does it impact within an organization rather than a rope course setting?
Jack Freund [00:17:57] I love this metaphor. It's amazing. And I think it really sort of strikes home with a lot of interesting topics. So this notion of trust is really important in modern society, and none of us create or do all the things that requires that it is required of us to live and operate today. So we're always trusting somebody else to do those kinds of things. And you know, that trust comes at a cost that costs is we have to rely upon either those other parts of the organization or we have to rely upon a third-party that has validated that that is correct. Right. So you have to trust that the health inspectors have looked at the restaurant and that they're doing the right thing so it's safe to eat there. You don't have the time or the inclination to, you know, walk through the kitchen and make sure that, you know, there's no raw food on the cutting board where they're, you know, cutting vegetables, for example, like those types of things. You know, so in cyber risk management, we find ourselves equally as distanced from the reality of a thing, especially when it comes to like third-party management, where you have to sort of operate on this, you know, trust, verify kind of model that we've said ad nauseum. And that goes too to the risk scenario part of this, you know, are you identifying and managing the right things? Like you said, are you looking at the ropes courses the same way that somebody whose life depends on it would? And, you know, for some organizations, they have built these. And, you know, when you think about the book I wrote ten years ago, you know, they would build out these risk programs internally. And part of the one big shift that's happened in the last decade is there was very little data back then and even more in the years leading up to us writing that book where there was even just a huge dearth of data available on cyber incidents. So you find yourself sort of creating your own distributions, your own ratings tables, your own organization. I would present these and then I always get these questions that like, well, what does it look like? And our peers and how we benchmark against them. And you could never you can never get the same type of data about your peers as you could as yourself. But now we're in a different position. So one of the big shifts has been the availability of good data prompted in large measure by the cyber insurance industry and the claims datasets available there. So now we can kind of look at these and say, Well. Assuming that we have a same industry profile of them as our peers, and that we have the opportunity to sort of model after that. And this is really how intelligent actuaries are doing the same kind of thing as they're modeling. Like if you're a 16 year old in a sports car, you're going to pay more whether you're a good driver or not, because the other 16 year olds made it bad for you. Those are the kind of things that I think modern risk assessments are trying to take into account to sort of account for this this shifting landscape.
David Fairman [00:20:53] No, I like your analogy Emily and I think, and from that, you're looking that from the consumer behavior perspective, the consumer perception and how they know to make choices on how they're going to decide what services or in your case, jungle route activities, or what organizations they go to use to go and do that type of event. So I think I think about it from a business context and from a service provider context. If we think about security being a primary primary enabler of frictionless business operations, and that's what we're talking about these days, we're talking about how do we make it easier and more frictionless for the consumer. And recognizing that cyber resilience is core to that user experience, then I think really the cybersecurity program starts to become this differentiator and competitive advantage for organizations today. And how do we get to a point where we have really effective, well understood. Cyber resilience capability, a key ingredient to that is how do we understand our risk profile and understand our value at risk, if you like, within our organization. Because if that's what we're aiming for, for that customer experience, to be able to have that competitive advantage, we need to make sure that all the different risk components of cyber have been addressed within risk appetite to achieve that outcome and I think it becomes a very easy dovetail for organizations to think about how do I get the recognition from my consumers to make them want to use me as a service provider. So I think you're absolutely right. This risk management program and the ability to do this well becomes a decision point. And if you can market that becomes a decision point for consumers to make more informed decisions and as a organization to be more competitive.
Emily Wearmouth [00:23:04] I'm really chuffed you both like my analogy, particularly Jack. There's an analogy in your book about a tire dropping off a cliff. And I love that one too. While we're in the business of complimenting analogies, I thought I would pass that one on as well. Everybody read the book. It's very good, right? I've got another question and I ought to declare a bit of a bias of this because I've built my career on the criticality of communication in technology. So you'll spot the bias as I ask, in the book, Jack, you talk about the importance of communication as an IT skill. I think you have a pie chart and about 50% of it in that sort of desirable IT job skills circle is communication skills. And it's really clear to me why if you're conveying risk, you're trying to secure budgets to mitigate risk. Communication is a really important skill there. But I wondered whether effective communication of risk and the programs you've got in place to manage risk, whether that also impacts on risk appetites, organizations actual comfort with the risk that you are going to be left with. And how important is communication to that process? I'll ask you, Jeff, because it was your pie chart.
Jack Freund [00:24:09] Yeah, I think it's enormous, especially for the risk analyst in cybersecurity. The risk analyst role really has to bridge two different disciplines. One, a highly technical security discipline that has its own language and nomenclature. And it's and you have to, you know, impress those people with your ability to understand and really represent what's going on there. At the same time, you have to be embedded in the business that your company is on the other side and understand, you know, how they communicate and they all have their own specialized nomenclature. Any sufficiently advanced profession has its own nomenclature for these types of things. So you really have to be very skilled at switching between one and the other and then building these mappings between the two of them. And I think this is very evident in that sort of communication about risk appetite. You know, I think there's this Greek, or I'm sorry, it's a Latin actually communication guide for thinking about how to communicate effectively. And it really comes down to there's three parts of it the logos, the pathos and the ethos. That logos side is this rationalized mathematical side, if you will. Very logical that ethos is, you know, you as a communicator, are you trustworthy as a communicator? And then there's that pathos, there's this emotional connection. I think this is very evident in that risk appetite conversation for as quantified as I like to make risk all the time, understanding whether or not a particular loss exposure is too much or not enough. That is 100% an emotional kind of reaction. I can tell you that you could lose $10 Million. I can show you the math that leads up to it. Whether that's acceptable or not, to the decision makers, it's really up to them. And they have to sort of decide what's rational and reasonable in pursuit of the business goals, objectives, or organizational mission, those kind of things. So I think it's important. I think it's important to be able to talk about those goals and objectives versus those goals and objectives. And then also, you know, make an understandable discussion around the emotional side of it, which really goes back to the sort of storytelling aspect we've been talking about in the news emerging that when you can truly advance that conversational element.
David Fairman [00:26:28] Probably the one thing I would double click on is around when you communicate, I think you're actually spot on in terms of being able to quantify and be able to articulate, you know, potential loss event to a dollar value and quantify that. I think the communication element is about almost to some degree about understanding your organization and understanding your business and being able to apply it to make those stories that we tell real to the products and services that we deliver. So talk about whether these risks for this loss event or this potential risk that we see here applies to these products or services and these specific products or services. And this is what it looks like and sort of comes back to where we started with this conversation. It could impact X% of our customers, it could impact X percent of our employees, etc.. So I think tying that back and being able to communicate and make that really meaningful to the organization and how that organization does its business will see security practitioners, risk practitioners, I think have a lot more success in this conversation. And it becomes more real for the executives and for the board, because they can relate to that.
Emily Wearmouth [00:27:48] Are there any words that you see risk professionals use sort of as they go to words that you just think make business leaders glaze over that just don't land or resonate for them?
Jack Freund [00:27:58] You know, one that I have been particularly upset about is this sort of, you know, "assume breach" kind of mentality, I think that's a little overdone sometimes. I mean, I understand the design perspective of that, but I would never use that in an executive conversation. I think it's a very different I mean, first of all, I think that's kind of legally triggering as well. But there's a lot of things you have to break into there. I think there's a distinction between the sort of sometimes a sort of like battlefield type of terminology thats used in cybersecurity versus a business terminology. It's just a different space.
Emily Wearmouth [00:28:35] Any from you, David?
David Fairman [00:28:36] Not so much when it comes to sort of risk quantification or risk appetite. I think that is broadly understood and accepted by business leaders. I think where they start to glaze over is when we start talking, maybe going down that technology path and start using acronyms and jargon. There's more technology related, I think sometimes because a lot of security practitioners have come up through the technology path, we naturally, you know, orientate and anchor back to that, you know, give you a really good example, I think zero trust these days. I mean, it comes back to your "assumed breach," Jack. You know, organizations are seeing so many of my peers and colleagues talk about zero trust over the border, that zero trust and the board are asking them, what are we doing better? Zero trust, Do we really understand what that means? What are the objectives and the outcomes we're trying to achieve? So don't take those buzzwords. Let's break those buzzwords down to what's the outcome we're trying to achieve and have a discussion around that and how we do that versus a, you know, a topic or a buzzword that seems to be flavor of the day.
Jack Freund [00:29:46] Yeah. Anything you see in the vendor floor at the RSA conference is, you know, probably primed for exclusion from an executive conversation.
David Fairman [00:29:54] Exactly. Exactly.
Emily Wearmouth [00:29:57] And I've seen you talk lately about some of the discomfort that legal teams sometimes have about the very idea of risk assessment. Risk quantification. Can you talk us through that? Why do they have such a problem with it?
Jack Freund [00:30:08] Yeah, this was a surprising notion that was relayed to me by some folks I was talking to where the idea was that what they didn't want to do was go on record saying that they were accept they were okay accepting certain amounts of losses. You know, this sort of hearkens back to, you know, 20 years ago when, you know, firms were going to just try and pretend that there wasn't a breach happening. That was very enlightened by, you know, some of the in the U.S. or some of the SEC filings that said, you know, we can never remove all risk, but we sent for doing the right things that those two sort of flavor of comments, I think, are really important. So that's kind of what this was. I was like, well, if we don't quantify it, then we don't know about it. So we can't be held liable during a discovery process if we're sued and people want to sort of get records from us. And I don't think that's really the concern they had. I think the real concern they had was that we didn't want evidence that we were told that there was a deficient control scenario and we chose not to fix it. I think it's really what it comes down to. But at the same time, that's not a tenable option either. You have to you know, I think as an investor, we talked about trust earlier as an investor or a consumer of a particular company's products or services, I'd be far more upset if they didn't do any risk assessments to find out what the bad things were that could go wrong than that they did that and then chose not to fix certain ones of them. You know, there's always a sort of floor of, you know, what's table stakes in information security and what's negligent and that type of stuff. But to not quantify it, to sort of add priority, I think is just wrongheaded and that's not the right direction.
Emily Wearmouth [00:31:43] David, gibt es irgendwelche wesentlichen Vorteile, die über die Risikominderung hinausgehen, irgendwelche finanziellen Vorteile der Erfassung der Risikoquantifizierung?
David Fairman [00:31:53] Ja, auf jeden Fall. Ich habe also mit einer Reihe von Organisationen zusammengearbeitet und habe es selbst in einer früheren Rolle gesehen, als wir diesen Weg eingeschlagen haben, wie wir tatsächlich kommerziellere Gespräche mit unseren Cyber-Versicherungsmaklern führen und unsere Versicherungsprämien tatsächlich senken können. Dadurch, dass Sie dies gut machen können, besteht ein direkter Blick auf PNL und Kosteneinsparungen, da Sie dadurch viel besser in der Lage sind, eine quantifizierte, greifbare Diskussion mit Scyber-Versicherern zu führen. Und wir wissen, was wir in den letzten Jahren in der Cyber-Versicherungsbranche gesehen haben, wie streng sie versuchen wird, zu verstehen und zu verstehen, welches Risiko sie abdeckt, und wie unglaublich die Prämien gestiegen sind in den letzten Jahren. Ich denke, irgendein Unternehmen schaut sich seine Prämien für Cyberversicherungen an und fragt sich: Wie haben wir angefangen, diese zu senken? Dies ist ein wichtiger Mechanismus, der Ihnen dies ermöglicht.
Emily Wearmouth [00:32:58] Und wird dieser Mechanismus von den Organisationen vorangetrieben, oder ist er etwas, das allmählich von den Versicherungsgesellschaften gefordert wird?
David Fairman [00:33:04] Ich denke tatsächlich, dass es ein bisschen von beidem ist. Ich denke, die Organisationen, die an vorderster Front stehen, fangen an, das Gespräch auf sich zu ziehen, das Cyber-Versicherer sind. Ich denke, dass die Cyber-Versicherer jetzt ein bisschen klüger werden, wie sie das sehen, und sie verlangen von den Organisationen, dass sie einen viel besseren Weg finden, weil sie tatsächlich darüber nachdenken, ob eine Organisation eine viel bessere Vorstellung davon hat, was sie können Wenn sie ihre eigene Risikolage und ihre Risikoprioritäten quantifizieren können, erhalten sie dadurch ein höheres Maß an Sicherheit bei der Risikoabsicherung, da diese Organisation offensichtlich ein viel klareres Verständnis dieser Risikolage hat und hoffentlich über eine primäre Selektivität verfügt passend.
Jack Freund [00:33:42] Das ist ein viel erwachseneres Gespräch, oder? Es gibt diese externen Organisationen, Ratingagenturen, es gibt Cyberversicherer, die Aufsichtsbehörden sagen jetzt: „Hey, haben Sie das Risiko auf diese Weise betrachtet?“ Und wenn die Antwort „Ja“ lautet, dann wissen wir, dass Sie Dinge tun, die, wissen Sie, viel weiter im Reifegrad liegen, im Gegensatz dazu, ach ja, ich mache es überhaupt nicht, weil ich nichts davon wissen möchte irgendwie. Das scheint sehr falsch gedacht zu sein. Also ja, ich stimme dir zu 100 Prozent zu. Ich denke, wissen Sie, als Verbraucher haben wir eine Bewertung vorgenommen, als wir darüber nachgedacht haben, wie sich das in einem Test auswirken könnte. Wir haben einige falsche Dinge gefunden. Wissen Sie, wir haben eine Versicherung abgeschlossen, um abzusichern, dass wir in unseren Kapitalrücklagen operative Rücklagen hatten. Wir haben unterschiedliche Dinge gemacht, das sind alles sehr ausgereifte, vernünftige Dinge, die mir zeigen, dass Sie wissen, was Sie tun. Ich würde sogar gerne einige grundlegende Dinge reparieren, die großartig wären. Und ich denke, das ist wahrscheinlich die eigentliche Botschaft hier. Aber einfach zu wissen, dass all diese Dinge möglich sind und dass wir das Risiko auf vielfältige Weise kontrollieren können, ist meiner Meinung nach sehr wichtig.
Emily Wearmouth [00:34:39] Ehrlich gesagt ist es so interessant, das mit euch beiden zu besprechen, aber ich sehe, wie mein Produzent mir zuwinkt und auf seine Uhr zeigt. Für mich ist das ein Bereich, der sowohl sehr theoretisch als auch sehr praxisorientiert ist. Und mir kommt es auch so vor, als ob es sich ebenso sehr auf die Psychologie stützt, als dass es einen in eine Art Matheunterricht in der Schule zurückversetzt, in dem man sich wahrscheinlich etwas unwohl fühlt. Ich möchte Ihnen beiden dafür danken, dass Sie sich Zeit für dieses Gespräch genommen haben, insbesondere weil die Zeitzonen in Sydney, London und North Carolina meiner Meinung nach eine kleine Herausforderung darstellen, es zu planen. Daher weiß ich es zu schätzen, dass Sie beide eine Auszeit finden, während wir alle wach sind, und ich lasse Sie zu Ihrem Frühstück und Abendessen oder zu der nächsten Mahlzeit, die für Sie kommt, gehen. Sie haben sich den Podcast „Security Visionaries“ angehört, und wenn Ihnen diese Episode gefallen hat und Sie sie zu diesem Zeitpunkt immer noch hören, erlaube ich mir, davon auszugehen, dass Sie es getan haben. Abonnieren Sie sie auf Ihrer bevorzugten Podcast-Plattform, damit Ihnen das nie wieder passiert Verpassen Sie eine unserer zweiwöchentlichen Folgen. Danke, Jack. Danke, David.
Jack Freund [00:35:33] Danke, Emily.
David Fairman [00:35:34] Vielen Dank. Schön, hier zu sein.