How Cyber Fits Into Big-Picture Risk | Episode 106

Waterfall team

Waterfall team

In this episode, Dr. Janaka Ruwanpura, Vice-Provost, University of Calgary joins us to look at where cyber risks fit into the “Big Picture” of overall risk at industrial operations.

Listen now or Download for later


https://www.youtube.com/watch?v=ChRdWIGy0D8

SUBSCRIBE

Apple Podcasts Google Podcasts Spotify RSS Icon

THE INDUSTRIAL SECURITY PODCAST HOSTED BY ANDREW GINTER AND NATE NELSON AVAILABLE EVERYWHERE YOU LISTEN TO PODCASTS​

About Dr. Janaka Ruwanpura

Dr. Janaka Ruwanpura is currently the Vice-Provost and associate Vice-President of research at the University of Calgary and also a professor at the Schulich School of Engineering, specializing in project management. You can read more about Dr Janaka Ruwanpura on his Wikipedia page, as well as his LinkedIn profile.

Dr. Janaka Ruwanpura

How Cyber Fits Into Big-Picture Risk

Transcript of this podcast episode:

Please note: This transcript was auto-generated and then edited by a person. In the case of any inconsistencies, please refer to the recording as the source.

Nathaniel Nelson: Welcome everyone to the industrial security podcast. My name is Nate Nelson I’m here with Andrew Ginter the vice president of industrial security at waterfall security solutions who’s going to introduce the subject and guest of our show today Andrew has gone.

Andrew Ginter: I’m very well thank you Nate our guest today is Dr Janica Ranura he is a professor at the University Of Calgary he is the vice provost of the entire university. You know an associate but vp of of research. Um and you know he’s he’s a professor of engineering and project management and Janaka does a lot of work with risk very generally and so we’re going to explore today. How cyber risk fits into the big picture. Of risk you know inside of of engineering and construction and and other kinds of projects and and organizations.

Nathaniel Nelson: Um, then without further ado let’s get into the interview pause.

Andrew Ginter: hello Janaka and and welcome to the podcast. Um, before we get started can I ask you to say a few words about yourself and about the good work that you’re doing at the University Of Calgary

Janaka Ruwanpura: Um, thank you Andrew um, my name is Janaka Ruwanpura I’m currently Vice-Provost international and associate Vice President of Research at the University of Calgary at the same time I’m also a professor in the Schulich school of engineering. Specializing in project management. Um, if I were to give ah about my connectivity with the university of course I look after the global engagement of the University of Calgary which includes every aspect of it in terms of academic. Ah, research mobility and industry connections. Um in terms of the University of Calgary I think you know we are very proud that being a young university um, and and particularly last year we became truly number five in Canada as a top 5 research university. Um, at the same time I think the other key element that I want to talk about and and might be very interested to um to hear for for your audience is that for two consecutive years University Of Calgary is the number one startup companies produced which is actually a tremendous. Recognition and reputation for a university like us whereas when you look at even the top 5 the remaining 4 ah in terms of the scalability and the size are much bigger and then also they are older more than hundred years old.

Andrew Ginter: Yeah, cool I mean I am an alumnus of the University of Calgary I’m ah I’m a great fan of the university but um, our topic today. Our topic today is risk I mean you are an expert in ah, risk in the context of sort of engineering project management. Um.

Janaka Ruwanpura: This is the problem.

Andrew Ginter: You know we’re of course on the podcast interested in Cyber risk but cyber risk fits into sort of a ah bigger. Ah bigger picture of ah sort of overall risk Management. You’ve got the risk of I don’t know hurricanes and fires and who knows what? so um. You know you’re an expert on risk. Can you start us at the top. You know what is Risk. What’s the big picture of risk what? what are we worried about what should we be worried about.

Hacker Cyber risk

Janaka Ruwanpura: Yeah I mean Andrew the key element of the the way that I look at the risk is that I always use the word risk with opportunities. Um, and my expertise is mainly in the project risk management side of things and. Yeah I think the key is that we look at every possible thing for a project that what are those elements challenges that uncertainties that could create challenging and problems. Moving ahead with our projects. So I think that’s where we look at and say how do we convert some of those negative things such as the negative risks into a better opportunities where we will handle them. We identify them upfront we come up with a ah. Come up with good solutions to ah deal with them so that we can run projects with minimum impact of risks and uncertainty so that the projects will be successfully planned designed and implemented and I think that would apply. In in in your domain in the cybersec security area about how do we identify these risks um in advance and then how do we come up with a sustainable practical solution that would benefit the key stakeholders.

Um, and to to ensure so success at the end and say we have done a good job.

Andrew Ginter: So so that makes sense at a high level but I thought I heard you say that? Um, if you look at them hard sometimes risks turn into opportunities. Can you give me an example how how does I work.

Janaka Ruwanpura: Yeah Andrew like come in when I dealt with few risk analysis sessions with industry folks. Um, and I can tell you that in 1 occasion that we were doing a project in Fort Mcmurra and then we went through the complete risk analysis process which I’m going to explain to you. Ah, later ah been we identified few risks and then when we look at the impact of the risks to the project schedule. We realized that we could not maintain the time challenge in terms of you know the number of weeks or the or the months to complete the project.

And at that moment we felt that I think we need to look at some alternate designs so that you could cut down the time duration of the project. So the team was very committed. They look at some alternate designs and as a result of that. Ah, then then we did the same thing we we. Look at them and we simulated to find out what’s a new projectration and then we realized yup we could achieve the time du ratio right? And similarly um I can think of another example that which I can even speak about it. You know, openly the Olympic over. Restoration that happened about twelve years ago at the University Of Calgary um and when we look at the risks we had a big challenge in Twenty Eleven September because the facility was already committed for other ah clients. Ah, for practicing as you know that this is the place that we call the fastest size and when we look at each of the risks we were very very determined. The project was so committed and they came up with some creative solutions to ah to reduce the time duration of the project right. That’s why I’m seeing it so sometimes you look at the risk risk in a negative way. But if you’re committed to come up with ah a better solution to deal with the risk it created an opportunity to come up with a better design. Maybe more efficient design maybe a more sustainable design and maybe a more creative design that.

Help the project team ah to to achieve the outcomes of the project in terms of reducing the duration reducing the cost maybe enhancing the efficiency. Um and things like that. So. That’s what I mean by, don’t always look at. Ah, negative side of the risks look at how the risks can create additional opportunities.

Andrew Ginter: So Nate Janakov was saying there. Um, you know he gave an example of of you know, physical design physical risk simplifying designs. Um, you know in the cybersecurity space. A lot of people face that same problem when it comes to patching. Ah, you know imagine I don’t know a a power plant with you know, 4 generating units each unit has I don’t know one ah hundred Plcs um, if your plcs are on the same network as your control system which is on the same network as your plant system which has a firewall. Going to the it network and the it network in turn as a firewall going out to the internet. That’s a very highly connected environment. Um, any security assessor coming in is going to look at this and say you really need to patch really aggressively everything on your industrial network. Because it’s so exposed to the it network and to the internet. Um and patching is really expensive I mean you would have to take the plant down in order to to change the firmware on the Plcs. Um, and you don’t want to do that you want to keep producing power and you you need to test. These new firmware images. Extensively you know a lot of people look at this and say let’s not do that I know that was sort of the risk that was identified and the the obvious fix to you know we’re exposed to to attack is to fix the vulnerabilities. But what lot of people do is what’s called compensating measures.

They will put additional layers of firewalls in they’ll put additional layers of security. They might throw a unitdirectional gateway and they might air gap the safety systems so that they simply cannot be compromised and in this way they reduce this you know they reduce the risk by. Ah, you know changing the design in such a way that you don’t have to do the really expensive patching thing anymore. so so yeah you know what? what Jaakka saying here but makes a lot of sense.

Andrew Ginter: Okay, so so big picture risk. Um, when we’re when you’re looking at a project. How do you get started with with risk management.

Janaka Ruwanpura: The main key ah component is that um we you know, especially the risk management and I’m talking about large capital projects. Um, we always look at um at at the very beginning of the project How we. Come up with a better risk ah process to identify the risks and then we can quantify the risks so that we can come up with a better risk response plan and that’s where. It’s important to bring the key stakeholders who are involved in the project. Um, and who has expertise and knowledge about similar projects in the past and so that they could actually provide the input for the current project. So. Before I get into the steps of that I think we we always emphasize share the current information that that that you know about the project so that the the participants involved in the risk analysis can understand it. Come up with a better risk identifiification. Um, and so when you look at the risk identification we want to be you want to ensure that everybody is committed to identify the unique risks that are relevant to the project right.

Map of cyber risks

And that’s where we you know ask the question I mean like for example, when we identify the risks we have to think about right away how we’re going to deal with the risks. So sometimes we we ask a question the 2 important things we ask about how urgent how important.

So I can refer to as Steven Coy’s time management marix where um, you know Stephen Cove mentioned about 4 like a 2 by 2 metrics. Um in terms of one side is on the urgent that the other side is on the important.

So For example, we we get what we call the reactive quadrant where is actually urgent and important that means there’s an important risk.. There’s an urgent risk and there is a different way of dealing with that then there’s a quadrant number 2 which is. Not urgent but important that means we have time to really identify Them. We cannot You know we we can actually proactively deal with that risk so that that the team is aware what to do. And then comes a quadrant three which is not important but urgent right? Which is also you know I mean it’s ah it’s a bit of a reactive nature at the same time. It is difficult for us to reject it. We have to deal with it right. How to deal with it because it’s so urgent. Um, and then so we need to deal with it and then comes the um, the fourth Quadrant which is like not important and not Urgent. So There’s not no drivers here right? and Then. Do. We really want to spend time in in looking at them So coming back to that step number one as I said the identification is really really key if the team is not identifying the risks properly then we won’t be able to come up with a robust risk management plan.

Nathaniel Nelson: Yes, what he’s referencing there I’ve heard this for years now as the the name of it being the eisenhower matrix and it being applied to the trades of successful people. But I guess the point he’s making is that it could be applied. To risk more easily because it’s sort of universal.

Andrew Ginter: That’s right I think he attributed it to Stefan Covey who I think documented it in in one of his books I don’t know was it 7 habits of highly effective people I’m not sure. But yeah, it’s it’s a you know I I think of it as a you know. I I recall being introduced to it as a time management matrix but it applies to risk as well. I mean you know in the cyberspace. Um, you know what are we talking about something that’s both urgent and important there’s ransomware on the ot network this is an emergency all hands on deck fix this problem. You know that’s that urgent and important not urgent but important is the risk assessment just came back the security assessment just came back. Ah, you know we’re in trouble we have to fix these problems before ransom work gets into the control network. An example of you know, not important but urgent. Um, we urgently need to change all of the passwords in all of the devices in all of our substations. Why ner cip says you have to do this? Yeah, but those substations they’re heavily defended. We’ve got. We’ve got security and they’re 9 ways to Sunday nobody can get in there with a password and and mess with the devices. It doesn’t matter if we breach the standard. We risk a million dollar per day of non-compliance fine fix this problem fix it now I don’t care if it’s not important securitywise it’s urgent compliance-wise. So yeah, this this matrix you know.

This very much applies in the cyberspace.

Andrew Ginter: OK so we’ve identified the risks. We have our matrix of what’s urgent vs important. What’s the next step?

Janaka Ruwanpura: The next step is like this is interesting when I when I’ve done many other facilitation of risk analysis sessions. People come up with all kinds of risks right? Then the question is do you really understand the risk. If somebody ask you about this particular risk. Can you justify that This risk is relevant to this project. So um, so then we ask questions like do you have background understanding about this particular risk have you seen that happening in other projects. Do You think it’s relevant. Ah for this project. If it happens for this project would you be able to really analyze the problem because the reason why we asked a question is questions are if you identify a risk and say hey this is a high risk. It’s going to be like you know the impact is going to be quite significant. How do you determine those if you don’t understand the risk So. That’s what we call the qualification so we go through what we call a like a step after the identification to qualify the risks. Are you really champions of this risk so that we can. Take the identified risks into the quantification stage. But before that we need to make sure that you understand the risks and if someone else in the team ask a question would you be able to defend whether these risks are relevant for the project. Okay.

And once we pass that stage then we can go to the risk quantification stage to determine 2 things. What is the probability of occurrence of this risk and if that risk occurs what would be the impact of the risks. Various aspects right? and I can say that one for example, as I said um, you know my background is in in more on the capital projects. The 2 key things. We always talk about the risk management is how does this these risks impact. Ah, cost of the project how they impact the the time of the project or the duration of the project but you can also look at other things you know the impacting categories could be a reputation a safety the performance right. So you can actually say okay, if this risk happens let’s quantify to say that. What’s the what’s the probability or occurrence of these risks or or the impact. Ah, if these risks happen right? And that’s where um, especially when we are dealing with. You know risk analysis with you know, um, stakeholders involved in there. We want to make sure that everybody really understand the process of the quantification and and that’s where we always adopt a standard methodology to look at the probability of occurrence. For example.

If if ah if I say oh you know what I have this risk which is you know, um, very likely that it’s going to happen so somebody would ask the question What do you mean? likely can you define what’s likely so is your like for example, Andrew in but even between you and i. If I use the word likely in a subjective term what what does that mean to you and then if I look at that likely. How do I interpret likely so that’s where we always look at ends and come up with a. Ah, standard methodology that can say you know what this risk is you know, um, likely means it is 40% chance happening or is it 50% chance happening. So we come up with a ah quantitative ah methodology. To define what do we mean by a subjective meaning and then convert that subjecting meaning into a quantitative meaning.

Andrew Ginter: So that makes sense. Um, but we’re talking about you know risk we’re talking about things that might not happen. Um, you know I might say um, you know you’re operating a yeah ah large. Consumer Goods factory and competing with a ah you know the same kind of factory in um, in another country and that country you know has ah an active industrial intelligence ah wing in their in their government and. I Think it’s very likely that the large consumer Goods Factory you know Laptop Factory is is going to be targeted with a nation state grade and Intelligence Agency Grade Cyber attack. Um, you might disagree. How do you?? How do you resolve these things about events that haven’t happened yet.

Janaka Ruwanpura: I mean I mean this is where Andrew like there’s 2 things sometimes we you know when we look at the risk management and identification we identify which ones are the strategic risks which ones are the tactical risks in the project management domain. We consider the tactical risk management is available at the projects for the project people to handle whereas a senior management will determine the strategic risks even the existence of a project depends on how they look at the strategic risks and then if they think like the example that you have you given. Is actually more geopolitical type of thing which is actually a strategic type of risk which would decide whether we want to go ahead with the project or not. But anyway the challenge that I have faced in the quantification of the risk is that you know do we think the same way like for example, Lamina. You know I I sometimes use a criteria like it says likelihood of occurrence. We define them in 5 different subjective ways almost certain now what is almost certain means to you and me so for us to really understand the same consistency then we define and say. Almost certain means that it’s going to be anywhere about 90% probability. Ah likely means it’s a higher risk that we can say between 70 to 90% a possible means a 30% to 70%

Unlikely means 10 to 30% and rare means 0 to 10 percent so we come up with a framework that everybody is thinking along the same ah definitions so that when we identify risks and when we quantify. That we get consistency. Um from everybody and I think that is also important in terms of when we look at the impact. so so so I’ll I’ll give you an example on that as well. Like if you if you were to come up with a criteria for. Impacting a simplest way maybe on a range of 10 we can say you know what a 10 plus means it’s ah it’s ah it’s a catastrophic impact in terms of time impact or a cost impact. Um, and you can say a serious means. On a scale of 10 maybe 8 to 10 a moderate means anywhere from 4 to 6 a negligible negligible means 0 to two. So for example, we could come up with a criteria that actually has the words called catastrophic serious severe moderate minor and negligible. But then we can say what do you mean by catastrophic impact catastropphic impact means you know, depending on the project value like we could say that means we are talking about um a 10000000 additional cost to the project.

And we are also talking about six months delay by versus a negligible means you’re talking about maybe up to $10000 in our cost impact with ah ah one week of delay you see. I think we need to come up with a a subjective nature of the impact and also put a value associated with that one in terms of the cost and the time so that everybody in the team when we analyze the risks that there’s a consistent mindset about. 2 things the probability of occurrence and the impact and I’m sure Andrew you could think of many examples in in your domain in terms of how you define the probability of occurrence with relevant to the risks and then also how do we see the impact of the risks.

Andrew Ginter: So Nate the you know the key word I took out of that ah was was strategic sort of strategic versus tactical risks. Um, you know in in ah a large organization think I don’t know a ah power utility with 40000 employees. Um, lots of different people are. Involved in lots of different kinds of risk management at lots of different levels. I mean you know individual technicians who drive out to a high voltage substation. They do not touch anything in the substation unless they know that it’s been de-energized ideally that you know they’ve de-energized it themselves so that they don’t you know, get. Two hundred thousand volts you know flying through them and and killing them on the job whereas you know senior management would tend to deal with risks of I don’t know. Ah you know, ah an earthquake. Collapsing the the head office and having to relocate you know the the functions of the head office to a backup office um on an emergency basis. But you know at what level of an organization should you be dealing with cyber risk and I think um. The the answer that I heard sort of in terms of general principles is that ah the highest levels of the organization have to be dealing with strategic risk and you know strategic risk is risk that puts the entire existence or the mandate of the organization at risk. So.

You know in the example of the ah the computer factory that I gave to to Janakka Um, and you know the the yeah the interference with the factory by a foreign intelligence agency that’s trying to give their own factories in their own country a competitive advantage that interference. Could be existential. It could drive the the computer factory out of business.

For example, if ah, if pricing information has been stolen from the it network in this in this factory and you know this allows the the factories in the other country to you know, buy ten cents by a dollar undercut the price of the the products produced by by this factory. Or if you know they’ve the the intelligence agency has wormed their way into the operations network and has been tampering with the the devices. The plc is controlling production and you know introducing flaws defects into the product that have to be repaired at ah, a massive cost. You know you could. This with this kind of interference you could drive the the factory out of business. The company out of business that level of threat is something that needs to be discussed at the board level in my understanding that’s a strategic threat. You know, lower level threats of you know I’m sorry if we mess with our if if we don’t. Comply with with the law regarding I don’t know. Ah you know, electromagnetic emissions or different kinds of compliance risks might be dealt with lower in the organization. Um, but you know strategic lift risk has to be dealt with at the highest levels and lesser risks are dealt with you know. Elsewhere is is what I took away here.

Andrew Ginter: Um, okay, so so we’ve identified our risks we’ve in a sense prioritized them. We understand which are strategic. You know we’ve we’ve quantified them. What’s next. How do we deal with these.

Janaka Ruwanpura: So so now you could actually you could come up with a nice risk matrix and the risk matrix will tell us based on the probability of occurrence and the impact which ones are high risks which ones are low risks which ones are in the middle. And that’s where you look at and and say hey I mean we have a high risk which is the probability of occurrence is very high. It’s a catastropphic risk and then do I want that risks to come all the way down to a low level. We are. Want to make sure that you know it’s a rare occurrence of that particular risk or the the impact is going to be very negligible right? Or somebody said you know what? no let’s also look at it in the alternate scenario. We won’t see that that risks could. Could could oca like you know it could be possible to oca if that happens that maybe there’s a moderate impact because of that risk so that’s where we look at now a framework about risk response planning and that’s where the two keywords come back again. 1 that I mentioned earlier called the proactive versus reactive right? So and actually you know my domain when I do things I actually have a kind of a decision tree built into to both proactive risk management versus reactive risk management.

So what are the different options available when you’re dealing with a ah proactive risk management because we see that potential risk coming in but we do have time to eliminate the risk or to mitigate the risks. Or to accept the risks or or to transfer the risks the the 4 things that I can I can elaborate on that. But if you’re now dealing with proactive versus reactive. How do we deal with you know I’ll give an I’ll give you like you know, ah kind of a simple ah decision tree. We can actually say you know what. The current probability of a particular risk is about 80% but we have 3 choices we can eliminate it that means there’s 80 % chance we want to eliminate it like we want to make it into a 0% that we will never see this risk. Okay. Or we can say you know what the current probability is eighty I mean let’s try and mitigate to about ah a 20% ah probability a 10 % chance of this risk happening right? So we will. What can we do proactively to mitigate this risk. Oka oh we can say know what I think this is kind of a risk that um I mean in in a project environment. There are various key stakeholders in India let’s say we have an owner or a consultant or a contractor or are the parties and say you know I think for this particular risk.

It may be better for us to transfer the risk to a party that could better handle this risk and so we can think of 3 options eliminate mitigate or transfer depending on the nature of the risk. But if you want to look at a reactive nature of risk the word eliminate does not exist because you know reactive means that something has already happened and you cannot eliminate it now. So your choices are either to mitigate the impact of the risk which means that you know. Through the risk Analysis. We identified if this risk occur. It’s a $100000 impact but I can mitigate this one by maybe spending maybe $60000 so that the the impact could be cut Down. We can even think about it and say how do I mitigate the impact of it. Maybe we do something that that it will not have the same $100000 impact or you know what? Yes, we can see the signs of this risk. But I think rather than. Me as a stakeholder handling the risk I could probably transfer these risks into another party who has a better authority or the accountability to handle the risk and we could do it in our transferring the risk and then handle it that way. Oh you know what? the risk has already happened.

There’s nothing much we could do it. Let’s accept it and deal with the problem right? I mean when you are you know I’ve also done some work in the disaster area right? You know particularly the natural disaster area with respect to you know tsunamis and then also um. The Tornadoes um and and that’s where sometimes you know you have to accept the impact of it I mean it, you know it happened and how do we deal with it now. Um, so so depending on the nature as I Said. Proactive was as reactive you could come up with a ah decision tree that will that will show different options and also will show the consequences of of those options to the project so that you can make it successful and dealing with. The risks flow.

Andrew Ginter: So I mean one of the things that you know now that we’ve had some of the big picture here. 1 of the things that always always puzzled me is when you’re doing um you know I I get deeply involved in cyber risk management but not so much you know management of the risks of earthquakes or of you know. Fires or of you know pandemics who knows what um and so you know if you’re let’s say you’re building I don’t know a hospital the systems that you’re putting in place have to protect the confidentiality of patient information. The design for the structure has to address the risk of earthquakes in the region because we can’t have the structure collapsing on all the patients. The design of the electric system has to ah you know allow for backup power supplies if the the main power supply fails because you got to keep your patients alive and electricity is is used for that so you got. You got different kinds of risks that you’re managing. Do you ever have to trade off 1 against the other and say this one’s more important I’m going to focus on it. Um, you know the other ones I’m just going to accept. Ah or you know is is something else going on here.

Janaka Ruwanpura: I mean Andrew I think it it took 2 different ah things to look at it one is that um, if we identify exactly the same 2 risks that you mentioned if they are important if they are. Um, that been identified in in our risk matrix through the probability of occurrence and the impact has been critically that we need to handle it how we handle it proactive versus reactive with 2 different things. But also the second one is at what stage this could happen like you know. Is it happening in the design stage. It could happen in the in the construction stage or is it happening in the commissioning stage. So if they’re both important and we need to tackle them. We don’t tradeoff we deal with different strategies to to deal with it right? You know you know one could be be proactively trying to eliminate that. Maybe the other one could be. We will be reactively mitigated right? So that 2 different things. Um I mean I think you know, um as the time goes like you know cyber securityity related risks are really being critical. In many of the engineering and construction projects because I mean the example you gave in in hospitals ah research facilities universities are becoming really critical now. So that you know we don’t trade off but if it’s important and if it’s high, then we we must find solutions to deal with that.

Andrew Ginter: So Nate, the question that sticks in my mind, at at waterfall we work with you know, heavy industry we work with people who are are dealing with you know, powerful, dangerous physical processes. You know they deal with risk every day. Um, and what I’ve heard from time to time from from different stakeholders in these organizations. You know, depending on the organization is you know Um. Andrew we’re we’re not going to worry about cyber for now you know we have bigger fish to fry and they talk about other risks and this was in a sense. You know my goal in in bringing janaka on is to try and understand how does cyber fit into the bigger picture and what I what I just heard him say was look. Andrew if you’ve got a strategic risk if the existence of the organization if the mandate of the organization is you know has faces a serious threat look you have to deal with that. The board has to deal with that. The executive has to deal with that. You cannot ignore material risks.

It doesn’t matter if you have lots of risks on the table you have to at least think about every one of these risks. Um, and you know that’s an insight I didn’t have before that you know. You know the the folks deal with you know, senior decision makers that deal with the risks you know major risks due to fire due to earthquake due to cyber you know, sort of independently. But you know it still begs the question where did that question come from and this is what you know? let’s let’s listen back in again sort of my my next question is is a little bit clarifying in terms of when can you trade stuff off and it you know it turns out it has more to do with different threats that have the same consequence in a sense. It’s the same risk as opposed to different risks. But you know if you’ve got different important risks. The lesson here is you have to deal with each of them.

Andrew Ginter: Instead of talking about risks with very different outcomes leaking patient information versus the building collapsing um, can we talk about risks that in a sense have the same. Consequence. Ah you know a solar farm might have motors to move the solar panels to track the position of the sun and they might have those motors because ah if you if the motors are working properly. They produce. You know the the farm produces twice as much power in a day if ranssonware gets in there and cripples the the computers that control the motors. And the the panels freeze you only produce half as much power as you expected for the day but you also might have mispredicted the weather I mean the weather is variable sometimes it’s cloudier than you expect and you only produce half the power in the day that you thought you would um, you know you might. Have a cloudy day dozens of times in the year you might have a ransomware incident once every two or three years when you have in a sense the same outcome of different causes of risk. Is this a time where you might legitimately say I’m going to trade off how much money I spend on one versus the other is the you know when is this what makes them comparable?

Janaka Ruwanpura: Yeah mean and I I think that that’s where um I give that scenario called if then scenarios like. For example, you could isolately look at each one of them individually or you can look at them in a combined way like for example, you know what? if a ransom were. As well as a cloudy nature would have a more cumulative impact um to the to the far right? versus you cannot look at individually in all the the cloudy situation. I mean as you said the weather is very random. Maybe we don’t know that one versus run somewherem. So that’s where we have I think that’s where the team needs to look at by looking at all those possible risks coming of its scenarios and you look at those scenarios and then. That’s where the tools like simulation or decision trees or as I said this analytical hierarchy process like Hpa we can evaluate each of these scenarios and see what’s the impact and then maybe as a result of that you could even come up with a better risk management. Strategy and so and that’s a beauty about but the key is it’s a committed effort to identify these scenarios when you identify the scenarios you can actually um, you know, analyze it and then come up with the better ways of handling.

And then that also will determine. You know what we probably have to practically deal with these things. Maybe we need to invest upfront to deal with it versus you know, um, looking at the reactive scenarios of managing risks.

Andrew Ginter: Well thank you Janaka this has been this has been educational for me. Thank you so much before we let you go um, you know. What should our listeners take away from this episode. What’s sort of the the number 1 takeaway for you.

Janaka Ruwanpura: I mean the key message that I want to make I want to pass that one as an academic as well as somebody who had dealt with industry and work with industry on the risk management side of things. I’ve seen people are making a commitment to do a proper job of a proper risk management process where sometimes I see them as a procedureal thing or a ad hoc thing. They won’t have the commitment they did simply doing it because they have to do it. So therefore my message is that if you it’s really really important and particularly in your domain about the cybersecret area to make sure that we do a proper risk analysis to ensure that we identify them. Really understand them. We qualify them. We quantify them. We come up with a better risk management risk response options. We look at various scenarios of if then scenarios to see whether like what’s the best way of handling them and that’s where we can help from the University Of Calgary I mean we have. We have experts here in terms of the cybersec security area at the University Of Calgary that we have you know a 2 of a computer computer science department. Um, and then and then to our um schlix school of engineering and we have experts actually in other areas in the faculty of law faculty of arts.

Um, in terms the policy side of things as well as we have experting experts in the risk management through the project risk management site through um through the shulik school of engineering with center for project management. Excellent. So There’s lot of things we can. We can help but to support. Ah, the Cyber security area and then I hope that my message is properly relate to you in terms of make a commitment to do a better risk comprehensive ah process and you will be happier and at the end of the day.

Nathaniel Nelson: So that was your interview with Janica rawaurra andrew do you have anything to take out this episode.

Andrew Ginter: Um, yeah I mean I’m I’m very grateful to you know? Dr Janaka Ruwanpura for joining us. Um, you know I don’t know I might have mentioned I’ve been writing a book on you know, ah 1 of the big topics in it is cyber risk ah for years now I’m I’m hoping to be done by october. Um, but something that had confused me. Ah you know time and again is is talking to people doing risk management and you know hearing stories like look um you know we? Yeah, we have bigger fish to fry than cyber. We’re not so much worried about cyber taking down one of our high voltage substations. Ah, you know we worry more about squirrels eating through the insulation getting electrocuted frying themselves and shortcircuiting everything and shutting down the substation and I’d always tried to you know understand how does how does that fit into the big picture does this really make any sense and what you know? Ah what? Janaka cleared up for me was look strategic risks important risks you have to deal with them independently if they’re important they’re important you have to deal with them. You can’t trade off you know the risk of a fire against the risk of an earthquake you have to deal with these. Um where you can legitimately trade off is when you have multiple threats. That have the same outcome. So if you’re if you’re so if the cyber scenario you’re looking at is one that would take down one substation the same way that a squirrel would eat through the insulation and take down one substation.

It’s reasonable to say how often do squirrels do this How often do Cyber do this is is really worth is this a problem worth solving if instead your cyber scenario could take down the entire grid. You know that’s a different Animal. You can’t compare that to squirrels. It’s a different consequence so that that bit of clarity is something that. Had you know, confused me for a very long time and I’m I’m grateful to Janaka for for you know, clearing that up for me.

Nathaniel Nelson: All right then with that thanks to Dr Rawanpura for speaking with you Andrew and Andrew as always thanks for speaking with me this has been the industrial security podcast from waterfall. Thanks to everybody out there listening.

Andrew Ginter: It’s always a pleasure, Nate. Thank you.

Previous episodes

Share

Stay up to date

Subscribe to our blog and receive insights straight to your inbox