Israeli Cybersecurity – Dr. Gabi Siboni | EPISODE #9

Picture of Waterfall team

Waterfall team

Gabi Siboni joins us to talk about standards, challenges and current initiatives in Israel – perhaps most thoroughly-cyber-protected nation on the planet.

Dr. Gabi Siboni – Director of the Military and Strategic Affairs Program and the Cyber Security Program at the Institute for National Security Studies in Israel, and the founder and editor of the journal Cyber, Intelligence and Security.

Apple Podcasts Google Podcasts Spotify RSS Icon


Intro: The Industrial Security podcast with Andrew Ginter and Nate Nelson, sponsored by Waterfall Security Solutions.

Nate: Hello and welcome to the Industrial Security podcast, my name is Nate Nelson. I’m here today with Andrew Ginter, vice president of Industrial Security at Waterfall Security Solutions, and he’s going to be introducing today’s guest. Andrew.

Andrew: Hi, Nate, it’s great to join you again. Our guest today is Dr. Gabi Siboni. He is the director of the Military and Strategic Affairs program and the cyber-security program at the Institute for National Security Studies in Israel, he’s also the founder and editor of the Military and Strategic Affairs Journal.

Nate: And, Andrew, you got to speak face-to-face with Gabi. So, here, I will cut now to your interview with Gabi Siboni.

Andrew: Thank you for joining us.

Gabi: Thank you.

Andrew: So, can you give our listeners a bit of background, what is the Institute for National Security Studies? What does it do when and what is your role there?

Gabi: So, the Institute is a nonpartisan Institute in Israel, it’s a think-tank that tries to enhance the national security of Israel. We have various programs within the Institute, I hold 3 positions. I’m the head of military strategic affairs program and the head of cyber security program, and I’m the founder and editor of cyber intelligence security journal; it’s a peer-reviewed journal, academic peer-reviewed journal that is published both in Hebrew and English 3 times a year and is distributed free of charges for the public.

Andrew: Yeah, I’ve been on your website and I’ve seen the journal, so it’s interesting reading. So, I think-tank focused on security, the focus of the podcast here is cyber-security, and in particular, cyber security for industrial operations. In 2016, you co-author the document, ‘Guidelines for a National Cyber Security Strategy’, where you talk about the role of the State and the role of private industry. In many countries, a lot of critical infrastructure is privately owned, in others, it’s government-owned, can you talk about the role of government versus individual facilities and businesses, what role should each play in ensuring the security of the population?

Gabi: I’ll discuss, this is further discussed in the book I wrote just published last week, ‘Regulation in Cyberspace’, and we have a model developed to try to define the relations between government and let’s say the business sector. Now, to focus on industrial systems, I think that should be the focus of our discussion. So, the focus on industrial systems, of course in every country, some of them are held privately and some are held by the government or by some let’s say public entities. Some our critical infrastructure and some are not. So, there are I think most of the industrial systems are privately owned and privately owned by companies that use industrial control systems for their own activity. So, when we discuss the relation between government and the owner source or the operators of industrial systems, we should divide those 2 what we call critical infrastructure system which may affect the country’s continuity, ability to continue with the critical service to the citizens, say power supply, water supply, transportation, train, a variety of those things, those entities, or even hospitals. And depending what you define as critical infrastructure, it’s differently defined, I assume, between various countries. And on the other hand, and in Israel, those entities are guided and regulated by the government. There is a government, our National Cyber Bureau or Cyber Directorate is in charge of critical infrastructure, of the regulation of critical infrastructure, the cyber-security regulation of critical infrastructure, but this is only a fraction of the industrial control systems that are used in Israel; only a small fraction. And the in our proposal in the book I just mentioned, we recommend that even private entities should be somehow regulated by the State, because some of those entities, if they are harmed in their systems, may affect national security. We don’t care if the business is going down, that’s the shareholders problem But if it affects national security, I think the government should regulate them as well in a certain model. So, we propose a model how to regulate those. This is in a nutshell, how I see the relations between the State and the operators of industrial control systems.

Nate: So, what Gabi’s talking about here is identifying critical infrastructures. And if you look online to DHS resources, you could see that the US has identified 16 sectors, Canada’s identified 10 different sectors which qualify, what Gabi’s talking about seems pretty standard, correct?

Andrew: It is. What I heard though, what I heard Gabi say though is that, for all of these critical infrastructure sectors, Israel has put in place regulations that apply to them, and that is unusual. If you look at the DHS advice to the sectors, they’ve got advice to a bunch of sectors, there’s regulations in place for I think 2 of them and nothing for everything else, and there’s even fewer in Canada. So, the breadth of regulation in Israel is unusual.

We will put a link to your book.

Gabi: It was published in Hebrew and now it is being translated. So, it will take a bit when it’s ready, I think you do it when it’s ready, I expect it to be within a couple of months.

Andrew: Okay, we’ll certainly do it when it’s ready. When we come to the topic of regulation, it’s a controversial topic. A lot of businesses see regulation as unnecessary cost. People describe the difference between security and regulation as, “Security is doing whatever I think is necessary to secure myself against threats that I’m worried about, and regulation is doing what someone else has told me to, whether it’s useful or not.” Can you talk about what kind of regulation you think makes sense? How prescriptive is it? How much leeway do individual businesses have for risk-based decision?

Gabi: Regulation is some sometimes being considered as let’s say as something that decays business, yes, which might be the case. But, again, you have to find the balance between the resilience and your ability to provide continuity in crisis time and the businesses. So, first, my main recommendation is you should not regulate whatever is not affecting the public. So, if it only affects the clients of the company, let’s say a bank and only the clients and it does not affect the stability of the financial system, then let’s be it so the government should not intervene and the shareholders should take their own business and also you decide what’s their investment in security based on their business considerations. But if the entity or the firm, the whatever affected by a cyber-event, be it let’s say a deliberate attack or just an event that happens because of malfunctioning, negligence or whatever, the question will rise, “What will be the effect to the public safety security and to the national security environment?” If there is a significant effect, then we should intervene.

I will give an example to make it a bit clearer. Assume that you have a garage that takes care of cars. Simon & Co garage, it’s a small garage in a little town and takes care of trucks and cars. So, nobody should regulate what this guy is doing with that. But assume that in the case of the United States, I don’t know how it works, but in Israel, it looks like this. Assume that the vehicles of let’s say one of our large manufacturers or distributors of food are treated in this site, and through this supply chain, because is it part of the supply chain, through him, you can find that he can affect those trucks, he can do a variety of mischief and steal information from those trucks having all the list of clients, a variety of things, or if this the garage is dealing with let’s say as the security service cars as well. Let’s say one of our security service taking care of sensitive cars there. In which case, you see if it was just an ordinary garage, we don’t care what happens. But because of this, the fact that this garage is taking care with sensitive or somehow can be sensitive, we should intervene and say, “Okay, you have to control what you do or we have to regulate what you do.” Be it like a big caterer that another example, that is supplies food. So, he gets a lot of let’s say details of personal details and a variety so, if the problem is a small amount of information that is distributed through him, then no problem, but if this is hundreds of thousands of people registered in this caterer and it supplies food, then we want to make sure that this caterer the issue of privacy is somehow regulated. So, it’s not only a business issue for him. And as well goes with the industrial control systems of course, it goes without anything. If we have a Teva which is a huge pharmaceutical manufacturer in Israel, yes. So, Teva has a lot of control systems, industrial control systems. So, we say, “Okay, Teva should manage their own business,” but if something happens to Teva and something happens in the manufacturing line, then it’s not only the problem of the shareholders of Teva. So, there, you should have some kind of a tool to intervene, that’s our model

Nate: So, Andrew, is what Gabi’s saying that regulation is a matter of scale, that when it comes to private business, it’s not necessarily even the type of business that matters, it’s the size of the business?

Andrew: I think what he’s saying is that it’s the impact of the business that is the question here. And what is again unusual about the approach he’s describing versus the approach we see in the United States and Canada in parts of Europe, what’s unusual is that it’s not even a sector he’s talking about, he’s talking about individual sites with significant impacts. There is no sector in which a garage is part of a critical infrastructure, but he makes the case that, if a risk assessment determines that a particular site is vital to the national interest, it comes into scope for this kind of assessment, and if necessary, regulations. So, that again is very different from the approach taken in a lot of other countries.

Nate: Right. It seems to me like it could just be something that works for a tree as small as Israelis, but if I were to extrapolate and think about how this idea would be used in practice in say the US, it just seems to me that you’d have a whole sectors and industries and so many more companies to deal with that would almost backfire as a strategy in the first place.

Andrew: It’s tough to say. We would want to get a policy guru from the US on a future podcast here and discuss the issue.

Nate: Hopefully we will, but in the meantime, let’s get back to Gabi.

Andrew: So when we’re talking about regulation, we’re talking about standards, how involved is your organization in defining specific standards or regulations? Can you compare what you’re doing or what Israel is doing with what’s happening in other countries, for instance the NIST framework or NERPSIP or anything like that?

Gabi: I have to divide this between my business operation and my academic operation. In the academic side, we do not develop actual detailed standards. We are investigating or researching existing standards like NIST or ISO, a variety, we’ve been asked now to evaluate the risk of the supply chain for the financial sector in Israel. So, we evaluate a variety of models and come up with our own model, so that’s what we recommend, but we don’t go into details. What is in my business activity, we have to go into details because we provide risk assessments and we provides consultancy to our clients. So, we have to go into details. In Israel, the uses is various, you can find various uses of NIST and the ISO and FIC and a variety of standards or frameworks. We have an advantage because the National Cyber Directorate has recently published last year let’s say the defense doctoring on cyber-security, both for large corporates and also for small companies. And this is a very, very good tool, a very strong tool that we also use and it has very detailed also into the side of let’s say industrial systems. So, we have a very good selection of tools we can use. One is maintained and developed by our National Directorate, Cyber Directorate which is very good, and others are maintained by the US or ISO organizations. We can select whatever we choose or the client wants.

Andrew: So, Nate I went online after the interview looking for the defense doctrine on cyber security. I did not find it, I think the name is something different. What I did find on the National Cyber Directorate’s website is a cyber-defense methodology for an organization. This is guidance, not a regulation, and describes a range of organizations. The sort of strongest recommendations in the guidance apply to any organization where the worst case cost of a cyber-breach, the worst case consequence is 20 million Israeli shekels or more, which is about 6 or 7 million US dollars.

Nate: And that’s worst case that frankly, look, 6 or 7 million dollars is a lot of money, but when we’re talking about big businesses, it actually sounds like a very low number to me.

Andrew: It is. And if you look at the measures, the measures recommended for businesses with that kind of worst-case consequence, the measures are very strong. I would compare those measures to the measures in the 2014 French ANSSI, A N S S I, regulations. And I’m on record in the past as describing those measures, as far as industrial cyber-security goes, as the strongest civilian measures in world.

Nate: It sounds like what you’re telling me is Israel is very active in this regard, that they’re implementing pretty strong regulation, perhaps setting an example.

Andrew: Very much so, regulation. This was guidance, but still it’s way out in front.

Nate: Alright, let’s get back to Gabi.

Andrew: You mentioned earlier in your example, you’ve mentioned a couple of times the topic of supply chain integrity. I understand you folks are cooperating with other people in other countries on this very, very difficult problem. Can you talk a bit about that work?

Gabi: So, the supply chain is a big problem. If you look at history, you’ll find that most of the let’s say published high-profile cyber-attacks have started through supply chain, not directly through the organization. So, the supply chain was a route to take you into the organization. Now, I’m involved with the initiative called CISP, an American organization, nonprofit organization trying to the promote the supply chain resilience in the energy sector in the United States, mainly by trying to let’s say vet hardware and to provide some kind of a license to hardware or registering hardware in a certain security standard that they will try to develop. My main idea is that you have the big companies trying to compete, not on the security side, but on other issues, and to have the security side somehow being it will be standardized with those big manufacturers like Siemens, General Electric, Imran, you name it, and this is my recommendation. However, the problem is not with new facilities. New facilities may be designed to have security embedded within them, yeah. The problem is I think the majority we have legacy systems that are working with relays with 50 years old the equipment, and the OT environment is less updated than the IT environment. So, I think that equipment to provide somehow to have an add-on to an existing facility and to try to enhance the security through add-ons that you put in those existing facilities, isolating environments and helping put the security level at a higher level, I think it’s a huge effort and it’s a very important effert.

Nate: This seems to me like a recurring theme in ICS trying to adapt modern solutions to outdated OT technology and infrastructure.

Andrew: That’s right. It’s referred to in a lot of places as the green field vs. brown field problem. It is a difficult problem because there’s a lot of very old equipment in industrial control systems. There’s an expectation that equipment runs for a long time. Think your own kitchen, modern refrigerators have CPUs in them, they’re connected up to networks, are we going to throw out our refrigerators when the vendor stops issuing security updates? Probably not, it’s still a working fridge. So, there’s these expectations when there’s physical equipment involved. But his original point on supply chain, that’s also a very big problem. This is a very active research area in a lot of organizations. This is a problem that has not really been solved. Talking about registering suppliers, establishing chains of custody, cryptographic signatures, there’s a lot of work going on in this field.

Nate: And the next question you ask Gabi was more about supply chain, let’s listen in.

Andrew: You mentioned that a lot of compromises historically have been on the supply chain side. It’s my understanding that CSP is looking at hardware supply chain. The compromises I recall are things like vendor laptops with crap on them being carried into sites or people compromising suppliers or stealing remote access credentials and remoting into their real target, but that’s all software.

Gabi: It’s a combination between software. I wouldn’t say that CSP is only related to related to hardware, but even software what you mentioned, for example, you have a PLC that you have a computer that is trying to update the letter diagram or whatever, adapted the software of the operation of the industrial system to after it the process, the process that this PLC is taking care of. Even this connection is managed by software, but also by hardware. So, you can manage this this risk by 2 mitigations, 2 families of mitigations. One is a hardware mitigation, both may be in the laptop or in the actual PLC hardware, and also by software integrations, but I would assume that you will need 2 of them. And also, not only technological mitigation, may be you will need a third kind of mitigation which is a just procedural mitigation. Let’s say you have only 1 computer that is authorized to be connected to a certain PLC and that’s it. When this is not happening, you have to register another laptop to do that. So, there are a variety of controls that you have to put, some of which are hardware, some of which are software, some of which are non-technological whatsoever coming from processes and let’s say procedures of action. For example, I’ll give you a little example, those big turbine manufacturers, they want to have access to the production turbine to make sure that all the parameters are in line with their envelope of operation of a turbine generator of power.

Andrew: That’s right, vibration is the enemy about rotating.

Gabi: Vibration may be not only, by pressure maybe pH. So, a variety of chemical and there’s a physical characteristics. They want to make sure that it is operated, because if they give you a license, they give you a guarantee that it will work or you have some kind of gravity, they want to make sure of that. So, they need an online connection. If you have a Siemens turbine and it is installed, well say for example in Israel or in the United States, the office in Siemens would like to have access to see the this data. So, how it is managed? How do you provide? You cannot have a line connection, a physical line connection, even this can be jeopardized, but the physical line connection between all the turbines of Siemens and the center of the technological center of Siemens. So, you have to find mitigation somehow beyond the technological, the technological environment. But that’s a whole how you mitigate those kind of risks and now you provide controls has a variety of software, hardware, and non-technological controls. It happens also with airplanes now. Engines of airplanes are also transmitting operational data to the manufacturers. Rolls-Royce engines, I would assume that they’re also engines are transmitting data to Rolls-Royce control systems. So, I would assume that in a certain risk, you can play with the engine while you can control sample for the parameters of the engine from the ground of airplanes.

Nate: If lease lines are used back to the turbine manufacturer in the example that Gabi used, does that not address the risk he was talking about? There’s no internet involved.

Andrew: Well, address is one kind of risk, it addresses the risk of attacking the communication mechanism, the communications as they pass across the internet. But if you look at the information flows, all cyber-attacks or information, every bit of information can be an attack. Where is the information flowing? It’s flowing from the turbine monitoring system into the central site where information from lots of turbines is aggregated, and it’s flowing from the central site back into the monitoring system. That backwards flow is a potential attack. A lot of regulators worldwide are looking at connections like this and saying, “What does this really mean for our security?” They’re asking, “If somebody manages to compromised that central site, now they’ve got an attack channel back into hundreds or thousands of, in this example, turbines worldwide.” In a sense, this is very similar to the sort of the new fad in industrial control systems, which is the industrial Internet of Things. We have devices throughout control systems all over the world connected through firewalls through the internet into cloud sites all over the world, it’s in a sense the same problem. And so, I asked Gabi about this problem.

So, this is all within the scope of the supply chain integrity project, it’s not just the manufacturing the equipment and delivering good equipment to the site, but ongoing connections you said. So, is this related at all to what people are calling the industrial Internet of Things, which is lots more connectivity, everything is connected to the cloud?

Gabi: Well, this is a huge issue because not all of them are what we call industrial control systems. You have a lot of connected devices, all the IOT family, billions and billions of devices, connected devices, refrigerators, printers, cameras, everything that is connected. I don’t know if this this recording system is connected. So, a lot of stuff is connected, and how to control and provide controls and mitigate the risk is a huge problem. And because the cost of those equipments, if you buy a camera a few dollars or maybe a sophisticated camera added a few tens of dollars and it’s connected, so how would you provide any security to this environment? You have to find other measures to control this kind of security. We have the example of Dyn the attack on Dyn couple years ago, which it was had a DDoS attack. DSN registered at the United States and they had I think hundreds of thousands or maybe have 100,000 of IOT devices sending requests to their servers took down the servers for some time, and Twitter went down in the a variety of companies that were registered with Dyn providers, DSN providing their name were taken down using this kind of connected devices, very simple, stupid connected devices. This is a good line of business, if you can go into this line of business, you’ll make it fortune if you find the Holy Grail.

Nate: Fun fact about that Dyn hack that the Gabi was talking about, at the time of the incident, it was the world’s largest hack in terms of the amount of flowing data. It came in at 1.2 terabytes per second of data flowing into this internet service provider. It happened to me that a few months later, that same record had been broken, but that just goes to show the scale of exactly the problem the Gabi is referencing here.

Andrew: That is impressive and it’s a huge scale. The concern that I have in this space though is less equipment being repurposed to attack other sights, what I’m concerned about is incorrect physical control, which led into my next question here. So, let’s listen in.

We’re drifting away from industrial, but I’ve often wondered if we are not going to see the very same problems that we have in the industrial space show up in the consumer space. So, for example, a consumer appliance store and buy a stove, a kitchen stove, a kitchen oven. Many of them have liquid crystal screens, CPUs are controlling the burners. If they are connected through Wi-Fi into the internet into the vendor someday and someone breaks into one of the vendors (there’s going to be well-defended vendors, there’s going to be poorly-defended vendors) and turns on all of the burners, downloads new firmware to turn on all the burners on all the stoves at 2:00 in the morning on Christmas Eve as a terrorist attack, we’re going to see houses burnt to the ground, we’re going to see people die. Is this something that the government, that anyone’s looking at?

Gabi: In my history, I used to work a lot in the process. I’m a mechanical engineer, so I designed design processes in the petroleum and the wastewater. And so, we had to design pressure tanks for example. So, if you have a pressure tank and you have a control system that OT is industrial control system running and controlling the pressure in the tank, this is an example to answer your question. So, good practice will be always to put a mechanical relief valve. So, whatever the pressure of the control systems is being hacked or whatever is done with that or it is malfunctioning somehow, it will not go above a certain pressure to explode because you have a mechanical relief valve. Sometimes we put 2 mechanical, that leaves us to have redundancy. That I think is the future of those devices. You can have a lot of software and hardware technological barriers to try to mitigate your scenario, but in the end, I would recommend any those kind of manufacturers to have some kind of an external precaution. Like, a mechanical switch that it’s not connected, it will not allow things to go above a certain temperature, above whatever you put. It’s like a relief valve, but in a stove. And then you can take this example to a variety. So, good practice to this kind of equipment would be always to put some kind of isolated safety systems, not depending on, again, and it goes with cars as well. So, you should have always isolated systems, mechanical system that it is very difficult to manipulate. It is very difficult to manipulate a mechanical relief valve, pressure relief valve, very difficult. You will need someone to go up and manipulate it.

Andrew: This makes perfect sense. The more difficult we can make these techs, we can’t have someone sipping coffee in another continent attacking us. The more difficult we make it, the better off we are.

Gabi: What I said was easier said than done because the cost will be of course have an effect if you want to have those devices in cars or your devices in safety devices. In cars, I would assume, I don’t know how, I’m not in the car business, but I would assume that good practice would be to design those kind of systems in cars because I’ve seen some that car, you would not able to stop it somehow because of software problem, it can go, and you don’t have any mechanical way to stop the car. That is not good this practice for design of cars. But I think that anyway, we will have to have some backup safety systems in any reasonable connected device.

Nate: I like Gabi’s idea about sort of reverting back to mechanical precautions when it comes to cyber-security. It sort of reminds me of how old railroads worked. Today, we have all these electrical and systems that are sent into command control centers, but back in the day of course, the way that you stopped 2 trains from running into one another is you had mechanical interlockings. So, what would amount to sort of a command control center today was a guy and he had these giant levers, they were about 6 feet tall, and you could pull on them and they were all lined up one next to the other. They still have some of them in sort of old European towns, and listen, it worked. So, maybe if the answer here could be a sort of reverting back to the days of old where our means of creating redundancy in this technology could be, not just creating 2 different paths to the same end from an electronic standpoint or a software standpoint, but perhaps pairing software with a mechanical means of doing the same thing.

Andrew: That’s a good point, but as Gabi pointed out, there are cost issues. The reason that we moved from manual things like that was partly labor costs, partly people make mistakes, errors, and omissions, partly opportunities for increased efficiencies if we can automate things, if we can do things more efficiently, a lot of automation is about efficiency. On the other hand, he’s right in that these analog mechanical safeties are very reliable. Anything analog cannot be hacked, it’s irrelevant to information. It sounds a lot like what I hear from Andy Bachmann at Idaho National Labs, he’s putting together some stuff on consequence and driven cyber-informed engineering, CCE, he’s got an article in Harvard Business Review that talks about analog backups. An issue is that analog backups can prevent disaster, they can prevent equipment damage. In my experience though, they generally cannot prevent downtime. And if downtime for the power grid or for other critical systems, if downtime is what we have to protect, the analog approach is only part of it. But it is an important part and it’s a part that I think has been under emphasized recently and is becoming increasingly re-emphasizing by people like Gabi and Mr. Bachman and others in the industry.

Nate: Now, you had 1 final question for Gabi, let’s hear it.

Andrew: We’re coming to the end here, can I ask what are you working on right now? What other thoughts you’d like to leave with us to think about to research?

Gabi: I’m currently working in my academic activity, I just finished 1 book, I’m trying to get into another one, it will be a handbook for those who run cyber risk management. So, we’ll review all the frameworks and we’ll provide a variety of tools for and applicable tools, it will not be an academic book, it will be an applied clickable book for those who are active. I just published an article on, and from this article, we will expand it to a book, an article on guidelines for risk management for cyber security risk management. Then other researchers would be related to the supply chain, which is a very big issue, and we’re trying also to find ways to relate it to the financial sector. This is during in my academics practice to find ways to somehow put a framework on the damage of cyber-attacks in the financial sector, try to find ways to let’s say normalize this because we all know what we try to evaluate risks, but we are very difficult to value our risk, to put value, to put, in the end, what is the value of the risk? So, we evaluate the risk and we like to look for mitigation, but we are unable to put figures on the risks, which is a very big issue, and that’s the ROI and it’s all economic issues. So, and in my business activity, I’m just like any other consulting firm looking for work and trying to do it the best way I can.

Nate: Risk management seems like one of those subjects that is just going to come up over and over in the course from a podcast like this.

Andrew: Very much so, it’s a timely subject. We hear a lot of talk in the industry about risk based decision making, but people never really define how to assess the risk. So, we definitely need more hands-on specific guidance in this space, so I look forward to Gabi’s book. As far as I’m concerned, the more detailed, the more hands-on his is how to, the better off we are.

Nate: And with that, thank you to Gabi Siboni and thank you Andrew.

Andrew: Always a pleasure.

Nate: Until next time, this has been the industrial security podcast.


Stay up to date

Subscribe to our blog and receive insights straight to your inbox