CWE for Zero Days – not CVE | Episode 126

The Mitre CWE (Common Weakness Enumeration) database talks about kinds of problems that can show up in the future - future zero days - rather than CVE that talks about what vulnerabilities were discovered in the past. Susan Farrell walks us through the CWE and how both vendors and owners & operators use it.

Picture of Waterfall team

Waterfall team

Susan Farrell joins to discuss weaknesses vs vulnerabilities

“…it’s basically looking at the weaknesses or bugs in a code that opens up the vulnerability or the ability for a cyber hacker to enter and attack your code.”

Available on

About Susan Farrell

Susan Farrell Susan has over 20 years of experience in the technology industry and is passionate about finding and delivering cutting-edge solutions that address the complex and evolving challenges of cybersecurity, AI/ML, and industrial automation. As the OT/ICS Cybersecurity Champion and Head of R&D Commercialization at ObjectSecurity, she lead the go-to-market strategy, business development, sales, and marketing for products and services that protect critical infrastructure and enhance operational efficiency across various sectors, including defense, public sector, manufacturing, and energy.

Susan has a strong and proven track record of turning innovative ideas into tangible, revenue-generating businesses, working with diverse and cross-functional teams, and forging strong customer and partner relationships. She’s successfully closed the first sales of new products, supported the customer discovery and market research process, and created channel partner program for driving technology adoption. She’s also participated in prestigious programs such as SBIR TAP and NSF I-Corps, where she gained valuable insights into entrepreneurship, industry requirements, and commercialization pathways. Susan considers herself a perpetual student of novel technologies, a frequent speaker at industry events, and a trusted advisor to senior-level stakeholders.

Share

Transcript of this podcast episode #126: 
CWE to Zero Days – not CVE | Episode 126

Please note: This transcript was auto-generated and then edited by a real person. In the case of any inconsistencies, please refer to the recording as the source.

Nathaniel Nelson
Welcome listeners to the industrial security podcast. My name is Nate Nelson I’m here with Andrew Ginter the vice president of industrial security at waterfall security solutions who’s going to introduce the subject and guest of our show today Andrew how are you?

Andrew Ginter
I’m very well, thank you Nate. Our guest today is Susan Farrell. She is a strategic advisor for OT and industrial cybersecurity and our topic is more or less scanning for zero days. She’s going to be talking about a Mitre repository I wasn’t aware of the Common Weakness Enumeration repository, rather than the the Common Vulnerability Repository that that I was familiar with so you know zero days weaknesses not vulnerabilities. That’s that’s our topic.

Nathaniel Nelson
Then without further ado here’s you with Susan.

Andrew Ginter
Hello Susan and thank you for joining us before we get started can I ask you to say a few words for our listeners about yourself about your background please.

Susan Farrell
Absolutely and Andrew thank you so much for inviting me to this podcast. The first twenty years of my career I spent in oil and gas expiration and production or upstream side of the business working for The halliburton software division landmark graphics work for Silicone Graphics or SGI in high performance visualization of hydrocarbon reservoirs as well as a paradigm geophysical. Then I shifted over to IT cybersecurity and because of my oil and gas background that migrated over to a focus in OT ICS cybersecurity. So for the last ten years as what I’ve been very focused on. Also last couple of years have been focused on taking OT ICS cyber security that was developed for defense purposes and commercializing that for protecting critical infrastructure and nuclear manufacturing oil and gas chemical etc.

Andrew Ginter
Thank you for that. And our topic today is mapping for zero days I’ve never heard of this what is mapping for zero days what problem are we solving here?

Susan Farrell
Assembly CodeSo it’s really interesting Andrew there’s a lot of focus when you think about OT, ICS, or industrial vulnerability management is to look at CVEs or common. Vulnerabilities and those are the vulnerabilities in software code or firmware code that sits in embedded devices that have actually been published There’s also another layer of vulnerabilities known as KEVs or known. Exploited vulnerabilities again. Those are the cbes that have been known to be exploited in different environments and software. What CWEs

Common Weakness Enumerations or it’s a category system that was developed by Mitre and Kpac that identifies weaknesses and vulnerabilities in software code, both at a hardware and and software level. So when you’re looking at firmware for instance, so that is what a CWE is. It’s basically looking at the weaknesses or bugs in a code that opens up the vulnerability or the ability for a cyber Hacker to enter and attack your code.

Andrew Ginter
Okay, and I’m familiar with the CVE system and database I am aware of the no exploited vulnerabilities list. But probably don’t consult it often enough. But I know it’s out there. Cwe I’ve never heard of weaknesses. Can you can you talk about that a bit more. What? what is a CWE?

Susan Farrell
Sure so again it it represents a weakness in the code. So 1 example is a buffer overflow and buffer overflow is ah the ability for a cyberattacker to overinstruct code and be able to literally shut down the production of a PLC for instance or may over ratchet the PLC and cause it to explode. For instance. So there’s even a CWE that represents malicious AI code. It’s called CWE 1039.e there are CWEs that represent cryptographic problems in code or dangerous function within the code. So basically it’s just outlining the different weaknesses that a code can have in it that can be exploited. One of the great activities that I did over this past year is participated in the CWE ics working group. Where we mapped CWEs to 62443 so that you can determine if there are any CWE’s that might be resident in firmware on an ot ics device. That puts you at risk of unmet requirements from a compliance perspective.

Nathaniel Nelson
Andrew I’m quite familiar with CVEs but CWEs are a new concept to me and I’m not yet entirely clear on the distinction I mean, buffer overflow to me sounds like a vulnerability.

Andrew Ginter
Yeah I wasn’t familiar with Cw es either. But if you Google it Mitre’s CWE there’s a whole repository. There’s hundreds of these things and if. If there’s a buffer overflow in a CVE, it says something like this PLC if you send it that command it triggers a buffer overflow that can be exploited to Blah Blala these 17 models of PLCs run that same software. They have this vulnerability. That’s a CVE. That’s a known vulnerability. In CWE there’s an entry for buffer overflow that says a buffer overflow is when somebody sends a a message that is usually too long. And it it describes what is a buffer overflow and that says that’s it. It does not say it’s in this PLC or in that web browser or in CVE is sort of specific whereas CWE is sort of the idea saying well. If you’re not checking the length of your buffers. You could have a buffer overflow if you’re not careful about how you manage memory. You might use memory after you free it if you are not you know, careful about keeping track of of your your web browsers. You could have cross-site request forgery.

Andrew Ginter
CWE is describing types of vulnerabilities whereas CVE is describing one of those types is specific in these 6 products. And if you you know, get the patch you you fix it. So in a sense CWE is is a long list of stuff that could go wrong in software and to me I thought it would be primarily useful to software developers saying oh I’m writing software here’s you know, literally hundreds of things I could do wrong. Let me go through them and see if I’ve done any of them. Does that make more sense.

Nathaniel Nelson
It does. So then if I follow you correctly it isn’t that we’re applying these CWEs to specific software products. They’re just a general list of categories sort of like Mitre attack. But for vulnerabilities.

Andrew Ginter
Um, I hadn’t thought of it that way. But yeah Mitre Attack is all of the ways you know, sort of types of ways that bad guys could attack us and CWE is all of the kinds of vulnerabilities that could be. Latent in in a particular product. So Yeah in a sense CWE is the the Mitre Attack of of vulnerabilities I haven’t thought of it that way I think that’s that’s probably a good way to think of it.

Andrew Ginter
Okay, so buffer overflow I understand you said you mapped it to 62443 though. I mean what? I’m the the standard I look at all the time in 62443 is 3-3 which talks about you should put a firewall here. You should. Do antivirus there it it talks about security controls. It doesn’t talk about vulnerabilities. What what part of 62443 did you map these vulnerabilities into?

Susan Farrell
So the three different parts that we as a group focused on were parts 3-3, 4-1, and 4-2 in For example, we were assigned a specific CWE each subgroup I was my first CWE that we mapped as a group was CWE seven ninety eight which represents cryptographic issues in code and it is was. Not a hard exercise to map it to specific specifications within 3-3, 4-1 and 4-2. We just mapped the wording. From the definition of CWE seven ninety eight for instance to the specific parts within 3-3, 4-1 and 4-2 and then as a result what Miter did is published the mapping. Of the CWE so that if you’re looking at setting your compliance requirements you can do an analysis of the firmware to see if any of those CWEs exist and then determine if by continuing to use that firmware in its existing state if it is causing compliance issues to 62443 the same can be done for 800-53 or 800-82 to map the compliance requirements to it.

Andrew Ginter
So could you give me an example of that I mean if you look at the PLC and say well it appears to be written in C that’s sort of the signature of the the binary in the firmware. So it is probably somewhere, possibly susceptible to a buffer overflow in what sense a have I got that example, right? and B if if I’ve got it right in what sense is that a compliance violation. There’s no there’s no, there’s no requirement in 62443 that says your. Your code must be free of bugs.

Susan Farrell
Ah, you’re correct it but it does say that you have to practice Good cryptographic practice or it does say that. You cannot have any cross-scripting weaknesses in your industry 4.0 web application For instance, or it will say that if you do have a buffer overflow issue. In your code. You are not meeting the the compliance requirements of certain parts of 3-3, 4-1, and 4-2.

Andrew Ginter
So Nate Susan went through that real fast. Let me give you a a concrete example. You know, 62443-3-3 is the standard that applies to systems if I have a a system that I’ve designed in my power plant and I want it to be compliant with the standard I would look at 3-3. 4-1 is a certified software. Secure software development process. So if I’m developing product and I want to be certified as a developer of secure product. That’s the standard. My my product development team. Is evaluated against my processes are evaluated. 4-2 is when you certify actual product if my team has produced a product and I want the product to be certified as compliant. It would be 4-2 that I’m compliant with so concrete example CWE is possible is what’s you know, vulnerabilities that are possible and what we’re mapping is look if you wind up with one of these vulnerabilities then you have a problem with this part of your compliance process. So concrete example, 3-3 has a rule number 5.7 that the language is is complicated but effectively it says if you’re going to send a password across a network then you have to encrypt it so that people can’t just read your password.

Okay, and the example that Susan gave was a CWE saying well there might be a bug in your encryption process that lets people see you know. Decrypt and and see the password. Okay so the mapping says look if you have this vulnerability in your products in your system and that’s an if CVE is what’s actually there CWE is what’s possible. If you wind up with this vulnerability then you would have potentially a problem with compliance with rule 5.7 because the bad guys could use the vulnerability to steal the password. So. That’s my understanding of how this this mapping against standards works.

Andrew Ginter
I’ve gone to the to the website I see mitre.org common weakness and enumeration. I see a long list of potential vulnerabilities and I can see if I’m a vendor I mean I’ve worked for vendors all my career if I’m a person developing the software in a PLC and whatnot. I’m staring at at a cwwe talking about I don’t know cross-site. You know, cross-site scripting or cross-site request forgery. These are vulnerabilities that my software developers should be looking out for when they’re writing the code. But. This list here. You’ve you’ve sort of connected it a bit to compliance do end users use this list. What good is this list to an end user is it is it primarily a tool for developers to say we should be on the lookout for these 700 different things in our source code.

Susan Farrell
So there are several different use cases for CWE’s and you’re right? Andrew it is a very valuable useful resource for devsec ops especially from a shift. Left perspective when they’re looking at Cyber-Informed Engineering or secure by design of the devices being aware of the CWE’s during the continuous. Integration and continuous development of OT ICS device firmware code. But beyond that if you are an asset owner and let’s say that you have just completed. Your asset inventory. It is typical that 20 to 40% of your assets may be end of life or legacy assets and ah. It’s no longer being supported by an oem but it is still in production. It’s still connected to your network and you will want to either a hire a threat hunter threat modeler to look at the firmware of those.

Susan Farrell
End of life devices to see if there are any of these CWEs that are in the firmware that can be done through reverse engineering. It can be done through binary analysis scanning. To see if these CWEs are present in your firmware. What actions you take after determining if there are any potential zero days in your firmware is to either a triage that end of life. Asset for replacement or work with your oem provider to see if there are any remediation steps to be taken. But. The other way of handling that is fencing the device much like what Waterfall does with providing Unidirectional communication to an end of life device to protect it. From exposure to a cyber attacker so there. Yes, there are many things that both an asset owner and a and an oem manufacturer can take in reducing risk if a CWE has been identified.

Andrew Ginter
So I mean this this sounds almost too good to be true if if I’ve got a list of I don’t know how many hundreds of potential weaknesses and I’ve got scanners available that can scan my firmware. You know, let’s say I’m a vendor can scan my my stuff and come back and say. Here’s all of your vulnerabilities fix them all now my code has none of but these these vulnerabilities the only thing I’m exposed to is I don’t know misconfiguration someone puts a weak password in Or. New kinds of vulnerability is being discovered it. It sounds really good is this is this how people use this.

Susan Farrell
Well, quite truthfully Andrew there are no silver bullets out there for doing binary analysis and finding a hundred percent of potential zero days or CWEs and not all CWEs are even exploitable. So even after you find a CWE could a hacker get to it is there a path to it. So you really have to use the the scanning as a part of your full stack when you’re looking at. Your vulnerability management. You’re using your SBOMs. You’re using your hardware bill of materials. You’re looking at endpoint protection you’re looking at network monitoring you’re looking at the configuration. Of the assets you’re making sure that that your vulnerable assets are fenced properly. That the information is being properly fed to your SIEM or your source so that you can trigger security policies to. To keep an eye out on access to your more vulnerable or end of life or legacy assets. You want to add a threat intelligence to your stack. You want to add?.

Susan Farrell
You know potential membership into an ISAC that is sector specific to what you’re involved in. There’s you know, Also the Great capabilities that AI is adding to industrial automation looking for anomalies within the code. So This is just a part of your full stack. But. I’m advocating not to miss the ability to proactively look for potential zero days in addition to monitoring for bad behavior on your network because often. By the time that you see the bad behavior on your network. It may be too late. So Add both proactive detection as well as responsive detection an additional part of that full stack is also digital twins. Digital twins are not only providing value for mechanical asset integrity but also from a cybersecur standpoint. It’s providing that redundancy and resiliency.

Susan Farrell
For your critical infrastructure environment to then apply patches or to then have discussions on remediation for these potential zero days. Triaging your end of life asset replacement Strategy. So it. I I Just want it to be viewed. And not be glossed over.. There’s a lot of value in knowing whether or not there’s potential zero days that are sitting in your firmware.

Andrew Ginter
Okay, so you’ve covered a lot of ground in terms of where CWEs might add value to a defense indepth posture can I give you a concrete example. Can you walk me through assume let’s say that I am I don’t know a medium impact power plant. That’s what. Personally I’m most familiar with NERC CIP I’ve got to comply means I’ve got to do a risk assessment a security assessment on my system I don’t know what is it every eighteen months or something like that. Whatever the rule is and let’s say as part of that. I bring in you know, an outside assessor they come in and very carefully run I don’t know Nessus or something and on all of my hundred PLCs and HMIs and everything else and it comes back and says you have 3 vulnerabilities. You didn’t know about. These 3 PLCs have a vulnerability. It’s in the CVE it was added ten days ago and I go what? what? I’m not supposed to have any of these quick put a plan in page in place patch this figure out if if I have to report this to the regulator and another sort of a month or six weeks later when the patch has been tested. It’s been rolled out. I’m done I am aware of no vulnerabilities. No CVEs that are evident in my asset inventory. I’m clean again. But I’ve learned about CWEs I go look at the cwwe database and it doesn’t have the name of my PLC in it.

Andrew Ginter
Um, if I have that sort of let’s say NERC CIP infrastructure in place already. How would I use a CWE.

Susan Farrell
So that is that is an absolutely fabulous question Andrew and I think that’s a lot of where companies they think that if they’ve handled all of their CVEs that they can call it a day. And and it is you know. Definitely a step forward in saying that you’ve eliminated all of your CVEs that’s a fabulous accomplishment. But again the CVE is a published vulnerability. It is. Not going to recognize a potential zero day and if you you look at your ot threat report that you did your your 2023 threat report. Some of the attacks that were reported. We’re actually taking advantage of zero days and not published CVEs so you’re not finished after you after you patch remediate all of your CVEs. It is important to Do a deeper dive and do vulnerability scanning of your firmware looking for CWEs that responsibility can be delegated to your third party vulnerability management provider – your compliance provider. It can also open up dialogue with your OEM manufacturer relationships that you have but the importance. Here is that you don’t stop just at the CVEs or or the kevs is that you do continue down the path of being proactive and seeing if there are any Cw es that reside in your assets.

Andrew Ginter
Okay, so maybe another way of of asking the same question. Let’s take the other extreme. I have I don’t know a shoe factory with hundreds of PLCs in it and other robots and I know I know that I have hundreds or even thousands of. Instances of vulnerabilities this this PLC that I have 73 of has 19 vulnerabilities that I know about and I just haven’t got around to patching it so I’ve buried these these vulnerable PLCs behind 6 layers of firewall and some encryption and some antivirus and some intrusion detection and you know. I already know that I have thousands of CVEs that I haven’t patched does c wwe do me any good. What sort of what how mature do you have to be before you can get some benefit out of a CWE scanner or approach or or whatever it is.

Susan Farrell
C codeYou know I think just about any critical infrastructure can take advantage of it I wouldn’t necessarily place a a shoe manufacturer is being critical Infrastructure. So if you look at the the the digital grid power generators. Electricity Providers. You’re looking at wastewater treatment plants you’re looking At. Ah. Nuclear Power generation or anything that could affect National security. Chemical refinery plants because you’re looking at highly targeted. Sectors for potential terrorism. Those are the low hanging fruit of sectors that take the most advantage of what we’re talking about. So the more mature critical infrastructure providers.

Nathaniel Nelson
So Andrew in its simplest form cdwes and CVEs they’re both tackling ultimately the same sorts of software bugs but CVEs from the after the fact perspective. Known vulnerabilities you deal with them in turn and c wwes as she mentions are the more proactive the looking for the things that will one day be CVEs if not dealt with today.

Andrew Ginter
Indeed but to me CWEs highlights sort of a very common blind spot that I see in engineering teams I see way too many so people often technicians sort of. People hands on lower down. You know, nose to the grindstone make this stuff happen focused on known vulnerabilities and the the the Calculus in their mind sort of goes. Well if I can patch. All of my vulnerabilities if I can get rid of all of my vulnerabilities then I am invulnerable and they work really hard to patch everything and the the whole system is sort of blind to the fact that well that addresses the known vulnerabilities. But who knows what’s out there on the unknown side and cwwe is sort of a tool that once you yeah, you have a glance at it you go There’s hundreds of these things. There’s hundreds of potential problems going to bite me if the bad guys find these before I do before the good guys do before. There’s a patch available then we’re in trouble.

Andrew Ginter
Um, so to me Cw sort of is is a powerful way of reminding people that it’s not just about patching there’s there’s there’s bigger fish to fry here. There’s sort of more dangerous scenarios that are possible.

Nathaniel Nelson
And this isn’t to undermine the the necessity and the difficulty in patching everything but it strikes me that maybe part of the problem is that it’s simply easier to conceive of all of the known vulnerabilities than to. Creatively address and go through all the things that you don’t know about.

Andrew Ginter
That’s right and it’s to me part of the problem is the the term vulnerability. When cbe uses the term most most of the vulnerabilities in the cbe database tens of thousands of them are bugs in the software that can be patched. If you look at the the definition of the term vulnerability in the 6 2 4 4 3 standard. It’s defined as any way that a system can be compromised many of which are bugs in the software that can be patched but others are.

Andrew Ginter
Configuration mistakes our stolen passwords our default passwords are any of the many ways that you can attack a system without exploiting a software vulnerability but the fact that they use the same word for both of these is very confusing. We know we had Eric Cosman on like a year ago and he admitted look. This is there’s confusion in the industry. Vulnerabilities are bigger than software bugs and again c w e reminds us about software bugs. We haven’t even seen about but the bigger picture is. You we need to be concerned about all of the ways that we can be attacked, not just exploits of known vulnerabilities. We do need to address those exploits. That that possibility. Yeah I don’t want to belittle that it’s it’s a very hard problem to solve. But the you know. Security is is bigger than that is sort of the lesson and Cw is one of the ways of of reminding us of that.

Andrew Ginter
And let me just continue in that theme and you’ve been working with this concept with end-users like this for some time can I ask you? when you look at. C to but you you said there’s scanners available if you run one of these scanners on you know, a system that is has a bunch of known vulnerabilities and a plan to get rid of them and has already got rid of all of the other known vulnerabilities and you run one of these scanners and you come up with What do you come up with how many instances do you come up with how unhappy are these these owners and operators to discover that there’s more to do. Can you talk about the experience of of working with CWE. .

Susan Farrell
Finding a CWE is not the end of the world. For the OEM or the asset owner and it provides a great collaborative opportunity. I was just included in a. Just listen to a podcast from Dragos for instance, that was using the use case of working with rockwell automation where they discovered some CWEs that then goes through the reporting exercise of actually C Wwe is a precursor to a c cv didn’t explain that earlier but you have to find a CWE you have to determine. It is exploitable you then? Discuss it through either sisa as the root management of that CWE for the oti cs sector and then once. It’s been communicated to the OEM they’re given the opportunity to remediate and then it becomes a named cv and Dragos is is a is a cv naming association or is also referred to as a cna. They went through the whole scenario of finding CWEs associated with rockwell automation devices or assets and found it to be a very positive collaborative discussion. Where Rockwell was very proactive in coming up with remediation. Before it was published so it can be a very positive activity and finding a CWE because it only reinforces, our.critical infrastructure cybersecurity position.

Andrew Ginter
Okay, so so work with me for a minute. I’ve led development teams for much of my career. We were responsible for products that had hundreds of thousands sometimes several million lines of code in them. And in spite of our best efforts of me and every one of my development teams every piece of production product that we released had bugs in it. There’s a certain defect density depending on how vigorously you test vigorously you inspect vigorously you design the code. In spite of your best efforts this always defects residual in the code. More defects in larger artifacts as a rule than in smaller artifacts. And some of those defects are security vulnerabilities just statistically. And you know, some of those those vulnerabilities we’ve discovered are are going to be discovered by the good guys and and are going to be fixed by the good guys before the bad guys find them some of them. Potentially the bad guys are going to find and use against us before we have a chance to patch them..

Andrew Ginter
You know this is sort of back of the envelope. This is my personal experience. This is this is sort of of principles of you know, software development in practice. You’ve been using these scanners you’ve been using Cw how. If if I scan a random hmi with 2000000 lines of code in it. Do I find zero days

Susan Farrell
So typically the the older the the device or asset is the more likely you are to find CWE’s in it. Because chances are it. Wasn’t included in a cyberinformed engineering program or secure by design program. So yes, you’re gonna you’re going to find c wwes the chances of those Cw es being exploitable is then the next step that you take. So it’s not a doomsday message that every asset is going to include a CWE that’s exploitable. So that’s why the importance is is to triage it based off of. Resiliency and and how old the asset is and if it’s being connected to the network. So. There’s a lot of different factors on whether or not you even want to do this scanning on it but you also need to prepare for how you would respond. If these assets or these zero days are actually hacked. So getting involved in incident response training programs like the isa’s ics for ics where teams.

Susan Farrell
Can be focused on the forensics of the zero day attack and how to respond to it how to protect yourself from that. Going forward and most importantly, how to prevent it by being proactive and so there’s a high connection between doing c w e scanning and awareness of zero day vulnerabilities with your incident response training program.

Nathaniel Nelson
Pause Andrew how do you prepare teams for scenarios where a zero day vulnerability is uncovered and now you have to rush to patch everything.

Andrew Ginter
Um, pause. That’s a good question I I wish I’d asked Susan that question but sort of off the top of my head. There’s a couple of things you have to do? you know.

Andrew Ginter
We talked about sort of the definition of vulnerabilities. It includes zero days. It includes other kinds of you know, Misconfigurations or attack opportunities zero day specifically. To me there’s a couple of things you. Got to look at the the big picture and anticipate the possibility in in your designs and in your risk planning. When a zero day is announced. Let’s say you you there occasionally an announcement saying here’s a CVE. Ah. Zero day is being actively exploited and there’s no patch available for it yet. It’s sort of an emergency announcement. So that’s not really incident response. That’s sort of emergency management in in the the entire program. The intrusion detection vendors will rapidly produce a signature for the the vulnerability being exploited. What bytes have to come across the the network to exploit it. You can put that in place to you know, see if you’re being attacked. You might add. Even a a drop the connection rule to your firewall. It says if I see a packet come by match the signature. It’s called virtual patching even before a patch is available. You can hopefully prevent the attack propagating into the vulnerable system.

Andrew Ginter
You can take other mitigations you can say that subsystem that has the vulnerability is is not vital to operations I’m going to disable. It. For a week and live without it until a patch is available. And sort of but more to your point. My guess is that if there’s an incident in progress. Well you got to deal with the incident no matter how it got in. But at some point you want to find out how did they get in because once we’ve shut everything down cleaned everything out. We don’t want to start it all up again. Until we know the bad guys aren’t going to get in 5 minutes later and so the forensic and analysts while we’re cleaning things out. They are analyzing the data trying to figure out how did they get in and we have to design our system so that these incident responders the forensic yeah analysts have. Information have enough information to be able to figure out how this happened and that means you know, detailed audit logs if we can on all of the devices, especially the the windows and the the linux machines so we can see what did the user you know, do on the compromised machine. To bring about these these undesirable results escalation of privilege or whatever. Some sites keep networks just keep packing keep copies of packets on their networks or keep copies of packets on their networks that came in through. Let’s say the firewall interface.

Andrew Ginter
Keep them for a long time. It means a huge amount of storage but it does mean that in an emergency. Your forensic team has a fighting chance of going back through those packets and trying to figure out where did when did the attack come through. And what did the attack look like so that they can figure out. What kind of vulnerability’s been exploited here and report it to the authorities and and put measures in place so long answer. This is sort of what what it means to be to to have a security program and and a response team. Aware of the possibility of zero days and especially the zero days in the C Wwe catalog

Andrew Ginter
Well Susan this has been tremendous I I learned something I didn’t even know that CWEs existed. Thank you for this? before we let you go can you sum up for us what is sort of the the main message is what’s you know? what’s the 1 thing we should remember from from this episode.

Susan Farrell
Well thank you Andrew the 1 takeaway that I’d really like listeners to to have is to be proactive instead of just being responsive. If you look at the overall. Ah. Nissed framework of being able to you know, detect and respond I think it’s really important to be proactive in in your overall cyber security program. Ah. Know your assets I think a lot of times. If you the the larger, the critical infrastructure is the greater the chance is that you don’t know every single asset in your infrastructure. And then even more so triaging the assets based off of resiliency and having a focus on your end of life or legacy assets that are still in production still connected to your network. And because they’re end of life. You may not have had a patch even available to you or support of that asset available to you for some time and it represents the danger of a zero day lurking within your infrastructure.

Susan Farrell
And CW ease give you that opportunity to really do a deep dive into those triaged assets to see if you’ve crossed all your t’s and dotted all your eyes from a compliance perspective.

Susan Farrell
Well, if any of the listeners want to get involved with the MITRE CWE ICS working group I encourage you to do that. It’s a great group to get get involved in Also. Reach out to your compliance service providers when you’re looking at your compliance alignment and ah.

Susan Farrell
Introduce this concept when you’re looking at your vulnerability gap analysis and your asset inventory management and feel free to reach out to me I’m on Linkedin. My Linkedin is Susan Farrell so it’s Linkedin. Dot com for slash in for slash Susan Farrell and I look forward to discussing any of these topics with you.

Nathaniel Nelson
Pause Andrew that concludes your interview with Susan Do you have any final thoughts to take us out today.

Andrew Ginter
Yeah, well, there’s something new in my understanding of the world. CWE is common weaknesses. They they strike me as as more useful for mature organizations that are looking forward at what might come at them. Versus you know, less mature organizations that are just getting started strikes me as very useful for developers. A long list of stuff to look forward to to watch for to avoid when we are developing industrial products. You know. Increasingly sort of and as an aside, you know, a lot of modern developers have are are you know, thinking about problems like this. And they’re using tools like writing their their products where possible in in Java or other languages that just. Know Java manages memory automatically other languages manage other kinds of things automatically takes entire sets of possible vulnerabilities off the table. But so the the Cd but repository strikes me as it reminds us of the potential for new vulnerabilities coming at us. Especially in older gear whose developers weren’t looking at their code this way weren’t thinking about all these kinds of things when they developed those products that a lot of us are still using so a useful resource.

Nathaniel Nelson
Okay, well with that thanks to Susan Farrell for speaking with you Andrew and Andrew as always thank you for speaking with me this has been the industrial security podcast from waterfall. Thanks to everybody out there listening.

Andrew Ginter
It’s always a pleasure. Thank you Nate.

Stay up to date

Subscribe to our blog and receive insights straight to your inbox