转到主要内容
Background image

Bridging the Skills Gap: Effective Cybersecurity Leadership and Collaboration with Bill Anderson - Part II

Share

Podcast

About This Episode

In this episode, co-hosts Rachael Lyon and Jonathan Knepher are joined once again by Dr. Bill Anderson, Principal Product Manager at Mattermost and founder of Oculus, for a deep dive into some of the most pressing cybersecurity topics of today. Bill draws on his background in cryptography and experience with the defense and intelligence communities to break down the ever-evolving landscape of regulatory compliance, incident response, and the growing role of automation and AI in securing critical infrastructure.

 

Tune in as they discuss how regulations like GDPR, HIPAA, and CMMC are shaping industry standards and why proactive preparation is essential for effective cyber incident response. Bill also sheds light on the impending advent of quantum computing, the "store and harvest" threat, and the urgent need to adopt post-quantum cryptography. Plus, he takes us on a personal journey through his career in cybersecurity and offers predictions about the next wave of innovations that could change our digital lives.

Podcast

Popular Episodes

      Podcast

      Bridging the Skills Gap: Effective Cybersecurity Leadership and Collaboration with Bill Anderson - Part II

      FP-TTP-Transcript Image-Ep-322

      Rachael Lyon:
      Welcome to To the Point cybersecurity podcast. Each week, join Jonathan Knepper and Rachael Lyon to explore the latest in global cybersecurity news, trending topics, and cyber industry initiatives impacting businesses, governments, and our way of life. Now let's get to the point. Hello, everyone. Welcome to this week's episode of To the Point podcast. I'm Rachel Lyon here with my cohost, Jon Knepher. We're excited to welcome back doctor Bill Anderson. He's principal product manager at Mattermost.

       

      [00:35] Regulatory Compliance: Catalyzing Better Security Practices

      Rachael Lyon:
      He's deep into cryptography, has extensive experience with the defense and intelligence communities, and is the founder of Oculus. So without further ado, let's get to the point. You know, here's a topic that, I mean, you hear about, you hear about, you hear about it. You know, you know, what is the path forward? But, you know, what role does regulatory compliance plan shaping incident response protocols? Right? Lot of discussion on that. And what sort of data must be reported in the week of a cyber incident? You know, I'd be I'd be curious your perspective there because there have been so many conversations on this topic.

      Bill Anderson:
      Yeah. It's actually this is a good example where regulation actually resulted in better behavior in the industry. It's it's it's necessary sometimes. I'm I'm not a socialist, but, I I I do think that sometimes government actually does have a role to play in saying, look, there's some things that we just all need to do, and it creates a level playing field. So it's, it's actually very hard if you've got 500 competitors, and some of them are cutting costs by not doing things, things that are actually bad for consumers or bad for the industry. It's really hard for you to justify spending money when they're not. So these regulations come in and do sort of raise the expectation across the industry. Couple examples, GDPR and HIPAA, both really matter a lot.

      Bill Anderson:
      So GDPR, the European Union got together and said, look, there's, we have some standards around, what is allowed to be shared regarding citizens, European citizens, personal information. And they created some, you know, a structure. Great. Okay. Now we know how to follow that structure. Same thing with HIPAA. It's a healthcare, framework. In the, defense industrial base, so companies that deal with information that is sensitive, that's related to government work.

      Bill Anderson:
      There's another one called CMMC two, and you've probably heard of it. So, it, so there's a bunch of stuff in CMMC two about being proactive around your security posture. One of the elements is to have a cyber incident response plan, and and and a part of that is to say, when you have an incident, you must report. So I, happen to have a list here of the things that you have to report. I I don't remember the time frame. I think for for some of these regulations, there's actually a time frame. So when you have an incident that passes a certain threshold of seriousness, you have to report in seventy two hours or twenty four hours or be it probably varies, and I don't know those offhand. But, again, people being people, you definitely have to put structures like and you have to report in a certain time frame.

      Bill Anderson:
      Otherwise, if the reporting is within four years, right, then then then it's in then it's ineffective. But in the CMMC two reporting, there's a place called dibnet. I think it's a Gov, maybe it's a Com. I think it's d it's a a.gov site. Anyway, when you've had an incident, there's a standard report that you have to send in above what happened. So your name, you know, your a unique identifier for your organization, your cage code, your facility clearance level, any contracts that you're working on that are affected, points of contact, your government points of contact, and then you have to get right down to like what types of information were affected, what level of information was it, What was the perceived impact to that sort of what we call covered defence information, dates and other sort of information so that on a systemic basis, the government can look at any patterns, which is fantastically valuable if you think about it. As I said before, the intel space thinks about what you're asking reveals what you're interested in. Well, right? There's been an uptick in, surprising and suspicious behaviors around access to this type of program.

      Bill Anderson:
      We may not be able to attribute where it's coming from. That's actually very hard, but there's a pattern that'll show by by looking at that sort of information. And so, as much as a as a pain it is to do any sort of bureaucratic compliance, this is one that actually has a really good purpose. And I think I don't know if they do the same sort of analysis in other spaces like HIPAA and, GDPR. Maybe they do. But for me at least, in in my company, Mattermost, we really focus on defense intelligence, you know, security and and, critical infrastructure customers. It really matters that you have that kind of insight in order to secure them.

      Jonathan Knepher:
      So thinking back too, like, you mentioned, like, being prepared so that you know, like, what to do, what to report, what how to remediate. What kinds of things should our listeners be doing in order to kind of create that playbook in advance so that they they know the right things and and they're, like, on the ball when it happens rather than scurrying around trying to figure it out

      Jonathan Knepher:
      in the heat of things. Right?

      Jonathan Knepher:
      Because when it's happening, it's it's too late.

      Bill Anderson:
      You you actually kind of just anticipated the answer, which is, okay. Shame shameless, you know, shameless part of the answer here is, but but part of it is, yeah, get ready in advance and get that secure collaborative tool in place and develop those policies and train people on how to execute them and run your system of analysis and response in a place that is not the same as the one that's going to get attacked next week. So that's what Mattermost does, and thank you, I'll shameless promotion here, but that's what we do. So that's that, we have a secure collaborative workflow platform that government customers, intel, all sorts of of enterprise use for doing exactly this kind of cyber incident response. And so, one of the things that often our our customers reflect to us is, you said, you know, I discovered that I can't protect my Microsoft Teams infrastructure from within Microsoft Teams. Like exactly because it's down today or it's not performing the way you'd expect, or you that's the system that you're worried about might be exposing data. So these companies will then set up matter most on a secure self hosted platform, get their incident response teams spooled up and using it on a daily basis so that no matter what's happening, it's not the same system, and they can actually execute against these playbooks, these workflows. So so that's the first one is like live in the tool because it's really when an incident is happening, that's not the time to open it up and learn.

      Bill Anderson:
      Your your security team has to live in that all the time. And then as I said at the beginning, it's because it's such a team sport. It turns out the rest of your organization wants to be in there too. So when there's an incident, the CEO needs to know there's an incident, or for certain compliance reasons, you might have to inform the CFO or do a report to that defense industrial base portal that I mentioned before. So those are integrations that come out of the platform. I need I need to send this report. I need to notify this other person. I need to invent invite another team in to deal with it because it's maybe a partner's network that's being affected as well.

      Bill Anderson:
      And so the partner would have a separate instance of Mattermost running, and they can be connected over secure channels. So each entity is sovereign and their data is owned by them, but they can talk to each other about incidents that are in progress over a secure channel. So there's there's no exposure to each other's systems, but you can be talking to them while you're dealing with these things. So it's really it's actually very, very sophisticated. But all as you as you kind of alluded to, it's all in the preparation. When the house is on fire, that's not the time to go looking to buy a fire extinguisher. Right? It's it's gotta be there already, or to try to pick up all your sensitive documents wherever they may be, and, you know, put them somewhere safe. Obviously detection and analysis comes next.

      Bill Anderson:
      So what what are we using to determine when something's happening? And there are lots of great third party tools for doing that. Microsoft has a bunch of them, which we integrate with, but anybody's third party XDR platform, that data comes in to matter most. So it's you could see it. You're using it. It's working in its own system, whether maybe it's a third party hosted system, maybe it's your system that's running. It doesn't matter. The point is you need to get the data into the team that's actually working on it. And then when you know what's happening, hopefully, you can figure it out, then there's that third phase of let's go to work.

      Bill Anderson:
      We we know what service were affected. We have procedures for restarting and rebooting them. If you're actually in that scenario, which most companies are of, okay, it's the 18 to 25 year olds that are going to do that, then they hit the big green button to say, what's the procedure to restart the server? And, you know, the the 40 year old CIO or CTO or whoever's responsible has written down a policy already. Do this, then do this, then do this. And the other thing is is is last of all is put a retrospective in in any of these procedures. In fact, what and what we're doing is is enabling those retrospectives with AI. So to look across prior runs to recognize changes in behavior. So an example being, you know, we have a 27 step procedure for restarting our our critical customer facing financial portal.

      Bill Anderson:
      K. Great. And yet over the last, 10 times that we've had to run it, we always skip steps 25 to 27. Well, okay. Well but we did something else instead. Well, the AI can say, hey. We've noticed that you've changed your procedure. Do you want me to update the playbook? Or we are noticing over the past 50 incidents that you've dealt with on the security front that we're seeing this type of attack.

      Bill Anderson:
      It's a shift away from what used to happen, and your response times are going up. Or you're trying to invite people to Teams, and those people aren't showing up for twenty four hours when your your SLA says that they'll be dealing with it in two hours. Like, so to start to use newer tools to give insights to these again, it always comes down to humans. Right? AI tools are wonderful. They can accelerate work for us. They can skip steps. They can summarize, but they can't really think. People have to do the thinking.

      Bill Anderson:
      What they can do is they can help us to think better, by by by filtering out some of the noise and summarizing, by looking across vast datasets and discovering patterns. And so we're pulling that into the product so that the results are quicker, faster, better, cheaper, more powerful, you know, whatever metric you

       

      [11:38] Automating Detection, Response, and Continuous Improvement

      Rachael Lyon:
      use to care. Curious though. How do you see the role of AI, evolving, you know, looking with regard to security tools and and and looking ahead to quantum, right, 02/1930? I mean, there's a lot happening in the next few years. And and so how do you see this all kind of coming together? I I love quantum.

      Bill Anderson:
      I'd love to talk to you about quantum, actually, because we we we we touched on it briefly. You said twenty, thirty, and I'm really curious to know about your your that number for you. For for me, it's a little longer, but it's somewhere in the next ten to thirty years that we're gonna have to make a complete shift. So, we've we're doing something with company called Crip that has some really great, quantum security technology, and it's to upgrade that capability in the platform so that whenever it happens that the adversaries actually do have quantum capable, computers, that our customers are unaffected. It's just like, yep. Fine. We shifted years ago. It's not a problem.

      Bill Anderson:
      Our data's still safe. Nothing to see here or move along. You know, that that's the that's actually the ultimate goal. So I'm we're we're heavily interested in what's gonna happen there. But, yeah, so to your question about what's happening with with sort of automation and AI driven tools in general, real time threat detection. Right? Because they can an an AI or an LLM or maybe not an LLM, but, you know, machine learning algorithm effectively can analyze really vast quantities of data. And can it can just do it so much faster than humans. So there's that sort of just staying on top of everything.

      Bill Anderson:
      The little downside is a lot of false positives still. And and the the metric for improvement in that industry is always going to be reduce the false positives. It's a tricky it's a tricky challenge, though. You don't wanna you don't wanna over reduce. Mhmm. So I think there's always going to need to be a human in the loop to make sense of things. Automated response is the second area. So, machine learning, I hate the word AI.

      Rachael Lyon:
      They're not synonymous? No.

      Bill Anderson:
      No. No. No. No.

      Rachael Lyon:
      No. No.

      Bill Anderson:
      No. No. No. No. No computer. So it's not it's not gonna do anything that isn't, you know, probabilistically programmed, but they're still wonderful. So but for automated response, same thing. I see this pattern, I do this thing.

      Bill Anderson:
      I see this other pattern, I do this other thing. That's great, because at least it shuts those open windows when the when the tornado comes through. We may not always shut the windows in time because we may not may not notice something. So when they shut the windows too often, so again, human in the loop, and that's maybe more on the retrospective side of it. Let it run. I mean, it's a great idea. Let it run. Let it do its job.

      Bill Anderson:
      We can't all be awake twenty four hours a day or or have the attention to spend on every tiny little detail. Thing with attackers is that they're infinitely creative because they're typically human. And so they look for those things that you're just not going to notice. Really, really clever. I love the clever stuff. This sort of, you know, take it to an extreme. You know, I'm hiding information inside of, you know, USB key that I can trick you into putting into your laptop, but it's actually got a separate processor in it. You never thought of that.

      Bill Anderson:
      By the way, these days, everybody knows about that one. Even ten years ago, it was very effective. So but humans need to be in that loop. And then I think the other one, which is a little ways out, but but is probably the biggest return is predictive. So I've looked at the historical data. I think this is what's going to happen next. We think that there's a pattern here. They're going after your authentication systems.

      Bill Anderson:
      They're going after your firewall. They're going after some, you know, random number generators, whatever it is that you might happen to be using. So, you know who else is really good at predictive? It's humans. We're we're we're real. And and we have a lot more computational power, in our brains than computers have to dedicate towards pattern analysis. They do it differently than we do, but we're really good at certain types of analysis because we have a couple of orders of magnitude more computing power right now that maybe not in ten maybe not in ten years, but we still do now. And so humans can look at things and say, this may not be obvious, but I can see what they're trying to do here. Right? They're trying to break down the boundary here, cause the system to go into this state so I can get into this other one unnoticed, leave something there, then come back here.

      Bill Anderson:
      Now it's not checking because this system's distracted. You know? So that that's the sort of I'd love

      Rachael Lyon:
      to hear you.

      Bill Anderson:
      Human level analysis that that'll still help.

      Rachael Lyon:
      Still have a little bit of an edge.

      Bill Anderson:
      We still I think we do.

      Jonathan Knepher:
      Yeah. We're still better pattern matchers at least. A little bit.

      Bill Anderson:
      At some things. Visual pattern matters matching, we're very good at. And about 80% of our perception comes in through our eyes. Right? So, so we just it naturally evolved, this really interesting capability to sort of live in a in a three-dimensional world with binocular vision, which isn't very particularly good, but we we learned how to interpret things by what we see.

       

      [17:18] Quantum Computing and the 'Store and Harvest' Threat

      Jonathan Knepher:
      So I wanted to ask you another question on the quantum side since you brought up, you know, some a good point. Right? We don't know how far away it is, but you combine that with how how cheap storage has gotten. Right? Like, are the are the adversaries collecting the encrypted data now thinking they can decrypt it later? And what do we need to do today to protect against that temporal threat that's not even here yet?

      Bill Anderson:
      I love that expression. It's the temporal threat. Yeah. It has a name. It's called store and harvest, and they are doing it to us. We're probably doing it to them. And it's scary.

      Jonathan Knepher:
      But but we've gotta be able to do something. Right? Like, are there quantum safe things we can be doing now?

      Bill Anderson:
      Yep. That's exactly and so that's the quantum shift. That's one of the things we're we're doing with Mattermost as well is giving our customers so so typically our customers host their own, Mattermost instances on their either on their own hardware or in their cloud own cloud infrastructure or in some third party, you know, private cloud infrastructure that they're renting. And so the customers make a lot of the determinations. We we advise them, of course, on how to do it, but one of the things to do is is put the post quantum algorithms into the platform now before you need them or at least around the time that you need them. I would argue that we need them right about now. And the reason is because of that store and harvest attack. So store and harvest, basically means and and take a step back.

      Bill Anderson:
      The issue with quantum computers, which are real and are getting better, Google has a hundred and five bit, general purpose, or it's a hundred and five qubit general purpose quantum computer now. It's estimated that you'll need around 5,000 to 10,000 qubits, But when you get to a computer a quantum computer that's around that big, it can run an algorithm that will reverse the public key cryptography that is the underpinning for Internet security today. So nobody needs to worry. We're not there today. Everything's safe. Your credit cards are fine. And even if they weren't, the banks will back you up. But once computers get that big, the tools that we're using today will be reversible.

      Bill Anderson:
      They were based on an interesting mathematical problem, discrete log factoring or factoring in the prime prime number, composite factoring, that turns out to be easier, much easier, when you have a big quantum computer. Okay. So if we believe that, technology is gonna progress, then within ten to thirty years, those computers will be available to us and to our adversaries. Frankly, we'll get them first. We're just better. We're just better at technology, so we will have them first. And but in anticipation of that, we're storing data. They're storing data now of of of interesting transactions, not everything.

      Bill Anderson:
      They don't care about what I bought from Amazon, but they probably care about what the state department was sending to other elements of the government. So they're storing those communications, and they'll run them through their quantum computers using this algorithm called Shor's algorithm to try to to factor these these mathematically hard problems. It's not the end of the world because we've known about this for twenty years. I mean, twenty years, I was working in a public key infrastructure, startup, and our customers there, in fact, it was a it was a California Medical Association, very, very far thinking, said, what about quantum? When are the when are the quantum algorithms coming? And the answer is they're here now. So the National Institute of Standards has already published the recommended standardized post quantum secure public key algorithm replacements, and we are adopting them. So these have been publicly studied for the last five, maybe ten. I'm not sure how long. At least the last five years, they've been looked at every possible way by really good cryptographers worldwide.

      Bill Anderson:
      Not me, by the way. I don't understand that stuff. It's hard. And they're they're believed to be good. Over time, that may change. We may replace some. We may improve some. That's a natural sort of give and take and learning experience with all cryptography, including the stuff we're using today.

      Bill Anderson:
      It went through iterations. So this is there's nothing new. There's nothing for people to worry about. But the point is we need to shift over to using those post quantum secured algorithms now so that when those computers are available, they can't re

      Rachael Lyon:
      It's funny you mentioned this. We actually had one of those fellows from this working in the post quantum team. And he was so smart. I didn't understand half of what he said, but it was really fascinating, you know, how many years, you know, they've actually been putting into to this work. It was astounding. Yeah. Yep.

      Bill Anderson:
      Well, it's fun for mathematicians. Yeah. I think, one of my one of my professors early on said that, you know, cryptography was the first honest word for mathematicians in two hundred years.

      Rachael Lyon:
      Are there That's hilarious. This quantum thread. Are there any dirty secrets about quantum we're not talking about?

      Bill Anderson:
      I think I have a little hobby horse that I'll get on, and and it's not about the algorithms themselves. It's about what is likely to happen next, which is large organizations being reticent to adopt the algorithms. And they will come up with reasons why they don't have to do it in in despite the case that I just made to you. So imagine you're on the board of directors at a large publicly traded company, and your chief security officer comes in and says, you know, in our corporate guidelines and policies, it says that we have a responsibility to protect our customers' data for twenty five years. We're ten years away from the quantum apocalypse. We've already broken our own rules. Therefore, we need to take action and spend money to upgrade our cryptography. Now the board will probably, in some cases, look at this person like they have two heads and say, so, how often or how are we being attacked? How much have we been breached? How much have we lost? What's our exposure? And the answer is, of course, is none.

      Bill Anderson:
      We haven't been attacked yet. When's it gonna happen? Happen? Ten years? Or maybe or maybe thirty? And the board the board member who's maybe 60 years old says, not my problem. My problem is meeting guidance, for next quarter. Otherwise, our stock's gonna drop, and I won't be able to retire happily. And and I'm, you know, I give them more credit than that, but in this in the Stark case, there is that sort of human nature of putting off these threats. Even though, logically, they are real, they are going to happen. And so I think the dirty secret is we're gonna see people kick the can down the road until it's unavoidable that it is too late. And, you know, it might turn out to be a year two thousand problem where we know we all worried and nobody was flying.

      Bill Anderson:
      Nobody was flying at midnight, in 1999. I wasn't. And it was and it was fine. Right? We we transitioned over no problem. That'll probably be the case, probably, but boy, you know, I wouldn't wanna be the one taking the responsibility and still being working in twenty years when it happens. I think I think they need to actually think, a little bit more. And, again, I'd say go listen to their experts. Don't listen to me.

      Bill Anderson:
      Go go talk to your CSO about what this really means into your business.

       

      [24:58] The Career Path of Bill Anderson

      Rachael Lyon:
      So are we ready to segue to the personal part, John? This is my favorite part of the conversation. Okay. So I'm gonna I'm always fascinated, Bill.

      Jonathan Knepher:
      I think so.

      Rachael Lyon:
      Because this is such an interesting industry. Right? And and I don't know. You know, maybe people of a certain age, myself included, you know, growing up just wasn't a thing, really. You know what I mean? You you had to kind of discover it, find it somehow. And I I'm always curious on the path. How did you get on this path? You know, was it purposeful or was it just kind of a happenstance? You know, it just, it found you.

      Bill Anderson:
      It was a little bit of an accident. So I'm an electrical engineer by training and I discovered that I'm terrible at power and I don't understand analog circuits and I can't do semiconductor design because it's all nonlinear. So I'm bad at that stuff. What I discovered in undergrad was I really like computers, communications technology. And there's a professor of mine, from undergrad named, Doctor. Gord Agnew, who was the greatest professor because he could teach. He could really teach. And he had taught a couple of courses, for me in undergrad that were just eye opening.

      Bill Anderson:
      Like, one day, we designed a computer. Like we'd spent the pre we spent the previous weeks explaining here's how an arithmetic logic unit works. Here's how memory works. Here's how disc like. And then one day he said, and now we're gonna build a PC and laid it out on. It was wonderful. Just wonderful. And so he was such a great teacher After I I finished undergrad, I realized I really still didn't know anything, and I I kind of wanted to know more.

      Bill Anderson:
      I'm a little bit competitive. I wanted to know more than my peers. So I'll go back to university and get a master's degree. I'm an engineer. It'll be there. And of course I went to Doctor. Agnew, and I thought I would just do a master's in communications technology in general, you know, the internet, was, was kind of a big thing. It was actually still like just about to be a big thing.

      Bill Anderson:
      It's been a while. Only, Doctor. Agnew was the co founder of a little company that turned out to be really successful called CertiCom. That was the company successfully commercializing elliptic curve cryptography, which, by the way, is what we're all using right now as we talk to each other. It is the backbone for the Internet today and a very, very secure public key algorithm. By the way, it's one of the ones that's going to be replaced by quantum computers. It's gonna be broken eventually. But, so I ended up, doing a PhD in, cryptography related, technology.

      Bill Anderson:
      So secure voice technology specifically, And the master's ended up turning into a PhD, and the PhD ended up being basically a five year job interview to go work at CertiCom. Mhmm. And CertiCom did really, really well. And we went, we were a Canadian company. We went public on the Toronto Exchange and then expanded down to, The US. And I'm and I'm Canadian originally. I'm a dual citizen. I'm a US citizen now.

      Bill Anderson:
      And I moved when we moved our headquarters to California, and I've been here ever since. So I've been in The States for gosh, a long time, more than twenty five years. And, that was such a wonderful experience because it was the go go boom time of that first mobile, the first mobile applications and devices were, were exploding. So think, you know, PalmPilot mobile device, Blackberry

      Rachael Lyon:
      at the time.

      Bill Anderson:
      I did

      Rachael Lyon:
      not call it. I'm not

      Bill Anderson:
      calling it. Pom. Yeah. I was a real evangelist for the for the Comcast. Oh, wow. Those companies all became our customers. And the reason was is that you couldn't do the older crypto algorithm called RSA. You couldn't do it on a mobile device.

      Bill Anderson:
      The mobile devices were too small and the elliptic curve had some advantages. So great experience there. And the pub, the company took off with the rest of the companies in, in the internet, you know, explosion. So it was fun. It was great fun. You know, we saw Boxtors starting to appear Porsche Boxtors. Everybody had a Boxtor in the parking lot. It was, it was short lived.

      Bill Anderson:
      It was short lived, unfortunately, but it was, it was a great deal of fun at the time. And I, so, and I, and I grew up through the product management track. So although I was technical, my interest is in talking to people and sort of understanding problems and solving problems as an engineer. And I'm just not good enough to be a cryptographer, to be honest. I'm not that good of a mathematician, but I'm really good at understanding problems and trying to, trying to solve them. And so that's product management and one product management job led to another, to another. I was ended up being vice president of marketing and vice president, for encryption products at a company called SafeNet that ended up getting acquired. They're now part of Thales.

      Bill Anderson:
      And then I had a crazy idea to do my own startup. So I I left. I I raised money. I took an investment from In Q Tel actually, which is the VC arm of the CIA and the other intelligence agencies and, you know, built, sold a company there, went to work for another VC, running their portfolio of companies. One of them was a quantum security company, by the way. And my my job I'm not a I'm not an investor. I'm an operator, but my job there was to, like, sell those security companies to other companies because the VC needed to get out. And so anyway, so I've been in this space I've been in this space of security partly by accident, but but honestly because it's I just cannot think of a more fun place to spend my time, because it keeps changing.

      Bill Anderson:
      We and and it it is right in the middle. Security is right in the middle of information, and information is what drives the entire world economy now. And so to be on the side peripheral but also important to enabling all of this wealth creation and lifestyle improvements and services and technologies and things that make people happy, it's been just the greatest career path for me.

      Rachael Lyon:
      This industry so much. So much. Now you've seen so much. Right? I mean, you've been at the heart of of so many different innovations and changes over the years. Now if you were to put on, like, your in the next fifty years hat, what are like, the Jetsons, for example, you know, what what are innovations that you see happening, you know, particularly those that could impact day to day life, be it communications, be it otherwise. But I'd be curious in your perspective, given everything you've seen throughout your career, right?

      Bill Anderson:
      Yeah. Well, it's hard to hard to predict that far out. There there are some very good futurists you should probably get on the program. Ray Kurtz. Well, for example, very, very good futurist. If you can ever get, if you ever get him to talk, so I think that the need for information security will persist for the next fifty years, but it will probably start to become boring as those problems are are they become solved by, frankly, by real AIs. And so I think if that eventually quite far off the the need for specialists in this space probably diminishes. There will still always be need for specialists.

      Bill Anderson:
      Fifty years is a long time though. I think we're going to see, like, continuing, in fact, probably expanding requirement for security expertise though, for the next twenty years. And the reason is because we're still going through an explosion in complexity. You know, we started with PCs that weren't even connected to a network to now we have Kubernetes enabled, you know, entirely virtual cloud infrastructure that use services. I have, some direct experience in that space and it horrified me to see the new problems that the new tools have created. So, for example, and there may be experts listening to this who can correct me on points. I apologize if I get some of this wrong or if I'm a little out of date. But, Kubernetes lets you, you know, deploy massive and complex systems, massive and complex code.

      Bill Anderson:
      It lets, maybe a fewer people do a lot more. The thing it doesn't do is it doesn't teach any security training. It doesn't teach any common sense. It and it makes the developers who are using it dependent on systems and methods and APIs that are built into these products. But they don't know what it means. And as a result, there are horrifying security holes in in most, designs that haven't been vetted by security people. So, as an example, the, you know, five gs phone network is run on Kubernetes systems. When you stand up, an instance of your local five gs tower talking to a node, a whole bunch of nodes that are strung together.

      Bill Anderson:
      And they're, you know, they're they're they're images of a system that works. You stand up one, you expand its scope, you stand up another one, you reduce it, whatever. They're they're, you know, they're dynamic. But when they stand themselves up, they call all these other capabilities, these modules. The modules all get a digital certificate in order to communicate securely within the cluster. That certificate is issued automatically if you don't know what you're doing. All you have to do is be a node in the logical network that is defined as being that cluster and say, hi. I'm here.

      Bill Anderson:
      I'm a node. Give me a cert, please. Give me a cert. And the the central quarterback, I forget what it's called, says, oh, okay. Here here you go. Second major part, the central quarterback uses an API and a, you know, module to say, I will create the certificate, and I will manage my root key right here locally in software, not on a piece of hardware. That's not secure either. Right? Now there are mechanisms in Kubernetes and other related systems to do this properly, where that incredibly important root key is actually held on a piece of hardware.

      Bill Anderson:
      You can do it if you know you're supposed to. Do you think a 25 year old new programmer knows that unless they've gone through extensive security training? Of course not. And so the sophistication and complexity that's enabled by these new new tools has just opened up so many weaknesses as well. So we're gonna who knows what the next big thing is going to be? But I guarantee it's a lot more breaches, which will then be countered by better security tools and back and forth. And so it's been a cat and mouse game for forever. I don't think, you know, cryptography in general, but security is a cat and mouse game. And so we're gonna see a back and forth, but eventually we'll settle down as humans stop designing these systems And systems of systems of systems are actually using very well established technologies, that don't have these weaknesses because they've been beaten out of them over over fifty years. Right? We no longer have to worry about the security of my mouse driver.

      Bill Anderson:
      Why? Because it's been beaten up for twenty five years. Right? There's nothing left to take out of it necessarily, so we don't have to worry about that component, but we do have to worry about this vibe coded new app that somebody who doesn't even know how to program just put together. Horrifying. Right? Don't ask ChatJPT to write your code unless you're willing to accept a lot of ex you know, unexpected results.

      Rachael Lyon:
      Oh, I wanna be mindful of time. I know we're running a little long today, Gil. Thank you so much for today's conversation. This has been so much fun. I I wish we could have a complete episode just on quantum because I think we're just scratching the surface here on on that conversation. So thank you for joining us, Phil. And and to all of our listeners out there, thanks again for joining us for the To the Point podcast. And, until next time, be safe.

      Rachael Lyon:
      Thanks for joining us on the To the Point cybersecurity podcast brought to you by Forcepoint. For more information and show notes from today's episode, please visit forcepoint.com/podcast. And, don't forget to subscribe and leave a review on Apple Podcasts or your favorite listening platform.