Episode 9: Rob Knake

About This Episode

Rob Knake’s fingerprints are all over modern cyber policy. As White House Director for Cybersecurity Policy, he helped craft the nation’s most significant cybersecurity strategy in nearly two decades and guided the response to major national cyber incidents. He’s advised presidents, shaped global cyber norms, co-authored two influential books on cyber conflict, and served as a senior fellow at Harvard and the Council on Foreign Relations.

Featuring

Credits

Transcript

Rob Knake:
I’m to this day a huge fan of both capitalism and democracy. I think that these ultimately are the right economic systems, the Churchill quote, right? Democracy is the worst of a form of government except for all the others. The cyber threats were real. They were growing, but they weren’t killing people. And terrorism was a risk that was. And so it made sense that it was getting that kind of focus and attention and that cyber was not.

Voice Over:
Knake’s fingerprints are all over modern cyber policy. As White House director for cybersecurity policy, he helped craft the nation’s most significant cybersecurity strategy in nearly two decades. He’s advised presidents, shaped global cyber norms, co-authored two influential books on cyber conflict, and served as a senior fellow at Harvard and the Council on Foreign Relations.

Rob Knake:
My argument is that the cyber domain is fundamentally different than air, sea and land. It is a marketplace. It is a forum for the exchange of ideas, principles of freedom of speech. Ransomware is killing people today. Not paying a ransom could kill people, may be killing people today in hospital systems. So you are dealing with major public policy, philosophical questions, right? If you pay the ransom today and you encourage and you grow this industry, what does that mean for the future?

Nathan Sportsman:
Rob Knake, welcome. Happy to be here. So I don’t typically do this when we first start chatting, but I’d like to go through your background a little bit to set up the stage. Hopefully I don’t miss anything. So in no particular order, you were a senior counselor at Department of Homeland Security. You’re the director of cybersecurity at the White House under the Obama administration. You were the deputy director, I believe, for strategy and budget under the Biden administration. You were a senior fellow at the Council on Foreign Relations. You’re a fellow or a senior fellow, I believe, at the McCarry Institute. You’ve lectured at Georgetown. You have a bachelor’s from Connecticut College, a master’s in public policy from Harvard, and you’re the author of two books, The Fifth Domain in Cyber War.

Rob Knake:
That’s pretty good. Yeah.

Nathan Sportsman:
Did I miss anything notable?

Rob Knake:
No. I mean, the two books I wrote, co-authored with Richard Clark. Co-authored with Richard Bark. I can’t claim to have written two books. I’ve written half of two books. I was a director, not the director at the NSC. There were about nine of us. I don’t like to sort of make it seem like I was the one person running cyber policy at the White House.

Nathan Sportsman:
Totally. Totally. You should try harder. So it’s good to have you here. Where I’d like to start is your background, and we talked about this a little bit last night, but I think our experiences shape our filters, our filters shape our framing and our arguments and that ultimately drives public policy. So can we sort of start at the beginning? Where’d you grow up?

Rob Knake:
New York City and then Boston.

Nathan Sportsman:
You mentioned your parents were Democrats. I think in your notes you mentioned they were New Deal Democrats. Can you talk a little bit about their focus on politics, what they did for a living, things like that?

Rob Knake:
Yeah. I mean, so my parents, I’m 46. I always seem to get that wrong. But I’m 46 and my parents were older, so they had me when they were 42. And so they were a lot older than most of my parents’ friends, even in New York City in that era. And so they were both born in the Great Depression and they were both kind of earliest memories or sort of deprivation, World War II, rationing. So that was their background. My dad was from the South. My mom was from New York City. And they both, I think, came out of that experience growing up with a very large sense of optimism, as well as strong belief in the Democratic Party, in pushing for progress, pushing for equality, that things were getting better in America, that we were triumphant. And so they both really had this really strong liberal political leaning unlike a lot of the people that I was surrounded by growing up.

Nathan Sportsman:
And so you mentioned optimism. I’ve also heard strong notions of shared sacrifice of moderation living at that time, some people call it the greatest generation. How did that imprint on you? Were those sort of discussions that were had at the dinner table? Did you talk about politics or ideals or philosophies growing up? How did that look being in a family like that?

Rob Knake:
Both my parents were fascinated by and really interested in politics and public policy and driven by the greater good. So yeah, that was dinner time conversation in our family, right? Presidential politics, debates, local politics, who they were supporting in elections, why were they supporting them in elections? What were the meaning of the big things that were happening at that time? What were doing to address the problems in society? That was the focus of our family’s conversations.

Nathan Sportsman:
You mentioned a quote that Kennedy gave that sort of always imprinted with you. What was the quote?

Rob Knake:
Yeah. To whom much is given, much is required. I grew up thinking of that as a Kennedy quote. I didn’t realize, even though I went to Catholic school, that that was actually him quoting the Bible. I always thought, oh, this is Kennedy, right? This is what he’s saying. So he came out of this very privileged position and felt obligated to be giving back to society. It was really, I think, the motivating thesis of his political career. And I think it was what I was raised on, and I think it was what the teachers at my Catholic school growing up were trying to impress upon me and my classmates.

Nathan Sportsman:
I think it was, we talked about this last night, but I think it was Luke 12:48 as the passage to whom much is given, much is required or depending on the translation, much is demanded. And so that quote at that age, it resonated with you. You felt that there was a higher calling. There was something bigger we needed to be in pursuit of. I talked about this in another interview, but there’s a quote in Carnegie’s library in New York that says, “The highest form of worship is service to man.” Did you have those feelings even at that age in terms of calibration?

Rob Knake:
Yeah. I mean, it really made an impression on me. And I think it struck me that that tone, that idea, you see that throughout education, particularly at a Catholic school or at a private school that’s driving towards liberal ideals, right? You have to give back. The problem is, and it’s a problem that I think I’ve wrestled with, but kind of come down on one side of is that, okay, you kind of end up in this kind of mouse trap or mouse wheel where you say, “Okay, we’re inculcating these values in you and we’re giving you this superior education and so we want you to give back because of that, but because you’ve had that superior education, had so much invested in you, you need to invest that in your own children.” And so you can never kind of get off that income hamster whale. And so you say from K through 12 through college through grad school, “Go out and do good things for the world.” Quote, Paul Simon, “Don’t worry about the marketplace.
Try and help the human race.” That’s easier said and done if you’re Paul Simon’s kid than if you are a son of even a corporate lawyer in New York. And so for me, it was just like a stark decision like, “This is what I’m going to do. This is what I want to focus on. I want to focus on government and public policy. I’m not going to say, okay, I got to focus on making money and then after I’ve done that, then I’ll go focus on public policy.” Because I knew from looking at my own family that probably would never happen, right? My father died before he was able to retire. He had just started doing the kind of pro bono work that he was really enthused by. My uncle had the same plans. He was going to retire at 55 and he had all these things he wanted to do, environmental policy, then he got Parkinson’s and that basically derailed his early retirement plans.
And so I took those lessons away and I said, “Okay, I may not be able to give my kids the same childhood that I had from an economic perspective, but I want to focus on the public policy work because that’s what I’ve been inspired to do. ” And so that just became the early focus of my career.

Nathan Sportsman:
Is it possible to do both? And we’ll talk about philosophy in a minute, but if you look at like Confucianism, all of these philosophical systems, they pursue some level of virtue, however they define it. But in that particular system, there’s sort of a hierarchy to virtue. And so it’s first start with your family, then society, then the leader and that sort of thing. As they say on the airlines, in the event of a crash, put your oxygen mask on you first before helping the person next to you, or sometimes they’ll joke, then pick your favorite child and work your way down. Is it possible to first make sure that your family is okay and then it’s sort of expanding concentric rings outward towards adjacency, towards society at larger? Is the requirements and the work behind going into the public sector so much that’s not possible?
I

Rob Knake:
Think it’s just a matter of perspective, right? So I had a good friend, very public policy motivated, very interested in politics, very interested in government who went to college with brilliant, brilliant guy. And he would quote me sort of George W., George Bush senior advice, which was, first you have to take care of your family and then once you’ve squared that away, then it’s a time for service, right? That’s one end of the spectrum. Easier to do, I think, if you’re from the Bush family than from an ordinary family, a family of more modest means. And so my friend, he took that approach and he’s had a great career, but he was like, “I’m going to go into corporate law.” So he was going to do what my father did. He’s like, “I’m going to go to corporate law. I’ll make a lot of money and then I want to move into government.” Well, here we are, what, 25 years after graduation, guess what he’s doing today?

Nathan Sportsman:
Still corporate?

Rob Knake:
He’s still in corporate law,

Nathan Sportsman:
Right? And so from high school, what happens next? Do you take time off? Do you immediately go to college? How are you thinking about once you transition K through 12?

Rob Knake:
So yeah, went to Connecticut College, New London, Connecticut. I got absolutely fabulous education there. They really took a chance on me. I wrote a sort of impassioned essay explaining, look, here’s the upward curve of my GPA and I’m on a strong thing. And if I can focus on the things I’m good at, take a chance on me. And the director of admissions, small school, you find these things out. He was like, “I want this kid. Let’s take a shot on him.” And so we did that. So I was sort of eternally grateful to Khan for accepting me and bringing me in. And then the funny thing about small liberal arts colleges is even at a mid-tier school, you’ve got absolutely amazing, brilliant professors who all went to Harvard, went to Yale, did their PhDs there incredibly competitive field. And so I had these just brilliant, brilliant, dedicated professors to their craft in a classroom of 12 people.
And so the education I got there was unparalleled. I came out of college with a better understanding of international relations theory of political economics of enlightenment, European history than just anybody, including my colleagues early on who had gone to Yale and Harvard and Princeton. I always say that you do much better studying the ideas of the famous names at Harvard at Connecticut College than you do taking the classes with the famous people at Harvard, right? So we read in the 90s a big work was the Class of Civilizations by Samuel Huntington. It sounds like you’ve read that book. I took a class when I was in grad school with Huntington. And the thing is, when you’re in a class with Huntington, you can’t really debate whether or not his theory is fundamentally racist with him, right? You can do that in a class of 12 people at Connecticut College.
You can engage in their ideas in a much better way, and you also get a lot more time with your professors than you do in a lecture hall with 400 other students. So I really valued that education. I’m to this day a huge fan of both capitalism and democracy. I think that these ultimately are the right economic systems, the Churchill quote, right? “Democracy is the worst of a form of government except for all the others.” It is still absolutely and fundamentally true today. We get worse outcomes in non-Democratic societies. I fundamentally believe that to be true, but I also think that even though some people are saying that we’re in sort of late stage capitalism, that’s a concept that you keep hearing more and more in progressive circles. Capitalism when properly restrained, controlled, directed, produces the most goods and the most good for the most people.
And so yes, it really was a split, right? It was on one side, it was this push of sort of Marxist, communist philosophy into the East, and then it was the push by the United States into Japan and to Germany of these two other concepts, capitalism and democracy that really allowed those societies to thrive, at least in my sort of philosophical undergraduate governance education, right? I don’t necessarily see that captured as a philosophical system until you get to Fukuyama, right? Who sort of looked back retrospectively and said, right, there is a march towards history. Hagel was fundamentally right that what societies need to flourish is something that addresses your spiritual needs as well as your economic needs and democracy by recognizing the individual voter, the individual person as valued in society and capitalism by providing good economic outcomes addresses those both needs. And this is fundamentally the end of the sort of march of history in terms of a process of coming up with a system that works.
That was that idea that he captured and I was quite taken with when I started learning about it and reading about that. And to get back to that undergraduate education question and the value of your traditional liberal arts education, here were all those ideas coming together. I’m learning about Hagel in that history class and then I’m reading ideas in a political economy class, in an international relations class. I think I had to read The End of History by Fukuyama in three classes, right? In an East Asian class, in a government class and in a European history class.

Nathan Sportsman:
So from college, these philosophies sort of help triangulate how you think about things from college, you had an internship at the United States Coast Guard, is that right?

Rob Knake:
Yeah. So Connecticut College was founded as the Connecticut College for Women. It was founded as an all women’s school to be the sister college to the Coast Guard Academy, which in 1911 was all men. And so for the first 60 years of Connecticut College’s history, it’s where Coast Guard officers went to meet their wives. They went to social, they got married right after graduation. So Khan is literally across the street from the Coast Guard Academy. If you’re a Coast Guard cadet, you get to classes all day and then you do military exercises all afternoon. So you have no time to be a research assistant to a professor, right? That’s just not something that happens. And so that’s where Connecticut college students swoop in. And so this was just incredible opportunity for me. I was a sophomore. I had a great summer plan to go scoop ice cream and work on a dock on an island off the coast of Maine.
Very excited about that. But one of my professors called me and said, “Hey, one of my graduating students, a senior has dropped out of a summer gig at the academy working for a professor there, Steve Flynn. It’s yours if you want it. I’ve sung your praises to him. He has no other options, so go meet with him and take it. ” I was like, “I don’t want to do this. ” And he said, “No, you got to go do this. ” He put a number on, he said, “This will put you in the top 10% of IR graduates in the country when you leave if you go do this. ” So I went and did that. And that is what pulled me into the public policy world and into, we didn’t have this word then, but the homeland security world, it pulled me into border security, drugs, thugs, migrants, the things that the Coast Guard was working on in the late ’90s, the sort of downside of globalization.
And so I worked there and that is what brought me into the whole world of public policy, led me to the Council of Foreign Relations and eventually to cybersecurity as an offshoot of that.

Nathan Sportsman:
So he was the one that exposed you, helped expose you to public policy, like you said, access, opening doors for you. The two questions I have is similar to your admission into college, you mentioned he went to at bat for you. Why? What was it that he saw in you that caused him to help in that sort of way? And then secondly, what did you learn from Steve? What were some of the biggest takeaways, that quote that you mentioned that we live in the means, the ins are just the story that we tell ourselves. Like when I told you last night when I read that, I could feel chills. It’s just a very, very powerful statement that will just carry with you from that point forward. What did you learn from Steve and why did he have such an interest in you?

Rob Knake:
So I mean, what did I do for Steve? I was a pretty good writer and so I did a lot of writing for him and I was a hell of a researcher in that I got really curious and dive down deep. And so I think that made me something of a partner to him. The big thing I learned from him was how to do that, right? That in public policy issues, you just needed to drill down so much more deeply than many people did. And he had this just absolute gift for that. In part, he had operational experience that I’ve never had in my career. He had been, I think he was like possibly the youngest commander of a Coast Guard cutter ever. He had this picture of him as a baby at 22 on the deck of a boat with his crew around him, I think off of Portland, Maine, where I live now.
And so he had this little bit of ability to kind of call bullshit as the rest of the public policy community was just starting to grapple with these issues that he’d spent his early part of his career working on. He was the guy or one of the people saying, prior to nine eleven, “Hey, we need to worry about the threat to our critical infrastructure in the United States. We need to worry about this guy called Bin Laden, what he’s done to our embassies. He says he wants to bring the threat overseas. Guess what? Our borders are open.” And so this was his message that he was sort of shouting out into the ether until, of course, nine eleven happened. Now, he wasn’t right about what the threat vector was going to be. He was focused on containers, maritime containers as where the threat was going to come from.
Turned out that it was aviation being weaponized, not containers being weaponized, but he was right that the threat was there. What he understood about how transportation systems work, how the maritime system work was so much more than everybody else. But he also had this great ability to say he knew what he knew, he didn’t know what he did. He knew what he didn’t know, and he would go out and find it out. And so he would do kind of almost like ethnography, almost like an anthropologist going out and really engaging deeply with the people who were operating ports from the CEO to the guy manning the gates. And so the need to sort of really drill down and then come back up to make policy was the biggest takeaway I had from my work with

Nathan Sportsman:
Steve. So you had mentioned through these internships with the Council on Foreign Relations, did I understand correctly you also had a chance to have exposure to Richard Clark through Steve Flynn, is that right?

Rob Knake:
Well, so what happened is I went after college, I worked for Flynn at the Council on Foreign Relations as one of his research assistants. And then I went to grad school. I went to grad school at the Kennedy School where my first year was 2004. It was the Carrie campaign and the entire faculty on national security had taken a sabbatical to go work on the campaign full time. There were like literally no courses in my concentration I could take, except ironically, this one course taught by Richard Clark and Ran Beers. And these guys had just left government. They were teaching for the first time. Rand was actually leading national security for the Carry campaign. So while everybody else had decided full time they couldn’t teach, they were going to work on the campaign. Rent was running national security for the campaign and he was teaching.
And so I took their course and I got to know both of them. And then I got hired by Dick and Rand to be their course assistant the next year because my old friend, Steve Flynn, made a phone call and said, “Hey, this guy was my research assistant. He’s really great.” And I said, “Fine, you’re hired be our teaching assistant next year.” And really for the next 10 years of my career, I worked for one of them or the other one first at Dick’s consulting firm, then left for a while, wrote a book, went back to think tank world, and then went to work for Rand at the Department of Homeland Security.

Nathan Sportsman:
In the course, so you were getting a master’s in public policy at Kennedy. What was the course on? Was it international affairs? Was it getting into cyber at this point?

Rob Knake:
Yeah. So it was really a kind of, here’s what we did, here’s this pocket of issues. So if Steve Flynn and all these other guys had worked on transnational threats in the ’90s where cyber wasn’t even really an issue, that bucket of topics were what Rand and Dick had managed from basically the Reagan administration through Bush 43’s first term. And so they were just basically bringing this out and it was their course on national security in the 21st century is what they titled it. It was really, the subtitle of it was Getting Shit Done in Government. That’s what it was about. Here’s how we got the things done that we got done, whether it was Planned Columbia or whether it was the pre nine eleven counterterrorism work or whether it was the first cyber strategy, that’s what they were teaching. And so that was one of my first heavy exposures to cyber was with them.

Nathan Sportsman:
And do I remember Richard Clark’s background? He was focused at a time on terrorism, but then he eventually became the “cyber czar” where he started to pivot and focus in terms of national security on cyber. Is that right?

Rob Knake:
Yeah. So I mean, Rand and Dick both had worked on cyber counterterrorism, homeland security, intelligence issues for their whole careers and really had spanned every single White House, in Rand’s case, I think five White Houses in a row. Cyber became a public policy issue when after Oklahoma City, we were first as a country looking at these transnational threats, these asymmetric threats. So you have the Oklahoma city bombing, you have the emergence of terrorism abroad, critical infrastructure becomes a term of art as something that we need to protect. And that’s when people started in 1996, 97, 98 saying, “We also need to worry about cyber. We also need to worry about the risk that we’re going to connect these systems to the internet and then make them vulnerable.” And people saw that right then. Now, if you go back far enough, you talk to Jay Healy about this in the history of cyber warfare, this was a perceived threat from the 1960s.
He could probably find earlier sources on it, but it was this buildup that really culminated in the late 1990s saying, “We need to start working on cyber issues.” And so Dick and his team at the NSC in 1998 wrote PDD 63, which is sort of the seminal first cybersecurity presidential decision document that really has set the course through to this day about what the nation’s cyber policy is, the public-private partnership, the division of responsibility between government and the private sector.

Nathan Sportsman:
And when was that document, what administration?

Rob Knake:
That was 1998, so that was the Clinton administration.

Nathan Sportsman:
Okay. And so as you identified, what are the new threats that we need to deal with, and they’re more individuals in terms of how we think about things, whether it’s a bin Laden or Timothy McVay, and then cyber is one of these emerging threats. Once the conclusion was reached though, that cyber is not a medium that terrorists or someone like Osama bin Laden would use, did that cause that particular vector to kind of take a backseat for a couple of years or was there still focus?

Rob Knake:
No, I think for me it was, it’s not a vector yet for this particular threat actor at this time, but it’s going to be as we were building more connectivity into critical systems as we were doing that without thinking about security as we were becoming more digitally dependent. It was very clear to me that this was going to be a major area of public policy that wasn’t getting the attention that it needed to get. And that it was really, really fascinating. I mean, in terms of wicked problems, how do you build a secure society in cyberspace when you want to have open borders, when you don’t have the traditional controls and checks that you have in physical spaces? This is a really interesting challenge. And so it’s really fun to work on to this day.

Nathan Sportsman:
100%. And so that’s the point in which you’ve kind of crafted your life’s work from graduation from Harvard. What happened next? Where did you go from there?

Rob Knake:
So I went to Good Harbor, which was Richard Clark, Roger Kressy, John Trajick, Paul Kurtz had founded after they had all left government. These were the guys in varying ways who were working on early cyber policy at commerce at the State Department and at the White House. And so I went to work for them and really got even deeper education, both in how public policy is done from them as well as in the cyber domain. And so I spent almost five years working for them, four years working for them before I sort of took what I’d learned and said, “Okay, I want to write a book about cyber war and I want to go back into the think tank world for a year and sort of reflect on what I’ve done.” So I did that. I went back to CFR this time as a fellow in my own right, wrote Cyber War with Dick, published that, and then basically almost immediately finally got the call from Rand to go work for Ram Beers at the Department of Homeland Security and actually get into government and try and do some of the things that I’ve been sitting at CFR for a year and thinking about.

Nathan Sportsman:
And so this sort of time period, is this like 2004 to 2010?

Rob Knake:
Yeah, 2005 to 910. Yeah.

Nathan Sportsman:
And as you’re writing that book, which to me was a siren song where you’re trying to educate folks on what’s to come, either writing that book or shortly after, what was the watershed moment for you whereas, okay, this isn’t a future vector, it has arrived?

Rob Knake:
So I think for me, it was understanding what was kind of being whispered around in DC at the time. 2008, something happened. We’re not going to talk about what it was on camera, but something happened that spooked the Bush administration really significantly enough to say, “Okay, we are going to launch what was called the Comprehensive Nationational Cyber Initiative at the end of the administration. As things are winding down, they’re winding this back up.” And so in some people’s telling of it, after Dick and other people left, the administration cyber kind of stopped getting the attention that it was getting when the 2003 strategy was put out. And then in 2008, you have HSPD, Homeland Security Presidential Directive 54, National Security Presidential Directive 23 that launches this new initiative in January of 2008, a year before the end of the administration and billions of dollars start flowing to cyber and people are suddenly very worried, very spooked about, “Oh my God, is this the threat vector that we’ve neglected?” And so as I started getting inklings of that, that’s when I really turned to it as a matter of national security, not just a matter of, okay, how do we help companies secure themselves against intellectual property theft or other threats that were on the horizon at that point?

Nathan Sportsman:
And so how does that book and your prior experience and prior academics, what is a story arc from that to the White House? Did you move through DHS at first or how did that play out?

Rob Knake:
Yeah. So Ram Bears, who’d been my professor and continued colleague since graduating, he was the undersecretary for the National Programs and Protection Directorate. So what today is now CISA, the Cybersecurity and Infrastructure Security Agency. So he oversaw all that. He brought me in as his, I think my title was special counselor or something like that. I wasn’t a lawyer, but somebody thought I was, so they made me a counselor, not an advisor. And I went to work for him. I thought I was going to be working on cyber issues. Phil Reitinger, his deputy, thought I was going to be working for him on cyber issues. And on my first day, we were having lunch in the Oak room that the Coast Guard runs at the department’s headquarters and we’re talking about cyber and what the problems are, what we need to do. And I was like, “So I’m counterterrorirism director for the secretary.
I’m leading our joint counterterrorism efforts across all the government and I need you to work on that. ” So I spent my first year in government not working on cyber, but working on CT at the Department of Homeland Security. That was an eyeopening experience for both how government runs and a really strong contrast for what we needed to build in cyber when I finally got to that issue. Department of Homeland Security rate really was founded on the counterterrorism mission. It was really founded to protect the continental United States, Hawaii, Alaska as well, but the US Conus from threats, from terrorism. So while we were fighting them over there, so we didn’t have to fight them there, which was sort of thesis of the GWAT, DHS was trying to build the systems and the mechanisms to be able to share information on threats, to protect our borders, to keep people from getting in who shouldn’t get in, to protect our critical infrastructure.
That was still a work in progress in 2010 and 2011, the knitting together of systems and data from the intelligence community, from border security agencies to be able to screen out threats at the border. And so that was mostly what I was working on, was how do we make sure that we’ve got data flowing where it needs to flow, that we’re addressing privacy concerns, that we’re addressing civil rights concerns, that we’re doing it within the law, and yet we can also use this data effectively to protect the American people. And so I just got a very quick and solid education on what we built, what we needed to build for that problem. And when I moved into cyber, it was very clear we didn’t have any of that. We didn’t have the mechanisms to be able to use intelligence to be able to use it to protect the American people, to get it to where it needed to go in an effective

Nathan Sportsman:
Manner. And so both the lack of a plan and vision in cyber, but also to the person you were working with or working for focused on counterterrorism and so therefore you need to focus on it. So is that occurrence more than just happenstance and it actually is a testament to what was still the government’s priorities, even with Bush issuing this directive and putting this funding towards it, the government’s priorities were still to your point towards the global war on terror, towards protecting the homeland, but from terrorism, not from cyber. Is that accurate?

Rob Knake:
Yeah. Oh, that’s definitely accurate and I think was right. I think it is hard looking back now to remember that the terrorism threat was well carried into the next decade after nine eleven. We were still actively engaged all over the globe, right? We had drone strike operations happening all the time against terrorist targets. We had active terrorism threats. So I was at DHS working on CT issues when the bin Laden raid happened, right? That happened, I think, in May of- 2011? May of 2011, right? And so that tells you how current it was. The leader of Al-Qaeda was still out there and we still had not found him a decade later. So yeah, it made sense that counter-terrorism was the more important issue. The most clearest thing I can say on that is it was really a matter of grave national security and cyber wasn’t.
The cyber threats were real, they were growing, but they weren’t killing people and terrorism was a risk that was. And so it made sense that it was getting that kind of focus and attention and that cyber was not.

Nathan Sportsman:
And so you said you worked in counter-terrorism for about a year, year and a half. Was it the bin Laden raid that caused the pivot and you started to work on cyber? What was it from your focus is this and now you’re pivoting over to DHS and cyber. How did that happen after a year or so?

Rob Knake:
So there was a big push in 2011 to pull together a comprehensive cyber legislative package across every agency, DOD, DHS, DOJ, and that was being led out of the NSC team at the White House. So I got a little bit of time to work on that and I worked on the team at DHS that was advising on the development of that law. So that let me kind of start to pivot out of the counter-terrorism world into cyber. And then really for me at about a year, that was the point at which from a sort of legal trigger perspective, it’s okay for a political detaile or a political appointee at one agency to be detailed to the White House. There’s actually a law that says you cannot hire somebody at the Department of Homeland Security and then just put them on the White House staff. And so it was at that point when I started to get asked, would I want to go work on one of the NSC teams?
And cyber was far and away the area I wanted to work in because it was just so exciting and what they were doing was so new.

Nathan Sportsman:
Can you talk a little bit about that experience you had mentioned in your notes, the interview process or part of it to both interest you and also test you?

Rob Knake:
Yeah. So Samir Bolutra was the senior director. He’d come out of, I think, Senate Intel before the administration. He’d been brought in by Howard Schmidt, who’s kind of a legend in the field of cybersecurity. Did a lot of the very early work on cyber law enforcement forensics when he was at the Air Force. And then Howard was actually Dick Clark’s deputy at the NSC in the Bush administration. Makes us all sound very incestuous, but so Howard had come in and he’d brought Samir in with him as his first hire. So Samir was senior director at the NSC. He was leading this legislative effort. We’d interviewed him when he was on Senate Intel for cyber war. So I met him then, knew him then, and then we started to work together on the lunch package. And Samir’s a very, very kind of clever, strategic guy. And so he asked me on fairly short notice in my recollection, “Hey, can you come and meet with me tonight?
We got to get this legislative package out. ” This was right after Bin Laden had been killed. I think it was a pretty narrow window.
And I’m like, “Yeah, sure.” He’s like, “Great, meet me. ” It was at least seven, if not nine at night. It was, “Come work with us after hours at the NSC.” And so I come and meet with him and he’s like, “Yeah, meet me in the sit room.” “Oh, okay, cool. “And so he takes me down to the sit room. Now the sit room, it’s not a single room. It’s basically a watch floor with three conference rooms, the big one that the cabinet can fit in, a sort of executive size one, and then this little tiny one, like six person room. We go into the six person room, right? That’s where Samir and I are meeting, right? And this is the room, there’s this famous photo of President Obama hunched over the officer who’s actually working the feet of the bin laden raid and then Hillary Clinton’s there and she’s like, sort of like that.
So we’re in that room. And this was of course all of part of Samir’s recruitment effort me, right? Test me would I show up? Did I understand the hours that they were going to work and then try to woo me in. He didn’t need to do any of that. I would’ve come and worked for him as soon as he asked, but he brings me into that room over there like, ” Wow, this feels pretty crazy.

Nathan Sportsman:
“And along with Clinton and President Obama Pineda and I think some other famous figures were in that room and he asked you to sit down in a particular chair.

Rob Knake:
He probably had me sit in both chairs, both the chair that the president should sit in at the head of the table where any NSC staffer who’s running a meeting will sit and in the individual chair, the president had sat in off to the side. So that felt pretty surreal to me.

Nathan Sportsman:
And so I would imagine that that kind of meeting, and I obviously never had any experience like this, but it sort of helps set the tempo and the magnitude of what you’re about to be working on. What was it like working in an administration? What was it like working at the White House?

Rob Knake:
Yeah. I mean, working in the NSC during the Obama years was pretty insane. Every generation thinks that the people who came after them have gotten soft, particularly as the NSC has grown. So the NSC keeps getting larger and larger and then it keeps getting tamped back down. So we were a fairly large team. I think there were eight or nine directors, two senior directors and a special assistant, the president, first Howard Schmidt, then Michael Daniel. And if you ask Paul Kurtz or Dick Clark or Mark Montgomery, any of the people who started out doing cyber in the Bush era, there were just two of them working cyber, much smaller teams, much more focused, much longer hours. But hours would be very long and very intense. You’d be getting in early 7:00 AM, I think most days, and then you were lucky if you were leaving at 7:00 PM.
So I had small kids at the time. I would not see them a lot of mornings before they were up and then I would be out and then I wouldn’t see them. When I got home, they would be asleep. So very long hours where you’re basically spending all your time rapidly researching, writing and negotiating. That’s really the role of an NSC staffer. And the bigger part of it is the negotiating, right? You’re building coalitions to try and move policy forward through that process.

Nathan Sportsman:
And so 10, 12 hour days, the public sector obviously doesn’t pay what the private sector pays, but I imagine mission sort of overtakes it. Did you ever meet him? Did you ever share a glance with the president? Did you ever get to shake his hand, anything like that?

Rob Knake:
On my way out, yeah. I mean, you get to do the White House photo op on your way out. Cyber wasn’t that important, an issue. There were probably, if I counted three, maybe four meetings during the Obama administration with the president on cyber, and usually very small teams. So I think Michael Daniel, Howchit, and Tom Donahue and Samir each probably had one meeting where they got to go meet the president and work on an issue. What I had was a sort of marginalia comment relationship with a president where we would send up a memo. It’s one of the great things about working on the NSC. If you’re the director on the NSC, your name goes on the memo. And if it’s a POTUS memo, it’s going to him and then you’d get back sort of scribbles on the edge of, “I like this. I don’t like this.
I approve this. ” And like, okay. So you felt like you got some kind of direct relationship, but I never had that. I was never at that level of, “Oh, I’m an advisor to the president in that way.” I always say, “I’m the consummate green badge.” I never had a blue badge across two administrations. I was working on an important topic, but I was one of the outer peripheries to the president’s inner circle.

Nathan Sportsman:
And so framing that important topic and kind of where it sits on orders of magnitude. So you had mentioned previously cyber was important, but people were not dying from cyber like they were against the war on terrorism. Can you kind of walk me through how that problem shaped and changed with the experience shares, things that happened, and then ultimately what was public policy that you worked on to try and deal with the problem as it sort of changed and evolved over time? Are you comfortable starting with Stuxnet or do you want to talk about something else?

Rob Knake:
Yeah, man. I think that really when that first piece of legislative package I was working on was still motivated by briefings on threats, on capabilities and some understanding of Stuxnet, which was like a forbidden word, you were not allowed to bring up Stuxnet in any kind of context when we were working on cyber matters in the Obama years. But I think understanding the capabilities of that, regardless of who was behind it, really spooked people and say, “Okay, this is the kind of real world physical damage that you can see on something that is very analogous in a lot of ways to civilian critical infrastructure.” So I think that Stuxnet had this motivation of helping people understand real world effects are out there. There were other things that helped people get that understanding. The Aurora vulnerabilities that San Diego National Lab was able to display the fact that you could see that video on early YouTube, I think helped inform the policy direction that we were trying to move to get ahead of this threat.
That legislation failed and it failed pretty miserably. We were fighting for it by the time I’d come over to the NSC. It was the administration’s legislative package. It was actually a law that was released out of the administration saying, “Here’s the law we want Congress to pass. Gets adopted by Senator Collins and Senator Lieberman.” And then it really gets killed because this was 2011 into 2012, it was an election cycle had begun and we were still recovering from the Great Recession. And so the idea that we had this legislative proposal that was going to give regulatory authority to the Department of Homeland Security over vast swaths of our critical infrastructure to enforce cybersecurity rules was something that a lot of people at that time were dead set against. So the Chamber of Commerce, which is right across Lafayette Square Park from the White House made it their mission to kill this bill, arguing that it was going to stifle innovation, arguing that as the economy was recovering, the one bright spot was what we were doing in the digital realm.
It was where we were leading and that regulation would kill off that innovation and it would kill off jobs. And the chamber had at that time, they’d taken between their sort of Greco-Roman columns, there were like four of them and they just spelled out jobs with one letter between each column facing off across the White House. And at a certain point, the White House pulled back support for the bill and said, “Okay, we’re not going to push this forward because the idea was too early and it was not palatable in an election year.” And so we pivoted from that to saying, “Okay, well, what can we do with the existing authorities that we already have? ” And so I got tasked with writing what would become Executive Order 13636, which was our response to say, “Okay, we can’t regulate. Let’s take the voluntary approach that everybody says we should take.
Let’s try and put some teeth into it. Let’s put it on steroids. Let’s see if we can take the idea of a public private partnership, which has been a vague notion since PDD 63 and 1998 and actually build a partnership and begin to build the mechanisms to actually work with the private sector on a voluntary basis and also to know whether it’s working.” So we did a lot of things in that bill, but probably the most notable was we created the NIST cybersecurity framework. What would become the NIST cybersecurity framework to say, “Look, here’s a standard of care that we expect critical infrastructure companies, regardless of whether or not they’re regulated to start to meet.” And that really started with just kind of a nudge without any kind of hard power behind it, it began to start shaping investments by the private sector, critical infrastructure operators and their suppliers in cybersecurity.

Nathan Sportsman:
And I definitely want to dig into this in public private partnership, but to make sure I understand the legislation that was ultimately killed. And so we’re going from a telling to a suggesting with the private sector based on the economic backdrop as well, the fear for loss of jobs, but Operation Aurora, for people that might not be aware, attribution is, most folks agree that it’s China, but this was a operation that affected a number of companies, most notably because they did something unusual was Google. Google actually went public and said, “Here is what we’re seeing, here are the patterns, and we’ve been compromised.” And unusually, from what I understand, in that safety net of a company like Google admitting that something had happened, other companies came forward, Adobe, Rackspace, media gets excited, they dig in and they ultimately conclude that about 30 to 40 companies were impacted in this one operation.
I listened to a speech from Peter Zakto a number of years ago, and in that speech he said, “What the media doesn’t say is that it wasn’t 30 or 40 companies that were impacted in that one operation. It was 3,000 to 4,000 companies that were affected in that single operation.” And so even in that context, lawmakers understanding that, and then we kind of touched on Stuxnet a little bit, the concern was still more towards ensuring that we recover from what was almost a great depression and ultimately the Great Recession. Is that correct?

Rob Knake:
Well, so I think really, it’s funny, there’s a couple tangents here, one of which is we were really bad at naming things in ways that had a clear nomenclature then. So at about the same time, we had both sort of operation of Aurora, which is Google getting targeted by China and the other companies. It was called Aurora. And then you also had the Aurora vulnerability, which was this vulnerarability that Sandia showed with SCADA systems, with the supervised controller and data acquisition systems, industrial control systems. So the ability to essentially make a turbine through a remote attack, light on fire basically explodes. So those two things, both with the same name, were two major motivations of cyber policy in that area. The law was really focused on critical infrastructure, right? The kind of damage that we had seen could be caused by something like Stuxnet or in the sort of unclassified space, what we’d seen by Aurora.
There was separately a massive focus on intellectual property theft and what we were going to do about that. And so if there were two sides of the House in the NSC Cyber Director in that years, there was what we called the evil cyberlord, which was Tom Donahue, who was an ex … He was a CIA cyber guy. He was leading that. He had a small team doing diplomacy and offensive operations in Intel, and then there was a domestic side that was doing the public policy around legislative and law. I was one of the few people because I was there for so long, I was there for four years, I crossed over that kind of divide on certain topics. And so I was exposed a lot to the China and the IP space and what do we do about that? That was really handled. The focus on IP theft was handled both through trying to get companies to recognize that their IP was being stolen and to argue that it was materially impactful and they should not hide it, but also diplomatically with China.
And that’s, I think, probably one of the greatest successes that we, speaking holistically and the Obama administration had was changing China’s behavior on that front over any number of years.

Nathan Sportsman:
You’re right. There’s two separate threads and there’s a bit to unpack there. So on the Stuxnet side, I don’t think we mentioned it, but it was some sort of malware that spun up centrifuges to the point that they would actually destroy themselves. And so to your point about Sandia, this was a demonstration in more of a controlled way, but how you could compromise OT environments, operational technologies, critical infrastructure. Operation Aurora, I think it was still around the same time, 2000-

Rob Knake:
2008. Yeah. I think 2009, 10? I thought it was 10. May have been 10.

Nathan Sportsman:
Still right around that time. And so not focused on sort of disruption or destruction, but focused more on the theft of IP. So both have impact maybe on different timelines, but with the backdrop of all of that, ultimately legislation that was more regulation was stymied, and then the alternative was, well, if we can’t enforce, I think you call it shoves versus nudges, let’s nudge the private sector, particularly OT, how to think about cybersecurity and give them a framework or a playbook for helping to defend themselves, but it was more of a suggestion. It was more of a way for them to think about not a requirement.

Rob Knake:
Yeah. So Cass Sunstein, the University of Chicago, law professor with President Obama on the same faculty, he was the director of the Office of Information and Regulatory Affairs, Barry DC office with inside an office, OIRIS inside the office of management and budget. So He was the director of that office. He had done a lot of academic work that he was bringing into that role on regulation on the effectiveness of nudges, saying, “Here’s not what you have to do, but here’s what you should do. We’re going to give you information and we’re going to try and change consumer or corporate behavior with that information.” And so in our earliest meetings with OIRA about what we were trying to do with the legislative package, I literally think I got handed a copy of the book or told to go buy one, his book on nudges. And when I read that and said, “Okay, I get it.
This is the direction that he is personally trying to drive OIRA and government in is to say, okay, how can we partner more? How can we encourage without putting in very strict mandates that the private sector then works to try and get around or meet in the most de minimis way? How do we create something better through partnership?” So that was the theory and that was what we were trying to encourage with 13636 was to say, “Look, there is this potential hammer out there of regulation where we can get new authorities for that, or we can use our existing authorities to add cyber requirements where that’s possible, but let’s go through this period of trying to do it on a voluntary basis.” And so that’s what we did with 13636 and with the NIST cybersecurity framework.

Nathan Sportsman:
And so a lot of your book, to me, when we kind of walked through some of the possible solutions, it’s a game of incentives. And so classic, Charlie Munger, show me the incentive, I’ll show you the outcome. So you have carrots and you have sticks. What is the carrot on a nudge or a suggestion? And by the way, I haven’t said this. I believe in CSF. We use it at the company. A lot of our customers use it. It’s a great framework for helping companies think about not just detection response, but holistically to form what you call in the book resilience, but where is the carrot to get people to adopt the suggestion or the framework? Where’s the incentive?

Rob Knake:
Yeah. So I mean, theoretically, people want to do the right thing. Theoretically, that’s an incentive. The second theory is that the market will start to respond, “Hey, have you implemented the cybersecurity framework? Have you tested your controls against it? Have you assessed against it? ” And so your clients and your customers will start demanding that. Now, to be clear, I think there was a lot of innovative policy. We did amazing work that Adam Sedgwick did leading the development of the framework at NIST, giving us this cohesive way to go up to a board member and explain cyber and then take those controls all the way down to your lowest level programmer to get them implemented. It was amazing work. It ultimately was a five-year experiment to show that this was not sufficient, that this was not going to get us to a place where companies were going to be making sufficient investment in cyber.
And so I think it may have been the right idea at the time, but ultimately nudges alone really weren’t sufficient for the corporate world. You really need what Dick and I called in the fifth domain, the shove of regulation. And that was what Anne Neuberger and the NSC team and the national cyber strategy I worked on the Biden administration were focused on saying, “Look, we need to regulate. We know we need to regulate.” And we can talk about all the reasons for that, but a lot of it just comes back to the simple analysis that if you’re not a cybersecurity company, you want to spend as little as possible on cybersecurity. And if you spend more than as little as possible, you are going to put yourself as a disadvantage to your competitors. And so that’s why we’ve seen, I think, a little bit of enlightened self-thought in many industries saying, “You know what?
We actually need to be regulated. We don’t want to compete on this issue.” We make chemicals. We’re pipeline companies. We don’t want to be put in a position where we’ve got to make bottom line decisions about this. We want a level playing field of investment.

Nathan Sportsman:
So what would version two look like for you?

Rob Knake:
Version two of-

Nathan Sportsman:
Not of the framework itself, but how you give it a shovel. What would that look like in terms of regulation?

Rob Knake:
So I think we’ve started to get there with a little bit of backsliding. If you go back to 2011, there were really two theories on how do you regulate for cybersecurity at the highest level or really who regulates? Do you have a centralized regulator? That was what we opted for in 2011, empowered the Department of Homeland Security. I was all in favor of that, very supportive of that vision at the time. In retrospect, I think the option that Jim Lewis and CSIS, when they first looked at this in 2008, they had the right vision. And they said existing regulators who know their industries have relationships with regulators should be doing the regulations. So you don’t want a cyber regulator and a safety regulator or a cyber regulator and an environmental regulator. You want the environmental regulator to have cyber rules. You want the safety regulator to have cyber rules and you want them to do that, that it’s actually much more efficient and going to be much more effective to do that rather than having two different systems of regulations.
So that was the first thing that was the approach that the Biden administration under and Newburger pursued was to say, “Okay, here’s where we have existing regulations. They need to be beefed up. Here’s where we have existing regulatory authority. We need to use it and here’s where we don’t have authority and we need to go get it. ” So I think fundamentally that’s the right approach. What you regulate is still, I think, an open question. How you regulate has been

Nathan Sportsman:
Answered. So who versus the how, and to make sure I understand the who correctly. So as an example, if you have a medical device, the FDA regulates that industry. The FDA should also have in its requirement, cybersecurity versus a separate standalone entity. FDA should follow guidance for cybersecurity because it understands that that particular industry, that particular vertical, same thing maybe for SEC when it comes to companies and things like that. Is that what you mean in terms of-

Rob Knake:
Yeah. It’s easier to inject cyber into an existing regulatory body than to teach a cyber regulator how a whole industry works and develop separate regulations and separate systems and separate relationships with the regulated entities. So I think that’s the first thing. I think we’ve got that right, at least as a matter of direction and policy.

Nathan Sportsman:
And then you mentioned the harder part is then the how. What do you mean by that?

Rob Knake:
So how do you regulate what do you require? Everybody in cybersecurity is well aware of the gap between compliance and security. Okay, we’ve done these things, we’ve marked the boxes. Are we secure? No. For some industries, for some companies, that’s okay. Other people say, “Oh yeah, but we see a corporate value in investing above that baseline.” And so they go and they do well beyond what is minimally required. So the question is, is that the right way to regulate, which is essentially better and better checklists, better and better sets of requirements where we’re dictating what the inputs are, what the outputs are for a cyber program, or do we try and regulate outcomes? Do we try and say, “You’ve got to be able to prevent a bad thing from happening and you have to prove that you can prevent the bad thing from happening and you have to put up the financial wherewithal that if the bad thing does happen, you’ll be able to make your customers or the victims whole.” Those are the kind of two potential methods for how you can regulate in cyberspace.
We’re mostly doing the first one. We’re mostly saying, “Here are the things that you need to do. ” All we’re doing on the second one is saying, “Here’s what you need to disclose if that first stack fails.” If everything you’re required to do doesn’t stop it and there’s still an incident, you have to disclose that to your regulators, to the public, to your victims. What we haven’t really figured out how to do is how to say, “Okay, we don’t even need to dictate, here are the things you do need to do for a cyber program. We just need to ensure that this bad outcome isn’t going to happen.” How do we do that in cyberspace? That’s still an open question.

Nathan Sportsman:
And so this is where we’re kind of getting tying in philosophy. So John Stewart Mill, outcome focus versus we don’t have a point of view on what you do or how you get there. We just care about the outcome. Where do you fall on this sort of stuff? And I kind of want to experience and share some of the things I’ve seen, but between those two, how do you think about things?

Rob Knake:
So when I think about the incentives, I want to create incentives so that companies are held liable when bad outcomes happen and they have that financial wherewithal to make their victims right. And that creates the incentive for them to do the right things to the left of boom. Whatever those right things may be, they may be everything in the NIST CSF or they might be nothing in the NIST CSF. I think we’re less effective at regulating individual controls and we need to become effective at regulating those bad outcomes. So it’s really about saying, look, you need to manage the risk. Now that risk isn’t just for your corporation, it’s not just for your shareholders, it’s for society at large. And I think there are many lessons we can take from other fields about how we’ve effectively done that and we have yet to apply those in cyberspace.

Nathan Sportsman:
So I am a capitalist and I’m also a big proponent of innovation. And so we don’t want security to put sand in the gears, but I do worry that now we’re shifting from sort of an outcome based approach to a best effort or activity based approach. And even with the insurance industry, I know a lot of stuff is happening there where insurance companies are running these various tools. From our perspective, we’ve looked at those tools and we can see the activity and it’s, for lack of a better word, all garbage. It’s forcing defenders to boil oceans and fix things that literally do not matter. And then while they’ve gone through and fixed that stuff, we can come over the top and actually get in in the way that actually is going to go down. And so with software and liability, I would worry that folks would focus on the activities to check the box versus focusing on the final outcome.
And so even if we’re going to stage gate it, I feel like it has to ultimately end, even if it’s year five, that you are responsible now for the outcome. We’ve given you a bridge to get there. I do agree software is hard, but I feel like so are building bridges, so we’re building buildings, and we have certifications and requirements around that. It’s just that there is a mechanism to ensure that buildings get built safely versus it’s kind of the wild, wild west when it comes to software. And the general concern I have is that technology is no longer a choice. We’re all in on that and security follows technology, but we’re still treating security as though it has optionality and all of this stuff, but there is a clear underpinning and dependence on technology in society. And I think until you have the heavy hand of government, some of this stuff just isn’t going to get addressed to the way that we want it to.

Rob Knake:
Yeah. So I mean, I would go back to the idea of we had to go through the process of saying within this cybersecurity framework, let’s try voluntary and to build off of that before we could get the level of consensus that we got with the Biden strategy to say, yeah, it is time, time to regulate, right? And we probably are going to see some backsliding on that in this administration, but I think we really, in that period, kind of shifted the Overton window, the window of what’s possible in a public policy context towards regulation where most people in the field who had been dead set in 2011 on regulation in 21, 22, they were like, “Yep, we recognize it’s needed.” So I think that’s the same place that we’re in now with software liability, right? We’ve got to first define what’s a reasonable standard of care, and then if we can start getting that adopted, create a safe harbor for doing those things, I think 10 years down the road, you could see strict liability for software.
We say, “Hey, it’s going to be on you to prevent bad outcomes completely and totally. And maybe you’re going to have to have some insurance behind you to pay the costs of those bad outcomes if they occur.” So there is a potential model out there that gets to what you’re talking about where we bring that strict liability for a bridge into the software realm, but we’ve got to spend some time creating the systems and the mechanisms to know what good looks like and to help companies be able to get there.

Nathan Sportsman:
Can you talk to us a little bit about what was SolarWinds? What was Sunburst? What exactly happened there?

Rob Knake:
Yeah. You may have some of the facts better on it than I do, but essentially in this incident, SolarWinds, which provides IT management software, got hacked by Russian intelligence. They got into their software build capability, and then they used that to launch a malicious update to all their customers. So I think 30,000 customers, something like that, got sent this malicious payload in a SolarWinds software update. They installed it, started beaconing back. The Russians said, “Oh, we’re in this company. We’re in this company. We’re in this company. These are the ones we want to target.” They shut off the activities of the ones they didn’t, and then they started harvesting intelligence from government agencies, from critical infrastructure operators, from software operators. And so they were able to use one company’s poor cybersecurity in a supply chain attack to get into thousands of companies and then ultimately harvest Intel from about a hundred victims that may have really done a pretty good job of protecting their perimeters and doing all those things on that checklist, but were not able to detect and stop a moistous intruder that essentially had to become an insider threat.

Nathan Sportsman:
I think SolarWinds had about 38,000 customers of those. I think 18,000 were affected. And a couple of things that were unusual, dwell time typically, I think CrowdStrike talks about it’s about three months these days, but in the solar winds case, the Russians dwelled in the network or networks, given the scope of it for over a year before anyone detected them. In these 18,000 companies, I mean, we’re talking about Microsoft was compromised. We’re talking about Mandiant FireEye was compromised. I think FireEye was the one that actually broke the story because they were impacted. So come back to software liability. SolarWinds, I think at one time was a publicly traded company, but at that time, private equity, private equity is cashflow and cost control. And I know some of the people that worked at SolarWinds when that stuff went down, they were not sufficiently investing enough in security.
So who’s liable? Is Mandiant liable? Is Microsoft liable? What is SolarWinds culpability in that event that had serious ramifications for the private sector, maybe even for the US government as well?

Rob Knake:
So I’ll say something about the private equity industry that has bothered me for a long time. If you take any of the ratings platforms that go out and they do external scoring of your cybersecurity, it’s not a perfect science, it may be more art than science, but they all show the same thing that when companies get bought by private equity, their scores go down. And that’s even true if they are buying cybersecurity or IT industry. So that is a market problem that we have not solved and that needs to be addressed. In Mandiant’s case, my hat is off to them for what they did.
Their tools got defeated as I understand the story, right? Their ability to detect what was going on, their network got defeated. It was only when multifactor was tripped and that multifactor hadn’t been tripped and an alert employee said, “Hey, wait, there is something off here because I didn’t ask to get a ping on my phone.” And they went off to the races and they went crazy. They put, according to some people, like half their investigative resources on figuring this out and tracing it down. And then they did something that a lot of companies wouldn’t have done. They went public. They said, “We’re a cybersecurity company. This is material. We can’t argue that it’s not. ” A lot of companies, when they have IP theft happen, they call their lawyers, their lawyers call economists, and they come up with an argument that it’s not material because the IP theft loss is not going to ultimately impact their current markets as of today.
And this has become a very large industry, arguing that IP thefts are not in fact impactful, but they didn’t do that. They went public doing that alerted the rest of industry that this was what was going on. And so for me, the real point where I felt just utterly defeated by SolarWinds wasn’t the fact that the Russians had been able to compromise SolarWinds. It wasn’t that SolarWinds was able to be used to break into all of these companies. It was that everybody but Mandiant at First Brush wasn’t able to detect it at all. And then putting me in a worse mood about our future, because as we’d argued in the fifth domain, we’d built all this infrastructure to detect threats, to share those threats so that you can beat one, but you can’t beat everyone. That seemed to not be true because companies started coming out and being like, “Oh, we saw that.
We stopped that. ” The CEO of Palo Alto came forward and said, “Yes, they got into our network, but thanks to one of our great cybersecurity tools, we stopped it. We didn’t think it was a big deal. We didn’t investigate it further. We protected our companies from it, our customers. We put out protections for them, but we didn’t share it within the community.” This was a real breakdown because five years before, Palo Alto had helped start one of the main cyber threat sharing initiatives. They had said, “Hey, we don’t want to compete on this kind of intelligence. We want to protect everybody. We’ll make money by giving you that first line of protection, but we want to protect the whole ecosystem.” So that was one of the first breakdowns, right? Palo Alto didn’t share that knowledge with the rest of the ecosystem. As the years went by, the number of companies that have claimed to say, “Oh, we saw it and we stopped it.
We didn’t recognize it was important. We didn’t tell anybody else, just keeps growing.” And so that’s where I said, “Okay, we have a breakdown in the idea that we’re building collective defense. Our collective defense failed on that day.”

Nathan Sportsman:
I want to take a detour from pivoting over to ransomware, but this is where I have just thoughts about some of the assertions from the book, but so there’s a capital incentive. Private equity is a great way to understand the economics of how business think, which led to software liability that maybe there was some ownership there and they could have done stuff to address things. So there’s a perversion of incentive between keeping the lights on or dealing with security. People are always going to bias towards keeping the lights on, just like what you talked about with GWAT versus cyber, right? It’s kind of the spectrum of things. So number one, the second issue is that FireEye, we could debate that particular technology, but Mandiant itself, Mandiant Proper, that is a world-class organization and they failed to detect because they’re ultimately going up against the GRU in a nation state.
In the book, you argue sort of this cohabitation between a private public partnership, but it feels like even if we can get the incentives right, Google I think can take care of itself. Maybe JP Morgan or Goldman Sachs can take care of itself, but most companies cannot defend against the GRU or against the PLA if they’re being targeted.
Does government not need to have a greater hand in this and not put the ownership on the private sector, but kind of to one of your quotes that I really liked in the book from Alexander was that the responsibility of the union is to protect or to defend the Commonwealth. Isn’t cyber in what we just described in that SolarWinds attack, isn’t that a fork, but some sort of mechanism of war that’s state on company activity? Where is the government’s hand in that versus Mandiant, Microsoft, whoever the ownership is on you to deal with this?

Rob Knake:
So I want to try and do justice to General Alexander’s arguments here, even though I don’t think that they’re right. His argument that he’s made it any number of ways, but one analogy he’s used is, when there’s a nuclear missile threat, we don’t go to Ford Motor Company and say, “Hey, there’s a nuclear missile threat. So invest in some anti-ballistic missiles, get a battery of patriots from Lockheed Martin, put over on your rooftop, good luck.” And that’s effectively what we’ve done in cyberspace. That’s his argument. My argument is that the cyber domain is fundamentally different than air, sea and land, right? It is a marketplace, it is a forum for the exchange of ideas, principles of freedom of speech exists, non-government interference is how the internet has grown. And so the idea that we can really draw on that analogy and say, “Okay, this should be a government responsibility to stop attacks and Russians would have the implications of recreating the internet in a way that we would not.
That wouldn’t be in JP Morgan’s interest. That wouldn’t be in your average American’s interest, right?” Rebuilding a great firewall of the United States would be maybe effective for cybersecurity and some marginal use cases, but it certainly would be a tool for censorship, for oppression, for surveillance. And so if we don’t want that outcome, the only alternative that I’ve been ever to see is something where we put that responsibility on the private sector and then government to use military analogy, government’s the supporting unit, not the supported unit. I used to break this down. Michael Daniel used to get mad at me when I would say this, but I would steal in line from Home Depot, right? It’s, you can do it, we’re here to help, right? Government is going to support it, but the responsibility of cybersecurity has to be on the owners and operators of those systems.
They’re the ones who know those systems and operate those systems. Government needs to create the incentives to secure them and it needs to overcome the barriers to sharing information to collective action. And then it needs to do the things that only government can do, right? Only government can do offensive cyber operations. Only government can collect intelligence. Government has the role in carrying out sanctions to punish bad actors. Government has the role in carrying military activity. Those are the things the government can do. And so it needs to do those things in support of helping to protect the private sector, but we’re not going to come up with a system in which we say, “Well, it’s a nation state threat, so now it’s the government’s problem and it’s not yours.”

Nathan Sportsman:
So I understand this argument of X could lead to Y. Y could lead to Z. And I’ve seen this pattern over and over again, whether it’s escalation theory in a nuclear world, the domino theory, and we were talking about communism, and if we let one country follow this and this and this will happen, is the assertion actually true though, that by having choke points or kill points on the internet where we can control the stuff that comes in, that there will be this natural progression towards a surveillance state or something along that line. And the outcome that was not intended would actually happen. I always try to draw on patterns or analogs to see an example, and I’m always hard pressed. I know this is a very common argument. We can’t do this because this will lead to this, which will lead to that, but that assumption, like, how do we know that that’s actually true?
And for me, and the way I make decisions is by saying yes to this, what am I saying no to? Or by saying no to this, what am I saying yes to? And by keeping it, operating more in this idea of it’s built from academia, it’s free and unfettered flow of information, if that’s the route that we take, we have to understand that what we’re saying yes to is the underpinnings of society, which are now dependent on technology and the digital world are at risk. And even in your book, I think you talked about an example, maybe it was your book, maybe it was another book, but Guam was at your book where the Chinese had compromised power plants, the water system, stuff like that. And the argument was made, well, what is the difference between that or hiring an insider to go around and put C4 along the power plant that’s ultimately going to have the same kinetic effect.
What is my question? I don’t know if I buy the idea that having government more involved in the internet more centralized to protect the citizens will actually lead to this authoritarian surveillance state.

Rob Knake:
So yeah, let me break a couple pieces of that apart. So yeah, we made the analogy, I think it was in the fifth domain. It may have been as early as cyber war that what people call preparation of the battlefield, right? The idea that, “Oh, I’m breaking into a system, not doing something today, but I’ve maintained access to it for the purpose of carrying out malicious activity, a destructive activity, or I may even have left behind a payload to do that. ” That would be the same as wrapping a power pylon in C4, right? So we think that, we argue that that’s very dangerous behavior and it should be treated as a hostile act and we need to convince adversaries not to do it parenthetically. We think that the rules of law war should develop so that we wouldn’t do it, right? That’s something that would be in our interest to forego if others would forego it.
Parenthetically, it would not be in our interest to forego that if others would not, right? So that’s how we’d like to see sort of the shape of warfare develop in the cyber domain, right? Say, “No, no, preplanting not okay. What you’ve done in Guam, far too aggressive, right? Get the hell out. Forego the capability to carry that out with a cyber conflict because you’re going to bring us into conflict sooner.” When it comes to the idea of it should be the government’s responsibility, this was really, if you go back to Homeland Security Presidential Directive 23 in 2008, which is now declassified, one of the things in it is the idea that we’re going to develop what was called a national cyber protection system, the NCPS, which was the program of record that funds flowed into for what was the government protection system that was called Einstein, Einstein one, two, and three.
And the idea was that we would, with the major carriers, we would build the ability to fill After all traffic going to government agencies, that was the initial plan. The overall plan, I think, was always to say, and then we could expand this for the whole country or in a time of conflict, we could say, route the traffic through this that’s going to our critical infrastructure owners and operators. So that was the vision that people had for how we could provide national protection by a government. That ran headlong into Edward Snowden and was killed off. Why was that killed off? Well, doing that kind of inspection on the backbone, routing traffic, looking at it with deep packet inspection means that you have to decrypt the traffic or it needs to be unencrypted. 2008, everything was unencrypted. 2011, 12, 13, most traffic was still flowing unencrypted across the internet.
And so you could have envisioned that kind of system. Snowden happens. Everybody comes to understand the vulnerability of communications and we get let’s encrypt and we get Google encrypting everything and we get Google and Yahoo and Microsoft collaborating to encrypt by default messages among their systems. And so that was the only vision that anybody ever had for saying, how could we protect the whole country at scale and make it a government responsibility using intelligence to create signatures that we would use to protect all of these entities? That’s no longer technically feasible if it ever was, if it ever would have worked, but it certainly is no longer technically feasible. So I don’t think anybody’s ever come up with a system that says, “Okay, this is what it looks like for the government to take over the responsibility of cybersecurity against nation states for the country for individual critical infrastructure.” What does that look like?
Does that look like CISA or cyber command, sitting on the network, sitting inside corporate networks, installing their own tools, deciding what packets get to go and what packets don’t, setting your IT policies, deciding what you’re going to purchase and what you’re not? Nobody’s ever been able to come up with a vision for how that could actually work. And so I think the best outcomes that we can envision are knitting those individual enterprises together, government doing what it can, setting requirements so that companies need to make these investments, setting requirements that force the disclosure of incidents, that create the mechanisms for sharing information, and then ultimately tightening those relationships between the corporate world and government so government can actively do the things that only government can do. And since SolarWinds, we did make a lot of progress on that with NSA’s Cyber Collaboration Center, with some of the efforts at CISO.
We are starting to knit together real time collaboration between government and agencies, but we don’t have it yet. And we’re not in a place where we could say, “Okay, somebody is being whacked in cyberspace. We need to go out and reach and touch somebody and on a tactical level, disrupt that activity.” We still don’t have the linkages quickly enough to make that happen.

Nathan Sportsman:
So sticking with incentives, one of the other things that you mentioned in the book is ransomware 2017, 2018, that we see kind of a spike. My understanding is ransomware was kind of this cottage industry, and then something specific happens in 2017, which was Sam Sam. This Iranian group that had this ransomware and two things happened. One is they actually upleveled what they charged. I think it was like 20,000, which was unheard of at that point. But the second thing was they had really, really great IT support. And so it turns out that the malware that they were infecting people with was not super great. It caused problems. And even after they paid, they couldn’t recover. And being a good IT support team, they actually helped their victims fix or update the ransomware to address it and ultimately help the victims recover. And I think what happened was companies and victims understood that this is a business and if you pay, you will actually get your stuff back.
And then you just saw this escalation from there and higher and higher targets. I’ve seen ransomwares up to the millions and 10 millions. Some folks talk about in terms of incentives. One of the great ways to try and deal with this is to make payment on ransomware illegal. I think Italy has looked at this. I know the US has talked about it. What’s your position on that? And are there any unintended consequences from doing something like that?

Rob Knake:
Yeah. I mean, there certainly can be unintended consequences from doing that. I have been a long-term proponent of banning ransomware payments, maybe not ransom payments, but ransomware payments. Getting to philosophical issues, right? Ransomware is killing people today. Not paying a ransom could kill people, may be killing people today in hospital systems. So you are dealing with major public policy philosophical questions, right? If you pay the ransom today and you encourage and you grow this industry, what does that mean for the future? And so I think what we’ve seen over time is you began with ransomware having very low capabilities, even targeting grandmas and things like that in early days to now going after larger and larger, more impactful targets, asking for more and more money. Well, why can’t they do that? Well, because they’ve been getting payouts and they’ve been taking that and maybe they’ve been buying Lamborghinis and leather jackets or whatever is in Moscow these days, but they take some of that and they hire more teams and they professionalize and they build their capabilities and they compete for discovering zero days.
And so they’re only getting stronger, right? I think still when we wrote the fifth domain, we wouldn’t have said that there were ransomware groups that rivaled nation state capability, but there clearly are today. How did they get there? People have been paying them ransoms. And so from that perspective, we’ve got to stop the payment of ransoms. We’ve got to prevent that from happening and that growth of capability. I’m willing to do it in a couple different ways. I’m willing to say one, let’s start with ransomware versus any ransoms, right? So it is absolutely fact Kemba Walden used to always raise us as a point. She’s totally right. Business email compromise still costs businesses more money in the United States today than ransom. The difference is that it has far less of an impact on the functioning of society than ransomware does, right? Fraud carried out through business email compromise does not cause hospitals to shut down and have to divert ambulances to other hospitals.
It doesn’t cancel operations. And so given that, we’ve really got to stop the deployment of ransomware. And the best way to do that is to say, “You won’t get paid if you do this full stop. We will make it illegal to pay you. ” Maybe still do extortion. Maybe we say, “Okay, that’s fine. That’s not a national security issue. Maybe that’s acceptable, but I think we’ve got to stop the payment of ransomware.” We can do it over time. We can give companies time to get their cybersecurity in place, but we’ve got to have a prevention.

Nathan Sportsman:
And to your point about hospitals, I mean, that’s not theory. There are incidents of that happening. I had the privilege to be invited to Columbia with Evan, who you know, Jay and Charles attended the CTO of Mandiant. We had this discussion in the class as kind of a group discussion and Charles’ argument was if you make it illegal to do payments, all that will happen is you’ll set up third parties that will actually fund the money through. Do you think that that’s true in an unintended consequence? I guess almost similar to prohibition where people are going to find alcohol one way or the other. And if that is an unintended consequence, how do you prevent that, these shadow brokers from actually facilitating

Rob Knake:
Payment anyways? Yeah. So I mean, you just mentioned it to Mike and great sparring partners on these debates, Evan Wolf and Jay Healy, and we’ve probably debated this for over a decade now. I have more faith in corporate America, I guess, than Mr. Carmichael does. I think that if you set a rule, corporations follow it because CEOs do not want to break the laws and make themselves personally liable for the fate of their corporations. So they’re not going to do it. They’re not going to do it. Now we already have those ransomware negotiators out there. We already have these companies that do as part of their incident response negotiating with payments. So yeah, we have those conduits. Are those people who came out of law enforcement, who’ve grown up in the cybersecurity community going to break the law? Actually, I really don’t think so. I don’t think we will see this going underground.
I also think that it’s harder and harder to hide ransomware activity. And so if you’ve been ransomed and your systems are locked up and you can’t operate, you’re not going to keep that from the media, you’re not going to keep that from your customers. And so the truth would out. I’m not worried that we would just end up with a black market economy of paying ransoms.

Nathan Sportsman:
So this is sort of kicker guard, this sort of leap of faith. I agree. I think in board meetings, cybersecurity is a going concern. So is compliance and so is the legality of things. Boards spend a ton of time worrying about and talking about those things, but kind of going back to a conversation from last night, although on camera, I’m not going to say the company or the vendor, there have been known incidents where entire companies have been stood up to hack on behalf of corporations back to these ransomware games and put a stop to them where the government hasn’t. If that is true, and these are publicly traded companies that I know have done this, what’s the delta between hacking back, which is illegal to not paying ransoms? Yeah.

Rob Knake:
I think that’s an interesting example. It does make me smile because the activity probably clearly violates the Computer Fraud and Abuse Act. There probably also were lawyers who argued that they’re exigent circumstances and we could defend this in court and change the law. And so I think that’s an interesting technique and tactic that the companies have been doing. On that kind of activity, this is where what I want to see is a tighter link between US private sector and the government, right? Because who do I think should be carrying out these activities? Well, if we think ransomware operators are a threat to national security, this is something that the intelligence community and cyber command should be doing. They have the authority and they should have the capability and it should be a priority to do that, but you have to do two things. You have to raise the priority of that as a threat within the intelligence community, within the national security community, and then you also need to develop those linkages so that you can share that information in real time with the government.
We don’t really have those mechanisms on either end to do it now. So we’re not taking the fight to the ransomware operators the way that we should. I would also say that I think outside the cyber domain is where the government probably has the best tools over the ransomware operators. And we need to hold the states accountable for allowing this activity to happen on their territory. So we’ll go back to my counterterrorism experience and that era. I think we talk about this in the book, Mike Sheehan, kind of a personal hero of mine, amazing, amazing guy. Green Beret was on the NSC, that movie, The Peacecaper with George Clooney, may or may not have been made about him. He was one of the first, maybe the first ambassador for counterterrorism. And before nine eleven, he went to the Taliban and he said, “Hey, you’ve got to stop providing sanctuary to Bin Laden to Al-Qaeda.
You cannot do it. We will hold you accountable.” And the Taliban did not understand that and they said, blah, blah, blah, blah, blah. And he said, “Okay, let me try and make this clear to you. ” If there’s an arsonist living in your basement and you let him stay there and he goes out every night and he burns down your neighbor’s house, you’re responsible. Let me be more to the point, we will hold you responsible. That message did not get through clearly to the Taliban before nine eleven. It did get through afterwards and our whole conception of nation state sovereignty that it doesn’t just come with rights. It also comes with responsibility shifted. Why is this important in cyberspace? Well, if you’ve got ransomware actors operating out of Moscow and we can’t get to them, we need to hold Moscow responsible, and that’s what we’re not doing effectively in this space today.

Nathan Sportsman:
And so two things there. So one I heard, and to quote movies, a clear and present danger similar to what’s happening with the cartels where we’re now sending CAG down there to deal with that stuff because it’s been prioritized. So you’re saying we need to prioritize these ransomware groups and answer the question, does this rise to the level of national security given some of the shenanigans that we’re doing? And the second point was either hold those groups or the individuals or countries that are providing them safe harbor accountable. And that doesn’t have to be cyber on cyber. It could be economic. It could be whatever the case is, but what I’m hearing when you describe that, what I’m hearing is deterrence versus what a lot of the book is about is resilience. And so kind of what I wanted to ask you about and using ransomware as an example, a company’s ability and by resilience, we mean to be able to recover operations in a timely manner.
So if a company is doing backups, even though these things might dwell in the network for a while, and so it’ll be hard to pick the point of differential backups to where they can get to a clean state, but if they can recover from a ransomware, they don’t have to pay because they can just pull back their data, but it doesn’t address … There’s a difference between, I think, disruption and sort of destruction, patient loss in the case or IP theft. And what a lot of these ransomwares groups do now is they don’t just encrypt the data or disrupt business operations. The second part of the playbook is, and we’ll also publicly disclose this. And so to me, resilience doesn’t quite cover the full tree out of confidentiality, integrity, and availability. I think it covers maybe two, but the confidentiality part you’re not going to get with resilience.
There has to be deterrence that stops these ransomware groups from doing this. If you do this, this is what will happen. And coming back to the discussion in New York, and I know I’m being a little bit long-winded, but Ed Cock and how he handled New York in the ’80s versus I think it was Dinkins in 89 to 91 or two, and then ultimately Giuliani, it was a pivot from how I studied it, one of deterrents. That’s how New York got cleaned up. It was the no broken glass theory, safe city, safe streets, and that’s how you got to a place that was very different in New York from the ’70s and ’80s to being, again, comfortable walking around downtown, but that’s a path of deterrence. In the book, was I misunderstanding or was I interpreting things that deterrence was kind of deprioritized with a concern around escalation, which is what we talked about, and instead the priority was on resilience.
How do you think about that?

Rob Knake:
So I just generally think that deterrence hasn’t been that useful a concept for thinking about cybersecurity. We want to put everything into this deterrence framework that we understand. Now that said, it doesn’t mean that offensive operations, disruptive activity on an adversary is not something that we should do and should do more of, right? So I don’t know, in fact, if you carried out a drone strike in Moscow and blew up the apartment that one ransomware group or another was carrying out of, whether that would deter other groups. Will we be able to message around that effectively? Would other groups understand why that was carried out? Will we be able to say, “This happened because you targeted a hospital and you killed children.” Or would people be like, “Well, who knows why that happened?” So the messaging around deterrence activity is often very hard to unpack.
I’ll give you another example. For reasons that we may not understand, Russia has been deterred from carrying out cyber attacks in the United States in order to change our behavior in relation to the war in Ukraine. We theorized, I worried that the first thing that was going to happen when the tanks rolled into Ukraine is they were going to do a shot across the bow in the US, say, “Hey, we could touch your critical infrastructure. We can turn out the lights in San Antonio. We can turn them out in New York. Look, we’ve done this. ” They didn’t do that. I don’t understand. I’m not sure if anybody understands their calculation for why they were deterred from that, what they think we would have done in response. And so given that, it’s very hard to think, how do we deter ransomware operators from engaging in this?
That said, possibly the lesson, to go back to your analogies to 1990 and New York, wasn’t not so much that we deterred crime, but the NYPD grew, expanded and arrested tons and tons of people. Now, whatever side of debate you’re on about that, whether that was the right thing or the wrong thing to do or a necessary thing to do, if you look at the crime statistics from that era, it really looks like it was the NYPD arresting. It wasn’t necessarily convicting and putting people in prison. It wasn’t these other interventions. This is the difference between other cities that saw their crime rates go down and New York which saw its crime rate go down more. And so maybe the answer there is not necessarily think about how do you deter crime or how do you deter ransomware groups, but how do you just simply disrupt the activity and eliminate those threats, even if you can’t convince other people to be deterred from carrying it out?

Nathan Sportsman:
So I think in New York and then coming back to the rants where, from what I understand, it was sort of a multi-step approach. There were, I think, something like another 5,000 cops that were put on the beat. They were arresting for frivolous misdemeanors as part of it. They removed sex shops and things that were seen as not appropriate for main streets. And a combination of those things ultimately resulted in more foot traffic, which resulted in less crime, which resulted in more foot traffic, which resulted in less crime and you had this sort of self-perpetuating loop. And so coming back to ransomware, I don’t think it would be an A to B thing. This ransomware group did this, so therefore we’re going to send a drone to that ransomware’s house. I think you have to do a one to mini mapping, and I don’t even think it has to be cyber, but rather than targeting individual ransomware groups, go to the oligarchs that are running FSB or SVR or whoever and say, “Hey, we are going to freeze these assets.
We’re going to pick up your yacht and the Mediterranean unless you put a stop to this stuff and let them handle the Conti’s and the rebels of the world and shut that stuff down.” In terms of deterrence, I think that still fits in the scope, even though it’s not one-to-one and it’s not cyber to cyber, but it’s the desired effect or the outcome that you would want to see.

Rob Knake:
Yeah. So I mean, what you’re talking about is I think raising the consequences for the activity on those who you can reach in some way and getting them to pressure the group. Yeah, I think that’s exactly what we need to see. The problem is that in the US Russia relationship as it is today, we don’t have that many levers. There aren’t that many oligarchs, yachts that we haven’t already seized. There aren’t that … We don’t have the ability to arrest people when they’re vacationing in the Seychelles anymore because they’ve stopped vacationing in the Seychelles. So we’ve lost a lot of levers that we have over Russia in order to get them to address the ransomware problem, right? I’d like to think that we could do with Russia what we did with China and IP theft in the Obama administration, get them to change their behavior, convince them that it’s not worth it, convince them that this is impacting the overall bilateral relationship.
The problem is we don’t have that at this time. So I think disruptive activity in cyberspace on ransomware operators is definitely something that we need to increase. I’m not sure that alone will be effective. And so therefore, I think we’re left with saying, how do we either take away their impetus by saying you can’t pay them ransoms and/or increase our resilience as a society and increase the resilience and the security of the companies and the organizations they’re targeting to make it harder for them to carry out their activity. I don’t see an option where we don’t have to do that.

Nathan Sportsman:
Where’s your head at today from writing the book? I know you talked a lot about technology and the notion of affirmative assets and things that are coming online for us. Do you feel that we’re making progress or even winning when it comes to cybersecurity?

Rob Knake:
So I think that the jury’s still out. I mean, it’s really about, do we get the incentives right? Technologically, we can solve these problems. Technologically, we can build secure software. Technologically, we can defend systems. We can make the investments. Right now, I think we are pulling back from many of the levers that we were trying to use to change government behavior. So this is a Jheliism, right? We need leverage. Government has leveraged to cause massive change with small acts. So you do something as small as create the cybersecurity framework that causes billions to be invested. You help build information sharing centers that has leverage, that helps you achieve scale. And so that was the direction that we were pushing in. And I was starting to feel that a lot of the small problems with the strategy that we’ve been pushing on now for 20 years are starting to get solved.
The connective tissue between government and the private sector is getting stronger. What we needed to do was to really build that connectivity out and address 8,000 issues of privacy and legality and security and trust. And we can envision kind of having a network defense system that’s able to rapidly evolve and respond to threats. AI, I think in the ultimate long run can be beneficial to cybersecurity. I think in the short run, the effect I’ve seen is my spear phishing emails that I’m getting hit with are now written in pristine and clear English, where before they weren’t. So right now, just at that level, it feels like adversaries are taking advantage of AI in a way that defensive systems are playing catch upon. So that gives me a moment of pause. On the other hand, the technological vision is out there. So if we can build the incentives, we can get to a much more secure future where cybersecurity isn’t really a policy issue, where there aren’t two offices in the White House that are working on cyber policy, where it can kind of fade to the background because it’s not something that any president or any CEO actually wants to do or wants to focus on.
It’s not that positive outcome that somebody runs for office to try and achieve. It’s a negative that you want to set aside as quickly as possible. If you’re a CEO, you want to spend as little on it as possible. So the goal here would be building a more resilient society in cyberspace so we can get it off the front page so it’s no longer disrupting operations at hospitals or schools. So it’s not the focus it is today.

Nathan Sportsman:
I agree. And this point from Jay about government needs leverage and that there’s been sort of a shift. Is it a rubber band effect where maybe we’re taking a couple of steps back, but ultimately we’ll still progress? Are we talking about a shift that’s such that we’re almost resetting and we’re going to have to start over? Where is your head out on that?

Rob Knake:
I don’t think we know yet. The people who have gone into the Trump administration on cyber so far are very good. They’re known quantities in the community. I respect them and it’s going to be a question about whether an overall push towards deregulation takes cybersecurity regulation with it. So do we see backsliding on that front or do we see an effort that we started in the last administration on harmonizing regulation, reducing the burdens of regulation continue while also recognizing that where necessary we need to implement new regulation. I think that is in fact possible in this administration. I think there’s a certain practicality that will emerge. So the example I always go to on regulation is when colonial pipeline happened, when it got hacked, colonial pipeline gets hacked by ransomware actors who apparently did not even intend to shut down gas production. That was a byproduct of the responsive colonial pipeline.
TSA very quickly used existing authorities went from a voluntary cooperative mechanism to saying, “Here are the requirements.” They did that within days and there was almost not a peep from the private sector, from Republicans in Congress saying, “Don’t do this. ” Nobody tried to roll back the authority. It was very clear that a voluntary requirement wasn’t getting the job done. And so regulations got put in place, industry worked with them to shape them, make them better, tailor them to the pipeline industry, and we have regulation today. I think if that same thing happened in this administration, practicality probably would prevail and people would say, “We have the authority. We need to do this. Why? Because we can’t have gas lines up and down the east coast of the Unitited States.”

Nathan Sportsman:
And along with this notion of potentially loosening regulations or perhaps even getting rid of regulations all together, we’ll see what happens. There is also, from my understanding, a loosening on offensive cyber operations that it’s sort of game on now and things are a lot more aggressive than they were prior. Is that your understanding?

Rob Knake:
So my understanding is that’s the stated intent, right? The stated intent is we want to focus on offensive operations. We want to do more disruptive activity. Whether or not that will represent an actual increase in tempo, I think still remains to be seen.

Nathan Sportsman:
So as we were discussing last night, I just have a very general naive but genuine question, why do we not pay public servants what we pay people in the private sector? I fundamentally and truly don’t understand.

Rob Knake:
So I think I know why, but I disagree with it, right? So when I was trying to recruit people to come into ONCD, the Office of National Cyber Director, the first people I called on were my colleagues from the NSC, people I’d worked with in the Obama administration, and I was almost totally unsuccessful And the reason was always the pay gap, the disparity. I had one colleague said, “All right, I’ll humor you for a minute. What is this paying?” And I said, “$172,000 a year.” And he said, “That’s a lot of money for a boy growing up in Biloxi, Alabama, wherever he grew up. But I’ve got tuition at two private schools in Washington DC.
I don’t even think I could cover that on that government salary.” So the gap between what government pays and what the private sector pays has just gotten so huge. At the same time, it is very hard to ask your average voter in the Unitited States who’s not making $1172,000 to say, “Hey, we need to pay the going rate for a cybersecurity professional in this field. And so somebody who’s making 350, 600,000, 1.2, 2.2 million a year, we should pay private sector rates for that person in the government.” That’s a really, really, really hard sell to the American people. And so that’s why people in Congress make 170,000 and that’s why I think some of them are so damn grumpy because they’re living in their offices because they can’t afford to live anywhere else and maintain two houses on that kind of salary. So it is a major impediment to getting good people into government and even more so to getting good people to come back into government.
A lot of good government advocates, they worry about the revolving door effect, the idea that people are going in and out of government. I worry about the revolving door effect stopping because people, once they’ve been in government, will never want to go back because the pay is so bad, the hassles are so great, the benefits are no good, and the risks are so high. So I think that’s something that we need to address. We’ve addressed it in other areas, right? The pay disparity between a private sector doctor and a VA doctor is pretty low. We pay VA doctors pretty well. We’ve figured that out. We figured out that we had to do that to provide veterans care. We need to do that across the board because while I’m inspired to government service and many people are inspired to government service, at the end of the day, that’s not enough if what you’re doing is the same job that you can do in the private sector, right?
So for me, the kind of roles that I’ve had, there’s nothing like them in the private sector. If I want to do them, the only place I could do them is in the government at whatever the pay’s at. But if my job is to defend an agency in cyberspace, that’s actually not that different than the job of defending a corporate entity in cyberspace. So there’s very little reason to think, well, if JP Morgan’s paying X, Y should CISA or the Department of Justice pay the same person X minus Y in order to do that job. It’s something that I think we need to address if we want the kind of efficient and effective government that people seem to be calling for right now.

Nathan Sportsman:
And a lot to unpack there. So one, to Jay Haley’s point, if government is a point of leverage, you want the best and the brightest and you want them for tenure, right? And if you look at Apple and Tim Cook’s team, on average, his executives are with him for 15 years. There’s something to be said about connective tissue. And so one, attracting these folks into government, and two, retaining them because these folks are often have competing offers with the private sector. And kind of what we had talked about between mission and idealism, but family and financial, there is a Maslow hierararchy of needs and we need to make sure that those needs are met, particularly for folks that are public servants and make sure that they stay and that experience stays with them versus moving on to an investment bank or wherever folks go. And to your point about efficiencies, I think if you found ways to have pay parity with the private sector, you would find efficiencies through that.
I know it seems counterintuitive, but bringing those folks in and allowing them to stay in post over the duration, I think would be helpful. And it just … Yeah, I don’t understand it. It doesn’t make full sense to me. And so I just wanted to get your thoughts on that. I told you last night, the concern I have is when that system is set up, ultimately you are left with the Roman Republic, where it is just the land owning class that ultra high net worth individuals where pay doesn’t matter anymore that can pursue a role of public service or a role of mission, but folks that don’t have that background or come from that, it’s much harder for them to be able to contribute. And so you in the private sector now, what are your pursuits now? What excites you about cybersecurity? What are you up to these days?
Are you thinking about writing another book?

Rob Knake:
I think about writing another book. I don’t know if anybody thinks about reading another book. And so that’s always the question, right? Is there still a market out there if I spend a year hammering away, right? Will anybody buy it? Much less read it. So what I’ve really been focused on is trying to take what I see as fundamentally public policy challenges, but reframe them and look at how can they be addressed in the private sector? Can we reframe them? Can we look at how to create an opportunity that a company will want to fill? Is there a market that nobody’s recognized or that we can create through a little bit of nudging in Congress or a little bit of nudging in an agency or simply by inventing a new technology to help address a public policy problem? And so I’m working with a number of startups on various things that they’re doing.
I’m working with a VC firm and talking to lots of others about what they’re doing, trying to say, okay, if I can’t be in government, if this isn’t my time for that, how can I address some of these problems holistically with the tools that the private sector has? And what I find is that you can typically find a lever, a mechanism to take a problem that the market hasn’t solved and figure out a way for the market to solve it rather than having the government step in. So we’ve talked a lot about AI in this conversation.
My two times in government, there have been big pushes after huge disasters on open source, right? Where the community gets together and said, “Oh my God, this open source library was totally unsupported and yet it was absolutely critical in the Obama administration.” It was heartbleed. “Okay, we need to fix this. “And we look at it and then companies make commitments and they say,” We’ll sponsor these nonprofit organizations that host these tools. “And then we forget about the issue and five years later, the next one comes out and we say,” Oh my God, we’ve discovered open source as a problem all over again and what are we going to do about it? Oh, okay. We’re going to invest and we’re going to get the foundations together and we’re really going to do it once more with feeling. “Well, my optimism right now is that with AI, we’ve reduced the cost of looking at open source, figuring out where there are vulnerabilities and potentially also fixing those vulnerabilities.
So one of the things I got excited about when I was at RSA last week is I think I found out about like three companies that were focusing on this issue. How can we make open source more secure? And because AI will lower the cost of doing that, they actually think that there’s a market now for that kind of activity. So it’s those kind of mechanisms where you’re saying actually a technology is going to come along and solve a problem that public policy could not in fact solve. And that’s what I’ve been working on.

Nathan Sportsman:
100%. And whether it’s Heartbleed or Log4j or whatever the latest big one is, coming back to that OpenAI security conference, what Dave Itel presented was a technology that he calls Ardbark, a little bug eater. And OpenAI, I don’t even know if they’re going to commercialize it. It looks like it’s just in the spirit of mission. They want to help solve the open source issue and they’re leveraging that technology that’s run or leverages AI to find bugs and open source packages, which is pretty cool. Plus a number of other commercial companies that you mentioned that you heard about at RSA. And so what about this show? Where do we fit on your journey into the private sector? Why did you decide to do this? I think at the time of this recording, all of the folks that we’ve interviewed, at least publicly, have been former hackers, things like that.
What made you decide to come on the show? What are you hoping to achieve or do by being here?

Rob Knake:
Yeah, in part, I think this is really our common friend, Evan Wolf’s, his push, right? And why did he want us to have this conversation? Well, he’s been operating in this role for almost his whole career between the hackers and the regulators and the policy makers as a lawyer in this space, bringing people together, often sometimes resolving disputes. In my whole career, I think that there’s been a certain tension between particularly the hacking community and the policy community. You mentioned Dave Itell, right? Dave and I, I consider him a friend now, but we first met under circumstances of him being like, Knake doesn’t know what the fuck he’s talking about. And somebody being like, ” You need to get to know him. He does know what he’s talking about. “And we met and we had oysters and beer and hugged it out and he became a really valuable resource for me looking at policy problems.
And I think I’ve taught him some things about how does policy actually work. So bridging those gaps between the hacking community, the technical community and the policy community is really important. And so I think these kind of dialogues really, really help. If people can start to understand, okay, policy is not that easy. It’s not that simple. It’s not that these people don’t understand technology. It’s possibly that me from the technical community, I don’t understand how the policy process works. I don’t understand how the legal process works. I don’t understand the impediments of things like the constitution to certain outcomes. And so I want to have that dialogue and continue to have that dialogue with the community. I’ll end by quoting my friend and colleague, Tara Wheeler, who’s also one of these people who kind of bridges all of these different communities. And she says,” So many people in the hacker community always say, Why don’t they just, whatever it is, why don’t they just do X, Y, and Z?
Why don’t they just ban ransomware? “It’s a great example of a very simple idea that’s really hard to implement and that would have huge consequences if you did it. And I’m in favor of it, but I can tell you all the reasons why we don’t just do that. So having that dialogue is really important to getting this community engaged in the policy process and figuring out how to contribute meaningful ways to improving and to helping build a more secure cyberspace.

Nathan Sportsman:
That’s awesome. We want the same thing. The show to me is about connection. So Rob, I appreciate you coming on. Thank you so much.

Rob Knake:
Thank you so much for having me. Appreciate it.