Episode 105: William Loomis and Stewart Scott on Securing the Software Supply Chain

On the latest episode of DevOps Radio, guests William Loomis and Stewart Scott of the Atlantic Council join host Brian Dawson to discuss one of the toughest issues in cybersecurity today: securing the software supply chain.

Brian Dawson: You’re listening to DevOps Radio, the podcast for software developers. In each episode, we speak to the developers, managers, and executives that are leading the DevOps charge and discuss their real world CI/CD trials and challenges. This show is sponsored by CloudBees, the enterprise software delivery company.

Hello. Thank you for joining us for another episode of DevOps Radio. I’m your host today, and I’m joined by Stewart Scott, an Assistant Director with the Atlantic Council’s GeoTech Center, and Will Loomis, Assistant Director with the Atlantic Council’s Cyber Statecraft Initiative.

I’m pretty excited, because this is a pretty unique episode for us. We’re not gonna get into the nuts and bolts of software delivery. We are gonna talk software delivery supply chain, but here, we really get an opportunity to talk about where technology meets cyber security, where technology meets geo policy. And I think this is a fantastic extension to some of the conversations that we’ve not only had about DevSecOps, but that we’ve also had about ethical technology and technology and ethics, and also, as we were just talking about in our pre-call, it’s very timely when we’ve just had some very public attacks and compromises, ransomware from the supply chain, we’ve had the Executive Order issued by Biden, and so, we’ll get into all of that. 

So, Stewart—hello, how are you doing?

Stewart Scott: I’m good. Thanks for having us.

Brian Dawson: Good. Thanks for joining. Will, how are you?

Will Loomis: I’m great, Brian. Thanks so much for having us today. We’re really excited to be here.

Brian Dawson: Yeah, thank you for joining. I’m excited to dig in.

So, before we get started and we get into the meat of the discussion, what I’d like to do is let you, who can explain much better than me to our listeners, what it is that you do at the Atlantic Council. 

So, why don’t we start with Will, we just heard you last. Will, can you give our listeners a brief overview of what your focus is, what the Atlantic Council does, and what your focus at the Atlantic Council is?

Will Loomis: Yeah, absolutely. So, Brian, the Atlantic Council is a nonpartisan think tank based in D.C. that really focuses on solving some of the nastiest and most complicated geopolitical questions with allies and partners on a global basis. I am an Assistant Director at the Atlantic Council’s Cyber Statecraft Initiative. So, my team at the Cyber Statecraft Initiative really focuses a lot on the role of the technology in the technology industry in geopolitics, cyber safety, the security of the Internet, and even the growing of a more capable cyber security policy and technical workforce. 

So, the bottom line is, we try and tackle ugly technical problems through a policy mindset and highlight some of those broader pieces of significance and geopolitical context in order to put those technical issues in a larger, more geopolitically focused sphere. 

So, happy to talk more about some of the particular projects that I work on. I know we’ll get there in a couple minutes here, but that’s just a quick overview of the Council and the work that I do for CSI.

Brian Dawson: Awesome, awesome. A lot of questions already brewing for me, but I guess I’ll get over to Stewart. Stewart, can you share a bit of your background with our listeners?

Stewart Scott: Yeah, absolutely.

Brian Dawson: I mean, what you focus on—excuse me.

Stewart Scott: No problem. So, yeah, I used to work with the Cyber Statecraft Initiative before moving to the GeoTech Center, and a lot of our work is focused on kind of broader tech politics questions, so we’ll do work looking at data privacy law, both in different countries and then between countries, data governance and use cases and computing at the edge and AI development. So, slightly broader topics, not quite as nitty-gritty technical as Will’s group, but we do have a good bit of overlap, and that’s where we get to do some really awesome collaborative projects. 

I started with CSI working on their Breaking Trust project and got to continue working on it after moving to GeoTech, and it’s been a pleasure.

Brian Dawson: That’s all compelling, and I have to say I’m a bit jealous. It sounds like I would love to wake up every morning and do this stuff.

Now, before we dig into specifics, I am curious to ask either of you if you could help me understand, what is—how much do y’all focus on work that draws from or influences private sector versus work that draws from or influences government?

Will Loomis: Yeah, I’m happy to take a first hack at this, Brian. For sure, I think particularly in the cyber security space, it’s basically impossible to try and tackle these problems without engaging and interacting with both the government and the private sector. 

So, when you look at issues like software supply chain, like cloud security, amongst others, a lot of the key infrastructure here is owned and operated by private sector entities. So, it’s pretty impossible to just come at these issues through a government lens.

Brian Dawson: Mm-hmm.

Will Loomis: Obviously, it’s important to engage with and push policy towards some of those key stakeholders within the U.S. government, but a big thing that the Atlantic Council focuses on and I think is even more important for our work at the Cyber Statecraft Initiative is working with allies and partners. And in the cyber sphere, it’s essential that private sector actors are at the table and are a part of the conversation, perhaps even more than they are right now.

Brian Dawson: Okay, awesome. And Stewart, is there—I know you worked on the Cyber Statecraft Initiative, but as you go into this new work, what’s the split of private sector versus government there?

Stewart Scott: I think it’s largely what will described. I think especially within any kind of tech enterprise, it’s really hard to separate the two. I think when we get into the nitty-gritty of the supply chain, a huge part of what we look at is how the government kind of, as the largest customer, can influence with its policy the security of the products it buys and therefore its own security.

Brian Dawson: Fantastic.

Stewart Scott: So, it’s definitely really exciting to try to parse apart whose role is what.

Brian Dawson: Alright, awesome. So, you both are closely working on the Atlantic Council’s Cyber Statecraft Initiative Breaking Trust project. Can you tell me a bit more about that project?

Will Loomis: Yeah, absolutely, Brian. So, the Breaking Trust project started back in September 2019. My boss and the Director of the Cyber Statecraft Initiative, Dr. Trey Herr, pulled me into his office and on his little white board, he had a quote, a famous quote from Ada Lovelace that said, “Software is eating the world.” And he pointed at it and he said, “Will, this is your next problem.”

So, you know, for us, it was really clear that society has a software problem. Software is no longer confined to computers, it now controls power generators, it controls medical hardware, and it influences planetary scale data sets. And it’s not even that. You know, a generation of Western defense systems, stuff like the F-35 amongst others, you know, really relies on commercial, off the shelf software and technologies to function. And, you know, this is a huge issue for us. Because, you know, although this has changed a little bit post SolarWinds, and despite the significance and broad use of software, software supply chain security has still been an under-appreciated domain of national security policymaking.

So, with a physical system, it’s rarely modified once it leaves the factory, but with software, it’s continually updated, which means that that supply chain is long and often depends on an intricate chain of users to both trust their vendors and the developers that made that software.

So, for us, you know, trust in software that one did not build themselves might be practically impossible, but you know, leaving the task of establishing and rigorously enforcing this level of trust in others’ code was not something that we felt comfortable with. 

So, for us, we really felt like we needed to (a) shine a light on the problem of software supply chain security, and also put forward some paths and recommendations to try and help solve this issue on a systemic basis. So, you know, this boiled down into two big things, and one that informed the other. 

The first was this creation of our data set. So, the project created a data set, it’s 138 software supply chain attacks and vulnerability disclosures collected from public reporting over the last decade. Using that data set, we were able to put forward two reports, Breaking Trust and Broken Trust. Report one really focused on kind of software supply chain attacks historically and pulling out trends that talked about how they're popular, impactful, and used to great effects by states. While report two, which was kind of put together as a response to the Sunburst campaign, really kind of focuses on putting the Sunburst campaign in context and pulling out some lessons for policy makers and cyber security practitioners on how we can take lessons from Sunburst in order to better secure our systems going forward.

So, that’s kind of a quick overview of the project—but yeah, back to you, Brian.

Brian Dawson: So, the Breaking Trust was really more sort of a state of play observation of supply chain vulnerabilities and how they can be exploited, and then report two, Broken Trust, was an analysis of the Sunburst campaign. And for our listeners, can you explain what the Sunburst campaign is, or was?

Will Loomis: Yeah, absolutely. And just quickly, a lot of you may have heard the campaign referred to as SolarWinds. The operation was a high profile and wide reaching software supply chain campaign against a wide slew of private and public sector actors in the U.S. It was undertaken by the Russian intelligence and the SVR, as has been attributed in the last couple months. And although the campaign demonstrate a natural evolution in the trends of software supply chain compromises that we talked about in the initial Breaking Trust report, it was interesting in two ways, and I’m happy to talk more about that later.

But for us, the SolarWinds, Sunburst operation was kind of not only a really concrete example of what we talked about in the Breaking Trust report, but also, it demonstrate evolution of those tactics, and a really good example of how a sophisticated actor can not only hit a wide variety of actors, but do so in a very quiet and sneaky way to remain undetected for long periods of time. And Stewart, feel free to hop in and add anything that I missed.

Stewart Scott: Yeah, absolutely. I think just emphasizing that the data set was at the heart of both your reports is important. It wasn’t just a matter of kinda listing all these different attacks, but finding a way to compare and group them that makes sense to the non-technical reader. I think that really paid off for the Sunburst report, beyond just adding another line to the data set. We could point to past entries and say, “Oh, you know, this attack is clearly large and significant, but it’s so clearly similar to these previous methods. That should be easier for policymakers to respond to in a coherent way.”

Brian Dawson: Yeah, and that actually leads well into my next question, and that’s, you know, coming off of these reports to your general body of work, what are some of the strategies that you have identified to help increase overall awareness around attacks on the software supply chain or software pipelines?

Stewart Scott: Yeah, so, for me—and, you know, I’m gonna kinda pivot on this question a little bit.

Brian Dawson: Okay.

Will Loomis: So, for me, as I mentioned earlier, you know, for us, one of the big reasons behind doing this project was to shed light and kind of bring new visibility to this problem. Before we kind of tackled this, there wasn’t a lot of gathered empirical data of both this volume, but also kinda the specificity on the types and classifications of compromise.

Brian Dawson: Right, right.

Will Loomis: And for us, this was really key. We were able to analyze the database, pull out some key trend data to show where the problems lie, and then be able to use this as a real resource to go to individuals and key stakeholders in the private and public sector and be like, “Hey, you know, these are where we see the concentrated forms of cyber risk sit within the software supply chain, and these are some ideas that we’ve had that can help use this information to better inform cyber security decision making and to better protect systems going forward.”

So, you know, in terms of strategies, for us, it was really about increasing visibility on the problems and, you know, more than happy to go into a little bit more detail on what some of those trends that really kind of shedded some light on where adversaries are targeting us, because I think that’s a really useful takeaway. But I wanted to just quickly go to Stew and see if he had anything else to add—yeah.

Stewart Scott: Yeah, sure. I think a huge part was just creating a kinda common framework to compare everything in. It lends itself to kinda aggregate data and analysis that there’s great opportunities for graphics and visualization, but it also just lets you kind of appreciate some of the attacks in a very down to earth way. You can point to case studies, you can appreciate how sophisticated some are and how others are just kinda taking advantage of pirated copies of Pokémon GO.

Brian Dawson: Right.

Stewart Scott: And just getting those anecdotes and that kinda link from the very tangible end user experiences that kind of everyone has nowadays to these much broader things that policymakers have to consider. I think it’s important, just kind of explaining the importance and complexity of the ecosystem.

Brian Dawson: Yeah, and it sounds like part of when we talk strategy, it is couched in tying sort of the anecdotal indicators that we all see and feel, but yet cyber security vulnerability for the average layperson is very nebulous, right? 

We hear about an attack there, we hear about an attack here, we hear more about how we need to secure passwords. If you’re in my space in the development side of the house, you hear about securing your supply chain, managing dependencies, you may do some scans, and then you hear about SolarWinds or Sunburst and the compromise to the supply chain. But it’s still all really nebulous and anecdotal, so it sounds like the strategy here and what is novel is really tying it to empirical data, structured research, pairing that with the anecdotal information and then getting that socialized and pushed out.

I will wanna dig into, I wanna talk data in a minute, and I wanna talk key takeaways and observations. But before I do, let’s jump ahead a bit. What have the results been? So, as you guys, you know, did this work and released the Breaking Trust and Broken Trust reports, what have you seen? Have you achieved your goals in terms of increasing awareness? What has the response or reaction been?

Stewart, do you wanna lead with that?

Stewart Scott: Sure. I think that tees up some discussion about the recent Executive Order well. I think it’s direct product of our work, we’re our own recommendations, based on kind of what the threat landscape looks like, what attackers love to go after, and which of those targets are most fruitful. We put out some recommendations and it’s been really nice to see that they're not ignored by the Executive Order. They're talking about the same things.

Brian Dawson: Wow.

Stewart Scott: I think a big one for us, we call it minding the blast radius. I think SolarWinds especially, and some of it is kinda precursor related attacks, they show a clear tendency that adversaries love to go after these very nitty-gritty programs that have lots of privileges and authorities they can run quietly and they're kind of in the background of everything. And so, seeing that Executive Order talked about zero trust architecture and defining and mapping kinda critical software, right? Just, the underlying assumption already in order is, you have to assume that large networks are going to be breached, right?

Brian Dawson: Yeah.

Stewart Scott: Like, at the end of the day, if you’re faced by 1,000 Russian software engineers with a lot of money and time on their hands, you can only do so much to prevent initial incursion, and a lot of the rest of your job has to be figuring out how to bound that, right? How to make sure that there’s not kinda random processes in the background that have runaway permissions that are super useful for attackers to find. Will, do you have some other ones you wanted to tag on?

Will Loomis: Yeah. I mean, I guess I would just, I have two things specifically and then I’ll zoom out for a second.

First, you know, for me, with this problem, it’s challenging to see, on an organization to organization or company to company basis, it’s hard to see those concrete results in a really short period of time. So, for us, you know, what we see as the big wins is getting those folks in the executive offices at companies, getting those folks in key places in government to realize that this is a problem. And to edit the way that they're thinking about the problem in order to make sure that not only their time, but also their resource, the limited resources that they have for cyber security are spent on mitigation strategies that actually raise their security instead of just thrown—I’m not gonna say bad things about end points or firewalls, but just thrown at whoever the new cyber security system or server is that they can get. But actually using those resources intelligently and thinking about where the highest concentrations of risk are for them and then actually trying to address those concentrations of risk.

So, a little bit abstract, and something that’s hard to quantify on an organization basis, but you know, we’ve heard a lot of really good sentiment from those key stakeholders that were engaging, that they're aware that this is an issue and that they're starting to put together the processes to think about their organizational cyber security in a better and more focused on critical technology mindset.

Just two quick things, one that I saw directly in the EO and one that I would like to see gain more attention over time. The first is, the EO talks a lot about SBOM, so software bill of materials, which basically is essentially like a menu label for a piece of software that shows—hey, these are all of the different pieces of code from different places that is used in this item. So, for us, you know, that’s really important. Allan Friedman of NTIA finally got his way and it’s great to see that this kind of focus on trust has made its way to the EO, and this is something for us that was big.

One thing that I would like to see more of is a focus on open source. So, for us, you know, open source and open source code is a critically, critically important part of the software supply chain ecosystem. It wasn’t part of Sunburst and it wasn’t part of the EO, so it’s not getting as much attention, but it’s a critically under-defended attack area in the software supply chain. It has a really wide potential for blast radius, and for us, it really needs more support and more funding from the USG as well as the private sector.

Brian Dawson: That is really, really interesting. It hits upon a couple of threads that I wanted to pull on before we go back and talk about a few other things like Executive Order data. And that is, interestingly enough, as we speak here, and the guests will learn, by the time this episode airs, I will have moved on from our sponsor, CloudBees, to go work at The Linux Foundation, where we’re actively taking a look at cyber security and its role in OSS as well as open source and free software and the role of cyber security.

And I think, to date, as developers, we’ve often focused, when we talk about open source security is, we lightly consider sort of an open source project security posture before we adopt it, but oftentimes, when it’s hand on keyboard and the developer needs to grab functionality, they find the most popular open source module, the one that is closest to what they need, they pull it in. And then security from there is really just based on static scans, right? So, there is—which gives us a bit of kind of security, but I would argue it starts a bit too late, right?

And what you'll see happening in the open source communities and with Linux Foundation being one of the leaders is really a movement to ensure that you’re embedding enterprise grade—I would beg to say government grade security practices into your open source supply chain before it even goes downstream to be consumed by, say, commercial vendors. So, it’s really interesting that you brought that up.

That does also, though, transition me to wanting to say that I would phrase something you said, Will, and it goes back to something you said, Stewart, is oftentimes, when we think cyber security, people are thinking about network security—and really, networks is part of your supply chain. 

But as we found with the rogue version of Apple’s ID and development environment or as we found with SolarWinds, there’s a bunch of other places in your supply chain that could get attacked. And Will, I think you said, “Yes, we can go out and buy the greatest tool to secure our endpoints, but we really need to change how we’re thinking about security within organizations.” And that almost aligns to our DevOps trinity, where we talk about people and culture, process and practices, tools and technology. 

And I’ll open for either of you all to comment on this question. It is one thing to focus on technologies that you can buy, but it sounds like you’re saying you haven’t fully developed a posture unless you’ve addressed people and culture, process and practices. Is that fair?

Will Loomis: Yeah. So, first of all, Brian, congratulations on the transition, that’s awesome. The Linux Foundation does great work. In terms of what you’re talking about in people and processes—I mean, honestly, I think this is extremely true for the software supply chain, and a lot of what you talked about in terms of open source and being able to have clean and concise cyber security practices and policies across an organization is a key first step towards raising that cyber security posture.

In terms of the people thing, you know, it’s a problem for software supply chain, but people and the insecurity and the lack of knowledge around people is a problem for the cyber security industry more broadly. So, in general, I think it’s kind of on a community level that there’s—and even in our, a lot of people have talked about upping education and trying to build in kind of basic cyber hygiene principles into U.S. education at a very early level. Obviously, that’s hard to do, but you know, a lot of the problems that we see—so, ransomware right now is a big one in the news after the shutdown of the Colonial Pipeline. A lot of that can be stopped by just having a better state of cyber hygiene on an individual to individual and organization to organization basis.

So, when it comes to people and processes, just making sure that people have 2FA enabled, making sure that they have differentiated passwords, like, the 101 stuff in cyber security, it is a key first step that organizations need to be taking, and we’ve still seen some lag in the uptake of those types of policies.

And, you know, although that’s not gonna solve all of your complicated issues and it’s not gonna completely protect you, software supply chain or otherwise, it’ll raise that baseline for security, which, in a very easy and manageable way, can take out huge clumps of concentrated cyber risk for your organization.

I know Stew might have some thoughts on this as well, so I also wanna make sure that he has a second to jump in.

Stewart Scott: Yeah, I think a lot of our work kinda looks at cyber security starting from the attack angle, starting from analyzing kinda different incidents. And I think that highlights just kinda how interrelated those three pillars you mentioned are. Within any kind of sophisticated attack—take SolarWinds, for example—you see kinda side by side these incredibly resourced organizations doing really technical stuff, but also building a lot of that on top of something like Will mentioned, a compromised development account that had a bad password.

So, I think anything to kinda highlight how those pillars just, they're interrelated and interdependent as much as they are important on their own, is useful.

Brian Dawson: Yeah, yeah. I can absolutely—look, all of us, it’s funny. In just focusing on developers as practitioners who are focused, sort of motivated on moving fast, creating, being innovative, oftentimes, I say we find ourselves trying to remove obstacles, right, so that we can move faster, innovate more. And I’d say at times, those shortcuts that are wired to remove those obstacles aren’t always the most prudent in terms of security posture.

And just to underscore what I think the three of us have said here, or sounds like we agree on is, you know, there is a culture or a personal element. Look, we can push out and mandate that you use 2FA or multifactor authentication, but there’s always a human vulnerability. So, we need to bring along the mindset of the people in these organizations to understand and agree what, when we talk quality first in development, we also need to develop a security first mindset. 

And I’ll put a call out to the vendors that listen to this call and the platforms Ops teams—look, it’s up to us, just as we’ve tried to balance with DevOps, speed and quality, also with speed in security, right? To ensure—and obviously, that’s the DevSecOps movement—but to ensure that it is low cognitive friction for our practitioners to execute across the software delivery supply chain in a manner that is secure.

I’d like to ask, my other path to the data, and I’ll get there in a minute, gentlemen, and I’ll start with you, Stewart. I suspect, and this just may be a yes or no, or there may be a deeper answer, but when we as the general public gauge cyber security and the status of cyber security through what we hear through the news, right, a number of the high profile attacks we’ve talked about, we can go all the way back to Heartbleed or Experian—or Equifax, excuse me. But I’m assuming that is really just the tip of the iceberg. That, as you'll did your research and gathered your data, I’m assuming what we’re aware of as the general public pales in comparison to all of the activity that occurs on a larger scale. 

Is that fair? Or what is it that we as the general public don’t know that you have learned through your research and through your work?

Stewart Scott: Sure. I can take a hack at that. I mean, to be fair, I don't think there is anything the general public doesn’t know that we—Will and I—do know. I think there’s a good question in there about kind of the different biases in data set reporting. I think the general public gets exposed kind of to the biggest and worst compromises.

Brian Dawson: Yes.

Stewart Scott: And there’s a huge, huge swath of these kind of very low effort but still very successful attacks. I think we’ve done some cut out work from the Breaking Trust report on just compromised apps in app stores. None of them is particularly destructive to national infrastructure, none of them is particularly complicated or hard to avoid, just don’t download random apps that aren’t purveyed by the actual vendor.

But even under that, I think there’s a layer, too, of kind of unnoble things, right? There’s all those attacks that never happened because of things that were put in place that were used successfully. And it’s important to keep in mind that our data set, then, really only contains things that worked and also things that worked that were publicly disclosed. And I think even within our data set, there were a couple of attacks were our own data is pretty sparse, just because a lot of what happened took place kinda behind closed doors. It was never fully disclosed outside of a small corporate environment.

Brian Dawson: Yeah. Yeah, that’s funny. It seems that there’s a bit of a pseudo-psychological element of a lack of transparency, shame, risk to the business in reporting this stuff. And Will, you might have gotten ready to jump in, I thought I heard something, but I’d also ask as you all respond if you can fill us in a bit on, there’s been a heavy focus on the data, and that really being a core to the strategy here and a differentiating factor.

Can you tell us a bit more about how you gathered data and what the data is?

Will Loomis: Beautiful transition for me, Brian, so thanks a lot, here. So, one thing that I want to hit quickly from what Stew said a second ago, you know, Stew said that a lot of the information and the data that we put together is publicly reported and that’s 100 percent true. So, we found all 138 cases in our case study from specific new sites, looking at stuff like ZDNet or Elredge or CyberScoop that focuses more on the technical side of it and a little bit less on kind of the big headline side of it.

But in terms of first the data and how we collect it—so, that’s a little bit on how we collected it, but in terms of what the public is missing, for me, it’s a lot about the volume. So, you know, you only hear about the top 2 percent of the big name cyber attacks that generally happen. So, you know, Colonial because it shut down all of this fuel transfer and caused all these increased fuel prices, you know, you hear about SolarWinds because it’s been such a high level deal within the government. But in general, these types of attacks do not gain a lot of attention.

So, for us, I definitely think it was important for the public to understand that—hey, you know, there’s a lot of these, and not only are they impactful, but they can actually influence you as the end user. So, I’ll kinda use this as a transition to talk briefly, kind of, about the five key takeaways that we found in very straightforward terms from the report.

So, kind of in order, our first key takeaway was that these types of attacks, these attacks utilizing vulnerabilities in the software supply chain are used a lot by state actors. So, for us, out of our 138, there were at least 30 attacks that have been directly attributed to state actors, especially China and Russia here. And for us, you know, as you said a second ago, we actually think that these are an underrepresentation of the larger number of state driven attacks against the software supply chain. So, a lot of those sophisticated attacks from state actors never reach the public ear, even organizations or news orgs like Elredge or ZDNet that focus on technical issues, because this stuff is kept proprietary by private actors. So, although it’s clear that these are used a lot by state actors, and that became super evident in the SolarWinds Sunburst campaign, we even think that it’s used by state actors to an even higher extent.

The second big trend here is, there are a lot of attacks in our data set that undermine—so, signing certificates and public key cryptography. So, there’s a lot of abuse of code signing in these attacks, and for someone unfamiliar with what code signing looks like, basically, you use a public key cryptography which is a stamp that says, “Hey, this is someone that we trust and who has access to this piece of programming” or, “This patch is clean because it’s been signed by this particular cryptographical sign.” 

We’ve seen a lot of malicious actors grant in the middle of that process and either steal legitimate code signing certificates or abuse ones that—create their own to abuse that process in order to get malicious code out to those end users. And for us, this really gets at that root of trust between the developer, the provider, and that end user that really causes a lot of problems here.

The last three that I’ll hit a little bit more quickly, and then happy to come back to them at the end or in a second. You know, the first one is open source. So, as I mentioned earlier, you know, this is a big vector for us. You know, we saw a lot of attacks that could have really wide blast radiuses by just getting at one or two small open source pieces of code in a library.

Another key one for us is attacks that targeted software updates. So, this is what we saw with the SolarWinds side of the Sunburst compromise. They were able to break a code signing cert and get at and sneak malicious code within a software update. So, this is like, you know, in the top right corner of your computer when you get a notification saying, “Hey, it’s time to update your Microsoft Word or your operating system” and you press that update button, in 26 percent of our categorized attacks, malicious actors have been able to sneak malware and malicious code into those software updates that seem legitimate. So, you know, this is another way that most actors have been able to exploit the trust that is so inherent in the software supply chain and get malicious code to end users.

The last one, and one I think that will be pretty near and dear to your heart, Brian, is those compromised apps and app development tools. So, this was a quarter of our total incidents in the data set. We’ve seen a lot in recent—or a little bit in the past couple months. We’ve seen this become more prominent for apps in the Apple store. But chiefly in our data set, you know, it’s a lot of people sneaking malicious code, either directly into apps on the Google Play store or in order third party app stores, or using techniques like typo squatting to post an app that looks extremely similar to a legitimate app and kind of entice folks to download this copy. So, we saw that a lot, particularly in China with slightly different versions of TikTok as well as a couple other examples.

But yeah, in summary, for us, those are the key five trends. We tried to make them reasonably easy to understand, but for us, these really highlight where the key concentrations of risk are amongst our software and our software supply chains, and they're the key drivers of the recommendations in our report.

Brian Dawson: I think that’s fantastic, and there’s a ton in the report that drills down on your five key takeaways, things I’d love to ask questions about, and we may get some more time in. But no, thanks for that summary.

And that sets me up to ask, and ask on behalf of the audience—how can businesses ensure the security in their software supply chain in light of these key takeaways? What do they need to do to ensure their supply chain gets the necessary attention to avoid running into one of those five key areas?

Stewart Scott: I can take a first swing. I think something we come back to, and it applies to government as much as industry, is defending that supply chain. The first thing it requires is knowing the supply chain, and that’s kind of what the underlying problem is, right? There’s so much interlocking dependency and complexity, you have no idea kinda ultimately what an application depends on that’s out of your control, and so there’s where tools like SBOM are incredibly useful to just be able to map not necessarily where you’re vulnerable, but just where you’re exposed, and be able to track who’s kind of in charge at the bottom of the chain of your security, even if it’s not your initial intention.

Will, do you wanna add to that? I know you’re a huge SBOM fan.

Will Loomis: One thing that I would say, you know, definitely building on what Stew said, I think increasing that knowledge of what software and what systems your organization is using is really critical. We saw this after SolarWinds and after some of the Microsoft Exchange exploits that have happened over the course of the last couple months that there were organizations that didn’t really know what they were running and didn’t understand or have an idea of whether or not they were exposed to these types of compromises.

And those are big, high profile services. Like, imagine, on a much more granular level, a piece of NPM code gets compromised, organizations have very little clue on what pieces of software and what systems they're using might depend on that little dependency. So, for us, definitely a huge emphasis on increasing visibility and forcing organizations to be more knowledgeable and more critical about the systems and the software that they use.

The other key thing here, and this is just as important for the private sector as it is for government, but we need to be ruthless about prioritizing cyber risk and putting our resources where they're needed most. So, you know, understanding where your risks lie is a key first step to this. But once you’ve identified those quote-unquote problem children within your systems and within your organization, you need to be able to spend that somewhat, sometimes limited amount of resources that your organization has or cyber security in the right place and ruthlessly identifying and prioritizing those most concentrated areas of risk and then using those resources to address them. For us, it’s gotta be step 1B right after you identify where those most critical systems are within your overall network.

Brian Dawson: Awesome, awesome. And now, moving down to incentivizing this, right, incentivizing that you use appropriate practices and tools, what role does the government play in that, or do you think the government can or should play in that?

Stewart Scott: I can start on that. I think we spent a good chunk of our second report, Broken Trust, looking at the myriad of government acquisition processes, right, that government buys and uses a huge, huge amount of private sector IT infrastructure and the kind of decisions that they make and how they procure and how they ensure that what they procure is secure kinda inevitably trickles down into private sector practice.

So, I think, you know, going back to that EO, leveraging the federal acquisition and the DFARS kind of branch of that, the DoD version, to make sure that they have metrics for what vendors are secure is a huge step, and I think there are provisions in the EO for making at least parts of those metrics publicly available, right?

And so, then, in an interaction between two completely private entities, they can still look at those scores, right? They can still say, “Oh, this person has been vetted by the government and is generally pretty good at cyber security, I can maybe trust them more.” So, just kinda contributing to that more open flow of information about general cyber security and cyber hygiene practices is a huge step.

Brian Dawson: Awesome. Will?

Will Loomis: Yeah, I definitely agree with everything Stew said there. You know, just one thing else that I would add in terms of incentivization is, you know, it’s really important that—so, obviously, the acquisition side is a big incentive in terms of pushing companies and pushing private sector actors to be more secure about their software.

You know, the other thing I wanted to add is just, you know, it’s really important that organizations actually kind of understand how to go about improving their cyber security postures. So, you see the NISC cyber security framework amongst many, many, many other guidelines that have come out from the government to be like, “Hey, these are the key things you need to be thinking about. These are the key steps that you need to be thinking about in order to raise that cyber security posture.”

But some of that is very abstract and not easily implementable on an organization to organizational basis. So, I think some continued partnership between the public and the private sector on being more granular and more stimulus plan on how, just as an example, you know, you can actually go as an organization to attempt to implement the five components of the NIST cyber security framework. Like, what does that actually look like for a software as a service company? What does that look like for different types of organizations, and how, on an operational basis, can that be implemented?

For me, that’s one of the most critical parts of this. It’s education, but it’s also making sure that kinda the standards and acquisition and regulatory frameworks are actually understandable by the people who are supposed to be protecting their systems. Because right now, there’s definitely some gaps there, and that causes some of the friction when it comes to uptake on the private sector side and making sure that their systems are systemically secure.

Brian Dawson: Awesome, awesome. Well, this gets us to a—well, first, let me just say this, and it’s not a wrap up—thank you for this last portion in particular, right? A concise summary of what your findings are, what are some actions enterprises can take, and how private sector and government might collaborate together to better our situation and better what I’ll call our overall shared security posture.

As you all said earlier, when you talk about this space, it is very difficult if not impossible to draw a hard line between private sector and government. They're deeply intertwined, and share in vulnerability and exposure.

Now, in one of our standard DevOps Radio questions that I’d like to present to you guys is, what is a resource, a book, a podcast, a person to follow that you recommend that has helped you and you recommend our readers read or otherwise consume?

And I’ll start out by offering one, it’s a great fireside read, and that’s the Breaking Trust and Broken Trust reports. Do make sure that you seek those out and download those. But for each of you personally—Will, we’ll start with you, do you have a particular resource that you highly recommend our listeners go get?

Will Loomis: So, two things, two pieces of resources that I’ll put forward and then something a little bit more fun. The first, a podcast, Risky Biz, really, really good, about not only hitting kind of the top level news when it comes to cyber and cyber security, but they're really timely, they're well resourced, the recordings are very accessible, and it gets below some of those very high level, top level media items that everyone reports and actually gets to some of the nitty-gritty in a very accessible way. So, for me, that’s a really good listen and a good way to keep up with the times when it comes to cyber security.

A comic strip, actually, that I wanna recommend—

Brian Dawson: Oh, nice.

Will Loomis: - Little Bobby, written on a weekly basis by the founder of the company Dragos, Rob Lee. Little Bobby’s primary purpose is to have fun, but there’s also a lot of education built into that, particularly around the industrial control system world and there’s some good inside jokes there for the cyber security community, so definitely would recommend Little Bobby, a great and very enjoyable quick read every Sunday.

And then something non-cyber or DevOps related, but a classic, just finished the book Cat’s Cradle by Kurt Vonnegut—amazing satire, dark yet funny, imaginative, quick read, would highly recommend.

Brian Dawson: Awesome, alright. I’ve just lined all of those up as you were speaking. They were all new to me. I think I had some exposure to Little Bobby, but I’m gonna go consume all of those. Thanks for that, Will?

And Stewart—so, Will kind of threw the gauntlet down. What do you got for us?

Stewart Scott: [Laughter] Yeah, I definitely, definitely second the Risky Biz podcast. I think, I did a lot of the work on the data set and I absolutely loved some of the Ars Technica articles, did a great job of kinda converting the nitty-gritty into common English.

Aside from that, there are definitely some great Twitter feeds out there from some DevSecOps folks that have been really helpful. And Bill Vane of more relaxed reading, I’ve been doing some Murakami reading. Hard-Boiled Wonderland and the End of the World was a fun one.

Brian Dawson: Okay. I had not heard of that, now I am curious—Hard-Boiled Wonderworld, there we go. Hard-Boiled Wonderland and the End of the World, alright, I’m gonna have to give that a read.

Awesome. Well, gentlemen, before I let you go, I am gonna ask and we’ll go in reverse order of what we just did for you to share your final thoughts, final takeaways, words of wisdom, learnings for our audience here. So, Stewart, I’d like to start with you—any final thoughts?

Stewart Scott: Yeah, well, first of all, thanks so much for having us. I think it’s been really great to see kinda the, shed some more light on software supply chain security. I think it’s definitely been, kind of our minds, underserved up until now, and great to see some of the agencies in government getting a bit more support.

I think one thing I’m kinda looking forward to down the line is, I think there’s a big role for cloud service providers in a lot of this, and kind of offloading security there. I’m really interested in looking at some of the infrastructure and complexity management questions that start to come up from that. I think it’ll be a really exciting problem for The Atlantic Council and Will with Cyber Statecraft to tackle.

Brian Dawson: Yeah, great, great. And I’ll just call out before I hand it over to you, Will, it’s come up recently in many discussions online and offline that I’ve had, and that is, we have now built a very powerful yet complex stack with our cloud services, and we’ve had discussions about, it’s time to shift towards understanding that complexity if not reducing that complexity, right? And I think that dovetails well with some of what we see in the report, with service providers being a common attack vector and, as we move to consume more of cloud services and shared services, our need to have an understanding that begets the proper security posture.

So, sorry for stealing a bit of your last words, Stewart, but over to Will. Do you have any final words of wisdom for our listeners, here?

Will Loomis: Yeah, first of all, I just wanted to echo what Stew said—you know, big thanks, Brian, to DevOps Radio and to CloudBees for having us. Really, really fun conversation, and always grateful for any opportunity to step up on my soap box and talk about these issues.

In summary, for me, operations like Sunburst amongst others have really demonstrated that we—and this is both the private sector and the public sector—need to better understand that, you know, (a) compromise is inevitable, and it's resiliency not prevention that’s key going forward. And also, we need to think differently about how and what we prioritize when it comes to defense.

So, I talked a little bit about that earlier, but for me, those are two of the key pillars going forward. Government and industry needs to better understand what products you’re using, how your network is set up, identify where those biggest points of risks are, and then set up stronger and more layered levels of both protection and monitoring to lower that risk down the road.

And then, you know, one last thing is, I’d kick myself if I didn’t foot stomp it again. You know, we talk a lot about, there’s been a lot of conversation post Sunburst and in the EO about software supply chains and looking at compromise to updates and stuff like that. Just one more time, we cannot let open source slip. It’s a huge attack vector, it is the backbone of huge swaths of the Internet. Right now, it’s critically under-defended. So, need to put more support, more time, and more funding into protecting open source.

That’s it for me, though, Brian, and I really appreciate the opportunity to join you today.

Brian Dawson: Thank you. I’m gonna jump in and steal a little bit of your thunder, too, on your final words. Thank you, actually, very powerful, and I would underscore something I love there—resiliency, not prevention. You know, not at the cost of prevention, but what we really need to do is focus on resiliency, and I’m really encouraged by the fact that, in terms of words capturing meaning, that word resiliency is really starting to gain traction in our space.

And then much more personally—yes, amen, I think most of us have come to an understanding of how much open source powers the world today. And it does underscore how important it is that that is well defended, and I’m encouraged by hearing you say that, I’m encouraged by the fact that in past weeks, I’ve spent significant time engaged in calls with multi-party efforts focused on just that, right? Securing the open source portion of the supply chain which helps power the Internet much beyond that.

Gentlemen, I just want to say thank you—phenomenal work. I have not gotten to digest it all. I am now absolutely inspired to go in and do some fireside reading of both reports and keep a continued eye towards the work coming out of Atlantic Council and specifically in the areas that you two are focusing on. I very much appreciate not only the time that you’ve given to our listeners today, but the immense amount of time that must have gone into the work that gives us the insights that are gonna help us all do better. 

So, again, thank you, Stewart. Thank you, Will. I appreciate your time and I appreciate your efforts.

Will Loomis: Yeah, thanks, Brian. You know, these issues are important to us, too, and that’s why we put the time in and, yeah, always thankful to have an opportunity to talk about them and very thankful to you guys.

Brian Dawson: Alright. 

Stewart Scott: Same here. Thanks so much.

Brian Dawson: Alright. So, with that, that concludes another episode of DevOps Radio. Before you close this, make sure you hit subscribe so that you get notified when the next episode with more insightful, intelligent, and generally phenomenal speakers comes about. Again, thank you for listening. Have a great day.

Brian Dawson

Brian is a DevOps evangelist and practitioner with a focus on agile, continuous integration (CI), continuous delivery (CD) and DevOps practices. He has over 25 years as a software professional in multiple domains including quality assurance, engineering and management, with a focus on optimization of software development. Brian has led an agile transformation consulting practice and helped many organizations implement CI, CD and DevOps.

Follow Brian Dawson on Twitter.