Resources Blog Interview: Reflections on the State of the Software Supply ...

Interview: Reflections on the State of the Software Supply Chain Report

Approx read time: 8.5 mins



Derek Weeks wears many hats in our industry.  He is the Co-Founder and Core Organizer for All Day DevOps and the SVP and Chief Marketing Officer at The Linux Foundation. He’s also a Senior Advisor at OpenSSF.  Derek was a major part of our last six State of the Software Supply Chain reports, and was involved in the development of the 2021 report.

Today, we share an interview with him with his insights and perspective on this and previous reports.

interview-derek

Overview

Q: What did you think of the 2021 report?  What was this year’s biggest takeaway?

First, the depth of the discussion and analysis there is part of what makes this a world class report. Few others have actually looked at the data to understand current software development practices.  

Second, there's a perception that good methods are in play and that organizations want to do the right thing.  They really believe they are doing the right thing.  The reality is that, in high velocity, high intensity software development, best practices are not necessarily followed.  Nor are they easy to visualize and understand.  

With so many components and dependencies within an application, its absolutely impossible to track or evaluate those manually.  When you include the aid of computers to assess what's in the software, find the dependencies, and determine software quality?  Only then do you begin to get an understanding of the complexities around the use of open source in software development.

There are certainly massive advantages to using open source – no one's coding from scratch anymore on those specific functions.  But that doesn't alleviate the fact that responsible development means using higher quality, more secure components.

So there's some big topics in this report.  The large scale of downloads, the number and speed of adversary attacks, and the political angles into how government is getting more and more involved. 

But I love the considerations around how innovation happens (or does not happen) in software development. I think the thing that stood out to me the most is the discussion on mean time to update and mean time to repair.

Q: Any surprises in the report?

An "aha" moment for me was when you look at the research done on the most popular projects versus the least popular. The most popular projects have considerably more known vulnerabilities.

That plays into two angles: one is that the most popular projects have more eyes on them. There's more people using and relying on them in the community, making it more likely that a security researcher is either curious or lucky enough to find the vulnerabilities in those components.  And make those visible to the community.

So the saying goes that "with more eyeballs, all bugs are shallow" (credited to Eric S. Raymond). More vulnerabilities are discovered through more eyeballs in the more popular projects, so that's good for the community at large.   

But are they resolved?  Are they capturing all those issues?  We know all code is not perfect and that all vulnerabilities are not found simultaneously. They're found gradually over time, especially in the popular projects.

At the same time, the projects that are used most often have more known vulnerabilities, and the adversaries know this. Bad actors aren't looking for a rarely-used, sleepy little project. No, they're looking for what is the most efficient attack vector.

Attacks at Scale

It's really a "skeleton key" approach, meaning I if can find one door, I can find 1,000.  Those are the vulnerabilities that attackers are looking for.  I want to find and exploit known issues in the most popular projects that are easy to find because they're discovered by someone else.

Opening 1,000 doors at once means you're much more efficient as an adversary.  And in the business of attacking, finding that one door in a popular project is much more effective.

You could also read it as there's safety in numbers – more eyeballs in the most popular projects means vulnerabilities are discovered.  But there is also risk that if you're asking “what do I protect?” or “what do I fix first?”, you have to fix the vulnerabilities in your most popular projects first, because that's where all the adversaries are aiming. 

Then and Now

Q: Any perspective coming from your experience on previous reports?

I'm very grateful to see the project continuing over the years. Where it says the "seventh annual" on the first page?  My reaction was “my gosh I did six of these!”

The historical perspective of this document in its history is really important. If it changed every single year with new themes and new approaches, you would lose something. That year-over-year history is part of a very important topic tied to the heart of innovation.

In fact, the structure of the report over the last four years has been very similar.  I think that's important for the community and Sonatype.  It’s nice because any reader who finds an "aha" moment of their own in this year's report can look back last year or the year before, and ask: how did this compare?  How are we improving, how have things progressed. Have we taken just one step forward or five steps in the last year, two years, three years?  

Some of the ongoing topics:

  1. Open source development community - What's happening in terms of new projects being built, new versions being released, and how often are those downloaded and consumed from enterprises or individuals.  Those numbers have grown exponentially over over the years.  

It seems there's always someone out there saying "it's going to slow down at some point", but that "some point" has not happened in the seven years.  It not only hasn't slowed down, it's accelerated.

  1. Adversaries - Security is only a problem if you have opposition and the report continues to include those realities.  We've maintained a focus on what our adversaries are doing and what we should be aware of.

  2. Recommendations – Every year, we've looked at software development best practices, as well as government or community policy and guidance.

An (Increasingly) Big Picture

One major difference from when we started is visible about the whole industry, especially U.S.-specific the Biden Executive Order back in May and the cybersecurity meeting at the White House in August. Those show how the problem we're talking about has been elevated over the years. 

When I started doing the report seven years ago, the conversation was not at a national level. But over and over again in the report, we've shown how the topic has elevated in importance as open source news has raised in importance in the technology community.

When it got to the Executive level it's not just that the White House said something on the topic.  It's that the White House is asking top industry leaders to address this problem. A critical problem for the country. That is very meaningful in terms of driving action in the largest and smallest companies around the country.

So it can't get to a higher level in the U.S. That evolution over time and the increasing importance of this is clear. It's now continuing to progress on the world stage, as the report details. 

Faster Copycats

Q: Any changes to the report between 2020 to 2021 reports that caught your eye?

If we look at the research process, while it's only written within a couple of months, the research behind it is a year long.  I was there for the first part and I saw a lot of the malicious code attacks. 

In particular, one of the biggest stories in the report is the Alex Birsan research <link> that led to malicious code injections and the subsequent dramatic growth in those types of attacks. Alex came out back in February and said "here are Dependency Confusion-type attacks that I staged as a researcher affecting 35 companies." Then, a month or so later there were 10,000 copycat attacks. 

This shows the curiosity of the software development community but also the eyeballs of the adversaries. If they were looking for new attack vectors, they clearly found one within Alex's work and very quickly started exploiting that.

When it comes to security vulnerabilities and remediation, you're only as good as your fastest attackers in software development. After all, your adversaries can look at a new attack vector and begin to exploit it within a couple of days. So you're very safe if your organization can find and remediate vulnerabilities within a day or two.  

On the other hand, if it takes a couple of months to find, identify, and remediate vulnerabilities, then during that window you're in a very precarious position. That's just one instance of one type of attack that adversaries are using to penetrate open source software within applications. So you have to look at best practices in both meantime to update and remediate internally. 

You also have to look at how fast the adversaries are at attacking those potential vulnerable points within your applications.

Complexity vs. Fundamentals

Q: How has your perspective on this topic changed since your move to the Linux Foundation?

Derek Weeks: One thing that's clear in this report and has been for years: it doesn't matter which company you work with. There's no silver bullet to resolve this. The issue revolves around:

  • What is the open source development community doing as stakeholders?

  • What are technology companies doing as stakeholders, introducing technological decisions?

  • What our organizations or communities are doing in terms of training developers or training the security community? 

  • What is government doing to encourage or mandate actions taking place within innovation to protect the critical nature of the infrastructure, applications, and data behind them? 

There's not one company, angle, or specific solution that can fix this problem. It is a very, very large-scale issue. Tooling is certainly part of that but you need a multi-faceted set of solutions. 

I was just speaking with a CSO this morning. He said it's really interesting being in that role and seeing the complex nature of attacks that can happen within an organization. You can get into really nuanced areas in this space, but 95% of what happens is just at the basics level.

In the case of open source:

  • Did you choose a good open source component or a bad one? 

  • If you know the difference between good and bad, how are you evaluating that? 

  • What evaluation criteria was used?  

  • How did that impact your choice on a particular open source project or version?

More education and best practices are needed, but we also need a fundamental approach to the problem.  This foundation is needed before we think about a level 2, 3, 4, or 5 of maturity. As an industry, we need to do more work at level 1.

There are a lot of things going on in this community and there's a lot of education that needs to happen on all fronts.

Active Projects

Q: One of the surprising statistics from the report says that only 6% of the four studied ecosystems are actually being used in production.  That suggests that some 94% of open source is almost dormant.

Derek Weeks: It makes sense looking at it but don't let that kind of bend in your perception of the criticality of this. If there are 37 million component releases and only 6% of that 37 million are being used, that is a huge number, right? That means about 3 million component releases are the most popular. You can't imagine 3 million so, even at 6%, don't let that small percentage fool you, the numbers are still huge.

Picture of Luke Mcbride

Written by Luke Mcbride

Luke is a writer at Sonatype covering everything from open source licenses and liability to DevSecOps trends and container security.