Following the software supply chain attack on Solarwinds and the worldwide panic from the vulnerability affecting Log4j, government and regulatory bodies around the world have been trying to address this looming problem: How do you secure and protect software supply chains as they become a greater target for cybersecurity attacks?
In the United States, the two Presidential Executive Orders of February 2021 and May 2021 started the conversation about protecting critical U.S. federal systems from cyberattacks. This has since turned into a steady drumbeat of activity intensifying and spreading beyond the borders of the United States and into the private sector.
For example:
- NIST has put out several publications, including the comprehensive guidance in "Software Security in Supply Chains."
- OMB issued another memorandum called "Enhancing the Security of the Software Supply Chain through Secure Software Development Practices."
- Several legislative routes are being discussed, including the Securing Open Source Act of 2022 (introduced in the Senate in September).
- There are industry initiatives like the Linux Foundation's OpenSSF Open Source Software Security Mobilization Plan (which I’ve been a part of) that aim to provide guidance on the topic as well.
Since this is a global concern, other governments are also acting:
- In July 2022 the UK government issued a Proposal for Legislation to "Improve the UK's Cyber Resilience," which highlights the immense impact even small security risks in the supply chain can have.
- Germany issued the Information Security Act 2.0 (IT-SiG).
- Japan passed the "Act on Promotion of Economic Security by Integrated Implementation of Economic Measures."
- This is on top of European Union publications including the 2021 report titled "Understanding the increase in Supply Chain Security Attacks," and most recently, the main topic of this article, the proposed Cyber Resilience Act.
Requiring SBOMs was a starting point
From the beginning, I was (and still am) a big proponent of SBOMs. At Sonatype, we contributed the initial security extension to CycloneDX several years ago because we wanted to use this as the basis of our integration APIs instead of inventing yet another format. We continue to be active in driving adoption of this and other standards across the ecosystems we influence and of course support ingestion and exporting of SBOMs from our tools.
I was happy to see the Executive Order making SBOMs a requirement for sales to the US government, something that had been socialized around congress since 2013. This was an important catalyst for the industry. However, in talking to lots of prospects and others in the interim, I have since realized that we missed the mark a tiny bit. Too many people are focusing only on "I need to produce an SBOM, by any means necessary," rather than "I really should be managing my supply chain… and if I do that, then, of course, I can emit an SBOM from my tooling."
I've used the metaphor that ONLY requiring SBOMs is a bit like telling auto manufacturers that all they need to do is print out and include a bill of materials in the glove box of every car they sell. We know that would be ridiculous in the physical world. Neither governments nor the public would be ok without the ability to recall bad or defective parts of a car.
Yet, that is what so many organizations have taken from the US Executive Order requiring SBOMs, focusing only on being able to ship the bill of materials without taking a step back to think about how they should really be managing their supply chain. The SBOM is a very, very important tool - and you can’t manage your software supply chain without it - but it is still just the first of many steps if good hygiene is the goal.
What is the Cyber Resilience Act?
The Cyber Resilience Act (CRA) is the European Union's proposed regulation to combat threats affecting any digital entity and to "bolster cybersecurity rules to ensure more secure hardware and software products."
Taken directly from European Commission itself, they describe two main goals that were kept in mind when developing this proposal:
- Create conditions for the development of secure products with digital elements by ensuring that hardware and software products are placed on the market with fewer vulnerabilities and ensure that manufacturers take security seriously throughout a product’s life cycle; and
- Create conditions allowing users to take cybersecurity into account when selecting and using products with digital elements.
Why the Cyber Resilience Act is good for software supply chain security
Just like all of the other proposals, the CRA calls for vendors and producers of software to have, among many other things, a detailed understanding of what’s inside their software (an SBOM). However and most importantly, the CRA demands that we go one step further, and have the ability to recall -- which implies active management of the entire supply chain. This is the approach we’ve been missing from so many other policies:
From the placing on the market and for the expected product lifetime or for a period of five years after the placing on the market of a product with digital elements, whichever is shorter, manufacturers who know or have reason to believe that the product with digital elements or the processes put in place by the manufacturer are not in conformity with the essential requirements set out in Annex I shall immediately take the corrective measures necessary to bring that product with digital elements or the manufacturer’s processes into conformity, to withdraw or to recall the product, as appropriate. (Page 40, paragraph 12 of the Cyber Resilience Act).
Why the Cyber Resilience Act (might) be bad for open source
With all of the good that the CRA brings in evolving the regulatory conversations past SBOMs, the current draft has some problematic language that could actually hurt the future of open source.
But first, what it gets right about open source. Page 15, Paragraph 10 attempts to exempt, or carve out, open source software (OSS) from the regulations, saying:
In order not to hamper innovation or research, free and open-source software
developed or supplied outside the course of a commercial activity should not be
covered by this Regulation. This is in particular the case for software, including its
source code and modified versions, that is openly shared and freely accessible, usable, modifiable and redistributable.
This is good, even great. OSS and project maintainers should be exempt from these regulations that apply liability, as this will have the effect of quashing innovation and sharing of ideas via code.
However, in the same paragraph, the CRA attempts to draw a line between commercial and non-commercial use of open source software:
In the context of software, a commercial activity might be characterized not only by charging a price for a product, but also by charging a price for technical support services, by providing a software platform through which the manufacturer monetises other services, or by the use of personal data for reasons other than exclusively for improving the security, compatibility or interoperability of the software.
In other words, it appears that a developer or supplier deriving commercial benefit from the open source software would make it subject to the CRA. While one can see the intent of the language, as it's written, there is A LOT of ambiguity around the phrase "developed or supplied outside the course of commercial activity."
Combine this with page 43, paragraph 4, which talks about the distribution of software, and this can be interpreted to mean that open source producers, or the developers of the software, themselves might be held liable, and in violation of this proposed regulation if their open source projects are being used for commercial use:
Distributors who know or have reason to believe that a product with digital elements, which they have made available on the market, or the processes put in place by its manufacturer are not in conformity with the essential requirements set out in Annex I shall make sure that the corrective measures necessary to bring that product with digital elements or the processes put in place by its manufacturer into conformity are taken, or to withdraw or recall the product, if appropriate.
Upon identifying a vulnerability in the product with digital elements, distributors
shall inform the manufacturer without undue delay about that vulnerability.
Furthermore, where the product with digital elements presents a significant
cybersecurity risk, distributors shall immediately inform the market surveillance
authorities of the Member States in which they have made the product with digital
elements available on the market to that effect, giving details, in particular, of the
non-conformity and of any corrective measures taken.
Further, on page 44 in paragraph 6, the proposed regulation notes that distributors must know about the ceasing of manufacturing:
When the distributor of a product with digital elements becomes aware that the
manufacturer of that product ceased its operations and, as result, is not able to
comply with the obligations laid down in this Regulation, the distributor shall inform the relevant market surveillance authorities about this situation, as well as, by any means available and to the extent possible, the users of the products with digital elements placed on the market.
What does this mean for open source software?
In terms of what this means for open source software, it's very, very unclear. How would this apply to companies like us who run Maven Central? Or, other public repositories, like PyPI and npm? Would organizations that provide open source content and derive a commercial benefit from this activity suddenly be willing to shoulder potentially unlimited liability for the content?
Further, since not every open source vulnerability applies to all possible usage of a particular component, it's impossible for a repository like Central or npm to assess the impact of every vulnerability. It is very rare that a vulnerability applies to all uses of a component. So, what is one to do in that situation? If we remove the component to solve an issue for one user, we may cause irreparable damage for another who is using it safely.
Another issue arises from trying to know when an open source project ceases to exist as part of a repository. Many times projects can get to a stable place and stop making active updates, but it is very difficult to tell if that means they are completely unresponsive to future vulnerability disclosures or not. How can a steward of a public repository be expected to know the status of millions of open source projects, and should that steward be in the position to determine whether a project is dormant or dead?
Imagine the unintended consequences
The current wording in the CRA would create a mess for open source. And, I’m almost positive, a very unintended mess that would affect access to the European market.
If this regulation becomes European Union law without further clarifications, the effect on open source software could be quite detrimental. If open source producers and distributors who also derive commercial benefit from developing or distributing open source are suddenly liable for every defect and vulnerability within a public repository, the only logical conclusion is a balkanization of open source.
For responsible stewards of open source repositories, limiting or even shutting down access to potential developers should be a last resort reserved for hostile nations where export controls are in place. But, the liability associated with the current regulation would make this a prudent course of action in markets that adopt it as drafted. And the consequence of this would be Central, npm, PyPI and countless other repositories being suddenly inaccessible to the European Union, which would be disastrous for both the EU and for the ecosystem as a whole.
Further, if liability against publishers stands, you can also "imagine" specific projects moving to block contributors from affected countries resulting in talented developers not being permitted to contribute to open source projects.
Last, but certainly not least, this could very conceivably lead to projects changing licenses to specifically exclude usage inside of products that are shipped to the European Union.
These (hopefully) unintended consequences are coming from an otherwise very admirable piece of legislation that aims to increase the cybersecurity posture within digital products in a more advanced way than many of its counterparts, including the current legislation and policy landscape in the US. This being said, significant work needs to be done in order for the CRA to avoid potentially catastrophic European market access limits and for open source as a whole.
Written by Brian Fox
Brian Fox is a software developer, innovator and entrepreneur. He is an active contributor within the open source development community, most prominently as a member of the Apache Software Foundation and former Chair of the Apache Maven project. As the CTO and co-founder of Sonatype, he is focused on building a platform for developers and DevOps professionals to build high-quality, secure applications with open source components.
Explore All Posts by Brian Fox