OpenSSF’s Alpha-Omega Project – Good, but not enough.

On February 1, the Linux Foundation’s OpenSSF announced the launch of its Alpha-Omega project, an initiative aimed at securing what are considered the most “core ecosystem” Open Source projects as well as another 10,000 widely-deployed open source projects. Microsoft and Google have kicked in a collective $5 million to help fund this effort.

There are three important points to make about this initiative which need to be raised. One is technical and two are non-technical. Let’s start with the technical. 

On the surface, the OpenSSF’s initiative has a really lofty and noble goal – to secure the open source software supply chain. However, its reliance on sigstore’s technology and particularly cosign and Rekor is less than optimal. The flaw in the architecture of this system is two-fold. First, cosign is based on certificates and keys – ancient technologies. What happens if you sign billions of artifacts with a certificate and then that certificate is now compromised? Will you go back and revoke the trust of all your assets? Furthermore, what if the actual trust root becomes compromised somehow? Serious questions to be asked and answered.

OpenSSF also plans to use Rekor, which is meant to be an immutable, tamper-resistant ledger and function as a transparency log of metadata about artifacts in a supply chain. Rekor utilizes Google’s Trillian for it’s underlying append-only data structure and requires MySQL or MariaDB and Redis underneath. Ultimately, this leaves room for vulnerability. There are too many moving pieces at play here, and the more pieces means more pieces you need to trust, or actually shouldn’t need to trust (you should be employing Zero-Trust). Once the underlying layers are not unequivocally immutable, there is room for the introduction of doubt and risk. To us, that’s far from optimal. The Rekor site also states: “IMPORTANT: This instance is currently operated on a best-effort basis. We will take the log down and reset it with zero notice. We will improve the stability and publish SLOs over time.” Yikes!

Now let’s look at how Codenotary solves these issues by utilizing our open source immutable database, immudb. It’s been very popular lately. Since immudb is itself an immutable database and doesn’t rely on any external piece of infrastructure, there is no room for doubt, no room for risk. What goes into immudb is stored in immudb, in a tamper-proof and verifiable way. What’s more, the state of the database can always be verified by the client as well, so you never need to blindly trust what immudb is telling you. It is one single integrated solution that provides all the requirements and functionality. This is a more compelling model, a more assured model, a more confident model, and it’s the reason we’ve built Codenotary Cloud on immudb, which can scale to billions of artifacts and notarizations/authentications.

Aside from these technical issues, there are two other big issues. While  $5 million sounds like a lot of money (and it is), it seems like a paltry sum to carry out this mission. The depth of penetration of open source into the global software supply chain is daunting. When you start to take a look at your potential Alpha projects, say, the Linux Kernel, etc. the effort there alone can potentially utilize almost all of that. How do you get to all your Omega projects then? Which leads us to the second non-technical point.

Did you hear about the Fortune 500 corporation that emailed an open source maintainer to “Demand Answers” about his software package, which they’ve never paid for, and which they’ve now realized is being used in their software? We did. And we didn’t think it was funny. This story showcases what is perhaps the most obvious truth about how the software supply chain can be secured – by paying maintainers. Any project or initiative, such as the one launched by OpenSSF, will never be complete and never fully actualized unless money is being earmarked to pay those maintainers of Omega software. Otherwise, the outcome will just be to identify more holes in a maintainers code which will cause them to work more hours, not less, for the same non-existent compensation. Open source has always been about community and it’s time the community re-think the value proposition when it comes to paying maintainers.

So on the whole, yes, the OpenSSF’s announcement is a positive step. People are starting to take securing the open source software supply chain seriously. Now, we need the right approach in combination with the right technologies to make the dream a reality. It’s clear, we still have a long way to go.

CNIL Metrics & Logs

Self-Hosted performance monitoring and compliant log analysis for VMware vSphere, container and much more.

immudb

Built on the fastest immutable ledger technology. Open Source and easy to use and integrate into existing application.

Codenotary Cloud

Trusted CI/CD, SBOM and artifact
protection with cryptographic proof.
One CLI to manage all.

MOST POPULAR

Sorry. No data so far.

Subscribe to Our Newsletter

Get the latest product updates, company news, and special offers delivered right to your inbox.

Subscribe to our newsletter

Use Case - Tamper-resistant Clinical Trials

Goal:

Blockchain PoCs were unsuccessful due to complexity and lack of developers.

Still the goal of data immutability as well as client verification is a crucial. Furthermore, the system needs to be easy to use and operate (allowing backup, maintenance windows aso.).

Implementation:

immudb is running in different datacenters across the globe. All clinical trial information is stored in immudb either as transactions or the pdf documents as a whole.

Having that single source of truth with versioned, timestamped, and cryptographically verifiable records, enables a whole new way of transparency and trust.

Use Case - Finance

Goal:

Store the source data, the decision and the rule base for financial support from governments timestamped, verifiable.

A very important functionality is the ability to compare the historic decision (based on the past rulebase) with the rulebase at a different date. Fully cryptographic verifiable Time Travel queries are required to be able to achieve that comparison.

Implementation:

While the source data, rulebase and the documented decision are stored in verifiable Blobs in immudb, the transaction is stored using the relational layer of immudb.

That allows the use of immudb’s time travel capabilities to retrieve verified historic data and recalculate with the most recent rulebase.

Use Case - eCommerce and NFT marketplace

Goal:

No matter if it’s an eCommerce platform or NFT marketplace, the goals are similar:

  • High amount of transactions (potentially millions a second)
  • Ability to read and write multiple records within one transaction
  • prevent overwrite or updates on transactions
  • comply with regulations (PCI, GDPR, …)


Implementation:

immudb is typically scaled out using Hyperscaler (i. e. AWS, Google Cloud, Microsoft Azure) distributed across the Globe. Auditors are also distributed to track the verification proof over time. Additionally, the shop or marketplace applications store immudb cryptographic state information. That high level of integrity and tamper-evidence while maintaining a very high transaction speed is key for companies to chose immudb.

Use Case - IoT Sensor Data

Goal:

IoT sensor data received by devices collecting environment data needs to be stored locally in a cryptographically verifiable manner until the data is transferred to a central datacenter. The data integrity needs to be verifiable at any given point in time and while in transit.

Implementation:

immudb runs embedded on the IoT device itself and is consistently audited by external probes. The data transfer to audit is minimal and works even with minimum bandwidth and unreliable connections.

Whenever the IoT devices are connected to a high bandwidth, the data transfer happens to a data center (large immudb deployment) and the source and destination date integrity is fully verified.

Use Case - DevOps Evidence

Goal:

CI/CD and application build logs need to be stored auditable and tamper-evident.
A very high Performance is required as the system should not slow down any build process.
Scalability is key as billions of artifacts are expected within the next years.
Next to a possibility of integrity validation, data needs to be retrievable by pipeline job id or digital asset checksum.

Implementation:

As part of the CI/CD audit functionality, data is stored within immudb using the Key/Value functionality. Key is either the CI/CD job id (i. e. Jenkins or GitLab) or the checksum of the resulting build or container image.

White Paper — Registration

We will also send you the research paper
via email.

CodeNotary — Webinar

White Paper — Registration

Please let us know where we can send the whitepaper on CodeNotary Trusted Software Supply Chain. 

Become a partner

Start Your Trial

Please enter contact information to receive an email with the virtual appliance download instructions.

Start Free Trial

Please enter contact information to receive an email with the free trial details.