Skip to main content

Trusted software supply chains with SigStore

Trojanised libraries are an increasingly growing problem in sofware supply chain due to the fact that almost every Java, PHP, Python or Node project typically uses a dozen of third-party libraries which then chain-load further libraries. A compilation of a Java project or installation of Node or Python project is continous stream of third-party libraries loaded from repositories such as Maven, NPM or Pypi — and abuse is just matter of statistics.

SigStore is an interesting development that has a chance to become an industry standard as a project supported by a number of large players but at the same developed in a transparent and open manner. SigStore objective is quite clear: provide a strong, cryptographic signatures to “build artifacts”, which basically means source code bundles, libraries, container images and everything else we currently use as inputs in software build workflows.

SigStore provides the following functionality:

  • Strong, cryptographic signature ensuring integrity and authenticity of the content signed.
  • Timestamping and public ledger of signing certificates and signed content hashes, preventing signature forging, signer account takeover, signature backdating etc.

All of that is done using modern cryptographic algorithms (ECDSA, SHA2) in well-established X.509 framework. On high level, part of this functionality was already in use since 1990’s using PGP web-of-trust model, which had one major deficiency: the latter didn’t work outside of relatively small and closed groups. PGP signatures worked great for signing packages developed as part of a Linux distribution which was able to establish the actual web of trust among its package maintainers who are responsible for establishing similar chains of trust to the upstream packages. They didn’t work very well for very large heterogenic software distributions such as NPM or PyPi, simply because there was no easy and scalable way to establish whether a PGP signatures on a thousand of packages imported by a software project are all generated by their respective authors rather than someone who took over a project with the sole purpose of trojanizing it. The opposite side of the spectrum was Microsoft code signing, which is a proprietary walled garden that applies to a single operating system and doesn’t really cover the use scenarios of building software out of largely open-source libraries.

In its simplest form, SigStore services can be used by python-sigstore implementation:

$ python -m pip install sigstore
$ echo "Hello world" > test.txt
$ sigstore sign test.txt
Waiting for browser interaction...
Using ephemeral certificate:

Transparency log entry created at index: 6424650
Signature written to file test.txt.sig
Certificate written to file test.txt.crt

What follows after the initial sigstore command is the primary difference to PGP model, where long-term signing keys are used. SigStore instead relies on short-lived signing keys issued on-demand whenever a signature is needed, and issued to an OpenID-established user identity. Which means, each time you run sigstore manually from command-line, you will need to complete an OpenID/OAuth2 session through a web browser (of course, doing it in automated release pipelines would be inconvenient, so these will use “ambient credentials” which are essentially API tokens and certificates embedded into pipelines as environment variables).

Ultimately it’s your OpenID identity that will end up in the signing certificate — note the attribute in X509v3 Subject Alternative Name section of the certificate. Another attribute indicated which OpenID provider supplied this authenticated identity into the SigStore workflow. Relying on OpenID allows a very broad range of authentication levels, from simple automated “ambient credentials” to strong person-bound hardware authenticators for a specific release.

$ openssl x509 -in test.txt.crt -text -noout
        Version: 3 (0x2)
        Serial Number:
        Signature Algorithm: ecdsa-with-SHA384
        Issuer: O =, CN = sigstore-intermediate
            Not Before: Nov  3 13:11:03 2022 GMT
            Not After : Nov  3 13:21:03 2022 GMT
        Subject Public Key Info:
            Public Key Algorithm: id-ecPublicKey
                Public-Key: (384 bit)
                ASN1 OID: secp384r1
                NIST CURVE: P-384
        X509v3 extensions:
            X509v3 Key Usage: critical
                Digital Signature
            X509v3 Extended Key Usage: 
                Code Signing
            X509v3 Subject Key Identifier: 
            X509v3 Authority Key Identifier: 

            X509v3 Subject Alternative Name: critical
            CT Precertificate SCTs: 
                Signed Certificate Timestamp:
                    Version   : v1 (0x0)
                    Log ID    : DD:3D:30:6A:C6:C7:11:32:63:19:1E:1C:99:67:37:02:
                    Timestamp : Nov  3 13:11:03.947 2022 GMT
                    Extensions: none
                    Signature : ecdsa-with-SHA256
    Signature Algorithm: ecdsa-with-SHA384

Verification of the test artifact does not require any authentication, it just relies on cryptographic verification of the signature against the file contents, plus checks for presence of the signing certificate and signature in the online transparency logs:

$ sigstore verify test.txt
OK: test.txt

If the file contents were tampered with, the signature will fail:

$ sigstore verify test.txt
FAIL: test.txt
Failure reason: Signature is invalid for input

The above verification will accept any cryptographically valid signature on the file, but it doesn’t establish any link to signer’s identity. There are two extra options for that:

$ sigstore verify --cert-email --cert-oidc-issuer test.txt
OK: test.txt

The --cert-email option requires that a signature was created by a particular identity confirmed by OpenID Connect (OIDC) to be valid. The second option --cert-oidc-issuer that that identity is certified by a particular OIDC platform. The latter alone could be used for example to validate all signatures issued by a particular organisation, not limiting their validity to a particular signer.

Signing and verification can be embedded into automated build pipelines, as demonstrated in one of my Ansible projects, where this release.yml pipeline creates a GitHub release, signs build artifacts with SigStore, attaches them to the release and then triggers an import into Ansible Galaxy. The key part of the pipeline control file written in declarative GitHub Actions langage in YAML format. There’s no explicit OpenID here, as explained above — sigstore uses “ambient credentials” provided by the GitHub platform here:

      - name: Sign release with Sigstore
        uses: sigstore/gh-action-sigstore-python@v0.0.9
          inputs: ${{ steps.version.outputs.version }}.tar.gz
          release-signing-artifacts: true
          upload-signing-artifacts: true
      - name: upload signed asset
        uses: actions/upload-release-asset@v1
          GITHUB_TOKEN: ${{ github.token }}
          upload_url: ${{ steps.create_release.outputs.upload_url }}
          asset_path: ${{ steps.version.outputs.version }}.tar.gz
          asset_name: ${{ steps.version.outputs.version }}.tar.gz
          asset_content_type: application/gzip

      - name: upload sigstore certificate
        uses: actions/upload-release-asset@v1
          GITHUB_TOKEN: ${{ github.token }}
          upload_url: ${{ steps.create_release.outputs.upload_url }}
          asset_path: ${{ steps.version.outputs.version }}.tar.gz.crt
          asset_name: ${{ steps.version.outputs.version }}.tar.gz.crt
          asset_content_type: application/x-x509-ca-cert

      - name: upload sigstore signature
        uses: actions/upload-release-asset@v1
          GITHUB_TOKEN: ${{ github.token }}
          upload_url: ${{ steps.create_release.outputs.upload_url }}
          asset_path: ${{ steps.version.outputs.version }}.tar.gz.sig
          asset_name: ${{ steps.version.outputs.version }}.tar.gz.sig
          asset_content_type: application/octet-stream

      - name: Build and Deploy Collection
        uses: 0x022b/galaxy-role-import-action@1.0.0
          galaxy_api_key: '${{ secrets.ANSIBLE_GALAXY_TOKEN }}'

This is not implemented by Ansible Galaxy platform as of time of this writing, but the logical next steps would be:

  • Ansible Galaxy checks whether the release from GitHub is accompanied by .sig and .crt files and if yes, verifies them using sigstore and only imports the role if the signature is valid (presence of the signature can be also configured to be mandatory by the maintainer).
  • The ansible-galaxy utility on the consumer system (which may be either end user, or another pipeline builder) does the same thing: check for presence of signature files, and verifies them on download of the role.

Find me on Fediverse, feel free to comment! See how