Removing PGP from PyPI (2023)
59 points
7 hours ago
| 16 comments
| blog.pypi.org
| HN
woodruffw
5 hours ago
[-]
This is slightly old news. For those curious, PGP support on the modern PyPI (i.e. the new codebase that began to be used in 2017-18) was always vestigial, and this change merely polished off a component that was, empirically[1], doing very little to improve the security of the packaging ecosystem.

Since then, PyPI has been working to adopt PEP 740[2], which both enforces a more modern cryptographic suite and signature scheme (built on Sigstore, although the design is adaptable) and is bootstrapped on PyPI's support for Trusted Publishing[3], meaning that it doesn't have the fundamental "identity" problem that PyPI-hosted PGP signatures have.

The hard next step from there is putting verification in client hands, which is the #1 thing that actually makes any signature scheme actually useful.

[1]: https://blog.yossarian.net/2023/05/21/PGP-signatures-on-PyPI...

[2]: https://peps.python.org/pep-0740/

[3]: https://docs.pypi.org/trusted-publishers/

reply
westurner
42 minutes ago
[-]
It's good that PyPI signs whatever is uploaded to PyPI using PyPI's key now.

GPG ASC support on PyPI was nearly as useful as uploading signatures to sigstore.

1. Is it yet possible to - with pypa/twine - sign a package uploaded to PyPI, using a key that users somehow know to trust as a release key for that package?

2. Does pip check software publisher keys at package install time? Which keys does pip trust to sign which package?

3. Is there any way to specify which keys to trust for a given package in a requirements.txt file?

4. Is there any way to specify which keys to trust for a version of a given package with different bdist releases, with Pipfile.lock, or pixi or uv?

People probably didn't GPG sign packages on PyPI because it wasn't easy or required to sign a package using a registered key/DID in order to upload.

Anyone can upload a signature for any artifact to sigstore. Sigstore is a centralized cryptographic signature database for any file.

Why should package installers trust that a software artifact publisher key [on sigstore or the GPG keyserver] is a release key?

gpg --recv-key downloads a public key for a given key fingerprint over HKP (HTTPS with the same CA cert bundle as everything else).

GPG keys can be wrapped as W3C DIDs FWIU.

W3C DIDs can optionally be centrally generated (like LetsEncrypt with ACME protocol).

W3C DIDs can optionally be centrally registered.

GPG or not, each software artifact publisher key must be retrieved over a different channel than the packages.

If PYPI acts as the (package,release_signing_key) directory and/or the keyserver, is that any better than hosting .asc signatures next to the downloads?

GPG signatures and wheel signatures were and are still better than just checksums.

Why should we trust that a given key is a release signing key for that package?

Why should we trust that a release signing key used at the end of a [SLSA] CI build hasn't been compromised?

How do clients grant and revoke their trust of a package release signing key with this system?

... With GPG or [GPG] W3C DIDs or whichever key algo and signed directory service.

reply
woodruffw
26 minutes ago
[-]
> It's good that PyPI signs whatever is uploaded to PyPI using PyPI's key now.

You might have misunderstood -- PyPI doesn't sign anything with PEP 740. It accepts attestations during upload, which are equivalent to bare PGP signatures. The big difference between the old PGP signature support and PEP 740 is that PyPI actually verifies the attestations on uploads, meaning that everything that gets stored and re-served by PyPI goes through a "can this actually be verified" sanity check first.

I'll try to answer the others piecewise:

1. Yes. You can use twine's `--attestations` flag to upload any attestations associated with distributions. To actually generate those attestations you'll need to use GitHub Actions or another OIDC provider currently supported as a Trusted Publisher on PyPI; the shortcut for doing that is to enable `attestations: true` while uploading with `gh-action-pypi-publish`. That's the happy path that we expect most users to take.

2. Not yet; the challenge there is primarily technical (`pip` can only vendor pure Python things, and most of PyCA cryptography has native dependencies). We're working on different workarounds for this; once they're ready `pip` will know which identity - not key - to trust based on each project's Trusted Publisher configuration.

3. Not yet, but this is needed to make downstream verification in a TOFU setting tractable. The current plan is to use the PEP 751 lockfile format for this, once it's finished.

4. That would be up to each of those tools to implement. They can follow PEP 740 to do it if they'd like.

I don't really know how to respond to the rest of the comment, sorry -- I find it a little hard to parse the connections you're drawing between PGP, DIDs, etc. The bottom line for PEP 740 is that we've intentionally constrained the initial valid signing identities to ones that can be substantiated via Trusted Publishing, since those can also be turned into verifiable Sigstore identities.

reply
politelemon
5 hours ago
[-]
This feels like perfect being the enemy of good enough. There are examples where the system falls over but that doesn't mean that it completely negates the benefits.

It is very easy to get blinkered into thinking that the specific problems they're citing absolutely need to be solved, and quite possibly an element of trying to use that as an excuse to reduce some maintenance overhead without understanding its benefits.

reply
creatonez
5 hours ago
[-]
Its benefits are very much completely negated in real-world use. See https://blog.yossarian.net/2023/05/21/PGP-signatures-on-PyPI... - the data suggests that nobody is verifying these PGP signatures at all.
reply
dig1
4 hours ago
[-]
I stopped reading after this: "PGP is an insecure [1] and outdated [2] ecosystem that hasn't reflected cryptographic best practices in decades [3]."

The first link [1] suggests avoiding encrypted email due to potential plaintext CC issues and instead recommends Signal or (check this) WhatsApp. However, with encrypted email, I have (or can have) full control over the keys and infrastructure, a level of security that Signal or WhatsApp can't match.

The second link [2] is Moxie's rant, which I don't entirely agree with. Yes, GPG has a learning curve. But instead of teaching people how to use it, we're handed dumbed-down products like Signal (I've been using it since its early days as a simple sms encryption app, and I can tell you, it's gone downhill), which has a brilliant solution: it forces you to remember (better to say to write down) a huge random hex monstrosity just to decrypt a database backup later. And no, you can't change it.

Despite the ongoing criticisms of GPG, no suitable alternative has been put forward and the likes of Signal, Tarsnap, and others [1] simply don't cut it. Many other projects running for years (with relatively good security track records, like kernel, debian, or cpan) have no problem with GPG. This is 5c.

[1] https://latacora.micro.blog/2019/07/16/the-pgp-problem.html

[2] https://moxie.org/2015/02/24/gpg-and-me.html

[3] https://blog.cryptographyengineering.com/2014/08/13/whats-ma...

reply
wkat4242
4 hours ago
[-]
Yeah I still use pgp a lot. Especially because of hardware backed tokens (on yubikey and openpgp cards) which I use a lot for file encryption. The good thing is that there's toolchains for all desktop OSes and mobile (Android, with openkeychain).

I'm sure there's better options but they're not as ubiquitous. I use it for file encryption, password manager (pass) and SSH login and everything works on all my stuff, with hardware tokens. Even on my tablet where Samsung cheaped out by not including NFC I can use the USB port.

Replacements like fido2 and age fall short by not supporting all the usecases (file encryption for fido2, hardware tokens for age) or not having a complete toolchain on all platforms.

reply
nick__m
4 hours ago
[-]
I use pcks11 on my yubikeys, would I gain something by using the PGP functionality instead?
reply
wkat4242
3 hours ago
[-]
Easier tooling. At least on Linux. PKCS11 requires a dynamically linked library (.so) which you pass to programs using it (like SSH) which is a bit annoying and because it's a binary linking it's not an API you can easily debug. It tends to just crash especially with version mismatches. The GPG agent API is easier for this. It works over a socket and can even be forwarded over SSH connections.

Usually you end up using OpenSC/OpenCT for that. Also the tooling to manage the virtual smartcards is usually not as easy. I haven't used PIV for this (which is probably what you use on the yubikey to get PKCS11) but it was much harder to get going than simply using GPG Agent/scdaemon/libpcscd and the command "gpg card-edit" to configure the card itself.

It's still not quite easy but I found it easier than PKCS11. I used that before with javacards and SmartHSM cards. The good thing about the PIV functionality is that it integrates really well with Windows active directory, which PGP of course doesn't. So if you're on Windows, PIV (with PKCS11) is probably the way to go (or Fido2 but it's more Windows Hello for Business rather than AD functionality then, it depends whether you're a legacy or modern shop).

The big benefit of yubikeys over smartcards is that you can use the touch functionality to approve every use of the yubikey, whereas an unlocked smartcard will approve every action without a physical button press (of course because it doesn't have such a button).

reply
aborsy
1 hour ago
[-]
I second this!
reply
Diti
4 hours ago
[-]
I believe the article you linked to doesn’t seem to say anything about “nobody verifying PGP signatures”. We would need PyPI to publish their Datadog & Google Analytics data, but I’d say the set of users who actually verify OpenPGP signatures intersects with the set of users faking/scrambling telemetry.
reply
woodruffw
4 hours ago
[-]
I wrote the blog post in question. The claim that "nobody is verifying PGP signatures (from PyPI)" comes from the fact that around 1/3rd had no discoverable public keys on what remains of the keyserver network.

Of the 2/3rd that did have discoverable keys, ~50% had no valid binding signature at the time of my audit, meaning that obtaining a living public key has worse-than-coin-toss odds for recent (>2020) PGP signatures on PyPI.

Combined, these datapoints (and a lack of public noise about signatures failing to verify) strongly suggest that nobody was attempting to verify PGP signatures from PyPI at any meaningful scale. This was more or less confirmed by the near-zero amount of feedback PyPI got once it disabled PGP uploads.

reply
opello
1 hour ago
[-]
This all makes sense.

PEP 740 mentions:

> In their previously supported form on PyPI, PGP signatures satisfied considerations (1) and (3) above but not (2) (owing to the need for external keyservers and key distribution) or (4) (due to PGP signatures typically being constructed over just an input file, without any associated signed metadata).

It seems to me that the infrastructure investment in sigstore.dev vs. PGP seems arbitrary. For example, on the PGP side, PyPI keyserver and tooling to validate uploads as to address (2) above. And (4) being handled similar to PEP 740 with say signatures for provenance objects. Maybe the sigstore is "just way better" but it doesn't exactly seem so cut-and-dried of a technical argument from the things discussed in these commends and the linked material.

It's perfectly responsible to make a choice. It seems unclear just what the scope of work difference would be despite there being a somewhat implicit suggestion across the discussions and links in the comments that it was great. Maybe that's an unreasonable level of detail to expect? But with what seems to come across as "dogging on PGP" it seems what I've found disappointing with my casual brush with this particular instance of PGP coming up in the news.

reply
woodruffw
1 hour ago
[-]
(2) is addressed by Sigstore having its own infrastructure and a full-time rotation staff. PyPI doesn't need to run or operationalize anything, which is a significant relief compared to the prospect of having to operationalize a PGP keyserver with volunteer staffing.

(I'm intentionally glossing over details here, like the fact that PyPI doesn't need to perform any online operations to validate Sigstore's signatures. The bottom line is that everything about it is operationally simpler and more modern than could be shaped out of the primitives PGP offers.)

(4) could be done with PGP, but would go against the long-standing pattern of "sign the file" that most PGP tooling is ossified around. It also doesn't change the fact that PGP's signing defaults aren't great, that there's a huge tail of junk signing keys out there, and that to address those problems PyPI would need to be in the business of parsing PGP packets during package upload. That's just not a good use of anybody's time.

reply
opello
20 minutes ago
[-]
> having its own infrastructure

This seems like a different brand of the keyserver network?

> PyPI doesn't need to run or operationalize anything

So it's not a new operational dependency because it's index metadata? That seems more like an implementation detail (aside from the imagined PGP keyserver dependency) that seems accommodatable given either system.

> like the fact that PyPI doesn't need to perform any online operations to validate Sigstore's signatures

I may be missing something subtle (or glaring) but "online operations" would be interactions with some other service or a non-PSF service? Or simply a service not-wholly-pypi? Regardless, the index seems like it must be a "verifier" for design consideration (2) from PEP 740 to hold, which would mean that the index must perform the verification step on the uploaded data--which seems inconsequentially different between an imagined PGP system (other than it would have to access the imagined PyPI keyserver) and sigstore/in-toto.

> ... PyPI would need to be in the business of parsing PGP packets during package upload.

But the sigstore analog is the JSON array of in-toto attestation statement objects.

reply
jacques_chester
5 hours ago
[-]
Maintaining this capability isn't free, it is of dubious benefit and there are much better alternatives.

On a cost benefit analysis this is a slam dunk.

reply
nightfly
5 hours ago
[-]
What are these "much better alternatives"?
reply
arccy
5 hours ago
[-]
https://www.sigstore.dev/

The emerging standard for verifying artifacts, e.g. in container image signing, npm, maven, etc

https://blog.sigstore.dev/npm-public-beta/ https://www.sonatype.com/blog/maven-central-and-sigstore

reply
binary132
1 hour ago
[-]
Emerging standard = not yet the standard
reply
bjornsing
5 hours ago
[-]
Does it matter much if the key can be verified? I mean it seems like a pretty big step up security wise to know that a new version of a package is signed with the same key was previous versions.
reply
woodruffw
4 hours ago
[-]
> I mean it seems like a pretty big step up security wise to know that a new version of a package is signed with the same key was previous versions.

A key part of the rationale for removing PGP uploads from PyPI was that you can't in fact know this, given the current state (and expected future) of key distribution in PGP.

(But also: yes, it's indeed important that the key can be verified i.e. considered authentic for an identity. Without that, you're in "secure phone call with the devil" territory.)

reply
jacques_chester
6 hours ago
[-]
I performed a similar analysis on RubyGems and found that of the top 10k most-downloaded gems, less than one percent had valid signatures. That plus the general hassle of managing key material means that this was a dead-end for large scale adoption.

I'm still hopeful that sigstore will see wide adoption and bring authorial attestation (code signing) to the masses.

reply
wnissen
5 hours ago
[-]
I agree, where is the LetsEncrypt for signing? Something you could download and get running in literally a minute.
reply
arccy
5 hours ago
[-]
reply
crote
27 minutes ago
[-]
I don't think Sigstore is a good example. I just spent half an hour trying to understand it, and I am still left with basic questions like "Does it require me to authenticate with Github & friends, or can I use my own OIDC backend?": it seems like you can, but there are cases where you need to use a blessed OIDC provider, but you can override that while self-hosting, and there are config options for the end user to specify any IODC provider? But the entire trust model also relies on the OIDC backend being trustworthy?

The quickstart guide looks easy enough to follow, but it seems nobody bothered to document what exactly is happening in the background, and why. There's literally a dozen moving pieces and obscure protocols involved. As an end user, Sigstore looks like a Rube Goldberg trust machine to me. It might just as well be a black box.

PGP is easy to understand. LetsEncrypt is easy to understand. I'm not an expert on either, but I am reasonably certain I can explain them properly to the average highschooler. But Sigstore? Not a chance - and in my opinion that alone makes it unsuitable for its intended use.

reply
Diti
4 hours ago
[-]
Specifically, the CA signing the code certificates (that are valid for 10 minutes) is https://github.com/sigstore/fulcio.
reply
SethMLarson
4 hours ago
[-]
I've authored a proposal to deprecate the expectation of PGP signatures for future CPython releases. We've been providing Sigstore signatures for CPython since 3.11.

https://peps.python.org/pep-0761/

reply
aborsy
1 hour ago
[-]
What’s the current best solution for associating a public key to an identity or person?

This is not related to cryptography protocols.

OpenPGP key server verifies email. Keybase was a good idea but seems dead. Maybe identity providers?

reply
woodruffw
58 minutes ago
[-]
> Maybe identity providers?

That's essentially all Sigstore is: it uses an identity provider to bind an identity (like an email) to a short-lived signing key.

reply
dang
53 minutes ago
[-]
Discussed at the time:

Removing PGP from PyPI - https://news.ycombinator.com/item?id=36044543 - May 2023 (187 comments)

reply
troismph
1 hour ago
[-]
I am curious why we still need PyPI to hold packages: it may be better to install from github.

Github provides much better integrated experience: source code, issues, docs, etc.

reply
MPSimmons
1 hour ago
[-]
I don't think this is that terrible of an idea, actually. Before PyPI disabled searching, I'd say that the value of centralization was from that, and possibly due to security, but I think any claim of security from a central repo is deluding ourselves these days. There are so many opportunities for supply chain attacks that maybe this isn't actually worse. Requiring pip to refer to a github owner/repo might eliminate some of the squatter problems we have, too.
reply
rurban
5 hours ago
[-]
On the other hand PGP keys were widely successful for cpan, the perl5 repo. It's very simple to use, not as complicated as with pypi.
reply
0xbadcafebee
5 hours ago
[-]
I dunno. I mean, sure, it's a worldwide-mirrored, cryptographically secure, curated, hierarchically and categorically organized, simple set of flat files, with multiple separate community projects, to test all packages on all supported Perl versions and platforms, with multiple different frontends, bug tracking, search engines, documentation hubs, security groups, and an incredibly long history of support and maintenance by the community.

But it's, like, old. You can't make something new be like something old. That's not cool. If what we're doing isn't new and cool, what is the point even?

reply
upofadown
4 hours ago
[-]
> Of those 1069 unique keys, about 30% of them were not discoverable on major public keyservers, making it difficult or impossible to meaningfully verify those signatures. Of the remaining 71%, nearly half of them were unable to be meaningfully verified at the time of the audit (2023-05-19).

A PGP keyserver provides no identity verification. It is simply a place to store keys. So I don't understand this statement. What is the ultimate goal here? I thought that things like this mostly provided a consistent identity for contributing entities with no requirement to know who the people behind the identities actually were in real life.

reply
woodruffw
4 hours ago
[-]
You're thinking one step past the failure state here: the problem isn't that keyservers don't provide identity verification, but that the PGP key distribution ecosystem isn't effectively delivering keys anymore.

There are probably multiple reasons for this, but the two biggest ones are likely (1) that nobody knows how to upload keys to keyservers anymore, and (2) that keyservers don't gossip/share keys anymore, following the SKS network's implosion[1].

Or put another way: a necessary precondition of signature verification is key retrieval, whether or not trust in a given key identity (or claimant human identity) is established. One of PGP's historic strengths was that kind of key retrieval, and the data strongly suggests that that's no longer the case.

[1]: https://gist.github.com/rjhansen/67ab921ffb4084c865b3618d695...

reply
upofadown
2 hours ago
[-]
The SKS keyserver thing was 5 years ago. It seems to be working. Was uploading a key somewhere a requirement for submitting to PyPi? Why were the keys not available from PyPi?

It just seems to me that there wasn't anything here in the first place. Something something PGP keys. Perhaps they were hoping for someone to come along and make a working system and no one ever did.

reply
woodruffw
2 hours ago
[-]
Could you clarify: which part seems to be working? The SKS servers certainly aren't, and the keyservers that are currently online don't appear to gossip or share keys with each other. That's why the post's dataset comes from querying the biggest/most popular ones manually.

> Was uploading a key somewhere a requirement for submitting to PyPi?

Where would "somewhere" be? If it was PyPI itself (or a server controlled by PyPI), replacing the key material would be trivial and would largely defeat the purpose of having signatures instead of just hashes.

In the past, "somewhere" could have been a gossiping SKS server. But that would tie PyPI's reliability and availability to that of the SKS network, which was never great even at its prime.

> Why were the keys not available from PyPi?

For the reason mentioned above: if PyPI is trusted to distribute the key material, then an attacker can simply replace the keys used to sign for the package. This makes it no better than having PyPI distribute hashes (which it already does), but a lot more complicated.

To my understanding, the reason PyPI originally accepted PGP keys is because someone asked for it and baseline expectations around security were more laissez-faire at the time: there was no baseline expectation that millions of people might be using `pip` (or `easy_install` at the time), and that all of them deserve the same integrity and authenticity properties as a small core of expert users. Those expectations have shifted over time towards the belief that ordinary users should also have signatures accessible to them, and I'm inclined to believe that's a good thing.

reply
ploxiln
4 hours ago
[-]
These keys could have related signatures from other keys, that some users or maintainers may have reason to trust.

(But for 30% of keys this was not even theoretically possible, while for another 40% of keys it was not practically possible, according to the article.)

reply
hiatus
5 hours ago
[-]
Should be tagged 2023.
reply
badgersnake
5 hours ago
[-]
Most people do security badly so let’s not do it at all.

Right.

reply
otikik
5 hours ago
[-]
Unfortunately we live in a world of limited time and resources, and priorities need to be adjusted accordingly.

Honestly, I would put the blame on PGP. It has a … special UX. I tried to use it in 3 separate occasions, and ended up doing something else (probably less secure) because I would couldn’t manage to make the damn thing work. I might not be a genius, but I am also not completely stupid.

reply
opello
5 hours ago
[-]
Wouldn't another very good answer be for PyPI to have a keyserver and require keys be sent to it for them to be used in package publishing?
reply
wnissen
5 hours ago
[-]
Wouldn't that make the maintenance burden worse? Now PyPI has to host a keyserver, with its own attack service. And presumably 99.7% of the keys would be only for PyPI, so folks would have no incentive to secure them. The two modes that work are either no signing, or mandatory signing like many app stores. Obviously the middle way is the worst of both worlds, no security for 99+% of packages, but all the maintenance headache. And mandatory signing raises the possibility that PyPI would be replaced by an alternate repository that's easier to contribute to. The open source world depends to a shocking degree on the volunteer labor of people who have a lot of things they could be doing with their time, and a "small" speed bump for enhanced security can have knock-on effects that are not small.
reply
opello
5 hours ago
[-]
Sure, it all hinges on whether the signatures provided any value. And it seems to be the conclusion that it didn't.

Without something showing "keyservers present an untenable risk" and Debian, Ubuntu, Launchpad, others have keyserver infrastructure, it seems like too far of a conclusion to reach casually. But of course, it adds attack surface for the simple fact that a public facing thing was stood up where once it was not. Though that isn't the kind of trivial conclusion I imagine you had in mind.

I don't see why there's a binary choice between "signing is no longer supported" and "signing is mandatory" when before that wasn't the case. If it truly provided no value, or so small a value with so high a maintenance burden that it harmed the project that way, then it makes sense--but that didn't seem to be the place from which the article argued.

reply
zitterbewegung
5 hours ago
[-]
From here: https://caremad.io/posts/2013/07/packaging-signing-not-holy-... which is linked to the article since PyPI has so many packages and that everyone can sign up to add a package it would be extremely unmanageable.
reply
opello
5 hours ago
[-]
That's fair and I appreciate that detail even without having followed the link in the original article. But while not being "the holy grail" why must the perfect be the enemy of the good, if it was providing a value?

I certainly allow for the "if it was providing a value" to be a gargantuan escape hatch through which any other perspective may be removed.

But by highlighting the difficulty in verifying signatures and saying it was because they keys were hard to find (or may have been expired or other signing errors per the footnote) a fairly straight forward solution presents itself: add keyserver infrastructure, check it when signed packages are posted, reject if key verification fails using that keyserver.

All told it seems like it wasn't providing a value, so throwing more resources at the effort was not done. But something about highlight how "keys being hard to find" helped justify the action doesn't quite pass muster to my mind.

reply
bee_rider
5 hours ago
[-]
I dunno, not all projects are equally important or popular, so it seems to me that the number of downloads which had keys is the better metric to look at.

But, if there are fundamental issues with the key system anyway, the percentages don’t matter anyway.

reply
woodruffw
4 hours ago
[-]
You're absolutely right that the number of downloads is probably a more important metric! But also yes, I think the basic "can't discover valid keys for a large majority of packages" is a sufficient justification, which is why I went with it :-)

The raw data behind the blog post is archived here[1]. It would be pretty easy to reduce it back down to package names, and see which/what percent of those names are in the top 500/1000/5000/etc. of PyPI packages by downloads. My prediction is that there's no particular relationship between "uploads a PGP key" and popularity, but that's speculative.

[1]: https://github.com/woodruffw/pypi-pgp-statistics

reply
nonameiguess
4 hours ago
[-]
I feel like there is a broader issue being pushed aside here. Verifying a signature means you have a cryptographic guarantee that whoever generated an artifact possessed a private key associated with a public key. That key doesn't necessarily need to be published in a web-facing keystore to be useful. For packages associated with an OS-approved app store or a Linux distro's official repo, the store of trusted keys is baked into the package manager.

What value does that provide? As the installer of something, you almost never personally know the developer. You don't really trust them. At best, you trust the operating system vendor to sufficient vet contributors to a blessed app store. Whoever published package A is actually a maintainer of Arch Linux. Whoever published app B went through whatever the heck hoops Apple makes you go through. If malware gets through, some sort of process failed that can potentially be mediated.

If you're downloading a package from PyPI or RubyGems or crates.io or whatever, a web repository that does no vetting and allow anyone to publish anything, what assurance is this giving? Great, some package was legitimately published by a person who also published a public key. Who are they exactly? A pseudonym on Github with a cartoon avatar? Does that make them trustworthy? If they publish malware, what process can be changed to prevent that from happening again? As far as I can tell, nothing.

If you change the keystore provider to sigstore, what does that give you? Fulcio just requires that you control an e-mail address to issue you a signing key. They're not vetting you in any way or requiring you to disclose a real-world identity that can be pursued if you do something bad. It's a step up in a limited scope of use cases in which packages are published by corporate entities that control an e-mail domain and ideally use their own private artifact registry. It does nothing for public repositories in which anyone is allowed to publish anything.

Fundamentally, if a public repository allows anyone to publish anything, does no vetting and requires no real identity disclosure, what is the basis of trust? If you're going to say something like "well I'm looking for .whl files but only from Microsoft," then the answer is for Microsoft to host its own repository that you can download from, not for Microsoft to publish packages to PyPI.

There are examples of making this sort of simpler for the consumer to get everything from a single place. Docker Hub, for instance. You can choose to only ever pull official library images and verify them against sigstore, but that works because Docker is itself a well-funded corporate entity that restricts who can publish official library images by vetting and verifying real identities.

reply
exabrial
3 hours ago
[-]
Not Invented Here!
reply