The State of OpenSSL for pyca/cryptography
65 points
3 hours ago
| 6 comments
| cryptography.io
| HN
woodruffw
2 hours ago
[-]
I think this part is really worth engaging with:

> Later, moving public key parsing to our own Rust code made end-to-end X.509 path validation 60% faster — just improving key loading led to a 60% end-to-end improvement, that’s how extreme the overhead of key parsing in OpenSSL was.

> The fact that we are able to achieve better performance doing our own parsing makes clear that doing better is practical. And indeed, our performance is not a result of clever SIMD micro-optimizations, it’s the result of doing simple things that work: we avoid copies, allocations, hash tables, indirect calls, and locks — none of which should be required for parsing basic DER structures.

I was involved in the design/implementation of the X.509 path validation library that PyCA cryptography now uses, and it was nuts to see how much performance was left on the ground by OpenSSL. We went into the design prioritizing ergonomics and safety, and left with a path validation implementation that's both faster and more conformant[1] than what PyCA would have gotten had it bound to OpenSSL's APIs instead.

[1]: https://x509-limbo.com

reply
RiverCrochet
6 minutes ago
[-]
Remember LibreSSL? That was borne of Heartbleed IIRC, and I remember presentation slides saying there was stuff in OpenSSL to support things like VAX, Amiga(?) and other ancient architectures. So I wonder if some of the things are there because of that.
reply
tialaramex
36 minutes ago
[-]
It is extremely common that a correct implementation also has excellent performance.

Also, even if somebody else can go faster by not being correct, what use is the wrong answer? https://nitter.net/magdraws/status/1551612747569299458

reply
woodruffw
27 minutes ago
[-]
> It is extremely common that a correct implementation also has excellent performance.

I think that's true in general, but in the case of X.509 path validation it's not a given: the path construction algorithm is non-trivial, and requires quadratic searches (e.g. of name constraints against subjects/SANs). An incorrect implementation could be faster by just not doing those things, which is often fine (for example, nothing really explodes if an EE doesn't have a SAN[1]). I think one of the things that's interesting in the PyCA case is that it commits to doing a lot of cross-checking/policy work that is "extra" on paper but stills comes out on top of OpenSSL.

[1]: https://x509-limbo.com/testcases/webpki/#webpkisanno-san

reply
jmspring
32 minutes ago
[-]
I’d say correct common path. OpenSSL due to hand waving deals with a lot of edge cases the correct path doesn’t handle. Even libraries like libnss suffers from this.
reply
some_furry
41 minutes ago
[-]
Now I wonder how much performance is being left on the table elsewhere in the OpenSSL codebase...
reply
Avamander
18 minutes ago
[-]
Given the massive regression with 3.x alone, you'll probably be happier if you don't know :/
reply
Retr0id
1 hour ago
[-]
By the way, pyca/cryptography is a really excellent cryptography library, and I have confidence that they're making the right decisions here. The python-level APIs are well thought-out and well documented. I've made a few minor contributions myself and it was a pleasant experience.

And my personal "new OpenSSL APIs suck" anecdote: https://github.com/openssl/openssl/issues/19612 (not my gh issue but I ran into the exact same thing myself)

> I set out to remove deprecated calls to SHA256_xxx to replace them with the EVP_Digestxxx equivalent in my code. However it seems the EVP code is slow. So I did a quick test (test case B vs C below), and it is indeed about 5x slower.

reply
owenthejumper
58 minutes ago
[-]
The article highlights Haproxy's blog with essentially the same name (from 2025): https://www.haproxy.com/blog/state-of-ssl-stacks

Since that Haproxy has effectively abandoned OpenSSL in favor or AWS-LC. Packages Re still built with both, but AWS-LC is clearly the path forward for them.

reply
Avamander
2 hours ago
[-]
I'm glad that they're considering getting rid of OpenSSL as a hard dependency. I've built parts of pyca/cryptography with OpenSSL replaced or stripped out for better debugging. OpenSSL's errors just suck tremendously. It shouldn't be tremendously difficult for them to do it for the entire package.

Though I'd also love to see parts of pyca/cryptography being usable outside of the context of Python, like the X.509 path validation mentioned in other comments here.

reply
dfajgljsldkjag
1 hour ago
[-]
It is honestly surprising that OpenSSL has been the standard for so long given how difficult it is to work with. I think moving the backend to Rust is probably the right move for long term stability.
reply
ameliaquining
21 minutes ago
[-]
Note that all cryptographic primitives are still going to be in C via an OpenSSL-like API for the next while; the current proposal is to migrate from OpenSSL to one of its forks. Various bits of backend logic that aren't cryptographic primitives (e.g., parsing) have been rewritten in Rust; additionally, https://github.com/ctz/graviola is mentioned near the end as a possible implementation of cryptographic primitives in a combination of Rust and assembly (without any C), but it's not especially mature yet.
reply
tialaramex
16 minutes ago
[-]
A couple of things probably made this more likely for OpenSSL than for other libraries, though I think this phenomenon (sticking with a famous library which just isn't very good) is overall just much more common than most people appreciate

1. OpenSSL is cryptography. We did explicitly tell people not to roll their own. So the first instinct of a programmer who finds X annoying ("Let's just write my own X") is ruled out by this as likely unwise or attracts backlash from their users, "What do you mean you rolled your own TLS implementation?"

2. Even the bits which aren't cryptography are like niches likely entirely unrelated to the true interest of the author using OpenSSL. The C++ programmer who needs to do an HTTPS POST but mostly is doing 3D graphics could spend a month learning about the Web PKI, AES, the X.500 directory system and the Distinguished Encoding, or they could just call OpenSSL and not care.

reply
formerly_proven
2 hours ago
[-]
> Finally, taking an OpenSSL public API and attempting to trace the implementation to see how it is implemented has become an exercise in self-flagellation. Being able to read the source to understand how something works is important both as part of self-improvement in software engineering, but also because as sophisticated consumers there are inevitably things about how an implementation works that aren’t documented, and reading the source gives you ground truth. The number of indirect calls, optional paths, #ifdef, and other obstacles to comprehension is astounding. We cannot overstate the extent to which just reading the OpenSSL source code has become miserable — in a way that both wasn’t true previously, and isn’t true in LibreSSL, BoringSSL, or AWS-LC.

OpenSSL code was not pleasant or easy to read even in v1 though and figuring out what calls into where under which circumstances when e.g. many optimized implementations exist (or will exist, once the many huge perl scripts have generated them) was always a headache with only the code itself. I haven't done this since 3.0 but if it regressed so hard on this as well then it has to be really quite bad.

reply
ak217
2 hours ago
[-]
I have a hacky piece of code that I used with OpenSSL 1.x to inspect the state of digest objects. This was removed from the public API in 3.0 but in the process of finding that out I took a deep dive in the digests API and I can confirm it's incomprehensible. I imagined there must be some deep reason for the indirection but it's good to know the Cryptography maintainers don't think so.

Speaking of which, as a library developer relying on both long established and new Cryptography APIs (like x.509 path validation), I want to say Alex Gaynor and team have done an absolutely terrific job building and maintaining Cryptography. I trust the API design and test methodology of Cryptography and use it as a model to emulate, and I know their work has prevented many vulnerabilities, upleveled the Python ecosystem, and enabled applications that would otherwise be impossible. That's why, when they express an opinion as strong as this one, I'm inclined to trust their judgment.

reply