https://blog.cr.yp.to/20220805-nsa.html
I'm actually quite surprised that anyone is advocating the non-hybrid PQ key exchange for real applications. If it isn't some sort of gimmick to allow NSA to break these, it's sure showing a huge amount of confidence in relatively recently developed mechanisms.
It feels kind of like saying "oh, now that we can detect viruses in sewage, hospitals should stop bothering to report possible epidemic outbreaks, because that's redundant with the sewage monitoring capability". (Except worse, because it involves some people who may secretly be pursuing goals that are the opposite of everyone else's.)
Edit: DJB said in that 2022 post
> Publicly, NSA justifies this by
>
> . pointing to a fringe case where a careless effort to add an extra security layer damaged security, and
> . expressing "confidence in the NIST PQC process".
Why is that so surprising? Adopting new cryptography by running it in a hybrid mode with the cryptography it's replacing is generally not standard practice and multi-algorithm schemes are pretty niche at best (TrueCrypt/VeraCrypt are the only non-PQ cases that come to mind, although I'm sure there are others). Now you could certainly argue that PQ algorithms are untested and risky in a way that was not true of any other new algorithm and thus a hybrid scheme makes the most sense, but it's not such an obviously correct argument that anyone arguing otherwise must be either stupid or malicious.
The cool thing is the dramatic security improvements against certain unknown unknowns for approximately linear additional work and space. Seems like a pretty great advantage for the defender, although seriously arguing that quantitatively requires some way to reason about the unknown unknowns (the reductio ad absurdum being that we would need to use every relevant primitive ever published in every protocol¹).
I see PQC as somehow very discontinuous with existing cryptography, both in terms of the attacks it tries to mitigate and the methods it uses to resist them. This might be wrong. Maybe it's fair to consider it an evolutionary advance in cryptographic primitive design.
The casual argument from ignorance is that lattices are apparently either somewhat harder to understand, or just less-studied overall, than other structures that public-key primitives have been built on, to the extent that we would probably currently not use them at all in practical cryptography if it weren't for the distinctive requirements of resistance to quantum algorithms. I understand that this isn't quantitative or even particularly qualitative (for instance, I don't have any idea of what about lattices is actually harder to understand).
Essentially, in this view, we're being forced into using weird esoteric stuff much earlier than we'd like because it offers some hope of defending against other weird esoteric stuff. Perhaps this is reinforced by, for example, another LWE submission having been called "NewHope", connoting to me that LWE was thought even by many of its advocates to offer urgently-needed "hope", but maybe not "confidence".
I'd like not to have to have that argument only in terms of vibes (and DJB does have some more concrete arguments that the security of SIKE was radically overestimated, while the security of LWE methods was moderately overestimated, so we need to figure out how to model how much of the problem was identified by the competition process and how much may remain to be discovered). I guess I just need to learn more math!
¹ I think I remember someone at CCC saying with respect to the general risk of cryptographic backdoors that we should use hybrids of mechanisms that were created by geopolitical rivals, either to increase the chance that at least one party did honest engineering, or to decrease the chance that any party knows a flaw in the overall system! This is so bizarre and annoying as a pure matter of math or engineering, but it's not like DJB is just imagining the idea that spy agencies sometimes want to sabotage cryptography, or have budgets and staff dedicated to doing so.
https://en.wikipedia.org/wiki/Lattice-based_cryptography#His...
2005 (LWE), 2012 (LWE for key exchange), earlier (1990s for lattice math in general), 2017 (Kyber submission), later (competition modifications to Kyber)?
I can see where one could see the mathematics as moderately mature (comparable in age to ECC, but maybe less intensively studied?). As above, I don't know quite how to think about whether the "thing" here is properly "lattices", "LWE", "LWE-KEX", "Kyber", or "the parameters and instantiation of Kyber from the NIST PQ competition". Depending where we focus our attention there, I suppose this gives us some timeframe from the 1980s (published studies of computational complexity of lattice-related algorithms) to "August 2024" (adoptions of NIST PQ FIPS documents).
Edit: The other contextual thing that freaks out DJB, for those who might not be familiar, is that one of the proposed standards NIST was considering, SIKE, made it all the way through to the final (fourth) round of consideration, whereupon it was completely broken by a couple of researchers bringing to bear mathematical insight. Now SIKE had a very different architecture than the other proposals in the fourth round, so it seems like a portion of the debate is whether the undetected mathematical problems in SIKE are symptomatic of "the NIST competition came extraordinarily close to approving something that was totally broken, so maybe it wasn't actually that great at evaluating candidate algorithms, or at least maybe the mathematics community's understanding of post-quantum key exchange algorithms is still immature" or more symptomatic of "SIKE had such a weird and distinctive architecture that it was hard to understand or analyze, or hard to motivate relevant experts to understand or analyze it, unlike other candidate algorithms that were and are much better understood". It seems like DJB is saying the former and you're saying the latter.
?
The point is to trust no one and no thing that we cannot examine freely, closely, and transparently. And to maintain healthy skepticism of any entity that claims to have a virtuous process to do its business.
> trust me from experience.
You're lived experience tells you to trust the NSA, at least as it relates to NIST standards.
It failed to raise my confidence at all.
> The IESG has concluded that there were no process failures by the SEC ADs. The IESG declines to directly address the complaint on the TLS WG document adoption matter. Instead, the appellant should refile their complaint with the SEC ADs in a manner which conforms to specified process.
This complaint? https://cr.yp.to/2025/20250812-non-hybrid.pdf
Engineering concerns start in section 2 and continue through section 4.
It seems you haven't read it.
Ah, yes, procedural complaints such as "The draft creates security risks." and "There are no principles supporting the adoption decision.", and "The draft increases software complexity."
I don't know what complaint you're reading, but you're working awful hard to ignore the engineering concerns presented in the one I've read and linked to.
This is the retort of every bureaucracy which fails to do the right thing, and signals to observers that procedure is being used to overrule engineering best practices. FYI.
I'm thankful for the work djb has put in to these complaints, as well as his attempts to work through process, successful or not, as otherwise I wouldn't be aware of these dangerous developments.
Excuses of any kind ring hollow in the presence of historical context around NSA and encryption standardization, and the engineering realities.
It's not a board's job to handle every engineering complaint themselves, simply because they are rarely the best suited people to handle engineering complaints. When something is raised to them it's a matter of determining whether the people whose job it is to make those decisions did so appropriately, and to facilitate review if necessary. In this case the entire procedural issue is clear - Dan didn't raise a complaint in the appropriate manner, there's still time for him to do so, there's no problem, and all the other complaints he made about the behaviour of the ADs were invalid.
As was https://en.wikipedia.org/wiki/Dual_EC_DRBG which was ratified over similar objections.
That made it no less of a backdoor.
> it's not their job
As I said about excuses.
But as has been pointed out elsewhere, the distinction between the Dual EC DRBG objections and here are massive. The former had an obvious technical weakness that provided a clear mechanism for a back door, and no technical justification for this was ever meaningfully presented, and also it wasn't an IETF discussion. The counterpoints to Dan's engineering complaints (such as they are) are easily accessible to everyone, Dan just chose not to mention them.
The complaint seems well referenced with evidence of poor engineering decisions to me.
> Dual EC DRBG ... had an obvious technical weakness that provided a clear mechanism for a back door
Removing an entire layer of well tested encryption qualifies as an obvious technical weakness to me. And as I've mentioned elsewhere in these comments, opens users up to a https://en.wikipedia.org/wiki/Downgrade_attack should flaws in the new cipher be found. There is a long history of such flaws being discovered, even after deployment. Several examples of which DJB references.
I see no cogent reason for such recklessness, and many reasons to avoid it.
Continued pointing toward "procedure" seems to cede the case.
I am curious what the costs are seen to be here. djb seems to make a decent argument that the code complexity and resource usage costs are less of an issue here, because PQ algorithms are already much more expensive/hard to implement then elliptic curve crypto. (So instead of the question being "why don't we triple our costs to implement three algorithms based on pretty much the same ideas", it's "why don't we take a 10% efficiency hit to supplement the new shiny algorithm with an established well-understood one".)
On the other hand, it seems pretty bad if personal or career cost was a factor here. The US government is, for better or worse, a pretty major stakeholder in a lot of companies. Like realistically most of the people qualified to opine on this have a fed in their reporting chain and/or are working at a company that cares about getting federal contracts. For whatever reason the US government is strongly anti-hybrid, so the cost of going against the grain on this might not feel worth it to them.
As a response to this only, while djb's recent blog posts have adopted a slightly crackpotish writing style, PQC hybridization is not a fringe idea, and is not deployed because of djb's rants.
Over in Europe, German BSI and French ANSSI both strongly recommend hybrid schemes. As noted in the blog, previous Google and Cloudflare experiments have deployed hybrids. This was at an earlier stage in the process, but the long history of lattices that is sometimes being used as a (reasonable) argument against hybrids applied equally when those experiments were deployed, so here I'm arguing that the choice made at the time is still reasonably today, since the history hasn't changed.
Yes, there is also a more general "lots of PQC fell quite dramatically" sentiment at play that doesn't attempt to separate SIKE and MLKEM. That part I'm happy to see criticized, but I think the broader point stands. Hybrids are a reasonable position, actually. It's fine.
The german position:
https://www.bsi.bund.de/SharedDocs/Downloads/EN/BSI/Publicat...
"The quantum-safe mechanisms recommended in this Technical Guideline are generally not yet trusted to the same extent as the established classical mechanisms, since they have not been as well studied with regard to side-channel resistance and implementation security. To ensure the long-term security of a key agreement, this Technical Guideline therefore recommends the use of a hybrid key agreement mechanism that combines a quantum-safe and a classical mechanism."
The french position, also quoting the German position:
https://cyber.gouv.fr/sites/default/files/document/follow_up...
"As outlined in the previous position paper [1], ANSSI still strongly emphasizes the necessity of hybridation1 wherever post-quantum mitigation is needed both in the short and medium term. Indeed, even if the post-quantum algorithms have gained a lot of attention, they are still not mature enough to solely ensure the security"
So you've constructed a strawman. Another indication of ceding the argument.
> and the answer we have from a whole bunch of people who are qualified
The ultimate job of a manager or a board is to take responsibility for the decisions of the organization. All of your comments in this thread center around abdicating that responsibility to others.
> This isn't actually an engineering hill I'd die on
Could have fooled me.
> we basically have djb against the entire world
Many of your comments indicate to me that clashing personalities may be interfering with making the right engineering decision.
"Why adopt a protocol that may rely on a weak algorithm without any additional protection"
Does not accurately represent the situation at hand. And that seems intentional.
"Why weaken an existing protocol in ways we know may be exploitable?" is a more accurate representation. And I believe the burden of evidence lies on those arguing to do so.
It really seems like you're trying not to hear what's been said.
There are absolutely NSA technical and psychological operations personnel who are on HN not just while at work, but for work, and this site is entirely in-scope for them to use rhetoric to try to advance their agenda, even in bad faith.
I'm not saying mjg59 is an NSA propagandist / covert influencer / astroturf / sockpuppet account, but they sure fail the duck test for sounding and acting like one.
It has certainly affected my perception of the individuals involved.
People can reasonably disagree with the djb position. His blog posts are notoriously divisive, and that doesn't make everyone on the other side a secret NSA influencer.
Please assume good faith, or discussions turn into personal attacks and wild accusations.
This turns a thread about cryptography into a thread about attacking someone's particular posting style. This is not going to advance the discussion in any sort of useful direction, the only thing this can do is divide people further while cementing existing positions.
If your IDS thinks well-known free software people are NSA agents because they disagree in a style you don't like, the problem is with the IDS.
Anyway, sounds like I'm being dismissed for being "divisive" despite raising substantive security concerns, just like djb. Readers: form your own conclusions about the repetitive patterns here; don't listen to the people telling you not to trust your own eyes.
Note the hallmarks: zero engagement with the substance of the critique (functional equivalence), ad-hom strawman attacks against my character as a response to a misrepresentation of my position, emotional manipulation techniques: demanding focus on tone / civility, maligning moral character of opponent (accusations of divisiveness), still trying to reframe a critique about behavior into an attack against identity that it isn't.
It is dishonest to state categorically that a person is not an X unless a person is in the position to know.
A pattern of behavior is a kind of evidence and the observed pattern of behavior does not seem to be in dispute.
There is no evidence presented that the person making a categorical statement is in a position to know about anyone's role or lack of a role in the NSA's clandestine activities.
In 2016, Isis Lovecruft was romantically involved with Jacob Appelbaum. Isis lost a coveted PhD student spot studying under Bernstein to… Jacob Appelbaum. Isis broke up with Jacob and accused him of sexual abuse in a spectacularly public manner.
Isis became romantically involved with Henry de Valence, another Bernstein PhD student. Valence became acquainted with Appelbaum. Later, under Isis’ direction, Valence published a wild screed full of bizarre accusations trying to get Appelbaum expelled and Bernstein fired. When this failed, Isis dumped Valence and publicly accused him of sexual abuse.
Isis Lovecruft is now married to Matthew Garrett. Obviously Matthew is going to work to discredit Bernstein, because if he fails, he knows what the next two steps are.
Well if your working in a standards development organisation then your manager probably should.
It looks like (in the US at least) standards development organisations have to have (and follow) very robust transparency processes to not be default-liable for individual decisions.
(Unlike most organisations, such as where where you and your manager from your scenario come from)
Everyone has no issues forcing other people to use 2FA, which preferably requires a smartphone, but a simple reply to qsecretary is something heinous.
The $250 are for spam and everyone apart from bureaucrats who want to smear someone as a group knows that this is 1990s bravado and hyperbole.
* Targets with sufficient technical understanding would use hybrids anyway.
* Average users and unsophisticated targets can already be monitored through PRISM which makes cryptography moot.
So...what's their actual end game here?
The NSA starts by requiring some insecure protocols be supported, and then when support is widespread they start requiring it be made a default by requiring compliance testing be done with default config.
From this privileged network position, if both sides support weaker crypto that NSA lobbied for, they can MitM the initial connection and omit the hybrid methods from the client's TLS ClientHello, and then client/server proceed to negotiate into a cipher that NSA prefers.
Intelligence is a numbers game, they never get everything, but if your net is wide enough and you don't give up, you'll catch a lot of fish over time
1. adopt hybrid/dual encryption. This is safe against a break of the PQC layer which seems entirely plausible given that the algorithms are young, the implementations are younger, and there has been significant weakening of the algorithms in the past decade.
2. Adopt PQC without a backup layer. This approach is ~5% faster (PQC algorithms are pretty slow), with the cost of breaking encryption for everyone on the internet if any flaw in the PQC algorithms or implementations is found.
Are you implying that djb blew the matter out of proportion?
The reason this is a poor quality analogy is that fundamentally ecdsa and ed25519 are sufficiently similar that people had a high degree of confidence that there was no fundamental weakness in ed25519, and so it's fine - whereas for PQC the newer algorithms are meaningfully mathematically distinct, and the fact that SIKE turned out to be broken is evidence that we may not have enough experience and tooling to be confident that any of them are sufficiently secure in themselves and so a protocol using PQC should use a hybrid algorithm with something we have more confidence in. And the counter to that is that SIKE was meaningfully different in terms of what it is and does and cryptographers apparently have much more confidence in the security of Kyber, and hybrid algorithms are going to be more complicated to implement correctly, have worse performance, and so on.
And the short answer seems to be that a lot of experts, including several I know well and would absolutely attest are not under the control of the NSA, seem to feel that the security benefits of a hybrid approach don't justify the drawbacks. This is a decision where entirely reasonable people could disagree, and there are people other than djb who do disagree with it. But only djb has engaged in a campaign of insinuating that the NSA has been controlling the process with the goal of undermining security.
The problem with this statement to me is that we know of at least 1/4 finalists in the post quantum cryptography challenge is broken, so it's very hard to assign a high probability that the rest of the algorithms will be secure from another decade of advancement (this is not helped by the fact that since the beginning of the contest, the lattice based methods have lost a signficant number of bits as better attacks have been discovered).
Seems dumb not to have like 10.
Yes, and at the same time all of modern crypto is incredibly cheap and can be added as wished on almost every application without any visible extra costs.
So the answer to the GP is not that trivial one. The actual answer is about software complexity making errors more likely, and similar encryption schemes not really adding any resiliency.
https://mailarchive.ietf.org/arch/msg/tls/RK1HQB7Y-WFBxQaAve...
Trust the process!
I wonder who else could reasonably host a standardization process? Maybe the Linux Foundation? All the cryptography talent seems to be working on ZK proofs at the moment in the Ethereum ecosysetem; I think if Vitalik organized a contest like NIST people would pay attention.
The most important thing is to incentivize attackers to break the cryptography on dummy examples instead of in the wild. Ideally: before the algorithm is standardized. The Ethereum folks are well setup to offer bounties for this. If a cryptographer can make FU money through responsible disclosure, then there is less incentive to sell the exploit to dishonest parties.
This implies that what is actually being offered is Security Through Ignorance.
Is this encryption sound? Maybe, who knows! Let's wait and find out!
Maybe an stunnel for CurveCP, or something like PQConnect
History has shown djb is usually right
He has been far more productive at writing software and developing cryptography that has avoided security vulnerabilities than any of the IETF WG members. The best part about his software IMHO is that it is small with low resource requirements and can primarily serve ordinary individual computer users, as opposed to large, complex, steep learning curve software primarily serving corporations like the ones that publish RFCs and send people to IETF meetings
Anyone reading this comment is probably using djb's cryptography in TLS. His contributions to today's internet are substantial
It really says a lot about "IETF" and other Silicon Valley pseudo-governance that a talented and trustworthy author, who has remained an academic when so many have sold out, gets treated like a nuisance
Dual EC wasn't a shockingly clever, CS-boundary-pushing hack (and NSA has apparently deployed at least one of those in the last 20 years). It was an RNG (not a key agreement protocol) based on asymmetric public key cryptography, a system where you could look at it and just ask "where's the private key?" There wasn't a ton of academic research trying to pick apart flaws in Dual EC because why would there be? Who would ever use it?
(It turns out: a big chunk of the industry, which all ran on ultra-closed source code and was much less cryptographically literate that most people thought. I was loudly wrong about this at the time!)
MLKEM is a standard realization of CRYSTALS-Kyber, an algorithm submitted to the NIST PQ contest by a team of some of the biggest names in academic PQ cryptography, including Peter Schwabe, a prior collaborator of Bernstein. Nobody is looking at MLKEM and wondering "huh, where's the private key?".
MLKEM is based on cryptographic ideas that go back to the 1990s, and were intensively studied in the 2000s. It's not oddball weird cryptography. It is to the lineage of lattice cryptography roughly what Ed25519 was to elliptic curve cryptography at the time of Ed25519's adoption.
Utterly unlike SIKE, which isn't a lattice algorithm at all, but rather a supersingular isogeny algorithm, a cryptographic primitive based on an entirely new problem class, and an extremely abstruse one at that. The field had been studying lattice cryptography intensively for decades by the time MLKEM came to pass. That's not remotely true of isogeny cryptography. Isogenies were taken seriously not because of confidence in the hardness of isogenies, but because of ergonomics: they were a drop-in replacement for Diffie Hellman in a way MLKEM isn't.
These are all things Bernstein is counting on you not knowing when you read this piece.
Currently the best attacks on NTRU, Kyber, etc, are essentially the same generic attacks that work for something like Frodo, which works on unstructured lattices. And while the resistance of unstructured attacks is pretty well studied at this point, it is not unreasonable to suspect that the algebraic structure in the more efficient lattice schemes can lead to more efficient attacks. How efficient? Who knows.
And now, in a world where QR + pre-QR algos are typically being introduced in a layered fashion, they're saying "let's add another option, to reduce the number of options" which at least looks very suspicious
Practical quantum computers are probably not very close, but you can certainly use the fear of them as a chance to introduce a new back-door. If you did, you'd have to behave exactly as the NSA is doing right now.
Dual EC isn't the only comparison he's making. He's also making a comparison to DES, which had an obvious weakness: 53 bit limitation, similar to the obvious weakness of non-hybrid. In neither case is there a secret backdoor. At the time of DES, the NSA publicly said they used it, to make others confident in it. Similarly, the NSA is saying "we do not anticipate supporting hybrid in NSS", which will make people confident in non-hybrid. But in the background, NSA actually uses something more secure (using 2 layers of encryption themselves).
I wonder what your strategy here is. Muddying the waters and depict Bernstein as a renegade? You have made too many big-state and big-money apologist posts for that to work.
You find it offensive now to compare ML-KEM and SIKE because SIKE was so thoroughly broken and demonstrated to be worse than pre-quantum crypto. But ML-KEM may already be broken this thoroughly by NSA and friends, and they’re keeping it secret because shipping bad crypto to billions of people enables SIGINT. The idea that your professional crypto acquaintances might be on the NSA’s payroll clearly disturbs you enough that you dismiss it out of hand.
Bernstein is proposing more transparency because that is what was promised after the Dual-EC debacle. Do you disagree with Bernstein because he advocates for transparency (which could prevent bad crypto shipping), or because of his rhetorical style?
You’ve admitted you were “loudly wrong” when you announced Dual-EC couldn’t be an NSA cryptography backdoor. Snowden let us all know the NSA spends $250 million every year secretly convincing/bribing the private sector to use bad cryptography. Despite that history, you are still convinced there’s no way ML-KEM is an NSA cryptographic backdoor and that all the bizarre procedural errors in the PQ crypto contest are mere coincidences.
[checks my text messages] Lucy just texted me, Thomas. She’s outside waiting for you to kick her football.
You saw a similar thing in Bernstein's earlier railing against the NIST contest (which he participated in), happily whipping up a crowd of people who believed Tancrede Lepoint or Chris Peikert or Peter Schwabe might have been corrupted by NSA, because nobody in that crowd have any idea who those three researchers are.
It's really gross.
“Apache chunked encoding is not exploitable” —- Dowd, 2002
What I think you're not seeing is that this isn't a SIKE vs. Lattice kind of debate; it's a Curve25519 vs. P-256 kind of debate. P-256 was never broken. Curve25519 made smart engineering decisions that for years foreclosed on some things that were common in-the-real-world implementation pitfalls. P-256 has closed that gap now, but for the whole run of the experience they were both sane choices.
That's a generous interpretation. Another parallel would be Rijndael vs. Serpent, where the Serpent advocates were all "I don't know about this Rijndael stuff seems dicy". Turned out: Rijndael was great.
But Bernstein wants you think that rather than a curve-selection type debate, this is more akin to a "discrete log vs. knapsack" debate. It isn't.
I'd use a hybrid if I was designing a system; I am deeply suspicious of all cryptography, and while I don't think Kyber is going to collapse, I wouldn't bet against 10-15 years of periodic new implementation bugs nobody knew to look for.
But I'm cynical about cryptography. It's really clear why people would want a non-hybrid code point.
Let me just say this once as clearly as I can: I sort of don't give a shit about any of this. A pox on all their houses. I think official cryptographic standards are a force for evil. More good is going to be done for the world by systems that implement well enough to become de facto standards. More WireGuards, fewer RFCs. Certainly, I can't possibly give even a millifuck about what NIST wants.
But I also can't be chill about these blog posts Bernstein writes where it's super clear his audience is not his colleagues in cryptography research, but rather a lay audience that just assumes anything he writes must be true and important. It's gross, because you can see the wires he's using to hold these arguments together (yes, even I can see them), and I don't like it when people insult their audiences this way.
It does though. It's just been engineered integral to the unibody. And there are crumple zones, airbags, seat belts, ABS, emergency braking systems, collision sensors, and more layered defenses in addition.
No sane engineer would argue that removing these layers of defense would make the car safer.
Which is why many engineers wear the ring.
Folks do love to argue though.
To me it really isn't. TLS has no need for it. But let's focus the context for some US government organisations that want this for their FIPS maturity level they're aiming for. Why would these organisations want a weaker algorithm for TLS than what is standardised; more importantly how does it benefit deployment except save a tiny bit of computation and eliminate some ECC code. I'm not going to jump the shark and say it is nefarious, but I will throw in my 2 cents and say it doesn't help security and is unnecessary.
Unless NSA pays you $10 million, as they did to RSA, to make said obviously bumbling attempt the default in their security products.
https://en.wikipedia.org/wiki/Dual_EC_DRBG#Timeline_of_Dual_...
https://www.reuters.com/article/us-usa-security-rsa-idUSBRE9...
Or unless the presence of such less secure options in compliant implementations enables a https://en.wikipedia.org/wiki/Downgrade_attack
>Surveillance agency NSA and its partner GCHQ are trying to have standards-development organizations endorse weakening ECC+PQ down to just PQ.
The NSA spends about half of its resources attempting to hack the FBI and erase its evidence against them in the matter of keeping my wife and me from communicating. The other half of the staff are busy commenting online about how unfair this is, and attempting to get justice.
There are no NSA resources left for actions like the one I quoted. I don't think NSA is involved in it.
They are not running out of resources.