Crypto, meaning Cryptography, was on the agenda in Amsterdam from the 13th to 15th of April with another Real World Crypto Symposium – held as an in-person and online event. Read this blog by Keyfactor’s David Hook to learn more about his three key takeaways from the event.
The executive summary is that there’s a lot going on! For those of you looking for more in-depth details for things affecting us in the short term, I thought I’d look at three topics of interest: the NIST PQC competition continues, certificate revocation lists (CRLs) are still a thing, and the idea of crypto-agility as a principal is proving to be a lot more involved than any of us first thought.
The NIST PQC competition continues
Late last year, it was announced that the final algorithm set from the NIST PQC competition would be chosen in late March of this year. As one speaker so eloquently put it, we are now in late March, for very large values of late. NIST have not made any further announcements and the submission teams do not appear to have more information either.
A possible reason for the pause is that there have been further advances in analysis of the candidates. An attack on Rainbow has meant that all its parameter sets have been effectively downgraded one level in security, more recently an improved dual lattice attack has been published which also downgrades the security of algorithms like SABER, Crystals-Dilithium, and Crystals-Kyber.
This does not mean that any of these algorithms are inherently broken, but it may mean that parameter sizes need to be revised and this has knock-on effects as to the utility of the algorithms which probably in turn affects an algorithm’s suitability for selection. So maybe – just maybe – NIST are having a bit more of a think.
One thing seems clear: the “shape”, as it were, of the algorithms we will migrate to in the future does not look like it’s going to change. More on this one a bit later!
CRLs are still a thing
On the CRL front, Mike Hamburg’s talk was rediscovering the CRL. For those who have just joined us, here’s some quick history. When CRLs started to get rather large, at least for some CAs and their users, the idea of looking up the status online – as opposed to distributing the full CRLs – came with OCSP (Online Certificate Status Protocol). Like most good ideas this led to other issues, resulting in OCSP stapling.
The elephant in the room with OCSP has always been what to do when the OCSP server is down. There has been a couple of incidents where we’ve found out, probably the most famous being the one which led to blogs with remarkably succinct titles such as “What Happened to My Mac? Apple’s OCSP Apocalypse”.
Mike’s talk was a fresh look into CRLite, which is an already active proposal to move back to a locally downloaded equivalent for a CRL. It was originally proposed in 2017 and provides for a much smaller representation. While CRLite does not provide as much information as an actual CRL, it does tell you if a certificate is revoked, which in most cases is all that is required. It turns out that only a few percent of certificates are ever revoked, which makes the use of filter maps such as Bloom filters an effective way to represent a CRL with a substantial reduction in size. And it seems like frayed ribbon filters produce an even smaller footprint. The size of the resulting structures seems to make them download-friendly. CRLite, as described, is much broader than a single CRL for a single CA, but the lessons and techniques learned are equally applicable to a single CA. If CRL management and issuance is of interest, this work is worth following.
Crypto-agility is not as easy as it sounds
Finally, there were a series of talks on all things post-quantum (PQ) which finished with a talk on cryptographic transition and agility. I’m going to paraphrase here a bit to keep things short, so if I haven’t quite captured what David Ott and his co-authors were trying to get across, I hope they accept my apologies.
We talk about crypto-agility a lot, but while a few have tried to point out that it’s not as easy as it sounds, a lot of us still seem to assume it’s like Y2K: just audit your databases and tweak those two column reads and comparisons, and it’s all good. It has become quite clear that it’s not all like that. As the leader of one session pointed out, you cannot just add PQ to the start of the protocol and expect it to work. Not all the new algorithms map directly onto the methods we currently use, and as an industry we are still working out which side of the line different protocols are on – for some it’s minor edits and you have added PQ. For others, to quote a famous line from Star Trek, it really is a case of “it’s life Jim, but not as we know it”.
That said, we seem to have a good idea of the characteristics of the “new” algorithms, so as far as experimentation goes, there is no need to wait for the final standards to be published. We can start now. This is particularly relevant for people looking at hybrid encryption (like combining a classical algorithm like ECCDH with a PQC one like FrodoKEM), as this means a change in the way things are done.
On this final note, I’d like to mention that the just released Bouncy Castle 1.71 features four of the NIST finalists and alternative candidates: FrodoKEM, Classic McEliece, SABER, and SPHINCS+, as well as supporting a HybridValueParameterSpec class to allow the output of the KEM algorithms to be used for hybrid encryption. The same algorithms with be available in Bouncy Castle for C# soon as well. With some outside assistance, we expect to have implementations of the others completed over the next few months, so if you’d like to start investigating these algorithms, some resources are already available, and the missing resources are coming.
If you want to start trying out Bouncy Castle for yourself, see these links: