For those of you not up with the news in Australia, last week saw the end of another COVID lockdown in Melbourne. Last week also saw NIST running its third PQC Standardization Conference and doing it virtually. Something for which this correspondent was most grateful. Of course, it did mean getting up at midnight, but that did also help cure any nostalgia I was feeling for jet-lag!
Sadly, the first item of news for those of us hoping for the competition to come to an end, was that the final list of algorithms will not be announced before December. NIST also expect to spend a further year on the work required to document the selected algorithms as formal standards. It is going to be a while yet before we are done.
Speakers from NIST again emphasized the importance of giving the process time and of not rushing out to commit to a particular algorithm just yet. The competition has driven a lot of new research and while this is clearly a good thing, it is important to recognize the use of the word “new”. Progress is still being made in the development of security proofs and formal methods for analyzing the candidates and this will also, in turn, help drive the final selection of algorithms.
That said, it seems pretty clear that the hash based SPHINCS+, while not a finalist but an alternate algorithm, is going to be standardized at some stage. There are already IETF standards for two similar Merkle tree-based algorithms LMS and XMSS, and SPHINCS+, while stateless, is easily the most conservative submission in the competition using well understood techniques that some would even classify as boring. In this line of work, boring is good, especially when a technique increases the prospect of more of it. An added feature, while signature sizes for SPHINCS+ are larger than what we traditionally deal with, there is something quite attractive about being able to drive an entire PKI using just a hash function.
The other item of note is that it was pointed out Section 2 of the KDF standard, NIST SP 800-56C, which was released in August last year now explicitly allows for hybrid calculation of the shared secret Z value, allowing an approved algorithm to be combined with some other method while still allowing the resulting implementation to be FIPS compliant. Very useful if you would like to quantum harden a key agreement now.
So what does this mean for us at Bouncy Castle? Originally, we were thinking that, around this time of year, it would be worth starting to progress implementations of the finalists that had been selected. While the lack of final decision does mean we cannot do this, things do seem to have stabilised enough for us to at least put together some trial implementations. We think this is worthwhile as while it is still too early to commit to one of these algorithms, given the additions to SP 800-56C, there are some changes that can be made, and it is a good time for people to start experimenting if they wish to. Make no mistake, these algorithms are different! Classic McEliece offers cipher texts in the range of 128 to 240 bytes, but the public key sizes range from 260K to just over 1357K. As Tanja Lange pointed out when giving the presentation “Classic McEliece: conservative code-based cryptography” the cipher text sizes are great, some would even say fantastic, for some things, but you want to be very careful about rushing out and setting up a server which will just accept certificates that are over a megabyte in size from the rest of the Internet. It is time to start thinking about the options, but also understanding what the implications are.
So, watch this space for further news, and watch https://github.com/bcgit/bc-java for the progress. If you would like to find out more about the conference and the presentations given, they are now online at: https://csrc.nist.gov/Events/2021/third-pqc-standardization-conference and also well worth a visit.
The author wishes to acknowledge that travel to his home office to participate in the event was funded by PrimeKey and Crypto Workshop. Stay safe!