Report by Luca de Feo on the 3rd PQC Standardization Conference

The 3rd PQC Standardization Conference, organized by NIST, took place online from June 7 to 9, featuring a mix of live talks, pre-recorded talks, and panels. The oral exchanges were complemented by a text-based forum, provided by an app well known for its lack of end-to-end encryption, where some topics were eventually debated at length. Slides for the talks will be available in a few days, and video recordings in a few weeks. In the meantime, I will give a personal account of the conference based exclusively on my recollections. I took no notes, and I was often preparing or eating dinner at the same time, so nothing of what I will report should be taken as an established truth.

Throughout the three days, each of the finalists and alternate finalists had a 15 minutes slot to present their updates for the 3rd round. In most cases, there were minimal or no updates. PicNic appears to be the most notable exception, with important changes to the structure of the LowMC block cipher. Rainbow and GeMSS had some explaining to do, in response to recent advances in cryptanalysis, and GeMSS had to drop some parameters. Vadim Lyubashevsky and Dan Bernstein possibly gave the most opinionated talks, I recommend watching both when they are available.

Several contributed talks reported on various aspects of post-quantum cryptography. Lattices had the greatest share, I especially enjoyed the talks by Thomas Espitau and Yu Yang on variants of Falcon… or maybe was it the excellent Château Latour I was having at the same time? I also enjoyed the “Applications” session, which opened my eyes on how difficult it is to put any of the PQC candidates in constrained environments such as smartcards and IoT.

Of particular interest to the readers of this blog should be the three contributed talks on isogeny-based cryptography:

• Péter Kutas (joint work with Christophe Petit) gave an excellent, if somewhat time-constrained, survey talk on several different “torsion point attacks” against SIDH and variants, which have previously appeared in this blog. The take-away message is that SIDH, SIKE and B-SIDH are well protected against all of them, be it because of the Fujisaki–Okamoto transform, or because of their intrinsic limitations, but the broader space of generalizations of SIDH a cryptographer might imagine is somewhat limited by these attacks, as it has already been repeatedly shown. I would certainly like to see more research in this promising direction, which has applications beyond cryptanalysis.
• Élise Tasso (joint work with Nadia El Mrabet, Simon Pontié and myself) presented an in-the-lab confirmation of a fault-injection attack on SIDH first proposed by Ti. The attack is alarmingly easy to mount (ok, we used equipment worth 40k€, but that’s only because we’re rich), but at the same time:
• It requires multiple repetitions of key generation with the same secret key, something that should never happen in a correct implementation of SIDH or SIKE;
• It appears to be difficult to exploit in presence of key compression;
• It has a countermeasure so simple and cheap, that it may as well be included by default in the reference code.

The old-timers of this blog will not be surprised to learn that the best talk of the conference was delivered by Craig Costello. In only 5 minutes, Craig pretended to use SageMath code to generate pairs of toy SIDH public keys (one for Alice, one for Bob), discard the secrets, and (clumsily fail to) upload the public keys to a GitHub repo. Then, he announced that Microsoft is offering $5,000 for the solution of the smaller instance, named$IKEp182, and $50, 000 for that of the larger instance, named$IKEp217. The prize money matches what the SIKE team estimates to be the material cost of breaking the instances, so think twice before reallocating your BitCoin mining resources.

If you believe this stunt, then be our guest and start cracking, but don’t come whining when some mysterious bounty catcher from Australia claims the big prize. For now, I can only observe that everybody seems to trust Microsoft and no one has even forked the GitHub repo (but, do you trust GitHub, anyway?). As a public service to the community, here is the SHA1 of the latest commit, dated June 9, 2021: 72dc1cb50d5a78fee605757e2f33043b2f36f9b4, and here is the SHA-512 of the contents of the repo (excluding .git and .gitignore), as of June 9:

$git clone https://github.com/microsoft/SIKE-challenges$ cd SIKE-challenges/
\$ cat * | sha512sum


Anyway, the video recording will be available in a few days, and you will all get a chance to have a close look at Craig’s sleight of hand. Be on the watch for video stitching by “those cheeky devils at the NSA”!

— Luca de Feo

Some recent papers in isogeny crypto

There have been quite a few papers on isogeny crypto posted in the last few months. Here is a brief summary of some of them. I thank the members of my research group for fruitful discussions about these papers.

1. Improved torsion point attacks on SIDH variants by Victoria de Quehen, Péter Kutas, Chris Leonardi, Chloe Martindale, Lorenz Panny, Christophe Petit and Katherine E. Stange.

This is a greatly revised and expanded version of an earlier paper by a subset of the authors. I am writing about the March 2021 version.

The paper builds on an idea of Petit (published at ASIACRYPT 2017) to exploit the fact that SIDH gives the image of torsion points. To be precise, suppose $E_0$ is an elliptic curve over $\mathbb{F}_p$ with known endomorphism ring (for simplicity let’s take $j(E_0) = 1728$). Let $E_1$ be an elliptic curve such that there is an isogeny $\phi : E_0 \to E_1$ of degree $A$. Suppose we are also given $\phi(P), \phi(Q)$ where $P, Q$ generate the subgroup of points of order $B$ on $E_0$. The attacker wants to know $\phi$. The general approach is to choose a suitable endomorphism $\theta$ on $E_0$ such that the isogeny $\phi \circ \theta \circ \hat{\phi} + d$ from $E_1$ to itself has degree divisible by $B$. One can then compute this isogeny and hence work back to determine $\phi$. Two specific technical improvements in this paper over Petit’s work are the use of the dual isogeny and the Frobenius map.

The paper contains a number of results, but the main headline result is to give a polynomial-time attack on “over-stretched” SIDH, where $B > pA$ (Petit’s original paper needed the stronger condition $B > A^4$). This is an important result, but it has no impact on standard implementations of SIDH (such as the SIKE submission to NIST), which have $A \approx B \approx \sqrt{p}$. However, the paper is a warning about non-standard variants of SIDH, and the paper discusses several such situations.

1. The SQALE of CSIDH: Square-root Vélu Quantum-resistant isogeny Action with Low Exponents by Jorge Chávez-Saab, Jesús-Javier Chi-Domínguez, Samuel Jaques and Francisco Rodríguez-Henríquez.

The CSIDH approach to isogeny crypto is very appealing for crypto constructions and has attracted a lot of interest, however the problem is that it is attackable using Kuperberg’s quantum hidden-shift algorithm. Several works, in particular the two EUROCRYPT 2020 papers by Peikert and Bonnetain-Schrottenloher, have seriously challenged the proposed CSIDH parameters and suggested they do not meet the minimum required post-quantum security levels. So is CSIDH doomed?

The paper analyses a natural approach to rescue CSIDH, by using a much larger prime (and hence much larger class group) but by choosing private keys from a small subset of the class group, namely the ideals of the form $\prod_i \ell_i^{e_i}$ with $|e_i| \le m$ for small $m$. (A similar proposal was made by myself and Luca De Feo in our SeaSign paper, in the context of lossy keys.)

The paper is mostly about Kuperberg’s quantum algorithm and its analysis. The paper gives arguments that suggest Kuperberg’s algorithm is not applicable to this variant of the problem. The analysis seems plausible, though I am not expert.

The big question is whether this saves CSIDH. For the proposed parameters, the prime $p$ is at least 4000 bits, so that protocol messages in key exchange are now at least 4000 bits. This is much worse than the originally suggested 512 bits. The timings are of the order of a few seconds for each operation, which again is much worse than desired. In conclusion, CSIDH may be saved, but it does lose some of its advantages over lattices (e.g., small keys and messages).

1. One-way functions and malleability oracles: Hidden shift attacks on isogeny-based protocols, by Péter Kutas, Simon-Philipp Merz, Christophe Petit and Charlotte Weitkämper (accepted to EUROCRYPT 2021).

This paper relates to both the papers already discussed. It constructs a group action in the SIDH setting, which opens the door to a Kuperberg-type sub-exponential quantum attack on SIDH. Again let $E_0 : y^2 = x^3 + x$ be the elliptic curve over $\mathbb{F}_p$ with $j(E_0) = 1728$. Let $\iota(x,y) = (-x, iy)$ be an endomorphism of $E_0$. Let $G$ be a certain multiplicative subgroup of $\mathbb{Z}[\iota]$. We need $G$ to act on some set. Let $L$ be a subgroup of $E_0$ of order $A$. The action of group element $\theta=a+b\iota \in G$ is defined to be $E_0 / \theta(L)$. So $G$ is acting on a set of elliptic curves (essentially a set of $j$-invariants). Computing the group action is easy when $L$ is known, but the difficulty is to compute the group action on the challenge curve $E_1 = E/K$ in the SIDH protocol (when $K$ is not known). To do this, the authors use torsion point information, as already discussed in item 1 above. So suppose we are given $E_0, E_1, \phi(P), \phi(Q)$ where $\phi : E_0 \to E_1$ has kernel $K$ of order $A$ and where $P, Q$ generate the subgroup of order $B$. The attack requires $B > p A^4$. Since SIDH with such parameters can already be broken in classical polynomial time by Petit (or the first paper discussed above), one would not use Kuperberg’s algorithm in this case. Hence this paper does not weaken SIDH. Instead the merit of the paper is to explain how group actions can be introduced in the SIDH setting, which contradicts the conventional wisdom that “SIDH is not based on group actions”.

— Steven Galbraith

SQISign

This post is about SQISign, an exciting post-quantum signature scheme based on isogenies. The blog post is intended for people who understand SIDH well, but are not experts at quaternions and Eichler orders. I do not claim to explain (or understand) all the details. I am mainly trying to give an overview of what are the main technical achievements of the SQISign authors.

First some background. Previously there were three isogeny-based signature schemes. A scheme hased on SIDH was given by Yoo, Azarderakhsh, Jalali, Jao and Soukharev. The basic idea is simple: Given a public key $(E_0, E_A)$ the prover wants to prove that they know the secret isogeny $\tau : E_0 \to E_A$. To do this, as in SIDH, commit to an isogeny $\psi : E_0 \to E_1$ and the value $j(E_2)$, where $\gcd( \deg(\tau), \deg(\psi)) = 1$ and $E_2 = E_0/\langle G_A, G_1 \rangle$ where $G_A = \ker( \tau )$ and $G_1 = \ker(\psi)$. The verifier sends a single bit, and the prover reveals either $G_1$ and its image on $E_A$, or $\psi(G_A)$.

A variant of this scheme was proposed by Galbraith, Petit and Silva. Their paper also included a completely different (and more complicated) scheme based on quaternions and endomorphism rings. Thirdly, the SeaSign signature by De Feo and Galbraith is based on CSIDH. In all three cases, the basic form is a 3-move identification protocol (Sigma protocol) with single bit challenges. The SeaSign scheme manages to increase the challenge size by using a Merkle tree and some other ideas. But even with SeaSign, it is necessary to repeat the scheme a number of times in parallel to ensure that the probability a forger can guess the challenge is negligble.

The fundamental problem in isogeny signatures has been to create a scheme that has an exponentially large challenge set, so that the scheme doesn’t have to be repeated. SQISign achieves this in an ingenious way, by using the Kohel-Lauter-Petit-Tignol (KLPT) algorithm.

The idea is simple enough at first sight. Again we want to prove knowledge of the secret isogeny $\tau : E_0 \to E_A$. The commitment is a curve $E_1$ such that the prover/signer knows an isogeny $\psi : E_0 \to E_1$. However, instead of a single bit challenge, the challenge is now an isogeny $\phi : E_1 \to E_2$. The signer/prover, knowing the secret key, can compute the isogeny $\sigma' = \phi \circ \psi \circ \hat{\tau} : E_A \to E_2$. However it is insecure to publish this isogeny, since it would leak the kernel of $\hat{\tau}$ and hence reveal the private key. Instead, we’d like to use the KLPT algorithm to create an isogeny $\sigma : E_A \to E_2$ which is “independent” of $\sigma'$ (this is exactly what KLPT does). If there was an efficient way to do this then we’d have a cool signature scheme. See the below figure, taken from the SQISign paper.

There are two big problems with this idea. The first is that we can’t run the KLPT algorithm on $E_A$. The KLPT algorithm relies on the norm in the quaternion order having a very special form that allows to reduce solving norm equations to solving binary (two variable) quadratic forms (using Cornacchia’s algorithm). The endomorphism ring of $E_A$ is not expected to be so well-behaved. So we can’t run KLPT with $E_A$. The second problem is that an attacker could easily forge signatures by choosing $E_1$ such that the attacker knows an isogeny $\phi' : E_A \to E_1$. Then, given the challenge $\phi : E_1 \to E_2$, the forger knows an isogeny from $E_A$ to $E_2$, and could win.

The SQISign paper solves both problems. Now would be a good time to mention that the authors of SQISign are an isogeny “dream team” of De Feo, Kohel, Leroux, Petit and Wesolowski. A deep understanding of KLPT is necessary to create the scheme, and the paper is a very impressive work.

We start with the special curve $E_0$ of j-invariant 1728. It has a “nice” endomorphism ring $\cal{O}$. The KLPT algorithm exploits the fact that isogenies from $E_0$ correspond to left-$\cal{O}$ ideals in $\cal{O}$. Equivalent ideals correspond to isogenies with the same image curve.

The most important thing a reader of this paper has to understand is that $E_0$ is the only curve we can work with, in terms of ideals and endomorphisms. So every computation has to be pulled back to $E_0$ in some way. This requires a whole bunch of sneaky constructions, starting with the private key. The private key is not one isogeny $\tau : E_0 \to E_A$, but two. Precisely, $\tau$ is an isogeny of “small” prime order $N_A$ corresponding to an ideal $I_\tau$. That’s nice in quaternion-land, but we can’t compute the public key, so we need to compute an equivalent ideal whose norm is a power of small primes, and thus compute the corresponding isogeny $\tau' : E_0 \to E_A$. The public key of a signer is $E_A$ and the private key is both $I_\tau$ and $\tau'$.

The commitment is a “random” isogeny $\psi : E_0 \to E_1$ of degree coprime to the degrees of both $\tau$ and $\tau'$. The prover can compute a left $\cal{O}$-ideal $I_\psi$ corresponding to $\psi$.

The challenge (which will be created using the Fiat-Shamir transform in the signature scheme) is the isogeny $\phi : E_1 \to E_2$. This isogeny is fully known to the prover, for example its kernel is known. There is an exponentially large set of possible challenges, which is what makes the signature scheme interesting. The degree of $\phi$ is co-prime to everything else. The prover needs to compute an ideal corresponding to $\phi$, and to do this we pull back the kernel of $\phi$ under $\psi$, compute the ideal $I$ and then push forward (via Lemma 3 of the paper). This results in $I_\phi$, which is a left-$End(E_1)$-ideal.

Now comes the really subtle bit. We have the three isogenies $\tau$, $\psi$ and $\phi$ and three ideals $I_\tau, I_\psi, I_\phi$. The isogeny $\phi \circ \psi \circ \hat{\tau} : E_A \to E_2$ corresponds to $I = \overline{I_\tau} I_\psi I_\phi$. Somehow we need to compute an equivalent ideal, but this is a left-$End(E_A)$-ideal, which is no good to us. The trick is to pull back $I$ under $\tau'$, which we can do since the degree of $\tau'$ is coprime to the norm of $I_\tau$. (This is the part where it is necessary to have two different private keys.) This ideal corresponds to an isogeny $\varphi : E_0 \to E'$ for some curve $E'$. Then we would run KLPT on this ideal to get a random ideal of suitable smooth norm, then push forward via $\tau$ or $\tau'$ and we’re done, right? Wrong!

We have two different isogenies $E_0 \to E'$, and the pushforward of one of them to $E_A$ is an isogeny to $E_2$. But the problem is: the pushforward of the other does not necessarily have an image curve isomorphic to $E_2$, which is what we need. How to enforce this requirement? This is where the Eichler orders come in. Eichler orders are intersections of two maximal orders in a quaternion algebra. The Eichler order of interest in this paper is $End( E_0 ) \cap End( E_A )$. (This is the reason why the degree of $\tau$ is taken as small as possible.) The key result is Corollary 1 of the paper, which restricts the equivalence of ideals to multiplication by elements $\beta$ in the Eichler order. It is necessary to develop a variant of KLPT to ensure this restriction is possible.

The paper contains several optimisations and also some new computational assumptions. One assumption is that there is no meet-in-the-middle attack on $\tau$, which seems plausible (though surprising at first sight). Other assumptions (see Section 7.3) are about the randomness of the outputs of the KLPT algorithm in this setting.

What about the attack I mentioned earlier (the “second problem”) choosing $E_1$ by computing an isogeny $\phi' : E_A \to E_1$. This attack is prevented by imposing a condition on the degree of $\sigma$. The attacker can’t run the same algorithm as the signer, since they have no way to pull back isogenies to $E_0$ and run KLPT.

To conclude, the exciting thing about SQISign is the exponentially large set of challenges, which means the signing process does not need to be repeated. This is why the signatures are very short compared with other post-quantum signatures. I have no opinion on the practicality of the scheme.

— Steven Galbraith

Review of ECC 2020

One of the reasons I started this blog was to share information with people who were unable to attend conferences. So I’ve tried to maintain a tradition of conference reviews. I’ll continue, even though online conferences are more inclusive and there is no excuse not to attend.

ECC 2020 took place online last week. Recordings of the 4 panel discussions are available on youtube here.

There were technical problems with the first panel (the “Pacific rim” group), but the audio is ok and the quality gets better after the first few minutes. The discussion in the first panel covered various attacks on signatures (including Minerva and TPM-Fail), groups of unknown order, the development of isogeny-based crypto, polynomial commitments, and computational records for factoring, finite field DLP and ECDLP. Finally we answered the question “why work on fancy crypto when we don’t even seem to be able to safely encrypt and sign data?”. Watch the recording to find out more!

The following three panels ran more smoothly and are also highly recommended. I found it particularly interesting to listen to the speculations on future developments in quantum computing in the panel “How long can we safely use pre-quantum ECC?”, but I also recommend “Formal verification of ECC” and “Is SIKE ready for prime time?”

There is also a curated list of talks which is a sort-of “greatest hits” of conference and seminar talks from the last year.

— Steven Galbraith

ECC 2020 Conference

The ECC 2020 conference will take the form of a curated collection of lecture recordings, and four panel discussions. Each panel will feature a number of experts in the area, and there will be an opportunity for audience members watching live to ask questions during the session using zulip. Each panel is followed by a 30 min social break on wonder.me. The conference is free, please come along.

To register, email Tanja Lange.

The four panels are:

• Wednesday 28th October, 01:00 UTC: Conference opening, and panel on recent trends in ECC
Moderator: Steven Galbraith.
Panel: Dan Boneh, Nadia Heninger, Kristin Lauter, Mehdi Tibouchi, and Yuval Yarom.
• Wednesday 28th October, 08:00 UTC: Formal verification of ECC protocols
Moderator: Benjamin Smith.
Panel: Karthik Bhargavan, Bas Spitters, and Bow-Yaw Wang.
• Thursday 29 October, 12:00 UTC: How long can we safely use pre-quantum ECC?
Moderator: Francisco Rodríguez Henríquez.
Panel: Sam Jaques, Manfred Lochter, and Michele Mosca.
• Thursday 29 October, 19:00 UTC: Is SIKE ready for prime time?
Moderator: Alfred Menezes.
Panel: David Jao, Christophe Petit, and Nick Sullivan.

See you there!

— Steven Galbraith

PQCrypto 2020 and other news

PQCrypto 2020 took place online on September 21-23. All sessions were recorded and can be viewed here. The conference had a varied programme across all of post-quantum crypto, including a number of isogeny papers. In this blog post I will discuss the Three Invited Talks. (I was not able to participate in the conference in real-time, as the sessions took place in the middle of the night for me.)

• Benjamin Smith “Isogenies: what now, and what next?”
Ben’s talk was an overview of isogenies, SIDH and CSIDH. In his words, isogenies are “absolutely weird” but they are gradually becoming “less weird”. He spent time talking about how isogenies are computed, including mentioning his recent result with Bernstein, De Feo and Leroux on computing $\ell$-isogenies in $\tilde{O}( \sqrt{\ell} )$ time. He mentioned some open problems, such as the “insanely hard problem” to generate a random supersingular elliptic curve E without knowing a trapdoor that would allow to compute (for example) the endomorphism ring of E. He also stated his view, in regard the $\sqrt{\ell}$ time result, that “square-root time is not the end of this story and we can probably do better than this”.
Regarding what’s next: Ben says we need to better understand the foundations and to speed up isogenies, and to do this we need to bring new brains to these problems.
• Frank Wilhelm-Mauch “Quantum computers – state of play and roadmap”
The talk gave an excellent overview of the current state of quantum computing, including some of the different models for building a quantum computer and some of the recent computational records for problems in various domains in science.
One of the main themes of the talk was that algorithmic progress is limited by the problem of fault-tolerance and lowering errors. If the error rate is too high (meaning > 0.3 percent) then computation seems to be more or less useless. The current challenge is to get an error rate around $10^{-4}$. Once that is achieved the community can consider “aggressively scaling” the number of qubits.
Frank’s speculation is that quantum computers will stay at the 50-100 qubit range for the near future, while researchers focus on lowering the error. He mentioned that IBM has a roadmap to 1000 qubit hardware in 3 years and hopes to reach $10^6$ qubits a little later, but they have not set themselves any error rate targets.
The conclusions of the talk: despite continual progress in hardware, “cryptanalysis still can only be addressed with fault-tolerant algorithms” and there is still a “long way to go” to achieve that.
• Dustin Moody/NIST “NIST PQC Standardization Update: Round 2 and beyond”
Dustin was representing the National Institute of Standards and Technology (NIST) in this talk, by giving an overview of how NIST selected the round 2 and round 3 candidates. In the talk, and during the questions afterwards, he addressed the extent to which the NSA has influenced the decision-making process. He stated that “NIST alone makes the decisions” based on their own research and input from the community, and that the NSA feedback did not change any of their decisions.
Regarding the “alternate” schemes in the third round (eg SIKE), Dustin stressed that these could be standardised in future if they continue to hold up to scrutiny.
In terms of the ultimate “winners”, he mentioned that there will be at most one lattice-based KEM and at most one lattice-based signature to be standardised. So it looks like the final selection will contain relatively few schemes (perhaps only 4).

• The Raccoon Attack is side-channel attack on Diffie-Hellman in prime fields in old versions of TLS. The idea of the attack is to reduce the problem to the hidden number problem (HNP), which can be solved using a lattice attack. To quote from the announcement “The vulnerability is really hard to exploit and relies on very precise timing measurements and on a specific server configuration to be exploitable”. What is interesting for this blog is that the attack works for finite field Diffie-Hellman but seems to be hard to mount on elliptic curve Diffie-Hellman, since the analogous hidden number problem does not have such a good solution.
Maybe it is time to revisit elliptic curve hidden number problems? I believe the current state-of-the-art to be the paper “On the Bit Security of Elliptic Curve Diffie-Hellman” by my previous PhD student Barak Shani (published at PKC 2017). Barak’s PhD thesis contains a detailed discussion of hidden number problems in different “access models”.
• NIST is running a Virtual Workshop on Considerations in Migrating to Post-Quantum Cryptographic Algorithms, on October 7th. Registration is free. The workshop will be recorded and the content will be made available after the event.
• The Accepted papers at Asiacrypt 2020 are online. The conference will take place online on Dec 7-11. There are lots of isogeny papers and it looks like it will be an interesting programme. Congratulations to Luca De Feo, David Kohel, Antonin Leroux, Christophe Petit and Benjamin Wesolowski for their paper “SQISign: Compact Post-Quantum signatures from Quaternions and Isogenies”, which is one of the three papers to be awarded the title of Best Paper. I will report from the conference at the time.

— Steven Galbraith

CRYPTO 2020

CRYPTO ran as an online conference this year due to COVID. Most of the sessions were while I was sleeping, but I was able to catch up on some of the sessions I missed by looking at recordings. The conference programme with links to papers and YouTube recordings is here.

The highlight was the powerful and important invited talk Crypto for the People by Seny Kamara. It covered several themes. One theme was to ask the question “Who benefits from the crypto research that we are doing?” Seny argued that mostly our work benefits companies and governments, rather than citizens. This echoed some sentiments of Phil Rogaway’s talk “The Moral Character of Cryptographic Work” from ASIACRYPT 2015. It was fascinating to hear about the low-tech system used by the ANC in the 1980s for secure communication between operatives in South Africa and the leadership in exile in the UK. (Being old enough to remember saving Basic programs from my home computer onto cassette tape using an audio format, it all made sense to me.) The need for secure communication for freedom fighters served to illustrate his argument that at least some of us should work on systems that help the marginalised and disempowered, rather than companies and governments. Another theme of his talk was diversity, both of people and research topics. Seny gave some insight into his experiences as a black immigrant minority in the crypto research community.

The Best Papers were:

• Joseph Jaeger and Nirvan Tyagi, “Handling Adaptive Compromise for Practical Encryption Schemes”. Best Paper by Early Career Researchers Award.
• Susan Hohenberger, Venkata Koppula and Brent Waters, “Chosen Ciphertext Security from Injective Trapdoor Functions”.
• Wouter Castryck, Jana Sotáková and Frederik Vercauteren, “Breaking the decisional Diffie-Hellman problem for class group actions using genus theory”.

This is the paper of most interest to this blog. It is a wonderful paper, and I have already discussed it in this blog post. A well-deserved best paper award.

• Christof Beierle, Gregor Leander and Yosuke Todo, “Improved Differential-Linear Attacks with Applications to ARX Ciphers”.

The Rump Session did not contain any talks directly related to ECC. Yvo Desmedt gave a talk “40 years Advances in Cryptology: How Will History Judge Us?” which was a plea to the community to publish more research on cryptanalysis (he also said that “side channels are not real cryptanalysis” and that everyone should read Kahn’s book). There was also a version of the game show “Jeopardy!” but I’d rather not talk about it.

Some papers related to discrete logarithms were:

• Fabrice Boudot, Pierrick Gaudry, Aurore Guillevic, Nadia Heninger, Emmanuel Thomé and Paul Zimmermann, “Comparing the difficulty of factorization and discrete logarithm: a 240-digit experiment”.

The paper reports some recent large-scale factoring and finite field DLP experiments using the number field sieve. The results suggest that solving DLP in $\mathbb{F}_p^*$ is only about 3 times slower than factoring for the same bit sizes (previous estimates would be suggested a bigger gap).

• Gabrielle De Micheli, Pierrick Gaudry and Cecile Pierrot, “Asymptotic complexities of discrete logarithm algorithms in pairing-relevant finite fields”.

The paper is about the complexity of index calculus algorithms for the DLP in “medium prime” finite field settings, as arise in pairing-based crypto.

• Alex Lombardi and Vinod Vaikuntanathan, “Fiat-Shamir for Repeated Squaring with Applications to PPAD-Hardness and VDFs”.

The paper is about zero-knowledge proofs, but it features a sort of index calculus algorithm. Worth a look if you are curious.

There were plenty of papers on lattices. A common theme was algorithms for algebraic structured lattices. Two papers I want to mention are:

• Qian Guo, Thomas Johansson and Alexander Nilsson, “A key-recovery timing attack on post-quantum primitives using the Fujisaki-Okamoto transformation and its application on FrodoKEM”.

The paper shows that the equality check in the Fujisaki-Okamoto transformation should be implemented in constant time for lattice schemes. If not, then a timing attack can be used to mount an active attack based on deducing when the errors start to overflow (and hence learning the error terms in LWE instances).

Not mentioned in their paper, but discussed in the chat thread, is that a similar idea can be used to mount the GPST attack on SIDH if the FO transform is not constant time.

• Koen de Boer, Léo Ducas, Alice Pellet-Mary and Benjamin Wesolowski, “Random Self-reducibility of Ideal-SVP via Arakelov Random Walks”.

The paper shows how to transform an ideal lattice to a random ideal lattice (and hence transform ideal-SVP to the average case) by a combination of continuous (Arakelov class group) and discrete (sparsification) processes. I recommend the pre-conference video by Koen, as it has some really awesome animations. I also like the fact that the approach is analogous to the use of random walks on isogeny graphs to get uniform mixing, as done by Jao, Miller and Venkatesan in their reduction for the ECDLP.

— Steven Galbraith

Posted in Uncategorized | 1 Comment

The second half of ANTS 2020

This is a follow-up to my previous post on the Fourteenth Algorithmic Number Theory Symposium (ANTS-XIV).

First I want to announce that the Selfridge Prize for the best submitted paper as judged by the program committee was awarded to Jonathan Love and Dan Boneh for their paper Supersingular Curves With Small Non-integer Endomorphisms. The prize is funded by the Number Theory Foundation.

The prize for the best poster was awarded to Eric Bach and Jonathan Sorenson for Generating Smooth Numbers with Known Factorization Uniformly at Random.

David Jao’s invited talk introduced and surveyed the current state of isogeny-based crypto. In his opinion, signatures are the exciting new frontier in this area.

The other two invited talks for the second half of the conference were Rachel Pries and Andrew Booker. Rachel talked about determining the Shimura datum of families of Abelian varieties coming from cyclic covers of the projective line. Andrew talked about his recent computations on sums of three cubes (in particular the 33 and 42 problems). I highly recommend watching Andrew’s talk for his descriptions of how the result about 33 was announced and his interactions with the press (and a suggestion for the conference T-shirt).

Among the contributed papers, there were several papers that will be of interest to cryptographers.

• Daniel J. Bernstein, Luca De Feo, Antonin Leroux and Benjamin Smith. Faster computation of isogenies of large prime degree

The paper gives an improved method to compute isogenies of high-degree, by using ideas about fast computation of products that go back to Pollard and Strassen.

• Novak Kaluderovic, Thorsten Kleinjung and Dusan Kostic. Cryptanalysis of the generalised Legendre pseudorandom function

The paper gives new algorithms to compute $f(x) \in \mathbb{F}_p[x]$ when given an oracle $O(x)$ that computes the Legendre symbol $(\tfrac{f(x)}{p})$. This problem goes back a several decades in crypto, but recently gained new interest due to applications in multi-party computation.

• Thomas Espitau and Paul Kirchner. The nearest-colattice algorithm: time-approximation tradeoff for approx-CVP

The paper gives a “block” algorithm for solving the approximate closest vector problem in a lattice, that generalises the Babai nearest plane method.

• Carl Bootland, Wouter Castryck and Frederik Vercauteren. On the Security of the Multivariate Ring Learning With Errors Problem

The paper gives methods to solve the search m-RLWE problem and attack cryptosystems based on this problem.

• Jean-Sébastien Coron, Luca Notarnicola and Gabor Wiese. Simultaneous diagonalization of incomplete matrices and applications

The paper is about some computations that improve algorithms to solve the approximate common divisor problem and break CLT13 cryptographic multilinear maps.

• Daniel E. Martin. Short Vector Problems and Simultaneous Approximation

The paper gives a reduction from the approximate shortest vector problem to the problem of simultaneous Diophantine approximation (the reverse direction to the famous reduction by Lagarias form simultaneous Diophantine approximation to SVP).

As always, the pdfs of the preconference versions of the papers are here, and pre-recorded talks can be found at the ANTS conference YouTube channel. Even easier: Everything is in one place at researchseminars.org.

— Steven Galbraith

The first half of ANTS 2020

The Fourteenth Algorithmic Number Theory Symposium (ANTS-XIV), that was intended to take place at the University of Auckland in New Zealand, is currently taking place online in what Noam Elkies has named “New Zoomland”. Here is a report on the first half of the conference.

Note that the pdfs of papers are all here and that there are short videos of all talks on the ANTS 2020 conference YouTube channel.

Day one of the conference opened with two papers on isogenies.

• Supersingular Curves With Small Non-integer Endomorphisms, by Jonathan Love and Dan Boneh.

The motivation for this paper is the problem of “hashing” to a random supersingular curve over $\mathbb{F}_{p^2}$. A natural solution would be to construct a CM curve with small discriminant, as done in Broker’s algorithm. The paper calls such curves $M$-small (curves whose endomorphism ring contains an isogeny of degree at most $M$). Equivalently the endomorphism ring contains an order of discriminant at most $M$. For each fundamental discriminant $D$ such that $M \le D < 0$ let $T_D$ be the set of all $M$-small curves such that the order lies in $\mathbb{Q}(\sqrt{D})$. The paper proves a number of new results (both mathematical and algorithmic) relating to the structure of the subgraphs $T_D$ within the full isogeny graph, when $M < \sqrt{p}/2$.

The conclusion is that, given any $M$-small curves $E, E'$ one can efficiently compute an isogeny from $E$ to $E'$, which means this is not a useful way to hash to elliptic curves.

Note that Castryck, Panny and Vercauteren (EUROCRYPT 2020) present a complementary result for hashing into the CSIDH set.

• Computing endomorphism rings of supersingular elliptic curves and connections to pathfinding in isogeny graphs, by Kirsten Eisenträger, Sean Hallgren, Chris Leonardi, Travis Morrison and Jennifer Park.

This paper is about the problem of computing $End(E)$ when $E$ is a supersingular elliptic curve. There are two approaches to this problem in the literature:

1. To compute cycles in the isogeny graph (and hence endomorphisms), giving a subring of $End(E)$. Then to work out which of the maximal orders containing that ring is the right one. This idea first appears in Kohel’s thesis, but all the details were not worked out.
2. To compute an isogeny from a nice curve $E_0$ with known endomorphism ring to $E$, and then to deduce the endomorphism ring of $E$. The key step to such approaches is the paper by Kohel, Lauter, Petit and Tignol (from ANTS 2014). Several subsequent papers discuss details of computing $End(E)$, in particular the paper Supersingular isogeny graphs and endomorphism rings: reductions and solutions by Eisenträger, Hallgren, Lauter, Morrison and Petit.

This ANTS paper is the first to give all the details of an algorithm of the first approach (finding cycles) and to work out the complexity carefully. The main focus of the paper is a study of Bass orders. The paper pays close attention to the size of the representation of the endomorphism ring.

At a high level it is not obvious to me which of these two approaches is the better strategy, or whether they both should have basically the same complexity.

Next up was a superb invited lecture by David Harvey on Recent results on fast multiplication. A recording of the lecture is on the YouTube channel mentioned above.

The first day concluded with two more talks on superspecial curves.

• Counting Richelot isogenies between superspecial abelian surfaces, by Toshiyuki Katsura and Katsuyuki Takashima.

The paper has a careful analysis of the isogenies of superspecial dimension 2 abelian varieties. The goal is to distinguish the case of isogenies to a product of two elliptic curves versus isogenies to the Jacobain of a genus 2 curve.

• Algorithms to enumerate superspecial Howe curves of genus four, by Momonari Kudo, Shushi Harashita and Everett Howe.

The paper is about a method to construct superspecial curves genus 4 curves out of elliptic curves and genus 2 curves.

Day two had a number of nice papers, but probably less interesting to a cryptographic audience (also a bit early in the morning for me). Isabel Vogt gave an invited lecture on Arithmetic and geometry of Brill–Noether loci of curves. There was a poster session comprising three of the posters. One of the posters has a crypto context: Samuel Dobson, Steven D. Galbraith, and Benjamin Smith, Trustless Construction of Groups of Unknown Order with Hyperelliptic Curves.

Day three started with three number theory talks (again, too early in the morning for me — who organised this??). The invited talk by Felipe Voloch on Commitment Schemes and Diophantine Equations discussed some ideas for a post-quantum commitment scheme based on diophantine equations that are hard to solve but for which determining the number of solutions is easy.

Day three also featured a wonderful session by Joppe Bos and Michael Naehrig to remember Peter Montgomery. Peter had attended a number of ANTS conferences over the years, and died earlier this year. His name is legendary in computational number theory due to his contributions to factoring and fast arithmetic, such as Montgomery multiplication, the Montgomery model for elliptic curves, and the Montgomery ladder for elliptic curve point multiplication.

Finally on day three was the rump session, which was heavily biased towards isogenies.

• Everett Howe gave a hilarious talk on isogenies of superspecial curves in characteristic 2 (joint work with Bradley Brock).
• Enric Florit (reporting on joint work with Ben Smith) presented some results on the Ramanujan property (or not) of genus 2 isogeny graphs.
• Christophe Petit advertised the Isogeny-based cryptography winter summer school in Bristol in December.
• Péter Kutas talked on joint work with on quantum attacks on unbalanced SIDH.
• Antonin Leroux sketched SQIsign (pronounced “ski sign”) post-quantum signature from quaternions and isogenies. It seems a nice idea and I am very curious to see the details.
• Wouter Castryck (joint work with Thomas Decru and Fre Vercauteren) presented Radical isogenies which is a major breakthrough new idea to speed up CSIDH.

It has been a great 3 days. I will report back again on the rest of the conference, and also the workshop on post-quantum crypto that follows the conference.

— Steven Galbraith

Eurocrypt 2020

Eurocrypt 2020 took place online, being the first major IACR cryptography conference to be run online over zoom. The chairs did an amazing job at short notice, assisted very ably by Kevin McCurley and Kay McKelly from the IACR as Virtual Conference Administrators.

I understand there were over 1000 registrations, making it by far the largest Eurocrypt of all time. It turns out that online events are more accessible. Certainly I had not planned to fly to Europe for it, but since it was online I was able to attend.

Short videos of the talks are on youtube and are linked from the conference webpage, and the full live sessions are on the IACR YouTube channel. Unfortunately I did not attend any of the live sessions due to time-zones, but I did at least catch most of the rump session.

For the purposes of this blog, the most important session was the opening session on “Number-Theoretic Cryptography”. There was an unexpected hidden theme between two of the papers, which is the algorithm by Becker, Coron and Joux for solving subset-sum (building on the beautiful idea of Howgrave-Graham and Joux to split vectors by weight rather than length).

I now discuss the papers in this session.

• “He Gives C-Sieves on the CSIDH” by Chris Peikert, and “Quantum Security Analysis of CSIDH” by Xavier Bonnetain and André Schrottenloher.

These two papers are about quantum algorithms to solve the group action problem (the computational assumption underlying the CSIDH isogeny-based post-quantum key exchange protocol). Both papers use Kuperberg’s second algorithm for the hidden shift problem. The papers advance our understanding of the security of CSIDH.

As outlined very clearly in Peikert’s talk, there are three components of such an algorithm:

1. A quantum circuit to compute the class group action $[a]*E$ for “arbitrary” ideal classes. This is used to create certain “labelled” quantum states (consisting of an integer, called the “label”, stored classically as bits, and a quantum state).
2. A sieve algorithm to combine “naughty” labelled states to form “nice” labelled quantum states.
3. A measurement process that allows to compute some bits of the secret ideal class from a nice labelled state.

The main challenges are to get a small quantum circuit to do step 1 and to minimise the number of states needed for the sieving in step 2.

Peikert’s paper is mainly about items 2 and 3. The paper uses “quantumly accessible classical RAM”, which means there is a storage device that stores classical bits, and one can efficiently create quantum states that are superpositions of stored values. The paper then studies the sieving process (the “collimation sieve”, which is the “C-sieve” in the title of the paper). One aim is to minimimise the total number of labelled states that need to be computed in step 1, and hence reduce the total number of quantum gates required.

Peikert claims that if the storage is (respectively) $2^{32}, 2^{40}, 2^{48}$ bits, then one can attack CSIDH-512 using (respectively) $2^{18.7}, 2^{15.7}, 2^{14.1}$ executions of step 1. He claims that in the case of $2^{40}$ bits of storage this gives a quantum algorithm to break CSIDH-512 that requires $2^{60}$ T-gates. Officially this is below NIST level 1, though it is still unclear to me that a quantum computer with $2^{60}$ gates is a realistic object in the forseeable future (as far as I know, we do not currently have classical computers with $2^{60}$ gates).

Peikert discusses several optimisations and also reports on his simulations (his code is available).

Bonnetain and Schrottenloher study a different range of parameters for Kuperberg’s second algorithm. Regarding step 2 they focus on the case of small storage and aim to reduce the quantum cost of the algorithm by doing a large classical computation involving the labels. In other words, they reduce the quantum cost by increasing the classical cost. Their innovations in the sieve stage include using subset-sum algorithms (such as the one by Becker-Coron-Joux that I mentioned above) and list-merging algorithms (such as Schroeppel-Shamir). They also make progress on step 1, with new ideas to shrink the quantum circuit for computing the class group action.

For CSIDH-512, Bonnetain and Schrottenloher claim an attack that requires only about 40,000 logical qubits and a total circuit size of approximately $2^{71.6}$ T-gates, while also performing a $2^{86}$ classical computation. Again, this is technically below the required security for NIST level 1.

In the discussion it was noted that these papers do not “break” CSIDH, but both authors suggest the parameters should be increased. It was also mentioned that the attacks apply to CSURF. If I had been awake in the discussions I would have asked what is the security level if the Bonnetain-Schrottenloher quantum circuit is used with the Peikert sieve. Presumably this gives a further improvement of the attack.

• “Double-Base Chains for Scalar Multiplications on Elliptic Curves” by Wei Yu, Saud Al Musa and Bao Li.

There has been an ongoing literature on the task of computing low weight double-base chains, which are a tool to potentially speed up variable point scalar multiplication on elliptic curves. This paper, which is mostly theoretical, gives some improvements in this area.

• “Rational Isogenies from Irrational Endomorphisms” by Wouter Castryck, Lorenz Panny and Frederik Vercauteren.

This lovely paper gives a reduction of the security of the CSIDH cryptosystem to the problem of computing endomorphism rings of supersingular elliptic curves.

For simplicity I discuss the case $p \equiv 3 \mod{4}$. Let $E_0 : y^2 = x^3 + x$ be the supersingular curve with $j$-invariant 1728. The quadratic twist $E_0^t$ of $E_0$ is isomorphic over $\mathbb{F}_p$ to $E_0$.

The paper exploits the fact that if $E = [a] * E_0$ for some ideal class $[a]$ then $E^t = [a^{-1}] E_0.$ Now suppose $\tau \in End(E)$ is an isogeny of degree $\ell$ such that $\tau \circ \pi = - \pi \circ \tau$, where $\pi$ is the $p$-power Frobenius (this is a typical case). Let $\phi : E \to E^t$ be the isomorphism over $\mathbb{F}_{p^2}$ from $E$ to its quadratic twist. Then $\phi \circ \tau : E \to E^t$ is an $\ell$-isogeny that is $\mathbb{F}_p$-rational. It follows that $\phi \circ \tau$ corresponds to an ideal class $[b]$ such that $E^t = [ab] E_0$. Hence $[ab] = [a^{-1}]$ and we can solve for $[a]$ by computing $\sqrt{[b]}$.

In short, if we are given $E$ and want to compute $[a]$ such that $E = [a] * E_0$, then it suffices to find a suitable element $\tau \in End(E)$. The paper explains how to do this.

• “Low Weight Discrete Logarithms and Subset Sum in $2^{0.65n}$ with Polynomial Memory” by Andre Esser and Alexander May.

This paper gives the first progress on the low Hamming weight DLP for about 20 years and it is a major result. The paper is a total mind-bender, with random walks built from random walks forming rhos of rhos. A key idea is splitting by weight, which again takes us back to the Becker-Coron-Joux algorithm.

I am sorry, but I cannot begin to summarise this paper in a few lines. Read it. Then read it again when you are stoned. Then read it again when you are sober.

The invited talks were both excellent.

• “Mathematics and Cryptography: A Marriage of Convenience?” by Alice Silverberg.

I watched the 44-minute pre-recorded lecture. There is also a Live talk with 30 mins presentation followed by Q&A.

Alice told various stories of her journey through mathematics and crypto, and some of the people she has worked with. She briefly talked about her work with Rubin on compression in finite fields (algebraic tori), her work with Boneh on multilinear maps, and her work with H.W. Lenstra Jr. on the Gentry-Szydlo algorithm. Most importantly she used these stories to illustrate the importance of community, curiosity, communication, kindness, listening, and doing the right thing.

• “Fine-Grained Cryptography: A New Frontier?” by Alon Rosen.

The talk gave an overview of recent work on fine-grained complexity and crypto. The basic idea is this: Suppose we have a computational problem that an honest user can set up in roughly linear time $O(n)$ but an attacker requires time $O(n^k)$ to solve. This is polynomial time, but if $k$ is large enough then such work may be infeasible, or at least extremely time-consuming, and that could be enough for crypto applications.

Alon explained the $k$-Orthogonal Vectors ($k$-OV) problem, which is: Given $k$ lists $L_i$, each containing $n$ vectors in $\mathbb{F}_p^d$ (for small $p$ and $d = O( \log(n))$) to find $k$ vectors $u_i \in L_i$ such that $\sum_{j=1}^d \prod_{i=1}^k u_{i,j} = 0$. In the case $k=2$ this becomes the problem of finding two vectors whose Eucliden inner product is zero (i.e., the vectors are orthogonal).

The obvious way to solve the $k$-OV problem is to consider all $n^k$ combinations of vectors. The $k$-OV assumption is that there is no algorithm that can do much better than this.

Impagliazzo, Paturi and Zane introduced the Strong Exponential Time Hypothesis (SETH) which is that you can’t solve $k$-SAT much better than expected. The SETH supports the $k$-OV assumption, in the sense that if there was a faster algorithm for $k$-OV then there would be faster algorithms for $k$-SAT than currently known.

Alon then described how to get proof-of-work and one-way functions from $k$-OV, and said many nice things about the CRYPTO 2019 paper “Public-Key Cryptography in the Fine-Grained Setting” by Rio LaVigne, Andrea Lincoln and Virginia Vassilevska Williams. The final part of the talk was a blur of philosophical musings (and some jokes).

As always, a highlight was the rump session. There was drinking. There were cigars. There were unusual zoom backgrounds. Contacts were traced. There was a challenging quiz.

In terms of this blog, there were no rump session talks of particular relevance to ECC. Two highlights were the songs “I will survive” by Women in Theory, and “Zoom zoom zoom zoom” by Chloe Martindale. Please click here for the