## Breaking the decisional Diffie-Hellman problem for class group actions using genus theory

This post is about the paper Breaking the decisional Diffie-Hellman problem for class group actions using genus theory by Wouter Castryck, Jana Sotáková and Frederik Vercauteren.

First let us recall the situation in the setting of $\mathbb{Z}_p^*$ where $p$ is prime. A Diffie-Hellman quadruple is $(g, g^a, g^b, g^{ab} )$ for $g \in \mathbb{Z}_p^*$. The Decision Diffie-Hellman problem (DDH) is to distinguish such a triple (for uniformly sampled $g \in \mathbb{Z}_p^*$ and $a,b \in \mathbb{Z}_{p-1}$) from a quadruple $(g, g^a, g^b, g^c)$ for uniformly sampled $g \in \mathbb{Z}_p^*$ and $a,b,c \in \mathbb{Z}_{p-1}$.

The Legendre symbol is multiplicative, which implies that $(\tfrac{g^a}{p} ) = ( \tfrac{g}{p} )^a$. If $(\tfrac{g}{p})= -1$ (which may happen when $g$ has even order) then one can learn the parity of $a$ and $b$ from $g^a, g^b$ respectively, and hence test if the fourth value of the quadruple has Legendre symbol consistent with $g^{ab}$. This well-known fact allows to reject half of all uniformly sampled quadruples $(g, g^a, g^b, g^c)$ when $(\tfrac{g}{p})= -1$, which is sufficient to say that DDH is not a hard problem in $\mathbb{Z}_p^*$.

One can increase the success rate beyond $1/2$ by using a random-self-reduction of Diffie-Hellman quadruples, but one can never get a perfect DDH oracle (i.e., an oracle that only accepts $(g, g^a, g^b, g^{ab} )$) from this technique, as the Legendre symbol only “sees” the order 2 elements.

The Legendre symbol is just a group homomorphism $\mathbb{Z}_p^* \to \{ 1, -1\}$, and for any prime $\ell \mid (p-1)$ one can get a homomorphism $\mathbb{Z}_p^* \to H$ where $H$ is a subgroup of order $\ell$. Hence, if $(p-1)$ has a range of small factors then one can get an increasingly accurate algorithm to distinguish Diffie-Hellman quadruples from random quadruples (and hence solve DDH).

[As an aside: The amazing thing about the Legendre symbol, and the reason it is taught in all good number theory courses, is not the existence of a group homomorphism $\mathbb{Z}_p^* \to \{ 1, -1\}$. This is trivial. What is non-trivial is the quadratic reciprocity law, which gives a non-obvious and very efficient way to compute Legendre symbols.]

Similarly, for any finite group one can consider group homomorphisms to subgroups of small order. So one can also attack elliptic curve DDH in a similar way. This is one of the main reasons why the community works with groups of prime order, and in particular elliptic curves of prime order.

Now we turn to group actions. Let $G$ be a finite group acting on a set $X$. Write the action of $g \in G$ on $x \in X$ as $g*x$. For example, let $G = Cl( \mathcal{O} )$ be an ideal class group acting on the set $X$ of supersingular elliptic curves with j-invariant in $\mathbb{F}_p$. This is the setting of the CSIDH system in isogeny-based post-quantum crypto. The natural analogue of DLP is: Given $E$ and $a * E$ to compute $a$. The natural analogue of DDH is to distinguish $(E, a*E, b*E, (ab)*E )$ from $(E, a*E, b*E, c*E )$, where $a, b, c$ are uniformly sampled from $G$.

Knowing about Legendre symbols, it is natural to speculate that one might be able to do something similar for group actions. We’d like a group homomorphism $\phi : G \to H$ for some group $H$ of small prime order, and to be able to compute $\phi(a)$ from $a*E$. That is what the paper of Castryck, Sotáková and Vercauteren does.

In some survey talks (such as at ANTS 2018 and at the Alice Silverberg birthday conference in 2018) I asked “Can subgroups of ideal class group be exploited?” While I thought this would be a good problem, I did not have any clue how to do this. I am surprised and delighted by the new results.

Without going into the details, what the paper shows is that one can get enough information about the degree of an isogeny from $E$ to $a*E$ (and hence the norm of the ideal $a$) from looking at pairings of points on the elliptic curve. It is a wonderful and surprising (at least, to me) result, and brings a new set of ideas and techniques into isogeny crypto. The paper is not too hard to follow (don’t be put off by the phrase “genus theory” — it is not as scary as it sounds).

I end with a few small comments. First, as with the case of $\mathbb{Z}_p^*$, this does not give a perfect algorithm to distinguish DDH quadruples from uniform ones. But it does allow to reject some quadruples as being definitely not DDH, and this is enough to break the DDH assumption. Second, breaking the DDH assumption does not, as far as I know, break any isogeny cryptosystem. This is because isogeny cryptosystems are rather unsophisticated compared with DLP-based protocols. Third, the results do not apply to SIDH and have no impact on the SIKE submission to the NIST standardization process.

To conclude, this paper is a great theoretical result that brings new ideas to the field. What will be next for isogeny crypto? Whatever it is, I look forward to it!

— Steven Galbraith

## Algorithmic Number Theory (ANTS-XIV)

The Fourteenth Algorithmic Number Theory Symposium, ANTS-XIV, will take place at the University of Auckland, New Zealand on June 30 – July 4, 2020.

The deadline for submissions is February 25. To submit, please go to the call for papers on the conference website.

I decided to have a look at the history of the ANTS conferences, and in particular to identify the most highly cited papers (using google scholar). The ANTS conferences started in 1994, with the first conference held at Cornell.

First I want to mention that citation counts are not a good measure of research quality or difficulty or importance. Citations are biased towards subject areas that have a culture of writing lots of papers and citing widely. Citation counts are also biased to older papers, since they have had more time to accrue citations. Hence, we would expect the most highly cited papers at ANTS to be in cryptography, and from 16 or more years ago.

Nevertheless, citation counts do tell something about the interest in a paper, and are a reasonable proxy for impact on the field.

The most highly cited paper in the history of the ANTS conferences (with nearly 1700 citations according to google scholar) is:

• J. Hoffstein, J. Pipher and J. H. Silverman, NTRU: A ring-based public key cryptosystem, ANTS III, 1998.

There is no doubt that this is a massively influential paper on lattice cryptography. Several of the lattice-based submissions to the NIST Post-Quantum standardisaton process were very closely building on NTRU. The irony (if you can call it that) is that this paper was rejected from CRYPTO, and yet has had higher impact than most other papers published in CRYPTO around that time.

Here are the following 14 most cited ANTS papers:

• A. Joux, A One Round Protocol for Tripartite Diffie-Hellman, ANTS IV, 2000. 1369 cites.
• D. Boneh, The Decision Diffie-Hellman Problem, ANTS III, 1998. 1078 cites.
• S. D. Galbraith, K. Harrison and D. Soldera, Implementing the Tate Pairing, ANTS V, 2002. 689 cites.
• A. Joux, The Weil and Tate Pairings as Building Blocks for Public KeyC ryptosystems, ANTS V, 2002. 287 cites.
• L. M. Adleman, J. DeMarrais and M.-D. A. Huang, A subexponential algorithm for discrete logarithms over the rational subgroup of the jacobians of large genus hyperelliptic curves over finite fields, ANTS I, 1994. 258 cites.
• P. Gaudry and R. Harley, Counting Points on Hyperelliptic Curves over Finite Fields, ANTS VI, 2000. 199 cites.
• P. Q. Nguyen and D. Stehlé, LLL on the Average, ANTS VII, 2006. 168 cites.
• O. Schirokauer, D. Weber and T. F. Denny, Discrete Logarithms: The Effectiveness of the Index Calculus Method, ANTS II, 1996. 167 cites.
• L. M. Adleman, The function field sieve, ANTS I, 1994. 165 cites.
• G.-J. Lay and H. G. Zimmer, Constructing elliptic curves with given group order over large finite fields, ANTS I, 1994. 163 cites.
• P. Q. Nguyen and J. Stern, Lattice Reduction in Cryptology: An Update, ANTS IV, 2000. 144 cites.
• N. D. Elkies, Shimura Curve Computations, ANTS III, 1998. 106 cites.
• M. Fouquet and F. Morain, Isogeny Volcanoes and the SEA Algorithm, ANTS V, 2002. 104 cites.
• A. Shallue and C. E. van de Woestijne, Construction of Rational Points on Elliptic Curves over Finite Fields, ANTS VII, 2006. 100 cites.

Of course, these rankings will change over time. But that is what it looked like in early 2020.

Looking at this list I see many important and favourite papers: Antoine Joux’s paper on One Round Tripartite Diffie-Hellman kick-started pairing-based crypto; the Adleman-DeMarrais-Huang paper was the first to show high genus curves are weak for DLP crypto; the Lay-Zimmer paper was influential in the early days of ECC; Fouquet and Morain introduced the phrase “Isogeny Volcano”; etc. It is also notable that several of the papers listed (e.g., those by Boneh, Elkies, Nguyen-Stern, and the second paper by Joux) are invited papers, which shows that the community does value survey/overview papers.

Anyway, I look forward to strong submissions to ANTS XIV in Auckland, including on elliptic curves, lattices and isogenies. Hopefully in 15-20 years the impact of some of those papers will be apparent.

— Steven Galbraith

## Various news items

1. The ECC 2019 conference took place in Bochum in December. Slides from some of the talks (including the rump session) are available here.

2. ASIACRYPT 2019 took place in Kobe, Japan in December. It was a very well organised conference.

The Best Paper award went to Thomas Debris-Alazard, Nicolas Sendrier and Jean-Pierre Tillich for the paper “Wave: A New Family of Trapdoor One-Way Preimage Sampleable Functions Based on Codes”. This paper gives a post-quantum signature scheme (of the “hash and sign” type) from error-correcting codes. Ward Beullens has written a blog post on this paper in the COSIC blog.

There were two conference sessions on isogenies, featuring these papers:

• Ward Beullens, Thorsten Kleinjung and Fréderik Vercauteren “CSI-FiSh: Efficient Isogeny based Signatures through Class Group Computations”. This paper introduces an isogeny signature scheme of the type originally proposed by Stolbunov (also discussed by De Feo and me in the appendix of our “SeaSign” paper). The key step is a large class group computation for one specific parameter (ie one prime).

• Luca De Feo, Simon Masson, Christophe Petit and Antonio Sanso “Verifiable Delay Functions from Supersingular Isogenies and Pairings”. Computing a sequence of isogenies is a natural “delay function” (in the sense that the computation cannot be sped up using parallel computing). The paper shows how to get a delay function whose result can be checked efficiently, by using pairings. Carl Bootland has blogged about this talk.
• Xiu Xu, Haiyang Xue, Kunpeng Wang, Man Ho Au and Song Tian “Strongly Secure Authenticated Key Exchange from Supersingular Isogenies”. The paper gives an authenticated key exchange scheme based on isogenies, with very strong security properties in the CK+ model (and the random oracle model).
• Michael Naehrig and Joost Renes “Dual Isogenies and Their Application to Public-key Compression for Isogeny-based Cryptography”. The paper has some nice ideas about compression.
• Suhri Kim, Kisoon Yoon, Young-Ho Park and Seokhie Hong “Optimized Method for Computing Odd-Degree Isogenies on Edwards Curves”. The paper introduces some nice isogeny formulas, that are similar to formulas already known for Montgomery curves.
• Salim Ali Altuğ and Yilei Chen “Hard Isogeny Problems over RSA Moduli and Groups with Infeasible Inversion”. The paper has fascinating ideas about constructing something like a group with infeasible inversion.

The invited talks were both about blockchain, so I don’t mention them here.

You can read about several other papers on the COSIC blog. Recordings were made of the talks, and will go on the iacr youtube channel eventually.

The rump session, hosted by Mehdi Tibouchi, featured a Samurai warrior to make sure speakers kept to time.

3. Recall the Multiparty Non-Interactive Key Exchange From Isogenies on Elliptic Curves (mentioned in this blog post. It was relying on an invariant of products of elliptic curves. Recently Eric Rains, Karl Rubin, Travis Scholl, Shahed Sharif and Alice Silverberg have posted on arxiv the paper “Algebraic maps constant on isomorphism classes of unpolarized abelian varieties are constant”, which gives additional evidence that a useful invariant doesn’t exist.

4. Advance notice for ECC 2020 in Taiwan! You will find information here.

— Steven Galbraith

## Isogeny crypto

A long time ago, when pairing-based cryptography was new, cryptographers who did not fully understand the mathematics of pairings would sometimes make mistakes. They would assume that everything that can be done with discrete logarithms could also be done with pairings. Unfortunately, this would sometimes result in protocols that were insecure, or else un-implementable.

Indeed, such cases apparently still happen:

This situation is natural whenever a crypto tool that is technically subtle (and crypto tools always have technical subtleties) moves from “niche” into the mainstream. However it can result in incorrect schemes being published, for example because there are not enough experts to review all the papers.

Back in 2006, in response to those issues in pairing-based crypto, Kenny Paterson, Nigel Smart and I wrote the paper Pairings for Cryptographers. The abstract read:

Many research papers in pairing based cryptography treat pairings as a “black box”. These papers build cryptographic schemes making use of various properties of pairings. If this approach is taken, then it is easy for authors to make invalid assumptions concerning the properties of pairings. The cryptographic schemes developed may not be realizable in practice, or may not be as efficient as the authors assume. The aim of this paper is to outline, in as simple a fashion as possible, the basic choices that are available when using pairings in cryptography. For each choice, the main properties and efficiency issues are summarized. The paper is intended to be of use to non-specialists who are interested in using pairings to design cryptographic schemes.

This abstract exhibits the particular style of understated writing that is cultivated by British people. What we really meant was: Please read this and stop screwing up.

Rolling forward 15 years, isogeny-based cryptography is another area with many technical subtleties, but is moving into the mainstream of cryptography. Once again, not everything that can be done with discrete logarithms can necessarily be done with isogenies. It is therefore not surprising to find papers that have issues with their security.

It is probably time for an Isogenies for Cryptographers paper, but I don’t have time to write it. Instead, in this blog post I will mention several recent examples of incorrect papers. My hope is that these examples are instructional and will help prevent future mistakes. My intention is not to bring shame upon the authors.

• In 2014, D. Jao and V. Soukharev proposed an isogeny-based undeniable signature scheme. The security analysis of their scheme required the introduction of some computational problems in isogenies. Recently, S.-P. Merz, R. Minko and C. Petit Another look at some isogeny hardness assumptions have broken the computational assumptions and formulated attacks on the scheme.

In this case, there is no reason for the original authors to be embarrassed. There has been considerable progress in isogeny crypto in the last 5 years, and it is natural that new cryptanalytic tools would become available that could break earlier schemes.

• Several papers, including this one, have argued that a certain decisional assumption related to the SIDH isogeny cryptosystem should be hard.

Without going into all the details, in SIDH there is a base curve $E$ and four points $P_A, Q_A, P_B, Q_B$ on it. An SIDH instance includes a triple $(E_A, \phi_A(P_B), \phi_A(Q_B))$ where $\phi_A : E \to E_A$ is an isogeny of degree $2^n$. One of the basic computational problems is to compute $\phi_A$ when given this information.

The decisional assumption is to distinguish a valid triple $(E_A, \phi_A(P_B), \phi_A(Q_B))$ from another triple $(E', P', Q')$ where $E'$ is a supersingular curve, and $P', Q'$ are points satisfying various conditions.

At Provsec 2019, S. Terada and K. Yoneyama (“Password-based Authenticated Key Exchange from Standard Isogeny Assumptions”) proposed a password-based authenticated key exchange scheme for SIDH. The security against offline dictionary attacks was based on the hardness of a decision problem, but it was not the above decision problem. Instead, the security of the scheme under such an offline dictionary attack relies on the difficulty of distinguishing the triple $(E', P', Q')$ from a uniformly random binary string of the same length. This problem is not hard at all since there are many properties that the valid triple should satisfy (e.g., $E$ is a supersingular elliptic curve, $P', Q' \in E'$ etc) which would not be satisfied by a uniformly chosen binary string. Hence the scheme in the paper is not secure against offline dictionary attacks.

It is actually a really interesting open question to fix this, related to compression of SIDH protocol messages. If one could compress SIDH protocol messages down to the minimum number of bits, then one might actually be able to argue that the protocol message is indistinguishable from a uniform binary string. I don’t know any way to solve this problem and I think it is probably impossible. For the state-of-the-art in compression of SIDH messages see G. H. M. Zanon, M. A. Simplicio Jr, G. C. C. F. Pereira, J. Doliskani and P. S. L. M. Barreto, “Faster key compression for isogeny-based cryptosystems”.

• A very natural and desirable feature is to be able to hash to an SIDH or CSIDH public key. Unfortunately this is hard. Really hard.

D. Boneh and J. Love Supersingular Curves With Small Non-integer Endomorphisms show, among other things, that it is hard to hash to SIDH public keys. W. Castryck, L. Panny and F. Vercauteren, Rational isogenies from irrational endomorphisms show it is hard to hash to CSIDH.

It would be great if someone can solve one of these problems, but I think they are both hard. In the meantime, cryptographers should not assume that it is possible to hash to public keys/protocol messages. This also limits the possibility to transport some protocols from the discrete-log world into the isogeny world.

• Due to the adaptive attacks on SIDH, one cannot get CCA1 or CCA2 secure encryption from SIDH without doing the Fujisaki-Okamoto transform (or something similar). Similarly, one cannot get non-interactive key exchange from SIDH. It is natural to try to get around this by some tweak to SIDH. R. Azarderakhsh, D. Jao and C. Leonardi gave a solution to this problem by running $k$ instances in parallel (e.g. for $k = 60$). S. Kayacan suggested two schemes that were hoped to be secure. However adaptive attacks have been shown in both schemes by my students and collaborators:
• A. Fujioka, K. Takashima, S. Terada and K. Yoneyama proposed an authenticated key exchange scheme similar to some previous discrete-log-based schemes that required gap assumptions in the security proof. Gap assumptions are of the form: Problem X is hard, even when given an oracle to solve the decisional variant of problem X.

For the isogeny context it is dangerous to use a gap assumption, as there are known arguments that one can reduce the computational isogeny problem to a decisional isogeny problem in certain cases. I already warned about this in the key exchange setting in this note. The solution of Fujioka et al was to introduce a “degree-insensitive” version of the problem, which is essentially to extend the protocol to $\ell$-isogeny chains of any length (rather than fixed length). It is an interesting idea.

However, my student S. Dobson and I have given evidence (see On the Degree-Insensitive SI-GDH problem and assumption) that the distribution of public keys in the degree insensitive case is close to uniform, and so it no longer makes sense to consider a gap problem. We do not have an attack on this protocol, but we conclude that the security proof is not correct. This shows again that one must be very careful to adapt ideas from discrete-log-based protocols into the isogeny setting.

• S. Furukawa, N. Kunihiro and K. Takashima (“Multi-party key exchange protocols from supersingular isogenies”) proposed an isogeny variant of the Burmester-Desmedt protocol for $n$-party key exchange in two rounds for any $n$. It is a nice paper, but Takashima (“Post-Quantum Constant-Round Group Key Exchange from Static Assumptions”) comments that:

Furukawa et al. [14] proposed an isogeny-based BD-type GKE protocol called SIBD. However, the security proof of SIBD (Theorem 4 in [14]) is imperfect, and several points remain unclear, for example, on how to simulate some public variables.

Once again, the scheme is not broken (as far as I know), but the security argument is not correct. Takashima gives a new security analysis in his paper (but I have not had time to check it).

What can authors do to avoid the dangers of isogeny crypto? There are some very good surveys of the basic ideas behind isogenies (for example see Mathematics of Isogeny Based Cryptography by Luca De Feo), but there is no good resource for cryptographers who want to use isogenies as a “black box”, and just want to know what is possible and what is not possible for building protocols. My best attempt so far is this note. In any case, I hope the present blog post can act as a cautionary tale: treating isogenies as a black box is risky!

— Steven Galbraith

Posted in Uncategorized | 1 Comment

## An attendee’s report: crypto means Crypto (2019)

Steven Galbraith, who maintains this blog, has been inviting me to write a blog post on several conferences for quite some time, and I’ve consistently postponed accepting the invitations for, well, too long, so here you go. Yet, the reader is kindly asked not to expect a masterpiece of literature in this very first attempt of mine at blogging (in other words: read on at your own peril; you won’t be able to unread it later).

Continuing the unavoidable trend for large conferences, Crypto 2019 offered two parallel tracks, and understandably I’ll report on but a few presentations of the one specific track I happened to choose at each segment of the program (I tried to vary my choice of track for every session block, though).

And yet, the dichotomy of parallel sessions got me into existential anguish (of sorts) right from the start for being unable to attend both. The very first parallel pair was on lattice-based ZK proofs on the one hand, and on certain symmetric constructions on the other. I chose symmetric constructions.
I found the notion of secure PRNGs that lack a random seed, introduced by S. Coretti et al. (“Seedless Fruit is the Sweetest: Random Number Generation, Revisited”), particularly intriguing (to say the least). The authors bypass the impossibility of attaining this by compromising: yes, the entropy source is still implicitly there despite the name, but instead of modeling the extraction procedure by feeding the PRNG a randomness seed, it assumes the underlying random oracle itself (called the “monolithic extractor”) is picked uniformly at random all at once. Building on this idea, the authors offer provably secure constructions and show how some existing ones are insecure. Unfortunately, delays between clicks and slide changes, coupled with a few other issues (including, I should say, a somewhat inordinate number of jokes), made it impossible to cover the extensive slide set in the allotted time… and to check if I got the ideas right.
My session choice meant I couldn’t attend the simultaneous presentation of the equally intriguing solution to the problem of constructing a non-interactive zero-knowledge (NIZK) proof system from the LWE assumption for any NP language, discovered by C. Peikert and S. Shiehian and described in their paper “Noninteractive Zero Knowledge for NP from (Plain) Learning with Errors”. That was a pity, but it was somewhat compensated by the work “Nonces Are Noticed: AEAD Revisited” by M. Bellare, Ruth Ng, and B. Tackmann. This work reveals an enormous gap between the usual theory of nonce-based schemes and the actual (sometimes even standardized) usage of those schemes in practice: nonces become a kind of metadata that can reveal a surprising amount of information about the users or devices originating them. Quite creepy, but the authors address it by providing new notions and solutions whereby the nonce is hidden as well, and also resist nonce misuse.

As usual, there was a session on FHE. The work “On the Plausibility of Fully Homomorphic Encryption for RAMs” by A. Hamlin et al., the authors tackle the problem of hiding the sequence of memory addresses that are accessed when doing some processing on a large database. Using their notion of rewindable oblivious RAM, they obtain a preliminary single-hop scheme where the multiplicative running time overhead is $O(\mathrm{polylog} N)$, where $N$ is the database size.
In the same session, Sri A. K. Thyagarajan talked about his joint work with G. Malavolta on “Homomorphic Time-Lock Puzzles and Applications” whereby one can evaluate functions over puzzles without solving them. This amusing notion has nice applications like e-voting: in a simple setting, the voters create one encryption of 1 for the candidate they are voting for and distinct encryptions of 0 for all the others, so that summing up those sets over all voters yields the encrypted voting tally for all candidates (without revealing who voted for them), while adding the all encryptions, and independently the squares of all encryptions, for each individual voter yields a proof that they voted exactly once for each candidate. Transforming the encryptions into time-lock puzzles makes the decryption operations public, and does away with the need for a trusted third party. Other applications were suggested, like sealed e-auction bidding, multiparty coin flipping, or multiparty contract signing.

The session on the communication complexity of multiparty computation (MPC), which I chose over malleable codes, was no less striking, in particular the presentation by Mark Simkin and the one by Abhi Shelat.
Mark, who presented his work with S. Ghosh (“The Communication Complexity of Threshold Private Set Intersection”), started with applications of private set intersection (like the intersection of fingerprints) where one only cares about large intersections. In that case, it pays to set up the protocol so that one actually learns the complement of the intersection instead. One can see this as MPC of the ratio between characteristic polynomials, so that common factors (that is, those corresponding to the intersection) cancel. I didn’t quite gather whether a trusted third party is essential or just a secondary concern for the proposed protocol, though.
Abhi delighted the audience with a long, slow-motion clip of radical acrobatic skiing and the associated adrenaline rush. This blogger is not really sure the subject of MPC communication complexity causes a similar physiological effect, although the presenter claimed it does. After a recapitulation of the milestones of the subject, the audience was finally rewarded with a quite detailed mathematical treatment of the contribution, though this time at a very, very fast pace. Perhaps the subject does cause an adrenaline rush after all. Anyway, the work covered adaptively secure MPC with sublinear communication cost, in a scenario where the adversary can corrupt parties at any time, even after the end of the protocol, at which time the adversary can potentially corrupt all parties.

The session on post-quantum security focused on the quantum random oracle model (QROM). Both papers in the first part of that session, “How to Record Quantum Queries, and Applications to Quantum Indifferentiability” by M. Zhandry, and “Quantum Security Proofs Using Semi-classical Oracles” by A. Ambainis, M. Hamburg and D. Unruh, were thickly theoretical. The talk on “Quantum Indistinguishability of Random Sponges” by J. Czajkowski, A. Hülsing, and C. Schaffner was more approachable in my opinion (TL;DR: the sponge construction can be used to build quantum-secure pseudorandom functions when the adversary has superposition access to the input-output behavior of the sponge but not to the sponge’s internal function or permutation function itself, assumed to be random in their model). Sure enough, the more theoretically-oriented results have a clear and welcome niche even here, since these results build upon Zhandry’s prior switching lemma for pseudo-random functions or permutations from 2015. Zhandry is also a co-author of another paper from that session, “Revisiting Post-Quantum Fiat-Shamir” (joint work with Q. Liu), which was presented together with the last one, “Security of the Fiat-Shamir Transformation in the Quantum Random-Oracle Model” by J. Don et al.

Several other works are worth mentioning; I’ll mention a few more, but alas, not a full list: hanc blogis exiguitas non caperet. I found the paper “Unifying Leakage Models on a Rényi Day” by T. Prest, D. Goudarzi, A. Martinelli, and A. Passelègue, whose presentation I could not attend for not being proficient at ubiquity, quite entertaining (I assure the reader that this has nothing to do with my living in the often rainy Seattle area). The paper “It Wasn’t Me! Repudiability and Claimability of Ring Signatures” by S. Park and A. Sealfon deals with the question of enabling repudiation for ring signature non-signers, and claimability for actual signers of ring signatures. The importance of the first is to deflect undue responsibility for ring signatures produced by another ring member, and the importance of the latter lies in taking due credit for signing when that turns out to be, or becomes, desirable, but prior notions of security for ring signatures were ambivalent at best on such issues. Besides updated notions, the authors offer a repudiable scheme based on a variety of assumptions (for instance, bilinear maps), and unclaimable scheme based on the SIS assumption, and constructions for claimable or unrepudiable schemes that can be obtained from certain existing ring signatures.

Last but obviously not least, three papers got awards:

1. “Cryptanalysis of OCB2: Attacks on Authenticity and Confidentiality,” by A. Inoue, T. Iwata, K. Minematsu, and B. Poettering got the Best Paper award;
2. “Quantum Cryptanalysis in the RAM Model: Claw-Finding Attacks on SIKE,” by
S. Jaques and J. M. Schanck, got Best Young Researcher Paper award;
3. “Fully Secure Attribute-Based Encryption for $t$-CNF from LWE,” by R. Tsabary, got Best Young Researcher Paper award.

The papers are quite well written. The interested readers are encouraged to avail themselves of them for all of the fascinating details of these works. I was personally interested in the second of them and, to a smaller degree, the first, so I’ll try and briefly summarize those (I’m afraid the third falls somewhat outside my areas of expertise so I refer the interested reader to the corresponding paper).

Kazuhiko Minematsu began describing their work on OCB2 by showing how easy it is to attain a minimal forgery with one single encryption query. The general attack follows the model previously applied against the EAX Prime mode of operation, which lacked a formal security analysis (so it was not really a big surprise that scheme turned out to succumb to attacks). However, OCB2 was supported by a full-fledged security proof and remained unscathed for fifteen years. The attack described in the paper stems from an observed gap in that security proof which turned out to be a severe flaw. On the bright side, the attack does not extend to OCB1 nor OCB3, nor to certain suggested tweaks to OCB2. This shows that the overall structure of OCB is sound, but also the necessity of active verification of proofs.

Sam Jaques explained that their claw-finding paper set forth three goals. The first goal was to fairly compare attacks with classical and quantum resources. The second goal was to view gates as processes (which is indeed the view suggested by current quantum technology). The third goal was to include error correction as part of the cost and effort of the attack, since those are essential to overcome the exquisite fragility (in the sense of susceptibility to decoherence) of quantum computations. Their main idea was thus to view quantum memory as a physical system acted upon by a memory controller. As such, it undergoes two kinds of time evolution: free (caused by noise) and costly (caused by the controller). The computation cost becomes the number of iterations (ignoring construction costs, focusing on the controller cost instead). Two cost models are covered: the so-called G (gate) cost, which assumes passive error correction and 1 RAM operation per gate, and the DW (depth-width) cost, which counts 1 RAM operation per qubit per time unit. This sets the framework for their analysis of the claw-finding algorithm, which is a meet-in-the-middle attack to recover a path spelled out by the private key in the isogeny graph, between the initial curve and the final one (which is part of the public key). It can be realized by Tani’s collision-finding algorithm, by following random walks on two Johnson graphs, looking for a collision, and doing all computations in a quantum setting. The complexity is $\tilde{O}(p^{1/6})$. Despite the paper title, a quite surprising conclusion of their analysis is that SIDH and SIKE are actually harder to break than initially thought. In particular, it appears that the minimum SIKE parameter set (namely, SIKE434) cannot be broken by any known attack in less than the cost and effort needed to break AES128, specifically $2^{143}$. This scales to other parameter sets, to the effect that the revised SIKE parameters for the 2nd round of the NIST PQC process are smaller than their 1st round counterparts.

So, there you have it, a brief (and necessarily incomplete, but hopefully helpful) appraisal of Crypto 2019. Scripsi. Vale.

## SIAM Conference on Applied Algebraic Geometry 2019

So here we are in the nice city of Bern, in the Teutonic Switzerland, for the SIAM Conference on Applied Algebraic Geometry 2019that this year counts more than 750 attendees. The weather is warm enough but the isogenies topic has never been so hot! So for this occurrence of the conference Tanja Lange, Chloe Martindale and Lorenz Panny managed to organise a really great isogenies mini-symposium spread over 4 days.

Day #1

Day #1 started strong. After a quick overview of isogenies by Chloe Martindale and Lorenz Panny, including an introduction to SIDH and CSIDH, the invited speakers took the stage:

This concluded Day #1

Day #2

• David Jao discussing recent progress in implementing isogeny-based cryptosystems in constant time to resist side-channel attacks. In particular he presented results from his recent paper (joint work with Jalali, Azarderakhsh and Kermani). One of the interesting observation made was that isogeny computation over Edward curves is relatively simple to be implemented in constant time (as expected) but it is faster only for isogenies of degree 5 or more. He concluded his talk with some really great demos (as also reported by Thomas Decru in a second blog post).
• Christophe Petit surveyed known results on the security of isogeny-based protocols including the celebrated active attack on SIDH.
• Frederik Vercauteren gave the first of two sessions dedicated to CSI-FiSh (joint work with Beullens and Kleinjung). This part had as a focus the new record class group computation they achieved while computing the class group structure of CSIDH. It seems they reused some of the code previously written by Kleinjung, and for the final computation of the closest vector in the lattice Léo Ducas gave a hand. While the technique used for the computation was standard, it was still a remarkably big task involving several core years. He concluded the talk with a nice list of open problems.
• David Kohel presented a joint work done with his student Leonardo Colò that was recently published at NutMiC 2019. This construction called OSIDH (that stands for oriented supersingular isogeny Diffie-Hellman) is built on top of O-oriented supersingular elliptic curves (as define in the paper).

Day #3

Day #3 of isogenies opened with the plenary session delivered by Kristin Lauter. Her talk, as usual, was really inspiring and was about the history of Supersingular Isogeny Graphs in Cryptography. She basically covered the Charles-Goren-Lauter (CGL) hashing construction and the panorama of post quantum cryptography. After a quick break and a commuting to the other building we were back to the isogenies mini-symposium:

• Thomas Decru presented a new CGL type genus-two hash function (joint work with Wouter Castryck and Benjamin Smith). The reformulated a previous construction by Takashima (broken by Yan Bo Ti and Victor Flynn) by using genus-two superspecial subgraphs.
• Jean-François Biasse talk was about algorithms for finding isogenies between supersingular elliptic curves. He showed that under some circumstances the generic Grover algorithm might beat the more isogeny specific Tani algorithm. This talk was also covered by a Thomas Decru’s blog post.
• Benjamin Wesolowski talked about his systematic approach to analyse horizontal isogeny graphs for abelian varieties. He covered some neat theorems he proved (in a joint paper with Brooks and Jetchev) and concluded saying that his results would not be enough to say anything about the CSIDH case but as we will see in the next talk they are extremely useful in the higher genus cases.
• Dimitar Jetche‘s talk was a natural following of the previous talk. He was focusing on vertical isogenies instead and announced a possible solution to the DLP on genus 3 hyperelliptic curves.

Day #4

And here we arrived already to the last day:

• Ward Beullens delivered the second part of the CSI-FiSh paper (here there is also a blog post about it). In this part he focused on the identification scheme/signature part including the Zero Knowledge and the optimization part.
• Florian Hess tried to give an answer to an open problem posed in a recent paper about multiparty non-interactive key exchange. Namely his talk was about the possibility of building an invariant maps from isogenies. His conclusions were not really positive at least so far.
• Luca De Feo brought a new topic to the isogeny World: #blockchain! He presented a new Verifiable Delay Function construction based on Supersingular Isogenies and Pairings (joint work with Simon Masson, Christophe Petit and Antonio Sanso). Despite using isogenies, the construction is not quantum resistant due the usage of pairings. A blog post about this construction can be found here.

• Jeff Burdges talked about some real word application of isogenies, including an hybrid scheme that might be used in mix networks, consensus algorithms in blockchain and encrypt to the future to be employed in auctions.

That’s all from SIAM AG. See you in 2 years.

— Antonio Sanso