ECC 2015, Bordeaux, France, September 28-30, 2015.

The conference took place at the University of Bordeaux, preceded by a well-attended summer school.

Slides of talks are available on the conference website .

The main conference began on Monday 28th. The first three talks were notable for being about finite field discrete logs (rather than elliptic curves).

Nadia Heninger (presenting joint work with a zillion authors) discussed the usage of finite fields for Diffie-Hellman key exchange in internet protocols such as TLS and IKE. By scanning the internet the researchers found a large number of servers that supported 512-bit “export level” keys. They also identified primes that were shared by a large number of servers. It therefore makes sense to attack such primes using the Number Field Sieve DLP algorithm. Their experiments using CADO-NFS showed that, once the precomputation is completed, individual dlogs can be solved in 70 seconds. This allows practical attacks on supposedly secure internet sessions. The talk discussed “downgrade” attacks (forcing servers to use smaller keys which can then be broken). Blame was laid upon US government export controls from the 1990s and the lack of knowledge by security practitioners about DLP algorithms. For more information see

Aurore Guillevic talked about solving the DLP in finite fields F_{p^n} coming from pairing applications, in particular fields coming from polynomial families like MNT and BN curves. The focus is on the “individual discrete logarithm” step. This work will appear at Asiacrypt.

Cecile Pierrot presented joint work with Antoine Joux about the DLP in medium characteristic finite fields. The talk covered two techniques: (1) combining the Multiple Number Field Sieve and the Conjugation approach; (2) a way to accommodate some non-sparse columns in the (block) Wiedemann method.

I gave a survey of ECDLP algorithms, focussing on open problems in the baby-step-giant-step algorithm and summation polynomials. In case there is any confusion: elliptic curves in characteristic 2 are not broken; Pollard rho is still the fastest algorithm for elliptic curves over F_{2^n} where n is prime.

Michiel Kosters discussed summation polynomial attacks on the ECDLP. He explained the work of his three papers on arxiv.

Kim Laine talked about Diem’s algorithm for the DLP on plane quartic curves, with particular focus on how to implement the two-large-prime variant using a graph. He presented a way to build up the factor base together with relations of the desired type, and discussed the massive storage requirements.

There was a reception followed by a rump session.

Tuesday September 29:

Enea Milio talked about “Computation of modular polynomials in dimension 2”. Modular polynomials are a fundamental tool for elliptic curves with many computational applications. A long-standing problem is to get similar techniques for abelian varieties of dimension 2 (Jacobians of hyperelliptic curves). The rather technical talk reported progress on this problem. Most interesting was the introduction of modular polynomials for cyclic subgroups of prime order p, compared with the usual situation of subgroups order p^2. But these are only for some very special curves with real multiplication

There were then two talks about computational problems in lattice cryptography. Leo Ducas presented his paper with Cramer, Peikert and Regev about the ideas of Bernstein and Campbell-Groves-Shepherd for computing short generators of principal ideals by decoding in the log-unit lattice. The talk was illustrated with some beautiful images. Then Katherine Stange talked about work of Eisentrager-Hallgren-Lauter, Elias-Lauter-Ozman-Stange and Chen-Lauter about an attack on a variant of Ring-LWE called Poly-LWE (meaning the errors are chosen using the polynomial basis rather than the canonical embedding representation).

Peter Schwabe gave a stimulating talk about the problem of using automated tools to prove the correctness and security of crypto software. He demonstrated how the valgrind profiling tool can be used on real crypto code, but emphasised that such tools create a massive overhead for software developers.

Michael Hamburg discussed some simple implementation tricks for ECC that are useful to obtain efficient and secure systems.

The last event of the day was a panel discussion about standardisation. The panel was chaired by Ben Smith. The panellists were: Daniel Bernstein, Joppe Bos, Jean-Pierre Flori, Michael Hamburg, Manfred Lochter, Dustin Moody.


The first question was about people’s experience with the recent IETF, and the on-going NIST, processes.

DB: IETF looks at what is being done and tries to standardise and improve. NIST doesn’t necessarily start from current systems, and so it is less clear where it will end up.

JB: Things could have been better with IETF. It would have been better if more academics were involved. It is an important decision, so want more involvement from industry and academia. It would be good to have backward compatibility (to earlier standards) when the NIST standard is introduced.

JPF: The ECC standard process was quite different to the competition process used to choose AES and the SHA3 hash function.

MH: IETF was an exercise in exhaustion.

ML: IETF was very emotional, and scientists were turned-away from participating. I believe this is a bad thing. Academia should participate since they are the experts. Germany likes to follow international standards. SHA3 hash competition was very helpful. Advice to conference attendees: participate and follow the process.

DM: NIST won’t be doing something like AES/SHA3 for the ECC process. Right now NIST is getting lots of feedback. In the next week or two there will be a request for comments from the public. There is likely to be another workshop to discuss requirements and criteria.

Ben then asked about the timeframe/lifetime of standards?

DM: Standards should be reviewed every 5 years or so. Standard curves should last a long time.

ML: NSA has recently announced new policy for post-quantum crypto. We should not change too much about ECC. Instead, we should move to standardise post-quantum crypto. All these processes are very slow. It takes years for Academic results to feed into standards, and then years for standards to feed into software. Maybe the main issue is not standards, but keeping software up to date.

MH: Depends on whether quantum computer is built. If not, standard should last 20-30 years.

JB: Shouldn’t be revising standards too often. I’m a big fan of crypto agility. Would be nice to support a family of curves. It would be good to have a common framework where most things stay the same and can plug-in new parameters if need to change.

DB: If crypto is working then shouldn’t touch it. If the old standard is working fine then don’t use the new standard. If you want to distract people then best thing you can do is have a whole bunch of standards. How many crypto standards are there? No-body can review security and support on many platforms. Best thing is to limit number of standards. Problem: most things aren’t working and the internet is mostly not secure. Need to do what it takes to get security. We should figure out why the current standards are not doing the job and fix them. If a new standard is not doing the job then fix it.

BS: What tools do we need to check that we are adhering to standards? What tools to develop right standards in the first place?

DB: Peter Schwabe talked about this, e.g., formal verfication. There are a ton of problems to be solved.

JB: Peter’s talk was interesting and such tools would be very helpful. More traditional testing is still valid. It would be good to have a database of test cases.

JPF: Many people don’t trust NIST curves. How many people verified the curve generation? Open source tools would be nice.

MH: It is very difficult to make a set of test vectors without knowing the special cases in the implementation, so formal verification more appropriate.

ML: Peter’s talk was great. There are many other side-channel attacks than timing. One solution may be optimal on one platform but not on others. So need flexibility. Not only correctness of implementation but also physical errors and fault injection. So absolutely necessary to test points before and after operations.

DM: I agree with need for tools. The more eyes looking over a standard the better. Regarding earlier point: NIST has verfied the curve generation process. We believe the NIST curves are secure despite concerns about their provenance. But NIST is open to new curves as well.

Ben asked if there are any theorems that can help?

MH: If you have a set of formulas for some elliptic curve process, put them into Groebner basis algorithms and prove the formula in an automated fashion. It would be nice if there were an automated way to prove that formulas are complete.

DB: Anyone doing formulas — do sage scripts and make them public online. So people can run them for themselves. This would minimise number of typos etc.

JB: When using side-channel countermeasures, such as dedicated formulas and large precomputed tables of points etc, not many people have looked at the counter cases and proved things about them.

ML: There are dozens or hundreds of implementations of ECC. To avoid patents companies choose their own implementations. When one cryptanalyses them then one runs into all sorts of mathematical problems, e.g., lattice problems or statistics. Need tools to extract secret from the data coming from side-channels etc.

Panellists were asked about the NSA announcement that users should consider moving to post-quantum crypto.

DB: We’ve been saying for more than 10 years that quantum computers are coming and we need to be prepared. Nice that NSA finally understands that message. But the details are really puzzling. Suite B lowest security was 384 bits ECC, and other gigantic security levels. Now it has 3000-bit RSA, which we know is breakable in like 2^{128} operations. I am puzzled by the details. Everyone I have talked to at NSA said “we didn’t see this in advance”.

JB: Puzzling announcement, especially the timing during the NIST ECC standard process. In my opinion it is premature to consider post-quantum systems, as lots more work is needed on post-quantum crypto.

MH: The NSA is concerned with 50-year security. Highest assurance data should consider quantum computing a threat. I also don’t know why 3072-bit RSA — is it for people who were thinking to move to ECC? You should be implementing elliptic curves. If you are worried about quantum computers in 10 years then build with ECC + post-quantum perhaps. Signatures less a concern. Adding post-quantum key-exchange a good idea.

ML: If NSA says it then we take it seriously. For top secret information, 30 years is the regulation, and it can be extended a further 30 years. Health data and voting information have unlimited lifetime. One case where quantum-secure systems are mandated is satellite communications. We are using Merkle signatures. I don’t see a mature post-quantum systems.

DM: NSA knows transitions are painful. It is very surprising all the details they didn’t say. It was surprising to the NSA guys I know. They also don’t know where it came from. I’m confident NSA still believe in the security of ECC, but NIST is also working on post-quantum crypto. NSA has been less engaged than usual in curve selection process this time.

Ben asks if standards can give a false sense of security.

DB: I prefer a de-centralised approach, where implementors do good work that becomes a de-facto standard. If it isn’t what the devices are using then it is not a useful standard.

JB: We (NXP) have to follow standards. We go to evaluation labs who do a lot of testing and they try to break it in physical ways. I guess a lot of software implementations do not follow standards.

JPF: Standards make it easier to help people not to make mistakes. We need better education of implementors.

MH: Try to make sure the entire crypto in the system is secure. Too often someone has a product with “military grade security AES”, but have not considered buffer overruns and side-channels etc. Often using AES in an insecure way. Need to find some way to get better quality.

DM: Better off with standards than without them.

ML: Companies have lifecycle management. But with open source people come along and add things. Then you start to find bugs. Can’t blame it on standards. Blame it on process. I don’t know how to mitigate that. Common criteria require evaluation lab testing.

DB: You CAN blame the standard. It can be too complicated and encourage implementors to have to deal with things that are likely to go wrong. Current NIST ECC standards allow, for example, invalid curve attacks thanks to choices of elliptic curves. It is hard to tell the difference between an implementation that follows the standard and one that doesn’t. This leads to major security problems. We know how to fix this at the standardisation level.

DB: Not enough to ask for more academics to be involved in the standardisation process. An example of a scientist in the standardisation process is Ron Rivest in 1990s. Rivest in 1992 sent several pages of comments stating “This standard has enough rope for the user to hang himself, which is not something a standard should do”. The expert said the standard was not good enough, but it turned out NSA was behind DSA and it got standardised. So it is not clear that expert opinion will help.

JB: Standards can be simplified, and this is a good thing. But don’t want a situation where everyone is expected to write crypto. Crypto should be done by crypto experts. Blame people over-extending their expertise.

Ben asked about the difference between hardware/commercial software/open source.

JB: Implementation on servers is a different security model than smart cards. So leads to different requirements and different decisions about curves. Should try to find balance. A list of requirements at the start might be a better idea.

JPF: Different needs. So maybe different curves?

ML: If you want to read the BSI position then read our requirements document on the web.

DM: The NIST position is not too many curves. Originally there were 15, and not many are being used. If software/hardware need different curves then we will support that.

We then all crossed the river for a fine conference dinner at Cafe du Port.

Wednesday 30

The conference ended with three talks. Christian Grothoff talked about rebuilding the internet on secure foundations. Arnaud Tisserand talked about general purpose and open-source Hardware accelerators for ECC and HECC. Juliane Krämer gave a very clear talk about Fault Attacks on Pairing-Based Cryptography (in particular, glitch attacks, which seem to be very powerful).

Andreas Enge, Damien Robert and their team are to be thanked for an excellent conference.

— Steven Galbraith

Posted in Uncategorized | 1 Comment

NIST workshop and Chinese mirror site

For discussion of what took place at the NIST workshop see these threads on the curves@moderncrypto mailing list:

Readers may also be interested to know that this blog now has an offical mirror site in China since wordpress is not accessible there.

— Steven Galbraith

Posted in Uncategorized | Leave a comment

Latincrypt accepted papers

Here’s a great reason to visit Guadalajara, Mexico: The list of accepted papers to Latincrypt 2015 contains quite a few papers about elliptic curves.

— Steven Galbraith

Posted in Uncategorized | Leave a comment

Elliptic Curve Cryptography conference, September 28-30, Bordeaux, France.

This years ECC conference will take place in Bordeaux. The conference webpage has a list of invited speakers. There will also be a panel discussion on standardisation of elliptic curves for cryptography.

— Steven Galbraith

Posted in Uncategorized | Leave a comment

Accepted papers at CRYPTO

The list of accepted papers for CRYPTO 2015 is now online. Some papers may change their title.

Of relevance for elliptic curve crypto are these ones:

  • Mike Hamburg, “Decaf: Eliminating Cofactors Through Point Compression”
  • Ming-Deh A. Huang, Michiel Kosters and Sze Ling Yeo “Last Fall Degree, HFE, and Weil Descent Attacks on ECDLP”

There are also some interesting papers on multilinear maps.

— Steven Galbraith

Posted in Uncategorized | Leave a comment

Public Key Cryptography 2015, Gaithersburg, USA

PKC 2015 was held at the Gaithersburg Campus of the National Institute of Standards and Technology (NIST), USA, March 30th to April 1st. There were 36 accepted papers and two invited talks. The venue was quite impressive. However, our global impression is that the conference held few surprises.

In the cryptanalysis session, I (Ludovic) had two papers: ‘A Polynomial-Time Key-Recovery Attack on MQQ Cryptosystems’ (joint work Jean-Charles Faugère, Danilo Gligoroski, Simona Samardjiska, Enrico Thomae) and `Algebraic Cryptanalysis of a Quantum Money Scheme — the Noise-Free Case’ (joint work with Marta Conde Pena, Jean-Charles Faugère). The first paper, presented by Simona, describes a polynomial-time attacks against the multivariate schemes which use quasi-groups (such systems provided up to know the fastest signature scheme on eBACS). The second paper, presented by Marta, proposes a heuristic polynomial-time attack against a quantum-money scheme of Scott Aaronson and Paul Christiano (STOC’2012). The situation of the quantum-money scheme is not as bad as the MQQ cryptosystems. A tweak, already proposed by Scott Aaronson and Paul Christiano, allows to circumvent the attack presented by Marta, so your quantum money is still safe for now.

Ayoub Otmani gave a nice talk about a `Polynomial-Time Attack on the BBCRS Scheme’ (joint work with Alain Couvreur, Jean-Pierre Tillich, and Valérie Gautier Umaña). This is yet another attack using the square code distinguishing technique. The target was a McEliece scheme using somewhat `hidden’ GRS codes (you can find a description of the BBCRS scheme in your favourite Journal of Cryptology).

Antoine Joux gave a very accessible invited talk on `Recent Advances in Algorithms for Computing Discrete Logarithms’, focusing primarily on his recent result with Cecile Peirrot (presented at Asiacrypt 2014) which enables one to compute the logs of the factor base elements with lower complexity than before, in small characteristic fields.

Sanjam Garg gave an invited talk on `New Advances in Obfuscation and its Applications’ on his latest results (EC’14) about obfuscation. In reply to a question from the audience, obfuscation, with rigorous security proofs, is not yet practical.

Nico Döttling gave a clear presentation about `Low Noise LPN: KDM Secure Public Key Encryption and Sample Amplification’. In particular, a connection between solving LPN with small error and bounded number of samples and solving LPN with unbounded number of samples. This result can be used to construct a KDM secure public-key cryptosystem from LPN with small noise.

Vadim Lyubashevsky gave a well motivated and very clear presentation on `Simple Lattice Trapdoor Sampling from a Broad Class of Distributions’ (joint work with Daniel Wichs). The timely question that the authors investigated is whether we really need Gaussian distributions in lattice-based cryptography. According to Vadim, Gaussians could be replaced in most situations by different (suitable) distributions without harming the security. However, from a practical point of view, Gaussians lead to the most practical schemes. So, `we can view Gaussian distributions as an optimisation parameter’.

There were only a couple of talks directly relevant to ECC.

Allison B. Lewko gave a well motivated and very clear presentation on `A Profitable Sub-Prime Loan: Obtaining the Advantages of Composite Order in Prime-Order Bilinear Groups’ (joint work with Sarah Meiklejohn). As the title indicates, this work shows how one can obtain for prime order bilinear groups, several useful features which occur naturally for composite order bilinear groups.

I (Rob) presented `Faster ECC over GF(2^521 – 1)’ (joint work with Mike Scott) which gave improved timings for point multiplication on the NIST curve P-521 and the Edwards curve E-521. The latter is now under serious consideration for several new ECC standards (including NIST’s).

After PKC, NIST organised a workshop on `Cybersecurity in a Post-Quantum World’. The event was a success with 140 participants (which is more than attended the PKC conference) with many of them from industry, namely from CISCO, Microsoft, Security Innovation, even RSA security. It seems that quantum-safe cryptography is going to be a very hot topic in future.

— Rob Granger and Ludovic Perret

Posted in Uncategorized | Leave a comment

Accepted presentations to NIST workshop on ECC standards

The NIST Workshop on Elliptic Curve Cryptography Standards takes place in June. The list of accepted presentations is now available here.

— Steven Galbraith

Posted in Uncategorized | Leave a comment