Very Basic Intro to Elliptic Curve Cryptography
qvault.ioBig mistake in the article: the discrete log is not a trapdoor function, as far as we know, and elliptic curve crypto does not rely on trapdoors.
A trapdoor function is when you have a function hard to invert (for any x, given y = f(x), find x), which inversion becomes very easy once you know some additional info. For instance in RSA, given c = m^e mod N, it is hard to find m. Unless you know d such that e*d = 1 mod phi(N), then you can easily find m by computing m = c^d mod N.
There is no known way of easily inverting exponentiation on finite groups.
To quote Wikipedia, "Functions related to the hardness of the discrete logarithm problem [...] are not known to be trapdoor functions, because there is no known "trapdoor" information about the group that enables the efficient computation of discrete logarithms. "
Author here, thanks for bringing this up. I'll be looking into this and updating the article
Doesn't elliptic curve crypto rely on trap doors because it uses finite fields? The wrap around caused by the finite field is a trap door, isn't it?
I found this to be a much better article than the original post. It also explains how the elliptic curve Diffie-Hellman works.
Thanks!
> This is a great trapdoor function because if you know where the starting point (A) is and how many hops are required to get to the ending point (E), it is very easy to find the ending point. On the other hand, if all you know is where the starting point and ending point are, it is nearly impossible to find how many hops it took to get there.
> Public Key: Starting Point A, Ending Point E
> Private Key: Number of hops from A to E
So basically the number of hops is the secret. Couldn't I simply "brute force" the hop count? Does this mean that a higher hop count is equal to a better private key?
You can't, this is called ECDLP (Elliptic Curve Discrete Logarithm Problem) and the cornerstone of elliptic curve cryptography security.
Currently the ECDLP problem can only be solved by Pollard Rho which requires exponential time though recent advances by Barbulescu et al on the Tower Number Field Sieve significantly reduced the security of ECC in 2017 (by 10~30 bits, i.e. what was though 128-bit secure i.e. 256-bit curves keys are now somewhere between 100~115-bit secure).
Note that the DLP problem (Discrete Logarithm Problem) which is the corner stone of RSA security has significantly more efficient algorithms than Pollard Rho as it can use techniques taken from integer factorization.
https://en.wikipedia.org/wiki/Discrete_logarithm
It is known however that asymmetric ECC can be broken by quantum computers in polynomial (?) or polylogarithmic (?) time by quantum computer and so new techniques, for example Isogeny-based ECC are being actively researched.
The Tower Number Field Sieve is only applicable to pairing-based ECC though, right? It doesn't impact the security of the curves used in most mainstream crypto (Curve25519, NIST P-256 etc).
Indeed, though if the authors are saying
> The number field sieve algorithm is still far from being fully understood, in particular for extension fields that are so important for pairing-based cryptography.
I won't say I understand it either. But it indeed seems to only be applicable in towers of extension fields which are only used for pairing-based cryptography.
Nitpicking, but RSA does not rely on the DLP, rather on factorisation and modular roots. I guess you meant Elgamal or Diffie-Hellman?
The discrete logarithm is the inverse operation of exponentiation and modulo. Being able to find a discrete logarithm quickly would break RSA.
Shor's algorithm does exactly that by using quantum computation to solve discrete logarithms.
I don't think so. Computing the discrete log of g^x assumes g to be known. In RSA the ciphertext is c = m^e and by definition, m is unknown.
The reverse operation of RSA is the modular e-th root, not the discrete log.
Pick a base k.
Raising k by both sides yieldslog_k(c) = e*log_k(m) log_k(c)*e^-1 = log_k(m)k^(log_k(c)*e^-1) => mI don't think this works mod N. Since N is not a prime, you have no guarantee that for k your base, c belongs to the multiplicative group generated by k.
Take for instance N = 15, you can't find a log base 12 of 7, for instance (12 -> 9 -> 3 -> 6 -> 12). So you'd first need to find a fitting base, not sure how hard this is.
It has been proven [0] that factoring is as hard as solving DLP, but RSA also relies on the modular e-th root hypothesis, which we don't know to be as hard or easier than factoring.
In the end, because of [0] is you solve DLP you solve factoring yes, but we usually don't talk about DLP for RSA, even if as I said I was nitpicking.
[0] https://www2.eecs.berkeley.edu/Pubs/TechRpts/1984/5973.html
> Couldn't I simply "brute force" the hop count?
Well, yes and no.
Yes, the bruteforce attack is a valid one. No, because usually the private key is around 256 bits, so you need to compute 2^255 hops on average to find the key.
The current biggest supercomputer can compute 10^17 flops/sec, assume you can do as many hops/sec. It still requires you 10^52 years to find the key.
The same is true for RSA : given a number N, find the two prime numbers p, q such that N = p * q. Trivial for N=15, a computer can do 13201225834171 = 1564571 * 8437601 in a matter of a second. However, current RSA N numbers have ~627 digits and we don't expect them to be factored in a reasonable amount of time.
As the other commenter said, there are much more efficient algorithms than bruteforce, but for current RSA keys (2048 bits) and ECC keys (256 bits), they are not reasonably efficient.
> Does this mean that a higher hop count is equal to a better private key?
So yes, the size of the key is a very important indicator of the strength of the key (but long keys can still be super weak if you don't generate them correctly). However bigger keys also result in longer encryption/decryption times.
> No, because usually the private key is around 256 bits, so you need to compute 2^255 hops on average to find the key.
So if my key is 255 bits long, then don't I need to do the 255 2^255 hops anyway to actually decrypt something? And in running 2^255 hops, wouldn't I have also cracked the key if it was any number less than that?
> bigger keys also result in longer encryption/decryption times.
So, there's a reasonable upper bound for the key length? Does that mean 256 bits aren't reasonable, and wouldn't this reasonable upper bound be more brute forceable?
You don't need to do that many hops when using your own key, because you can double it repeatedly. ie instead of adding 1 256 times to get 256, you just double it 8 times.