Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
NIST’s Post-Quantum Cryptography Program Enters ‘Selection Round’ (nist.gov)
107 points by xoa on July 24, 2020 | hide | past | favorite | 77 comments


Here is the documentation: https://nvlpubs.nist.gov/nistpubs/ir/2020/NIST.IR.8309.pdf

All 15 the following are moving to the 3rd round.

Third-Round Finalists:

Public-Key Encryption/KEMs

- Classic McEliece (Code)

- CRYSTALS-KYBER (Lattice)

- NTRU (Lattice)

- SABER (Lattice)

Digital Signatures

- CRYSTALS-DILITHIUM (Lattice)

- FALCON (Lattice)

- Rainbow (Multivariate)

Alternate Candidates:

Public-Key Encryption/KEMs

- BIKE (Code)

- FrodoKEM (Lattice)

- HQC (Code)

- NTRU Prime (Lattice)

- SIKE (Supersingular Elliptic Curve Isogeny)

Digital Signatures:

- GeMSS (Multivariate)

- Picnic (Zero-knowledge)

- SPHINCS+ (Hash)

All of Bernstein's submissions (except the joke one) are in the finalist or in the alternative list.

A note regarding the alternatives:

> The alternate candidates are regarded as potential candidates for future standardization, most likely after another round of evaluation. Some of the alternate candidates have worse performance than the finalists but might be selected for standardization based on NIST’s high confidence in their security. Others have acceptable performance but require additional analysis or other work to inspire sufficient confidence in their security for NIST to standardize. In addition, some alternate candidates were selected based either on NIST’s desire for diversity in future post-quantum security standards or on their potential for further improvement.

I am quite a big fan of SPHINCS+, Picnic (these two reduce their security to the one of their underlying hash functions), and Classic McEliece myself. SIKE is interesting too but as far as I know it needs to be used interactively. Rainbow is also interesting.


I think McEliece and Sphincs+ are both excellent applications for typical PGP use cases-- and have great security properties. Their weaknesses don't matter as much in low throughput applications.

(e.g. McEliece's slow key generation is a big problem when you're creating 1000 connections per second and want good perfect forward secrecy. It's a non-issue for a long term key that is rotated at most once a year.)

I'm sad that sphincs+ is only an alternate.

Things like software signing and human scale communication don't care if signing is somewhat slow and the signatures are long-- but they do have extremely long lived keys that are difficult to rotate.


Signing/verification speed are both important on things like TLS/SSH/etc and sadly SPHINCS+ can not compete in either, see page 55 in https://sphincs.org/data/sphincs+-round2-specification.pdf

But I agree that it can be used in the OpenPGP use-case. Anyway, Rainbow seems quite solid for TLS.

> McEliece's slow key generation is a big problem when you're creating 1000 connections per second and want good perfect forward secrecy

I personally think that if you want forward secrecy you should derive the session key from both the static McEliece key and the ephemeral ntru/whatever key.


SSH probably could use a much simpler hash based signature than sphincs+ -- e.g. you wouldn't really care if the signature (which is just sent once at connection time) was 40kbytes.

Nothing in the competition does that. (too bad, because it would also be the fastest algo in the competition to verify and one of the fastest to sign).


The problem with classic mceliece is the size of the public keys, which is measured in megabytes


Well, at least people are used to downloading Mb of junk when they browse the internet these days anyway...


We also want crypto systems that work on embedded/constrained environments


And that's fine. These systems can use NTRU or whatever else. These systems being weak is not a reason to drag everyone else down.


Yeah I’m not actually stoked on big keys, just being snarky.


1.3 MiB for the most secure parameter set (mceliece8192128). The other sets are less than 1 MiB.


> All of Bernstein's submissions (except the joke one) are in the finalist or in the alternative list.

What was the joke one?


Post-quantum RSA, 1-terabyte keys: https://eprint.iacr.org/2017/351.pdf


Note, the submission was a joke and the paper was humorous but not a joke itself. It presents a new factorization method for example.


40 year old Classic McEliece made into the second round. It's well proven but not fit to all uses due to it's large public key size.

> Classic McEliece has a stable specification−−the only significant change in the second round is the addition of additional parameter sets. As such, NIST selected Classic McEliece as a finalist and believes it could be ready for standardization (should NIST choose to select it) at the end of the third round


I have heard that too, but us it really a big deal to have 65KB+ keys these days?


As someone who works on embedded devices, yes. The whole system can have 256KB of flash or less. 1 or 2KB for keys hurts, but is manageable. 65KB means that much of the IoT will not be upgraded to post-quantum security.


IoT devices should be behind private networks rather than being directly accessible to the internet anyway. Said devices can use a "weaker" algorithm like NTRU.


> IoT devices should be behind private networks rather than being directly accessible to the internet anyway.

Hogwash. You just reduced my reliability by a dramatic amount. Not only does my IoT device have to be functional, but it now has to have a gateway that is functional at the same time in order to be useful. Uh, yeah, no.

And, besides, a private network is only private until one of the devices gets compromised. Then it's not private anymore.

Better to make your device capable of living on the real, hostile Internet.


> Not only does my IoT device have to be functional, but it now has to have a gateway that is functional at the same time in order to be useful

Consider it differently. Your IoT stove, IoT coffee maker, IoT washing machine, IoT lights, etc do not need to be secure if they are behind a secure gateway.

> And, besides, a private network is only private until one of the devices gets compromised. Then it's not private anymore.

The gateway makes sure that nobody but you will be able to access (and thus compromise) your IoT freezer.

> Better to make your device capable of living on the real, hostile Internet.

Sure, but considering the amount of IoT botnet that is out there I do not think that this is going to happen.


So why not use the other alternatives or IoT or have a separate standard for them? One size fits all is good but not if it means making costly sacrifices in my opinion


I hope it makes it through the selection. Either way i'll be using it.


What application do you have where huge public keys are no problem, but you fear imminent (at least within a lifetime where you can't upgrade) availability of large quantum computers capable of Shor's algorithm?


> imminent

The arrival of quantum computers doesn't have to be imminent for it to cause harm. The Utah datacenter (and likely others) store giant amounts of encrypted content, waiting for the key exchange to become decryptable. Currently almost all communication we consider to be encrypted is not safe from quantum computers.

How great would it be in 30 years if you had great evidence that people who are in the prime of their careers have done embarrassing things in their youth. You could extort them or use it to get them out of the way if they turn out to be a problem. The value of such information in fact increases over time as those people rise to more important positions of society.

So ideally, we'd switch to quantum confidential key exchange as soon as possible so that such an attack can't be done.


> How great would it be in 30 years if you had great evidence that people who are in the prime of their careers have done embarrassing things in their youth.

Yes, Facebook does exist. I've written about this before. People who don't have a past with vaguely embarrassing things they did and goofy photos and the wrong friends and so on are not future politicians, they are ghosts and their future is as foreign intelligence agents. Do you know what sank Prime Minister "Call me Dave" Cameron in the United Kingdom? Was it the unverified and likely fictitious stories about a pig - similar to events from a popular a TV show? The family tax scandal? Something he did at school or as a young man? Photographs of him dressed like someone from a different century? Nope.

What sank Dave was assuming that voters wouldn't destroy their own country to spite people they don't like (including him). A depressingly ordinary political tale, that could come from any era in history.


Wow, that story is really irrelevant. Take the time to read about cancel culture.


With Classic McElice, at least right now, the computation time for a keypair is a bigger problem than the size of the public key. It will get faster with better implementations and hardware use. But now it takes many seconds on a Ryzen 9 3950 to generate a keypair. I had to build a system with background threads computing keys and having them ready for quick response when a connection is made. If anyone is interested see: https://github.com/kareldonk/QuantumGate


You might want to check out https://groups.google.com/a/list.nist.gov/forum/#!topic/pqc-...

This project that you linked seems quite nice. I especically like its Code of Conduct.


Thanks for the link. It seems the implementations are already improving. From the posted benchmarks it looks like it got twice as fast, which is still slow for some use cases (like ephemeral use for forward secrecy) and slower devices. The best solution will be hardware support.


Pretty much anything really. HTTPS, private communications, etc. I am using a computer, not an IoT device.


Classic McEliece is not good for HTTPS or anything where you need to download new public keys frequently.


You do not need to download a new public key in HTTPS frequently (except if you use forward secrecy, in which case a mixed scheme could be used), most people visit only just a few specific sites. In addition public keys could be cached. Anyway, a lot of web pages are already much bigger than the McEliece public key.


How about a web browser?

The Nytimes.com front page is 4.66 MB. A Quantum McEliece key is 8373911 bits (per wikipedia) or almost exactly 1MB. Therefore if my connection to Nytimes.com was over McEliece based TLS -- it would be a 21% bandwidth increase. I doubt I would notice.


How much of that do you have to download before browser does first paint? With a key handshake you would have to do it before any content is displayed. In much of the modern web (for people blessed with fast connections) latency is much more important than bandwidth. An extra 1mb right upfront might be very different from an extra mb overall. Especially when tcp slow start is taken into account.

Even more so if the webpage is loading stuff from many different domains


You have to consider that web browsers run on IoT devices now and shouldn’t be excluded from security.


I honestly doubt that an IoT device capable of displaying a modern fancy site and running bloated things like webkit would have trouble using Classic McEliece.


This is the 3rd round, not the second.


This is the 2nd round that selected 3rd round finalists.


"made into the second round" means that it passed the 1st round and entered the second.


It should be noted that Bernstein was complaining about the lack of transparency and NSA involvement in https://twitter.com/hashbreaker/status/1285922808392908800


Of course, NIST compliance is a circus.

I use it as a list of what not to use.

Once a senior cryptographer in Citi's R&D lab was grilling me on my use of 25519 being without NIST's blessing. I asked him what he would use, he said, "cryptographically speaking 25519", Ok.jpg.


> I use it as a list of what not to use.

Bernstein's (the one that made 25519) own submissions passed to round 3 though. Do you also avoid AES/SHA3/BLAKE because they were NIST finalists?


> Of course, NIST compliance is a circus.

The problem is when customers start demanding it. Then you don't really have much of a choice. Especially when your competitors are offering it – what is the more successful sales strategy – try to convince the customer they don't really need what they think they need, or just give them what they want? You'll win more deals with the second approach than the first.

That doesn't mean you can't use NIST-unapproved algorithms that are technically superior to the NIST-approved ones. You can just have some "NIST mode" (or "FIPS mode" to use the older terminology) that you can turn on for those customers that ask for it, and then only NIST-approved algorithms get used for those customers. (It is more than just the choice of algorithms, you also need to have a certified implementation – in Java, one can replace BouncyCastle with BouncyCastle FIPS, and then have a flag to turn on approved mode for the instance; once in approved mode, use of non-certified algorithms is blocked.)


Interesting. Arjen Lenstra used to be at Citi's R&D Lab; his brother is Hendrik Lenstra, who was Bernstein's Ph.D advisor.


One thing that bothers me about most of these algorithms is how conceptually difficult they are for me to understand. I don't have that strong of a background in cryptography but there is something special about how with RSA you can do it at a small scale by hand. With very limited notes I can explain to a group of high schoolers almost exactly how this vital part of encryption functions and even vulnerabilities in it (small exponent values, etc). While rolling your own crypto is bad, with a bit of work I'd expect almost any programmer to be able to implement RSA without too much difficulty.

I'm not sure how important being able to explain encryption algorithms in detail to high school students is, but I'll definitely be sad when RSA and other non-quantum resistant algorithms join the likes of the enigma machine and exist for demonstration process only. I'm hopeful in time less crypto minded people (like me) will better understand module learning with errors or lattice based cryptography and maybe we'll get better at explaining it. It would be a shame if even more of cryptography into an unintelligible black box to eke out performance gains.


RSA isn't that easy to understand. It's a trap: it seems simple on the surface because the core mathematical operation is easy, but the actual complexity (padding) often gets omitted.

RSA-KEM (key encapsulation mode) is easy. There's no padding, the only thing is you can't encrypt a message, only a random number < the modulus. Then you run that number through a Key Derivation Function (KDF) and use it with an Authenticated Encryption with Associate Data (AEAD) cipher. That's just simple enough I'd teach it to high-school students.

RSA encryption is hard. It requires OAEP padding, which is by no means simple. I'd not try to teach it to high-school students.

RSA signing is hard. It requires PSS padding, which is by no means simple. I'd also not try to teach it to high-school students.

Every other use of RSA is insecure, often in very subtle and hard to understand ways. Using RSA as a teaching tool about public-key cryptography does a disservice to students, since they (like you) tend to think that RSA alone is useful for cryptography.

And if you can treat an operation like the padding as a "black box" and ignore understanding, then you can understand code-based cryptography the same way: treat the code as a black box, and the math is very simple.


I sympathize but don't entirely agree.

The old version 1.5 signature padding remains widespread. Unlike PSS it doesn't have a security proof reducing to RSA but it does have decades of successful use in practice in one of the harshest environments (the Web because the clients merrily run code written by a potential adversary).

"Prepend this fixed data to your hash" which is the central idea of v1.5 padding is definitely something you could teach to high school students. You can even show them why it's necessary pretty easily for a small exponent.

Making people check padding is again not too hard for high school students, and I think "Do all the things on the checklist. All of them" is a worthwhile lesson not just in cryptography.


It is possible to implement 1.5 padding securely. But I'd not say it's easy, since I'm not aware of any implementation not written by an experienced cryptographer/cryptographic programmer that's gotten it right on the first try (no padding oracles or other vulnerabilities). It's conceptually simple, but still has plenty of footguns. The problem with RSA is all the footguns! Avoiding them reduces the conceptual simplicity, to the point where you might as well teach a slightly more complex scheme, like EdDSA or McEliece (with black-box code).


The most common mistake in implementations of RSA signature verification seems to be just plain not validating the padding at all. That does not require a "experienced cryptographer" it requires actually doing everything on the checklist, a reflex that's worthwhile in many pursuits.

Historically sometimes people would say "Well nobody would make an error like that in a modern elliptic curve scheme" and then Microsoft turns out to have shipped a very popular operating system named "Windows" which didn't do curve validation - so much for that belief.

Padding oracles aren't a thing for signature verification. You can work this out for yourself, everybody can do signature verification using the public key, so if there was a way to do it "badly" that somehow gives away the private key, you'd do it that way yourself and skip all the effort.

What does exist and might have confused you is an oracle for RSA PKCS#1 v1.5 decryption. In this case the party doing decryption knows the private key and so it makes sense that if this is done poorly they can leak vital information. It doesn't seem fair to call this a flaw in the signature scheme.


> Historically sometimes people would say "Well nobody would make an error like that in a modern elliptic curve scheme" and then Microsoft turns out to have shipped a very popular operating system named "Windows" which didn't do curve validation - so much for that belief.

I am pretty sure that CryptoAPI does not support (or at least did not at the time) any modern elliptic curve signature scheme, and by modern I am referring to thins like ed25519.


Not to mention the most difficult issue that even mainstream libraries (like libgcrypt which is used by gpg) get wrong: implementing rsa in constant time as to avoid timing side channel attacks. I would argue that understanding and implementing elliptic curves (the modern ones especically) correctly is much easier.


Do you have a reference to any practical side channel attack on, say, 2048 bit RSA using the current version of libgcrypt? If so it should be filed as a bug.



The current version of libgcrypt is not practically exploitable using the technique described in that 3 year old paper.


I remember that their changes <https://git.gnupg.org/cgi-bin/gitweb.cgi?p=libgcrypt.git;a=c... were still variable time and thus able to leak secrets. I presume that nobody bothered exploiting yet.

Anyway, even if this was fixed (which I doubt it) the point is that they had vulnerable code for 18 years.


You can intuitively reduce RSA to model problems that you can do by hand, or trivially implement it in Python, but it's debatable how much your intuition is actually serving you; do you really have your head around why RSA works, what its limitations are, and how it can be applied? I think you have to get that through careful study, not from the kind of "machine sympathy" you often acquire through implementing algorithms and systems code.

In reality, most programmers cannot implement RSA safely without much difficulty; the opposite thing is rather the case. There's a reason the term "schoolbook RSA" has the connotations it does.


But there still is value in "schoolbook RSA."

The OP thinks we lose something if we don't have "schoolbook $NextGenCrypto."

I would disagree, because we can still use "schoolbook RSA" for its primary purpose: education on the general concept. At the high-school, or even 101 uni level, either we are teaching basic, generalizable skills, or we are providing high-level surveys of general knowledge.

The use case isn't "How do I implement and use RSA as a practitioner in the field," but rather it is "How does this thing, that seems like magic, work?"

The risk is that the simplified model gives some kind of intuition that will provide a disservice later in life, but the benefits are just the opposite. Which is why showing how even small errors even in the toy-implementation can render it insecure is also a useful tool.


I dispute that; I think schoolbook RSA has probably done more harm than good. It's not showing you how this thing that seems like magic work; it's just replacing one magic trick with another.

It's not a hypothetical concern; you see it in every instance where someone has implemented schoolbook RSA in production, which was a not-infrequent occurrence when people were still using RSA in new designs.


Unlike Thomas I don't believe Textbook RSA is necessarily unhelpful for thinking about RSA. However I do think it's unhelpful for thinking about other public key algorithms. And even ignoring Post-Quantum Cryptography you probably want other algorithms than RSA in new systems.

What you will still see in brand new by-laymen for-laymen material today is an assumption that everything else is just like RSA but with trickier maths and that's nowhere close to true. For example, RSA can be used for signatures, and so can Ed448 using fancy elliptic curve maths, I know how to use RSA to encrypt this GUID†... so presumably now I can just encrypt the GUID with Ed448 instead of RSA? And the answer is just "No" because Ed448 is a signature scheme, it isn't for encrypting things.

This happens across all of IT, whether it is programmers bemused that git doesn't "lock" the remote repository when they check code out, or surprised that Strings aren't actually an array of Characters in a new language they're learning, or network engineers who still haven't seen CIDR notation... but in cryptography your mistakes from false over-generalization can blow up in other people's faces really badly.

† There's an excellent chance you were doing this wrong, but that's not the point here.


Because post quantum crypto algorithm are based on completely different breed of hard problems. Lattice problems and they are not straightforward to visualize as we were with RSA. Visualizing a lattice of large dimension, calculate its shortest vector or various other properties is not straightforward.

My bachelors was in Math and my master thesis was about lattice problems and post quantum cryptography. For my Ph.D. I switched to completely different subject because I couldn't write good papers in this area. So I understand your concern.


2d lattices are easy to visualize though. I've given brown bags to general engineers on NTRU like systems, and have found that some find it even easier to understand than RSA as it's more graphical


I am actually not sure if this isn't more of a problem of RSA than a feature.

I have seen many simple introductions to some cryptography that try to explain RSA. The problem is: They're almost always wrong and explain some insecure variation of RSA. (Most of the time they talk about a "Message" as the input to the RSA function without ever explaining the concept of padding and why this is important in RSA.)

The thing is: Implementing some form of RSA is relatively easy. Implementing RSA in a secure way is far from trivial.


I think it's likely that your understanding of RSA is mostly fake. E.g. You could do some totally insecure 'RSA' on paper, but it would be insecure.

Eventually people will write intuitive explanations of other algorithms (it's fairly easy to do them for sphincs+ and mceliece, for example) and you'll feel you understand them too. (and, in fact, I think those understandings will actually be less dangerous than the RSA ones!)

> with a bit of work I'd expect almost any programmer to be able to implement RSA without too much difficulty.

Implement RSA incorrectly without too much difficulty.

You should be dubious that even world class cryptographers will get it totally right, as many RSA implementations -- even ones written by experts-- have had serious flaws.


Would you consider elliptic curve cryptography conceptually difficult? It's certainly more complex than RSA, but over time great educational resources have been developed. The same will happen for these post-quantum schemes.


Personally, I don't think the ECC tutorials are very good-- they are mostly analogs of the RSA tutorials that are extremely mechanical and do not impart much to any useful intuition about the system.

Someone versed in the monthly HN "how ECC works" submissions will find themselves able to implement a completely naive sidechannel vulnerable and unusably slow implementation of pubkey generation and ECDH (and maybe ECDSA, if they cribbed from wikipedia while doing it). Yet they should on no account be doing this because there are already many excellent, well written (or at least battle hardened) implementations.

By contrast they are unable to:

1) recognize vulnerabilities, even trivial ones, in implementations (including their own)

2) Invent new useful applications (which would be the best reasons to not use an existing piece of code-- it doesn't do what you need)

3) Come up with or implement performance optimizations.

The reason I've found for that by focusing on a mechnical understanding of how to apply the group law gets in the way of an algebraic understanding of how the group can be used.

For all but (3) the specifics of the group law are irrelevant and even for (3) everything except for specialized cracking code uses projective coordinates so the procedure people learned for affine coordinates isn't useful.

If people are having fun they have my blessing :)... but it's my informed view that overwhelmingly ECC tutorials give a lot more "feeling of understanding" than useful understanding.


I second that. I've helped teach ECC to high schoolers (granted, it was a summer program for exceptionally mathematical students) and they found it quite approachable.


Lattice based cryptography can actually be made pretty accessible and can be taught to highschoolers and fresh undergraduates (I have done this with students at my job) as long as you give them a couple of math tools they may not have: matrix transforms, Gaussian distribution, vector dot product, and the one time pad. While this is a longish list, none of the topics are particularly hard and are accessible. I'll take a crack at an explanation (with schoolbook Regev-like LWE) and you can let me know how I did. It is better with pictures and probably can be shaped over the years, but I do think that people can adapt this explanation to turn it into something as easy, if not easier, than RSA.

1. Start with a random (secret) n-dimensional vector, s, that lives on a lattice (a square grid like pattern). This is your (small) secret key.

2. Hit that vector with a randomly generated matrix A (written As for multiplying matrix A with vector s), which basically just scales, skews, rotates, and projects that lattice into m-dimensional space (we want m to be much bigger than n). For those in the know, the new vector, As, is almost a "psuedorandom", "random looking", or hard to predict value. This is a way of taking your small key and turning it into a large key.

Intuition: Note that each component of As is basically an individual dot product. On a simple level, you can think of the dot product as a generalization of XOR, parity, or something that creates a checkerboard pattern using that lattice. This basically creates a high dimensional grid (lattice) with many colors that alternate. The end goal is to get some samples of these colors and reconstruct what the grid pattern looks like. If you have tried to learn a checkerboard (XOR, parity) pattern with a linear classifier, you know that this is hard. Here, we are generalizing this concept into something even harder.

3. Add (Discrete) Gaussian noise to As to get As + e. This is just adding "small" random noise to each of the components. You do this is to turn the almost "random looking" large key into an actually "random looking" large key.

Intuition: The reason for doing this is that if you have A and y where As = y, you can solve for s using Gaussian elimination. This is equivalent to the process that gradeschoolers use to solve systems of equations with multiple variables (where they subtract one row from another or solve then substitute variables).

Turns out that Gaussian elimination and other algorithms suck bad at getting the right answer if you have noise or small sources of error. The process gets thrown off wildly and gets no where close to s. The situation is so bad that even quantum computers are assumed to not be able to do it. This is what makes your large key resistant to being reverse engineered by powerful computers.

4. Use your As + e as a one time pad. To encrypt a message m, your ciphertext is the pair A and As + e + m. To decrypt, someone takes the key, s, computes As, then subtracts that off. This leaves you with m + e.

5. Decryption in (4) is bad because the small error, e, is leftover. This makes the process noisy and makes you lose a little information when you encrypt or decrypt. You fix this by using a noise correction encoding that makes the e go away.

Your new ciphertext is (A, As + e + encode(m))

EDIT: Note that to make this actually secure you need to pick good parameters for your lattices and your noise distributions. If your error is too small, it wont fool attacks and is easy to solve. If your error is too large you cant decrypt. The problem is easy unless you are in high enough dimensions (for instance, LWE is easy in 2 dimensions, which is what we use to visualize things).


Do you have any YouTube videos of your lectures? If not, consider making one and uploading it. As lattice based cryptography comes into maturity ELI15 for us non-mathematicians on the ground floor will be essential.


Watch YouTube videos of Dr. Peikert presenting his paper "lattice based cryptography for internet"

https://www.youtube.com/results?search_query=lattice+based+c...


DJB seems to have a disagreement regarding some statements that NIST made https://twitter.com/hashbreaker/status/1288039119805747200

"Misrepresentations of security proofs are starting to cause serious damage. The embarrassingly wrong idea that there's a proof that the "security of NewHope is never better than that of KYBER" is the centerpiece of NIST removing NewHope from #NISTPQC. See https://nvlpubs.nist.gov/nistpubs/ir/2020/NIST.IR.8309.pdf

Known hybrid attacks are faster against Kyber than against NewHope. (Kyber missed this because its security analysis was oversimplified; I'm writing a paper on this.) More to the point, it's clear that there isn't and won't be a proof that Kyber is at least as secure as NewHope.

Cryptographers who condone exaggerations of what has been proven share responsibility for any resulting security failures. We cannot simply ignore the increased influence of dangerously oversimplified "provable security" claims upon standards and deployment. This is not a game."


This all seems premature. If someone manages to invent a computer that works on atomic effects then there will be a lot of development of new algorithms based on whatever that computer can do.

There is a very good chance that regular classical computing would be greatly speeded up by such an invention. Just the ability to do things on atom scale would be a huge breakthrough.


It's entirely possible that QC will arrive but will be more of a special purpose accelerator for certain algorithms (think of the way custom ASICs can accelerate specific operations) rather than a general purpose thing. Some form of general purpose QC may happen eventually but that might take quite a bit longer, and it may take longer still for it to be commonplace.

In the meantime there will be a strong need for classically computable cryptographic algorithms that are strong against attack by any known quantum computable algorithm.

Symmetric encryption (AES, ChaCha) and hashing (SHA, Blake2) are mostly safe as long as at least 256 bit keys are used and the algorithm doesn't have weaknesses that QC might open wider. Asymmetric encryption based on conventional Diffie-Hellman, RSA, or elliptic curve methods is absolutely not safe. There are known quantum algorithms that on paper can crack them in reasonable amounts of time if you have the right type and size of quantum computer, and at this point we have reason to believe that such a quantum computer can probably be built.

Post-quantum cryptography efforts are about developing asymmetric encryption (key exchange and signatures) that can be computed on a classical computer but for which there are no known quantum attacks.

The biggest danger by far is that somebody figures something out and builds such a quantum computer in secret, allowing them to snoop on all kinds of secret communication and steal vast amounts of money before anyone figures out something funny is going on. Unlike an atomic bomb or a new kind of aircraft or satellite, such a quantum computer could be built and operated without creating physical effects that anyone could see at a distance. Unless somebody talks it could remain secret for quite a while.


we have reason to believe that such a quantum computer can probably What are those reasons? Quantum computers could as well be nonsensical bullshit funded since 40s years


There are a few QC skeptics with various arguments.

If you were the head of NIST (or the NSA), would you be willing to bet the entire security of your civilian and military communications infrastructure that these few skeptics are right? There were atomic bomb skeptics too, and it took Einstein to convince the US government to ignore them and move forward.

Seems silly to make such a high stakes bet against the scientific consensus, especially if classically computable algorithms that are both classically and quantum strong can be found and deployed and if doing so is not that expensive.

Reminds me of Asimov's saying (paraphrasing): "When a distinguished but elderly scientist says something is possible, they are probably right. When a distinguished but elderly scientist says it is not, there's a non-trivial chance they're wrong." In this case quite a few of distinguished elderly physicists are saying you probably can build a (useful non-toy) quantum computer. If history is any guide, they're much more likely to be right than wrong.

Edit: it's not a bad idea to develop new algorithms anyway just to have them around. We don't think the trap door functions behind current asymmetric crypto are classically reversible (in any practical amount of compute time), but there is no mathematical proof of this. It's a strong conjecture that's held up so far, but it's still a conjecture.


Yes, my heuristically rational denial of the possibility of useful quantum computing does not imply that we should not prepare even for the lowest risks, especially when they're an existential threat. That being said I do not understand the need of this NIST competition. I had the believe that current SHA256/512 is quantum proof. Is that wrong? Why?


> That being said I do not understand the need of this NIST competition. I had the believe that current SHA256/512 is quantum proof. Is that wrong? Why?

SHA2, SHA3, and AES256 are all quantum proof, yes. This competition is about asymmetric cryptography though, it replaces non-quantum proof algorithms such as RSA, DSA, ECC, etc.


> be a lot of development of new algorithms based on whatever that computer can do.

This whatever should only change the performance by constant factors.

In theory, cryptographic algorithms give the defender an exponential advantage against the attacker, so any small constant factor can be overcome by adding a few bits of security margin.


Good




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: