Post-Quantum RSA

Interesting research on a version of RSA that is secure against a quantum computer:

Post-quantum RSA

Daniel J. Bernstein, Nadia Heninger, Paul Lou, and Luke Valenta

Abstract: This paper proposes RSA parameters for which (1) key generation, encryption, decryption, signing, and verification are feasible on today's computers while (2) all known attacks are infeasible, even assuming highly scalable quantum computers. As part of the performance analysis, this paper introduces a new algorithm to generate a batch of primes. As part of the attack analysis, this paper introduces a new quantum factorization algorithm that is often much faster than Shor's algorithm and much faster than pre-quantum factorization algorithms. Initial pqRSA implementation results are provided.

Posted on May 31, 2017 at 6:31 AM • 40 Comments

Comments

Who?May 31, 2017 7:12 AM

I haven't read the paper yet, just did a fast look at it, so I may be wrong here. Will read the full document later.

It seems that using carefully chosen RSA parameters we can have quantum resistant RSA cryptography on our current hardware. So, it should be possible, let us say, generating a pair of RSA certificates in a computer that implements these parameters and move them to a smart card (obviously smart card themselves will not generate quantum computing resistant certificates yet).

Will a firmware upgrade on the readers add this feature to hardware?

Well, at least libraries like LibreSSL should implement the new prime generation algorithms.

Who?May 31, 2017 8:33 AM

@ MrC

Indeed, I stopped reading the article when found that "[their] main result is successful generation of a 1-terabyte exponent-3 RSA key consisting of 4096-bit primes," and "the final multiplication of two 512 GB integers took 176,223 seconds in wall-clock time, using 3.166TB of RAM and 2.5 TB of swap storage." Nonsense. This is not only impossible to store on smart cards, it is out of the scope of most computing systems. I do not want to copy a one terabyte key to my ~/.ssh/authorized_keys file, not to say multiple keys.

Sad, I was expecting something like a method to generate an attack resistant Diffie-Hellman moduli file. This paper is something I would have never expected from these authors, it is certainly not their best research.

Another research targeted to the "fake news media" (I am sorry for the pun), not a serious scientific publication.

MrCMay 31, 2017 8:43 AM

@Who?

Indeed, this seems to be pretty far into "if your key is that big, why not just use a one-time pad?" territory...

AndrewMay 31, 2017 8:57 AM

Not a crypto expert but until a new algorithm is found (or a new approach, like QKD) a huge storage, huge memory or longer processing time may be the only way to protect against quantum processing power.

OtherMay 31, 2017 9:06 AM

Our batch prime-generation algorithm suggests that, to help reduce energy consumption and protect the environment, all users of RSA|including users of traditional pre-quantum RSA|should delegate their key-generation computations to NIST or another trusted third party.
The article does not contain any publishing date. In the PDF file's properties, the creation date is 2017 April 19. I guess that the extra "9" is a typo.

Clive RobinsonMay 31, 2017 9:17 AM

@ All,

I started reading the paper, but stopped when I got to page 4, section "Future Work" 2nd patagraph and read,

    Our Batch prime-generation algorithm suggests that ... users ... Should delegate their key-generation computations to NIST...

I do not now about anybody else, but I don't trust anyone else to make my keys. I use a couole of sources of entropy, a TRNG I designed built and tested, and a physical generator using a number of dice. These are used to generate large quantities of random bits that are both filtered and de-biased prior to a mixing process. The outout is then tested by software I wrote and tested, then used to search for primes with an algorithm I designed, wrote and tested and the results then go into further tests...

I'm sorry but NIST is untrustworthy because their standards process is flawed securiry wise and this appears to have been abused on more than one occasion by the NSA "finessing"...

Would other people take seriously the suggestion that NIST or any other third party that a Govetnment has influance over could be trusted?

The simple answer is no, we fought the Key-Escrow battle some years ago, and the reasons then have not changed for the better. In fact they have changed for the worse, a lot worse.

Clive RobinsonMay 31, 2017 9:21 AM

@ Other,

We appear to have hit the same stumbling point at about the same time...

Clive RobinsonMay 31, 2017 9:46 AM

@ MrC,

Indeed, this seems to be pretty far into "if your key is that big, why not just use a one-time pad?" territory...

Because in some respects the OTP is very inferior to PubKey.

For instance the OTP like all current symmetric cryptography requires a second out of band channel to securely transfer Key Material (KeyMat), otherwise it offers no security. Worse unlike other forms of symmetric crypto the OTP KeyMat needs to be transfered in advance and of a size sufficient for all envisioned communications in any given KeyMat transfer window. Further the OTP KeyMat is not reusable unlike many symmetric crypto systems, this gives rise to real problems for Key Generation (KeyGen). All of which means that there is a lot of Key Managment (KeyMan) issues both physical and logistic with OTP systems that rule it out for most --but not all-- crypto work.

The advantage of PubKey is it allows secure inband KeyMat transfer, thus it does not require the highly problematic secure second channel for secure communications to be established. Further KeyGen for the symetric crypto usually used behind it has considerably less problems and can be done as and when required on demand.

The major issue with Internet security is that it realy will not work with a "secure second channel" because the only way you can securely transfer KeyMat for symetric encryption boils down to a face2face transfer in a secure location. You would have to do this for every secure communications channel you wished to establish and that is in no way practical.

So like it or not we are stuck with PubKey, because the maths behind it is so far more trust worthy than any third party or out of band communications channel.

Dirk PraetMay 31, 2017 9:52 AM

@MrC, @ Who

"a 1-terabyte public key"?!

It will probably fit on a thumb drive or in an expansion slot in your head in a couple of years from now.

@ Clive, @ Other

Our batch prime-generation algorithm suggests that ... all users of RSA|including users of traditional pre-quantum RSA|should delegate their key-generation computations to NIST or another trusted third party.

Err, no. NIST can go covfefe itself.

MrCMay 31, 2017 10:27 AM

@ Clive

How exactly do you plan to do an in-band transfer of a 1TB file? That would take my ISP days to deliver, at which point they'd probably terminate my service. Maybe someone somewhere owns a wire that can handle that in a reasonable timeframe, but for mere mortals like me, transmitting a 1TB file is so massively inconvenient that it's just faster and easier to hand deliver a hard drive instead. At which point, since I'm already forced to do a face-to-face meeting, why not just exchange a symmetric key? The one-time pad bit was slight hyperbole -- my point being that exhausting a TB of pad would take months if not years for most purposes. I'm not seriously advocating that anyone start using 1TB OTPs; rather I'm pointing out that the practical problems with a 1TB public key probably exceed the practical problems with OTP.

AnonMay 31, 2017 11:01 AM

I sort of fail to the point of this.
The problem with RSA is that factoring is polynomial time with quantum computing and this does nothing to change it. Sure it ups the exponent on the on the polynomial a little from my cursory understanding, but it's still polynomial.

As for the suggestion that we leave key generation to NIST, the same jokers responsible for the backdoor in Dual_EC_DRBG, is completely laughable.

And of course, this says nothing of the fact that if their key sizes are such that a single multiply takes more than 2 days on a cluster of computers, even the "simple" operations like modular exponentiation used for encrypting/decrypting will take months.

The only safety this system provides is against anyone ever using it.

SandroMay 31, 2017 11:02 AM

Nice shot Bruce, now you have to confess:
did you want to check how much we pay attention to your posts?

Perhaps the paper could be interesting as speculation, but if we have understand well, 1-terabyte exponent-3 RSA key consisting of 4096-bit primes, multiplication of two 512 GB integers and so on, don't fit well to our (secure signature creation) devices ;-)

Milo M.May 31, 2017 12:26 PM

"Post-quantum RSA is not what one would call lightweight cryptography: the cost of each new encryption or decryption is on the scale of $1 of computer time, many orders of magnitude more expensive than pre-quantum RSA."

ATM fees are gonna wipe out our bank accounts.

AnuraMay 31, 2017 12:36 PM

1 terabit key is capable of fitting on a hard drive or being downloaded over the internet. This has practical applications, it's just not a general purpose replacement for RSA. For highly secure systems that need to communicate over the internet where you need to be certain the key isn't compromised in transit this is potentially very useful. Systems that might currently rely on a very large one time pad, for example, might benefit from switching to this key exchange.

AnonMay 31, 2017 12:43 PM

Just one more note about threat model.

When quantum computers that can factor RSA are eventually created, in the near term immediately afterwards, the only people who will probably have one are nation states (due to cost/difficulty of building one). Thus, the threat any post quantum algorithm needs to secure against is nation state level attacks - anything else can be adequately secured by existing pre-quantum algorithms.

Thus a post quantum algorithm that outsources its key generation to the government is like outsourcing design and security of the hen house to the foxes.

MeMay 31, 2017 1:53 PM

@Anon

I agree, but consider that those able to afford this will be those needing to protect themselves from nation states. I think the implication is that those people will be in our government.

Jesse ThompsonMay 31, 2017 3:04 PM

Just to be sure, don't we already have post-quantum algorithms more easily handled by ordinary computers today? SIDH for example.

In light of this I see projects like "What would it take to make RSA post-quantum?" more like academic explorations than any kind of serious search for a usable cryptosystem (such as one that might recommend involving NIST in *anything*?!)

@Me • May 31, 2017 1:53 PM

I'm really not even kidding: having NIST make your keys for you is not even appropriate *for government agencies*. Just imagine trying to get your fieldwork done without getting killed at the CIA, but NIST is sharing all of your key material with your foreign target just because cheeto thought that would be funny.

Yeah, no.

Nick PMay 31, 2017 3:14 PM

@ All

"until a new algorithm is found (or a new approach, like QKD) a huge storage, huge memory or longer processing time may be the only way to protect against quantum processing power." (Andrew)

"It will probably fit on a thumb drive or in an expansion slot in your head in a couple of years from now." (Dirk)

This is why I figure he posted it and why I sent it. Everyone do remember that there's only a few, post-Quantum methods available. They haven't had the peer review RSA has had. One already has ridiculous, key sizes. Another is patented with a version failing to classical attacks. Any of them might fail in the future.

So, we have work like this going on. The situation with asymmetric is Poe's Law: you get solutions that might be necessary evil or just a joke. Hard to tell as bad as things are. Worst case is we go OTP-like as another commenter suggested where we transfer the key material on encrypted HD's periodically. Since this is cumbersome, the public will opt for trusted, third parties like they do now for free serviced but maybe with HSM's. NSA does that in EKMS btw.

ab praeceptisMay 31, 2017 3:58 PM

Clive Robinson

I started reading the paper, but stopped when I got to page 4, section "Future Work" 2nd patagraph and read,

Our Batch prime-generation algorithm suggests that ... users ... Should delegate their key-generation computations to NIST...

Exactly. Same with me, plus: nist? Seriously? After nists collusion with nsa in tainting crypto nist not only isn't trustworthy but to be considered an enemy.

Moreover, TB keys (and the time needed) are evidently not practically useable.

On the other hand I understand the authors in that they look from an academic perspective and such their work *does* make sense. Moreover the part about *considerably* faster (than Shor) algorithms is interesting and of importance.

jerMay 31, 2017 4:12 PM

Given the stakes, current computing capabilities, no feasible attack angles, and adding a projection of five to ten or even twenty years, I would say this is how you go about hedging your bets. Most entities don't need that, but this is not one of those entities.

ThothMay 31, 2017 5:15 PM

@Clive Robsinson, ab praeceptis

Can we still trust these authors, especially DJB, after making such a weird publication and bad suggestion of "trusting NIST" ?

Or maybe DJB is just trolling us all ?

We need a DJB golden sticker ?

AnuraMay 31, 2017 5:33 PM

Well, it does say "Make RSA Great Again", and that means making sure your keys are used by Real Americans™ and not terrorists.

call girlMay 31, 2017 5:52 PM

The post-quantum speculation is highly premature. We don't even know what "quantum computation" is supposed to do or if it's any more achievable than the "artificial intelligence" that is always 50 years away.

No one seems to take the trouble to clarify whether we are talking about Geordie Rose's "quantum annealing" or one of the various proposed quantum logic gates, or whether any of these gates are achievable in practice.

We do not have clear details on "quantum error correction" other than that it is absolutely essential to any possible realizable quantum computation of non-trivial size.

There is a more subtle philosophical issue underlying this, which concerns the so-called "collapse of the wave function" which in reality is not a collapse but an irreversible dissipation of computational information into waste heat, as described by Landauer

kBT ln 2

where kB is Boltzmann's constant, which might be more familiar to chemists as kB=R/NA, the ratio of the gram-molar ideal gas constant R to the gram-molar Avogadro's number NA, and T is the absolute temperature of the waste heat reservoir.

Thus to convert entropy from information-theoretic units such as bits to a more familiar engineering unit such as Joules/Kelvin, we may multiply by

kB ln 2

as long as we have Boltzmann's constant kB in the desired units, and the natural logarithm of the base of the information-theoretic unit.

ThothMay 31, 2017 6:02 PM

@all

The published pqRSA paper seems to be a troll paper. Notice the keyword tag used "Make RSA Great Again" under the abstract portion. I guess they either have nothing better to do than just troll around with some weird pqRSA. I guess we can take that paper just for the laugh and nothing all too serious.

A dedicated pqRSA golden sticker might look pretty.

Clive RobinsonMay 31, 2017 6:18 PM

@ MrC,

How exactly do you plan to do an in-band transfer of a 1TB file?

I don't, but that is not the point.

The size of the file is time relative, currently a 4Kbit key size is considered adiquate and will remain so for some time to come.

The problem security wise is not the PubKey maths, but the way we current use PubKey.

If we accept that at some point 4Kbit Pk will be broken then we have to accept that due to "collect it all" that any plaintext including symmetric keys sent under it will become available to the NSA et al.

Even increasing the Pk size will not feasibly resolve this problem.

There are two known solutions to this problem.

The first is only send information that has a short temporal period. Thus most financial transactions on the Internet use a Credit Card. However within six months the record of the sale are a "matter of record" and available to the NSA et al any way as "business records". The solution in this case is to also make Internet Credit Cards have a very short temporal life, preferably for only a single transaction, we already know ways to pursue this in a way that Quantum Computing is not likely to effect.

The second known way is to use a shared secret and amplify it in a way such that each party can derive a key for a symetric algorithm with neither being sufficiently effected by Quantum Computing. As an overly simple idea for explanation only each side could send a random number that they then transform via an asymetric algorithm keyed by the shared secret. The problem is obviously establishing the shared secret.

There however may be other ways currently unknown to perform the same functions as current Pk that is Quantum Computer proof for the next century to millennium but without the excessive size of key.

But there may be another way Quantum Key Distribution if the distance, switcheng and one or two other problems can be overcome, may render the current need for Pk or other Quantum Computing sensitive methods to be irrelevant. By for instance enabling a shared secret to be securely transfered easily without the two parties having to meet geographically.

Making both a possability is afterall what research is about.

But I suspect making those two possible, may actually be easier or achived befor Quantum Computing can be scaled up sufficiently to be of use against Pk of more moderate key size than 1TB.

call girlMay 31, 2017 6:25 PM

@Thoth

The published pqRSA paper seems to be a troll paper.

I have that concern as well about the work of people so apparently full of themselves. These papers are coming out of a back office somewhere with the force of gunfire.

There's a frat house. Drugs, sex, and alcohol all lie. Guns generally do not.

However, I must add that mathematical truth does not proceed out of the barrel of a gun.

The math needs to be proven so that ordinary folks can be convinced of its truth.

Something is not right about this, and I am concerned about the freedom of others to criticize this work — if it is anything but a troll paper. We seem to have lost a critical mass of bona fide academic research somewhere about the time college tuition began to exceed prudent levels of debt.

We need to get real about this. If you can't work your way through college debt-free with a summer job like your parents or grand-parents could, your education is way too expensive and you aren't getting what you paid for. Profs should definitely not be funding their own "research" with students' tuition monies, especially when the students are not participating in such jealously guarded research.

AnuraMay 31, 2017 6:35 PM

Trolling aside, a centralized Diffie-Hellman parameter generator might not be a bad idea. Use something like SHAKE256 as a DRBG to generate Diffie-Hellman parameters from a counter and test them using a verifiable process. Then have the servers post the counter value as well as the parameters online as they find valid ones, signing them as they find them.

You can have dozens or more servers cranking out parameters, and then other systems can regularly get new ones for use in an ephemeral Diffie-Hellman key exchange. This allows you to mitigate the issue of reusing Diffie-Hellman parameters, while also ensuring the safety of the parameters and minimizing overhead. Then we can have a secure ephemeral key exchange without the concerns of using ECC.

Werner AlmesbergerMay 31, 2017 6:52 PM

The NIST reference alone makes it pretty clear that this is not serious.

As far as I understand what the state of the art is, quantum computers of suitable size should be able to defeat any RSA. Thus anything that resists would be a breakthrough, even if it required a ridiculously large key. So that's the one thing I actually wouldn't consider a red flag :-)

More: Make RSA Great Again. Reference [1].

Why ? No idea. "Because we can" sometimes is plenty.

- Werner

ThothMay 31, 2017 7:06 PM

@all

Despite the paper being almost obviously a troll paper, they are not proposong anything too out of the way except for requirement of very huge computing capabilities due to the 1 TB of exponent required.

As noted in the paper, 4096 bits Modulus can still be used whereas what they are asking for is 1 TB of exponent.

It is known that currently, most RSA operations uses a fixed 3 byte expinent of 0x010001. If a fixed 1 TB modulus is used and the private modulus are kept secret, it is still not as bad.

We can simply assume a fixed exponent and then only need to generate the 4096 bit private modulus and public modulus and sent the public modulus which might turn out to be rsther efficient as we dont have to consider the exponent if using a fixed exponent.

To avoid storing 1 TB worth of exponent, we can simply use 0x0100......01 (we have been using 0x010001 for exponent for ages) and we simply fill a RAM large enough with 0x010000..01 and do the BigInteger math and get the results hopefully without needing to spend too much time and money on the RAM memory.

All in all, the troll algorithm isn't too impossible to implement :) .

Clive RobinsonMay 31, 2017 7:31 PM

@ ab praeceptis, Thoth,

nist? Seriously? After nists collusion with nsa in tainting crypto nist not only isn't trustworthy but to be considered an enemy.

Which naturaly gives rise to,

Can we still trust these authors, especially DJB, after making such a weird publication and bad suggestion of "trusting NIST" ?

I think we can all agree various NIST standards commitees got owned by the NSA, and we can also probably agree that it was the way the standards committees were formed that allowed the capture to happen.

To be honest all the standards organisations I've had any dealings with get owned/captured by non benign interests be they commercial or governmental agencies.

As I've explained before it's the "Health and safety" version of the political "Think of the children". If you are sociopathic enough to frame your hidden interests behind "Health and Safety" you will outflank ordinary people who have either consciences or a fear of future consequences. It is the reason why we are still wasting trillions on the various "Wars on xxx".

So whilst I can understand intellectualy why NIST or any other standards body could get "owned". Especially as I warned about the NSA et al "owning standards" long before that, I still feel a sense of betrayal even though it was all together inevitable.

The thing about trust is there are two ways of looking at it, the social way and the technical way. Humans as social animals tend to trust implicitly then get betrayed. On the technical side you assume that nothing is trustworthy untill you have either verified it as such or have mittigated it if it is not.

The reality of modern systems is that there is no practical way to verify they are trustworthy. It was comming to this realisation last century that made me start down the road to the "mitigation approach" that gave rise to Castles-v-Prisons.

Thus my view point on NIST is that the safe way of behaving is to take the technical approach. That is to assume as defaulty they are not trustworthy, therefore verify or better still mitigate.

Which brings us back to the second question about the authors of the paper. I will have to read through the paper a couple of times before I make my mind up but it is odd behaviour.

Why say NIST when saying "a trusted third party" would have been more than sufficient.

Maybe one or more of the authors are trying to curry favour with NIST or some other US Gov agency.

What ever the reason I would find it hard to believe that any of the authors are compleatly unaware of community feelings about NIST.

Clive RobinsonMay 31, 2017 7:58 PM

@ Dirk Praet,

Err, no. NIST can go covfefe itself.

Is it an anagram, is it an acronym, no it's a Trumpism...

On the anagram front could it be V-coffee, or is the Doh-gnarled tryint to re work the "We for coffee" joke?

Clive RobinsonMay 31, 2017 8:21 PM

@ Call Girl,

Profs should definitely not be funding their own "research" with students' tuition monies, especially when the students are not participating in such jealously guarded research.

That's not what is happening to students tuition fees. In fact the spending on Uni staff is such that some tutors qualify for food stamps and other assistance.

Apparently a number of US Uni's have in effect become investment houses or hedge funds... Hence certain people at the head of the Uni administration are actually taking the tuition fees and gambling with it, to pay themselves very very excessive incomes...

ab praeceptisMay 31, 2017 8:48 PM

Thoth, Clive Robinson, call girl (et al)

I see two main aspects re. nist and djb.

- there are different groups such as academia, mist/nsa/..., large corps and gov'ments, and us, the security people (keep in mind that our perspective can be quite different from the one of cryptologists).

djb and his colleagues, being in academia, are in one way or another (as professioals) living and working with a construct called academia which is certainly not "free" and which is in a major part financed by state and large corp money.
Moreover *the* major driving - and paying - forces in the field are again states/agencies and large corp.

Which leads to a simple and tough situation: At least you don't attack them (in a professional capacity) and if you want your work to continue (being financed) and accepted you will be pretty much bound to look halfway neutral.

Furthermore we all know that something withouth nist/fips/eal/etc stamp won't be accepted by state agencies and large corps.

*We* security people are mainly interested in the net result ("security") plus being mistrusting is part of our healthy professional attitude. A cryptologist, however, usually is mainly a mathematician and academician. Both his definition and measure for security often is quite different from ours (e.g. attacks being of no less complexity than brute force).

It is hence much less strange for a cryptologist to use nist as an example for a "neutral" and widely accepted third party (also with access to considerable computing resources) than it seem for us.

In other words: When *we* judge (as I among others did) we naturally do that from our perspective and the result is pretty much bound to be a clear "nist? No!".

If we, however, want to judge someone like djb then we must apply the norms and usances of *his* field!

Another aspect - that is more important for us anyway - when asking "can we still trust djb (or Bruce Schneier, or ...)?" is that we do not even need to trust them a lot. a) we ourselves can examine the result of their work, the algorithms and b) quite commonly their own colleagues rigorously examine each others work.

So, as far as I'm concerned: No, no golden sticker for Chacha, 25519, etc. I *do* trust those algorithms with good reasons (and btw. I also trust djb and based on solid grounds).

I suggest to not boil down and reduce that paper to "nist" but to study it. Many years of experience tell me that a djb/Lange paper shouldn't be ignored.

--------

I found call girls thoughts and remarks interesting. I myself also don't expect quantum computers (not 4 qubits experiments but actually useable systems) anytime soon. In fact, I would not even bet more than a pizza on quantum computing anyway for diverse reasons, some of which call girl also mentioned.

However: As security people we must assume the worst.about half a century of "don't worry, that won't be a problem" premises have brought us into the ugly situation we're in now. So, again: Let us assume the worst case scenario, maybe quantum computers, maybe aliens landing, may who knows what.

Which leads me back to djb's paper. Forget that (from our perspective) unhealthy "nist" remark and let's look at the beef.

PK still pretty much means algorithms based either on the factorization or its cousin the log. reduction (for ECC). Now, make a little - and frightening - jump in your minds: If djb and colleagues have found some non-pq Shor competitor then we must assume that nsa/ghcq and accomplices have that, too.

djb just went the big step to the conclusion, namely: So, we have to assume that rsa 4K isn't good enough anymore. But how far do we want to go? Is 8k good enough? 128k?
That seems pretty much akin to how much time will ever bigger rsa buy us?

From what I see djb asks the right question in driving it to the point of assuming that sooner or later we will be at the point that we currently label as "pq" (post-quantum). We must understand that quantum-computing in a way is just the only known realistic looking incarnation of a problem that, as djb demonstrates, may well find other incarnations, some of which, btw. will evolve out of nsa's ever rising computing power obsession.

I also found call girls thoughts interesting as they tough a major *real* boundary for nsa: energy and the related factors (e.g. cooling). We should understand that that means no less than that some major leap in quite different fields (such as cooling live silicon, for instance) might create the situation, the threat that we currently paint and imagine as "quantum computing".

In other words: It is not quantum computing that is threatening us at the horizon. QC, again, is but one incarnation of the real problem, the monster behind it: Massivley increased computing power, no matter how (and in the wrong hands -> nsa and accomplices).

pq-secure crypto must be an urgent target and I agree with djb that it's not anymore good enough to step from 512 to 1k to 2k to 4k rsa. We must prepare for the monster because it will come. If that happens in the QC incarnation or in another one is not the relevant factor. But QC is what we always had in mind as image of the monster and it serves the purpose well.

And please, let us also learn to be humble as scientists and engineers! Not only because djb's paper is humbling but also because arrogance and ignorance have led us to where we are. Let us be humble and let us be mistrusting. Both, in my minds eye, are sine qua nons for good security people.

Android headrest monitorMay 31, 2017 9:37 PM

And please, let us also learn to be humble as scientists and engineers! Not only because djb's paper is humbling but also because arrogance and ignorance have led us to where we are. Let us be humble and let us be mistrusting. Both, in my minds eye, are sine qua nons for good security people.

AndrewMay 31, 2017 10:50 PM

To be honest, 1T long key is not really a terrible idea if you have to set up secure communication, like in diplomacy, for example. It may take few hours, maybe days, to transfer it but then you have a secure quantum resistant channel.
On the other hand, there are quantum processors out there and the progress is really fast. I think today's​ supercomputers can do the job too, RSA is far from being safe.

Some lecture:
http://www.nature.com/news/ibm-s-quantum-cloud-computer-goes-commercial-1.21585

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.