Accustomed to imagining worst-case scenarios, many cryptography experts are more concerned than usual these days: one of the most widely used schemes for safely transmitting data is poised to become obsolete once quantum computing reaches a sufficiently advanced state.
The cryptosystem known as RSA provides the safety structure for a host of privacy and communication protocols, from email to internet retail transactions. Current standards rely on the fact that no one has the computing power to test every possible way to de-scramble your data once encrypted, but a mature quantum computer could try every option within a matter of hours.
It should be stressed that quantum computers haven’t yet hit that level of maturity — and won’t for some time — but when a large, stable device is built (or if it’s built, as an increasingly diminishing minority argue), its unprecedented ability to factor large numbers would essentially leave the RSA cryptosystem in tatters. Thankfully, the technology is still a ways away — and the experts are on it.
“Don’t panic.” That’s what Mike Brown, CTO and co-founder of quantum-focused cryptography company ISARA Corporation, advises anxious prospective clients. The threat is far from imminent. “What we hear from the academic community and from companies like IBM and Microsoft is that a 2026-to-2030 timeframe is what we typically use from a planning perspective in terms of getting systems ready,” he said.
Cryptographers from ISARA are among several contingents currently taking part in the Post-Quantum Cryptography Standardization project, a contest of quantum-resistant encryption schemes. The aim is to standardize algorithms that can resist attacks levied by large-scale quantum computers. The competition was launched in 2016 by the National Institute of Standards and Technology (NIST), a federal agency that helps establish tech and science guidelines, and is now gearing up for its third round.
Indeed, the level of complexity and stability required of a quantum computer to launch the much-discussed RSA attack is “very extreme,” according to John Donohue, scientific outreach manager at the University of Waterloo’s Institute for Quantum Computing. Even granting that timelines in quantum computing — particularly in terms of scalability — are points of contention, “the community is pretty comfortable saying that’s not something that’s going to happen in the next five to 10 years,” he said.
When Google announced that it had achieved “quantum supremacy” — or that it used a quantum computer to run, in minutes, an operation that would take thousands of years to complete on a classical supercomputer — that machine operated on 54 qubits, the computational bedrocks of quantum computing. While IBM’s Q 53 system operates at a similar level, many current prototypes operate on as few as 20 — or even five — qubits.
But how many qubits would be needed to crack RSA? “Probably on the scale of millions of error-tolerant qubits,” Donohue told Built In.
Scott Aaronson, a computer scientist at the University of Texas at Austin, underscored the same last year in his popular blog after presidential candidate Andrew Yang tweeted that “no code is uncrackable” in the wake of Google’s proof-of-concept milestone.
That’s the good news. The bad news is that, while cryptography experts gain more time to keep our data secure from quantum computers, the technology’s numerous potential upsides — ranging from drug discovery to materials science to financial modeling — is also largely forestalled. And that question of error tolerance continues to stand as quantum computing’s central, Herculean challenge. But before we wrestle with that, let’s get a better elemental sense of the technology.