The number that changed everything
For years, the working assumption was that breaking modern encryption would require somewhere between 9 and 10 million qubits. That estimate made quantum threats feel safely distant. Two papers published on consecutive days last week retired that number entirely.
Google Quantum AI published on March 30 showing that elliptic curve cryptography, the math securing Bitcoin and most financial infrastructure, could be broken with fewer than 500,000 physical qubits. That alone was a roughly 20-fold reduction from prior projections. Then, on March 31, Caltech researchers and a new startup called Oratomic published a separate breakthrough cutting the requirement further still: 10,000 to 20,000 qubits. Not millions. Thousands.
Two papers. Two days. The threat window shifted from the mid-2030s to the late 2020s.
How the math collapsed
Quantum computers are inherently error-prone. Every qubit makes mistakes, so engineers must bundle many unstable physical qubits together to produce a single reliable logical one. Until now, that ratio sat at roughly 1,000 physical qubits per logical qubit. The Caltech and Oratomic team, using neutral atoms arranged by laser-guided optical tweezers, brought that ratio down to about 5. A 200-fold reduction in overhead is what collapses the total hardware requirement from millions of qubits to something closer to what a well-funded startup could plausibly build by 2029.
Oratomic co-founder Manuel Endres has already trapped arrays of 6,100 neutral atoms in his lab. The company's CEO Dolev Bluvstein was direct about what that implies:
"It is quite plausible, although not guaranteed, that we will have a fault-tolerant quantum computer by the end of the decade. Although exciting and opening the door to a broad range of applications, such advances would also put modern cryptography at-risk."
This is not a researcher hedging for grant money. Oratomic's founding team, drawn from Caltech, Berkeley, Harvard, Amazon, and Google, openly states that their own research changed their minds about how close this is.
What breaks first
RSA encryption and elliptic curve cryptography underpin nearly every sensitive system on the internet: banking, military communications, and the private keys that control every Bitcoin wallet. Quantum computers running Shor's algorithm are specifically designed to dismantle these systems.
The Bitcoin exposure is unusually concrete. According to Google's Quantum AI whitepaper, roughly 6.9 million Bitcoin already have exposed public keys on-chain, meaning a sufficiently capable quantum machine could derive the corresponding private keys and drain those wallets. At current prices, that is not a rounding error. Satoshi Nakamoto's estimated 1.1 million BTC, untouched since 2010, sit among them.
There is also a narrower attack vector that gets less attention. As Bitcoin Magazine reported, advanced quantum systems may be capable of executing attacks within Bitcoin's 10-minute transaction confirmation window, targeting unconfirmed transactions before the network can settle them. Google's researcher Craig Gidney puts the odds at 10% that a cryptographically capable quantum machine exists by 2030. That is not a comfort. Ten percent on an outcome this consequential is not a small number.
The coordination problem nobody wants to solve
Post-quantum cryptography exists. NIST has been standardizing it. Global guidelines recommend migrating by 2035. Google has set an internal deadline of 2029. The gap between those two dates is the real problem.
For centralized systems, migration is painful but manageable. For Bitcoin, it is a different category of challenge. Any migration to quantum-resistant cryptography requires voluntary coordination across thousands of independent stakeholders: miners, exchanges, wallet providers, node operators, and individual holders. There is no central authority to issue a mandatory upgrade. Satoshi sketched out a response to this scenario back in 2010, suggesting users could re-sign coins into new address formats if cryptography weakened. Voluntary re-signing across millions of wallets, against a threat that has not yet materialized, is exactly the kind of coordination that never happens fast enough.
Alex Thorn, head of research at Galaxy Digital, noted that Google's paper "describes much more efficient circuits that significantly reduce the requirements for a quantum computer to be capable of breaking classical cryptography, such as those that secure blockchains like Bitcoin." Bitfinex analysts have called quantum computing "a genuine engineering challenge for the cryptocurrency industry, but far from an existential threat in its current form." That qualifier, current form, is doing a lot of work in that sentence.
Why it matters
The story everyone is telling about these papers is about qubits. Fewer qubits required, timeline moves up, encryption at risk. That is accurate but incomplete.
The deeper pattern is about the gap between physics and coordination. Physics moves in research cycles measured in months. Institutional coordination, especially across decentralized systems, moves in cycles measured in years. Two papers just compressed a decade of assumed runway into something closer to three or four years. The coordination machinery has not changed speed.
Post-quantum cryptography standards exist. Quantum-safe algorithms are available. The tools are not the bottleneck. The bottleneck is the assumption that there is still time to move gradually, and these papers are the first serious evidence that assumption was wrong.
When the lock breaks, it will not send a warning. The question is not whether the migration needs to happen. It is whether the systems that most need it are capable of moving fast enough to actually do it.
Originally published as an Instagram carousel on @recul.ai.