Blockchain Quantum Security Research Report: A Comprehensive Analysis of Quantum Computing Threats, Current Quantum Security Status, Preparation Recommendations, and Timeline Projections

Original author**:**Bob,Web3Caff Research researcher

At the start of 2026, Coinbase—the largest publicly listed blockchain company in the United States—announced the formation of a Quantum Advisory Council. The Ethereum Foundation elevated quantum security to the highest strategic priority and set up a quantum security team. Meanwhile, the U.S. NIST (U.S. National Institute of Standards and Technology) has also provided a timeline for migrating to quantum security. Taken together, these signs indicate that the blockchain industry is about to face a major security challenge.

Bringing the timeline closer to March 30, 2026, a paper released by Ryan Babbush, head of Google’s Quantum AI division, together with researchers associated with the Ethereum Foundation and Stanford University, has sounded the alarm bell for the quantum apocalypse. This latest paper, 《Protecting Elliptic Curve Cryptocurrencies from Quantum Vulnerability Attacks: Resource Estimates and Mitigation Measures》, states that under the latest quantum resource estimates, using fewer than 500,000 qubits can complete a quantum attack within minutes—shrinking the estimate by 20x compared with prior industry estimates. Compared with earlier timelines, Google has officially moved up the post-quantum migration schedule to 2029 and issued a “final” warning to the entire industry.

We know that the core foundation of blockchain is public-key cryptography. In recent years, the computing power of quantum computers has shown exponential growth trends, increasingly threatening traditional public-key cryptography. The media often gives very urgent dates for quantum threats, as if quantum computing could instantly destroy the old digital world—but that is not the case. In response to quantum’s potential challenges, the blockchain industry is actively developing quantum-secure solutions—for example, the Bitcoin community has newly proposed BIP-360 (Pay to Merkle Root) as an anti-quantum proposal; Ethereum’s upcoming anti-quantum upgrade EIP-8141; and the anti-quantum roadmap for the next 10 years proposed by Optimism, an Ethereum Layer 2 network. Compared with the complexity of blockchain network upgrades, the developer community is also building more “simple” anti-quantum infrastructure—for instance, “anti-quantum” tools (YellowPages) for Bitcoin users, aiming to ensure the quantum security of their private keys.

Of course, as the ability to store qubits scales up, the risk that quantum computers can break traditional blockchain cryptography is indeed increasing. Then, just how severe is this threat? How is the Web3 industry responding? How far away is anti-quantum in the future? Without “esoteric” physics concepts, this report extends from the most basic “quantum” ideas, analyzes the current state of quantum security for blockchain, and provides a time projection for this “quantum apocalypse clock,” in order to comprehensively dissect the systematic risks it poses to the blockchain industry and the mitigation measures being adopted today.

Table of contents

  • Theoretical introduction to quantum computing
  • Principles of quantum computing (superposition, entanglement, interference)
  • History of the development of quantum computers
  • Applications of quantum computing
  • Threats of quantum computing
  • Quantum algorithm: SHOR Shor
  • Quantum algorithm: Grover Grover
  • Analysis of the impact of quantum computing on blockchain
  • The impact of quantum computing on digital finance
  • Current state of quantum security
  • Development of post-quantum cryptography
  • Anti-quantum progress in the blockchain industry
  • Preparation recommendations and timeline projections for anti-quantum in the blockchain industry
  • Migration plans at the national level
  • Substantive deployment at the enterprise level
  • Quantum security preparation timeline for the blockchain industry
  • Conclusion
  • Key points structure diagram
  • References

Theoretical introduction to quantum computing

Quantum mechanics is the theoretical foundation of quantum computing. Quantum mechanics, as an academic theory, began in the early 20th century and is a very important part of modern physics. The term “quantum mechanics” originally comes from the German “Quantenmechanik” , created by a group of physicists from Germany and Austria at the University of Göttingen (University of Göttingen) in Germany. The emergence of quantum mechanics was driven by the need to explain systems that “classical physics” could not explain. “Classical physics” was an early understanding of the fundamental laws of nature, such as mechanics, electromagnetism, and thermodynamics. However, in the microscopic world, classical physics theories encountered limitations, and modern physics theories such as quantum mechanics emerged. Unlike classical mechanics, quantum mechanics describes the behavior of matter in terms of “probability,” thereby providing a completely new theoretical framework for the microscopic world.

It is very apt to describe traditional physics and quantum physics as “Does God play dice?” More than a hundred years ago, mainstream scientists in people’s time believed that the universe governed by God exists with “determinism.” The legendary physicist Albert Einstein (hereinafter referred to as Einstein) once questioned the randomness of quantum mechanics with the saying “God does not play dice.” The quantum school argued that God not only plays dice, but sometimes throws the dice to places we cannot see. As a supporter of quantum mechanics’ incompleteness thesis, Einstein believed that the universe exists objectively and agreed with “physical determinism”—that all phenomena are fundamentally necessarily governed and there is no “real randomness.” Danish physicist Niels Henrik David Bohr (hereinafter referred to as Bohr), as the representative of the new “probability theory” quantum school, believed that the essence of the world is “probability,” and proposed the “complementarity principle” (particle-like and wave-like behavior are complementary; they cannot be precisely measured at the same time; it is related to the uncertainty principle). This academic dispute about quantum mechanics began in 1925 and lasted for over 10 years. In the following decades, various experiments gradually proved Bohr’s viewpoint. Although Einstein was once a critic of the “probability theory” in quantum mechanics, he also indirectly propelled the development of quantum theory. More than a hundred years later, quantum physics has deeply penetrated all aspects of modern technology—from semiconductor electronic devices to medical imaging. People have gradually come to accept, somewhat belatedly, that the underlying foundation of the world is quantum.

Bohr–Einstein controversy, Image source: wikipedia

Quantum computing uses the non-traditional rules of quantum mechanics to perform computations. Put in terms everyone can understand, to distinguish traditional computing from quantum computing: traditional computing solves problems in a way similar to a detective following clues one by one, step by step; whereas quantum computing sends out many detectives at the same time, investigates clues across multiple dimensions at once, and connects the detectives’ clues with each other—so that it can find the answer faster.

We all know that traditional computers use binary 0 or 1, while quantum computing can have “superposition states” that are simultaneously in both 0 and 1 until “measurement,” when they become definite. In plain terms, in a traditional computer, each bit of information can only be 0 or 1, like a light switch: off is 0, on is 1. You either see the light on or the light off—there is no third state. In quantum computing, this light can be on and off at the same time (superposition) until you look at it, at which point it “decides” whether it is on or off. Quantum superposition states arise from the physical nature itself, because the natural world we observe operates that way: electrons (Electron, one of the basic particles that make up matter) and photons (Photon, the basic unit of light and all electromagnetic radiation) are indeed in multiple possible states before they are measured.

Although the quantum world looks very different from the reality we experience in everyday life, classical experiments have verified its existence—this is the famous “double-slit experiment” (Double-slit Experiment). In the experiment, scientists let electrons or photons pass through a screen with two slits, and then record their positions on a detection screen behind it. The result shows that when electrons or photons pass through both slits at the same time, the screen displays interference fringes, as if the particles simultaneously take two paths and “interfere” with each other. Even more intriguingly, if you try to observe which slit they actually pass through, the interference fringes disappear, and only two separate peaks remain on the screen, as if the particles can only take one path. This experiment shows that quantum particles, when not observed, truly exist in superposition states—with multiple possible states coexisting.

For easier understanding, you can compare it to tossing a coin: in the quantum world, when a coin spins in the air, it is not simply in heads or tails; it is in a state where heads and tails exist simultaneously. Only when you catch it and look does it “decide” whether it is heads or tails. The principle of quantum superposition states is similar—before being observed, a particle can be in multiple possible states at the same time. This is a phenomenon that classical physics cannot explain, and precisely because of this, quantum mechanics is considered one of the most imaginative breakthroughs across disciplines and industries in the future.

Double Slit Experiment, Image source: Science Notes

In simple terms, a quantum computer is a new type of computer that performs computations based on quantum principles. Compared with traditional computers that can only store and process bits (Bit: the smallest unit of information, representing only 0 and 1), a quantum computer uses “qubits” (Qubit) to store data. Since qubits can represent multiple states at the same time—namely the “superposition states” described above—having multiple qubits allows them to combine into an exponentially growing number of possibilities. Simply put, for each additional qubit, the computation space expands by a factor. And that is why in certain specific areas—such as breaking complex cryptographic systems, optimizing huge combinatorial problems, and simulating molecular structures—a quantum computer may have enormous potential advantages over traditional computers.

Principles of quantum computing (superposition, entanglement, interference)

To understand how quantum computing works, you first need to understand a completely new set of terminology. The core principles include 3 important concepts: superposition (Superposition), entanglement (Entanglement), and decoherence (Decoherence).

In the previous section, we mentioned that quantum computers use qubits (Qubit) to store and process information. A qubit is a special kind of unit that can represent not only multiple states of 0 or 1. This property is called superposition (Superposition).

In quantum systems, you can add multiple quantum states to form another effective quantum state; conversely, you can also represent a single quantum state as a combined form of two or more other different states. The superposition property gives quantum computers parallel processing capabilities, enabling them to perform millions of computation operations at the same time. For example, under the operating environment of a normal computer, 10 qubits can represent only 1 state at a time (e.g., 0000011010), while 10 qubits in a quantum computer can simultaneously represent up to 1024 possible states (2 to the 10th power). Compared with a normal computer that can represent only a single state at a time, a quantum computer can effectively probe more than 1000 states at once. The “superposition state” of qubits is the most core feature in quantum computing.

The second important concept is quantum entanglement (Entanglement). In simple terms, when two qubits “entangle” together, no matter how far apart they are, if you change the state of one, the other will immediately undergo a corresponding change. This is the most astonishing part of quantum mechanics, as if there were some invisible mysterious connection between them. This phenomenon exists in tiny particles such as photons (Photon) and electrons (Electron). When several particles interact, they form a single joint system—like several dance partners holding hands and spinning together. If you push one dance partner, the others will move too.

Let’s use another intuitive everyday example: imagine you and a friend in another city each hold a pair of magic coins that are “entangled.” When you flip your coin to show heads, your friend’s coin also becomes heads instantly, regardless of how far apart you are. Quantum entanglement is one of the key traits that enables quantum computers to achieve strong parallel computation and information transmission potential, and it is something traditional computers cannot do.

Quantum entanglement is extremely important in quantum computing and quantum communication. It allows quantum computers to solve complex problems faster. Without quantum entanglement, quantum computers cannot make qubits (Qubit) cooperate, thereby losing their quantum computing advantage. The “multi-particle state” property of quantum entanglement allows multiple qubits to work together; then, through algorithms, the computer can achieve “exponential” acceleration.

The third important concept is quantum decoherence (Decoherence). Quantum decoherence means that once a qubit is disturbed by the external environment, the original quantum properties such as superposition states and entanglement gradually disappear—like a coin spinning in midair that, after being gently tapped, immediately lands as heads or tails. Therefore, one of the core difficulties in quantum computers is to extend the stable time of this “spinning state” as much as possible, ensuring that computation can be completed smoothly. For example, when creating qubit superposition states on a device platform, environmental noise causes qubits (Qubit) to decohere, and it usually requires constructing extreme physical environments such as extremely low temperatures and vacuum.

The first step in quantum computation is “initialization.” The purpose of initialization is to adjust the state of its qubits (Qubit) from a random state to a base state (corresponding to the state with the lowest energy), ensuring that the quantum algorithm runs within the required states. Then, through a series of “quantum gate” operations (similar to logic gates in computers), they evolve and ultimately obtain measurement results. However, quantum states are extremely fragile, and even tiny disturbances from the outside environment can destroy superposition and quantum entanglement. Therefore, quantum computers require extremely stringent external environmental support.

Because of all this, quantum computing has huge potential in many fields. For example, in cryptography (breaking encryption systems), materials science (simulating and analyzing material behavior), artificial intelligence, and weather forecasting. As quantum computing continues to develop, the future world may undergo tremendous changes due to quantum computing.

History of the development of quantum computers

After understanding the basic concepts of quantum computing, let’s learn about quantum computers.

Quantum computers always mysteriously appear in the news, because quantum supremacy is among the top scientific competitions across countries. Quantum computer manufacturing has only been around for a little over 20 years, but as technology advances, the use of quantum computers has gradually become accessible to the public. The idea of quantum computing equipment was first proposed in 1969 by U.S.-born Israeli physicist Stephen J. Wiesner. In 1981, Richard Phillips Feynman proposed the idea of using quantum for universal computation, laying the theoretical foundation for early quantum computing prototypes. In 1994, Peter Shor proposed the famous Shor algorithm, after which people came to understand the tremendous potential of quantum computing for breaking traditional encryption technologies. From 2000 onward to today, large technology companies such as Google and Microsoft have been developing quantum computing-related products and services.

Like ordinary computers, designing and manufacturing quantum computers can be divided into hardware and software. The hardware side has three core components: the data panel, the control and measurement panel, and the processor. The quantum data panel is the “heart” of a quantum computer: it stores qubits (the basic unit used by quantum computers to store and process information) and the structures that hold these qubits in place. Current mainstream approaches include superconducting qubits, topological qubits, and more. IBM and Google have chosen the superconducting qubit technical route, whose advantage is ease of manufacturing. Topological qubits are more stable, but harder to implement; Microsoft chose its corresponding technical route.

A quantum computer is like a factory. Its “heart”—the quantum data panel—stores qubits (Qubit). The control and measurement panel converts digital signals into waveforms to control the qubits, while the processor handles computation. Software runs algorithms through quantum circuits, and programmers can write quantum programs using IBM’s Qiskit, Google’s Cirq, or Microsoft’s Q#.

Google CEO and quantum computers, Image source: NYTimes

Applications of quantum computing

With the evolution of quantum algorithms and the “commercialization” of quantum computers, quantum technology is gradually integrating into every aspect of our lives.

Driven by the entry of major commercial players and capital investment, quantum computing is thriving across various subdomains—such as drug R&D and risk control model design in the financial industry. Traditional drug discovery methods rely on classical computers to simulate molecular interactions, but quantum computers can simulate chemical reactions more precisely. For example, on January 11, 2021, Google partnered with the German pharmaceutical company Boehringer Ingelheim to use quantum algorithms to simulate molecular structures, helping design drugs targeting cardiovascular diseases and greatly shortening the trial cycle. In finance, quantum computing optimizes risk management and investment portfolios. JPMorgan Chase is one of the first financial institutions globally to adopt IBM Q System One (the first circuit-based commercial quantum computer) for quantum computing exploration. It uses IBM’s Q System One to simulate Monte Carlo methods for assessing market risk and derivative pricing, helping the bank make more accurate decisions in the market. Although quantum computing still faces skepticism and challenges related to commercial scale, these cases prove that the pace of quantum computing moving from the lab to real-world applications is accelerating.

Threats of quantum computing

The unique advantages of quantum computers allow them, in specific environments, to achieve exponential speedup computation. Therefore, quantum decryption algorithms pose a significant potential threat to blockchain technologies built on cryptography. Currently, the most mainstream blockchain architectures (such as Bitcoin, Ethereum, etc.) mainly rely on public-key encryption systems (such as the Elliptic Curve Digital Signature Algorithm ECDSA) and hash functions (such as SHA-256) for secure encryption. In the foreseeable future, quantum computing will break through this security barrier. The current threat from quantum computing to blockchain security mainly comes from two most iconic quantum algorithms: the Shor algorithm proposed by Peter Shor in 1994, and the Grover algorithm proposed by Lov Grover in 1996.

Quantum algorithm: SHOR Shor

The Shor (Shor) algorithm is a quantum algorithm proposed by Peter Williston Shor, a mathematics professor at MIT in the United States. It is also called the “quantum integer factorization algorithm.” In plain terms, it can quickly decompose huge integers like those used in RSA encryption into the product of two large prime numbers. Compared with traditional computers, quantum computers can complete this task in an extremely short time—that is what makes the Shor algorithm particularly powerful. Its core idea is also very “clever”: the algorithm does not directly search for prime factors. Instead, it first quickly finds number patterns (periods), and then derives the prime factors based on those patterns.

A simple analogy: if a traditional computer breaks a large number apart like a team searching through a huge warehouse for items, a quantum computer is like having many copies of itself, trying every path at once, so it can find the answer quickly.

As early as 2001, IBM demonstrated an instance of the Shor algorithm using a liquid-state nuclear magnetic resonance quantum computer. Since then, this algorithm has caused a huge stir in cryptography because it demonstrates the potential power of quantum computers: in the future, it could have far-reaching impacts on traditional encryption technologies and internet security.

This means that, in traditional cryptographic systems, elliptic curve encryption (Elliptic Curve Cryptography) and RSA—such as the elliptic curve encryption used for website HTTPS/TSL signatures, SSH keys, and website certificate signatures—will face direct threats, especially elliptic curve encryption, which is closely tied to our everyday lives. For example, it is used in smartphone Apps and software ID authentication for encryption, making it one of the most mainstream encryption technologies on the modern internet. Although today’s quantum computers cannot yet break 2048-bit RSA encryption (theory requires thousands of qubits), as quantum computing technology matures, it may break through this encryption security boundary in the near future.

Quantum algorithm: Grover Grover

Two years after the introduction of the Shor algorithm, Lov Kumar Grover, an Indian-American scientist at Stanford University, proposed and developed a new quantum algorithm—the Grover algorithm, also known as the quantum search algorithm. In quantum computing, Grover’s algorithm is a very practical algorithm used for searching and querying unstructured databases.

If an ordinary computer needs to find answers in a database as large as “2 raised to the dozens of powers,” the basic traditional approach is to scan from start to end one by one, like flipping through books page by page in a library—very time-consuming. Grover’s algorithm instead leverages the features of “quantum superposition” and “amplitude amplification” to find the answer in about √N attempts. This process is called “quadratic speedup” (Quadratic Speedup).

In simple terms, if a traditional computer needs to run 10^12 times (i.e., one trillion times), Grover’s algorithm theoretically needs only about 1 million times—an enormous efficiency gap.

Its core principle is: first, “superpose” all possible answers, so that qubits represent N possible states at the same time. Initially, the probability of selecting each answer is 1/N. Then, through a mechanism called a “oracle,” the correct answer is “marked” (phase inversion). Next, by repeatedly iterating, the probability of the correct answer keeps increasing, while the probabilities of other incorrect answers are reduced.

A comparison: imagine a pitch-black room with countless doors, and only one door hides a treasure. A traditional computer can only try doors one by one. Grover’s algorithm is like letting all doors be tried “simultaneously” first, and then, with each round, increasing the “brightness” of the correct door by a little until it becomes increasingly brighter in the darkness, so that you can tell at a glance. When the probability of the correct answer is amplified to nearly 100%, the measurement system can obtain the correct result with high probability.

You may ask: since it tries all doors at the same time at the start, why doesn’t it simply tell us which door has the treasure? The reason is: when you actually “look” at the result (measure), you can only see one door. If you look at the start, since all doors have the same probability, the chance of seeing the treasure door is like randomly drawing lots—almost the same as guessing blindly. Therefore, Grover’s algorithm must make the correct door increasingly brighter round by round. When the correct door is already clearly brighter than the others in the dark, then when you “look,” you will almost certainly see the correct answer. In other words, a quantum computer can explore all possibilities at the same time, but it cannot present all answers simultaneously. It can only “amplify the probability of the correct answer,” so that your measurement yields the correct result with high probability.

Grover’s algorithm can also be applied to brute-force attacks in cryptography, posing a substantial threat to breaking symmetric keys. Currently, industry guidance is to use keys with the length of AES-256 (Advanced Encryption Standard) bits, because in a quantum environment, a 128-bit key provides only 64 bits of security. Thus, the industry needs higher security. However, Grover’s algorithm has limitations as well: it only provides a quadratic speedup—meaning although it is much faster than a traditional computer, the speedup is not infinite. A simple analogy: if you originally needed to run 100 kilometers, Grover’s algorithm might let you run only 10 kilometers to get there, but you still have to pay the physical cost of running. And because the manufacturing and operation cost of a quantum computer is very high, it is like needing an extremely expensive treadmill to complete that 10-kilometer run. Therefore, in real applications, Grover’s algorithm cannot break every encryption system without limits. You still need to combine it with longer keys or other security measures to ensure security.

Analysis of the impact of quantum computing on blockchain

The core of blockchain design is building a distributed ledger on a foundation of cryptography. Most blockchain protocols, such as Bitcoin, use ECC (elliptic curve cryptography) to generate public and private keys and perform digital signatures. Based on ECDSA (Elliptic Curve Digital Signature Algorithm, elliptic curve digital signature) Secp256k1, the specific parameter standard for a particular elliptic curve commonly used by Bitcoin and Ethereum, is characterized by high security, efficiency, and relatively short keys. It is widely used for generating key pairs and signing on-chain.

SHA-2 (Secure Hash Algorithm 2) also includes SHA-256, an encryption hash function that is broadly adopted by blockchain. Hash functions map data of any length to a fixed-length value (hash value). This algorithm is irreversible and difficult to reverse engineer, so it is commonly used for proof-of-work algorithms and transaction verification. As quantum computers iterate, with a sufficiently large qubit scale, computers can run “quantum algorithms” to break non-symmetric encryption algorithms such as elliptic curve encryption in a short time (1 month) of continuous computation—meaning blockchain components will face direct challenges.

Different algorithms’ impact on encryption components, Image source: Web3Caff Research researcher Bob’s own creation

In addition, quantum computers may lead to “HNDL attacks” (Harvest-Now-Decrypt-Later). That is, attackers start collecting data now and launch decryption attacks on the “quantum jump date” once their quantum capabilities “take off.” HNDL is a monitoring strategy that depends on long-term monitoring and storing encrypted data that cannot be broken today; then, once quantum technology matures in the future, the data can be decrypted. The industry refers to the hypothetical “jump date” of this quantum computing scenario as Y2Q or Q Day. In response to the threat of quantum computing, the blockchain industry is also actively responding. For example, in January 2026, Coinbase, a well-known U.S. publicly listed company, established an independent quantum computing and blockchain committee to address potential threats that quantum computing may pose to blockchain encryption security in the future, and to research anti-quantum solutions. In the same year, Ethereum’s Layer 2 network Optimism also began introducing anti-quantum algorithms to meet bigger challenges ahead.

HNDL explanation image, Image source: Paloalto Networks

The impact of quantum computing on digital finance

Of course, the potential impact of quantum computing is not limited to the blockchain finance industry. It also affects the broader digital finance sector—such as banks related to people’s daily lives. From a risk and security perspective, encrypted security facilities that banks heavily rely on will be threatened first. The Shor algorithm can quickly break the RSA encryption and elliptic curve encryption commonly used by banks, thereby compromising information of bank users. Moreover, the attack style of “HNDL—steal now, decrypt later” means that currently leaked financial data may also be vulnerable to decryption by quantum computers in the future. In the face of “quantum threats,” top global financial enterprises have already begun entering the “post-quantum era.” In 2024, the U.S. National Institute of Standards and Technology (NIST) released the first batch of quantum security standards. Banks and financial institutions have also begun gradually planning migrations to post-quantum cryptography (PQC) algorithms to prepare for the arrival of the quantum era.

But quantum computers also bring more than challenges to financial institutions such as banks—they have an active side as well. Quantum computers can bring transformation to finance by accelerating complex computations. In risk modeling, quantum computing can speed up Monte Carlo simulations, enabling banks to evaluate risks more precisely and quickly. In recent years, there have been more and more real-world deployment scenarios for quantum computing in banking. For example, in 2025, HSBC partnered with IBM’s quantum computing project and used quantum processors to assist bond trading predictions, improving accuracy by 34%. Turkey’s Yapi Kredi Bank collaborated with Canada’s D-Wave to rapidly lock onto high-risk enterprises through risk control models.

Current state of quantum security

In fact, since people became aware of quantum threats, post-quantum cryptography (Post-Quantum Cryptography, abbreviated as PQC) has made active progress in recent years. Especially after NIST (U.S. National Institute of Standards and Technology) released three post-quantum cryptography standards in 2024, industries related to data security have been working intensively on quantum security migration. Financial banking industries and large platform enterprises like electronic communications have brought anti-quantum security measures onto their agendas, planning to upgrade to quantum-resistant algorithms over the coming years.

Development of post-quantum cryptography

According to predictions from Global Risk Institute (a Quantum Threat Timeline report based on dozens of experts), the probability that RSA encryption algorithms will be cracked by quantum attacks after 8 years (2034) is approximately 19–34% (2024/2025 data). Compared with previous years, this timeline has accelerated slightly. Post-quantum cryptography (Post-Quantum Cryptography) emerged in response to growing concerns about Q Day, and has now developed into a cornerstone for anti-quantum research.

A prediction image of a quantum computer cracking RSA-2048 within 1 day, Image source: Global Risk Institute

Post-quantum cryptography is also known as “anti-quantum cryptography” or “quantum-secure cryptography.” Most quantum attacks target public-key algorithms. Research directions in post-quantum cryptography include lattice cryptography, fault-tolerant learning, multivariate polynomials, and more. These algorithms are designed to ensure the security of privacy data in future quantum computing environments.

The process of standardizing anti-quantum measures has taken 10 years so far. Starting in 2016, when NIST (National Institute of Standards and Technology) launched the post-quantum cryptography project, the program went through multiple rounds of evaluation. In August 2024, NIST officially published the first batch of post-quantum cryptography encryption standards. Their purpose is singular—to address the quantum threats posed by future quantum computers against existing public-key algorithms (RSA and elliptic curve encryption). These three post-quantum cryptography standards are:

  • ML-KEM: used for key encapsulation, mainly responsible for “securely exchanging keys.” In simple terms, when you access a secure website (such as HTTPS), both sides need to secretly agree on an “encryption key” first. ML-KEM is the tool used to securely transmit this key. Its features include fast encryption and high efficiency;
  • ML-DSA: based on the CRYSTALS-Dilithium module lattice digital signature algorithm, used to ensure that data is not tampered with during transmission, and to confirm the sender’s identity. It can be understood as putting a “tamper-proof seal” on a file or message so others can verify that the seal is genuine and that the content was not altered;
  • SLH-DSA: a digital signature algorithm based on stateless hashing, originally named SPHINCS+. Its security is stronger and it belongs to a more “robust” scheme, but it has the tradeoff of longer time required to generate signatures and larger size. It can be understood as a more insurance-oriented signature method, only that it signs more slowly.

They are called “anti-quantum algorithms” mainly because: they no longer rely on mathematical problems that Shor’s algorithm can efficiently crack (such as integer factorization or the elliptic curve discrete logarithm problem). Instead, they are built on mathematical foundations that today’s quantum computers still find difficult to break.

The security of traditional encryption algorithms (such as RSA and ECC) is built on problems like “integer factorization is hard” or “it is hard to reverse-engineer the private key from an elliptic curve.” But quantum computers can use Shor’s algorithm to achieve exponential speedup decryption, so in theory, they are no longer secure.

ML-KEM and ML-DSA are based on “lattice cryptography.” In simple terms, it is like searching for a specific solution inside an extremely complex, high-dimensional mathematical maze. Currently, there is no quantum algorithm like Shor’s that can significantly speed up breaking lattice problems. Therefore, even in a quantum computing environment, these problems are still considered highly difficult.

SLH-DSA (originally SPHINCS+) is built on hash functions. Quantum computers can only use Grover’s algorithm to achieve quadratic speedup against hash functions, not exponential speedup. This means that as long as you appropriately increase security parameters (for example, using a longer hash length), you can offset the speedup advantage brought by quantum computing. As a result, its security is more robust, but the tradeoff is a larger signature size and slower generation speed.

In summary, these anti-quantum algorithms are secure because they are based on mathematical problems that currently known quantum computers cannot efficiently solve (lattice problems or hash problems), rather than on traditional integer factorization problems that are easy for Shor’s algorithm to break.

Based on the above standards, developers have explored three categories of mainstream anti-quantum algorithm technical routes:

  1. Lattice-based technical route. This route has the best overall performance and is also the most favored. Interpreted in non-technical language: the encryption mechanism works by designing a high-dimensional “lattice-space structure,” preventing quantum algorithms from using a “shortest path” style computation to decrypt and solve. The advantages are most obvious—moderate key sizes, multifunctionality (encryption, signatures, and zero-knowledge proofs), and high security. Most TLS transport layer security protocols are likely to choose this route. NIST-standardized algorithms such as ML-KEM (original Kyber) and ML-DSA (original Crystal-Dilithium) are based on this;
  2. Hash-based technical route. This route offers the highest security level solution. This route uses hash functions entirely for signatures. Quantum computers can only decrypt via brute-force blind guessing. The advantage is high security. Ethereum founder Vitalik Buterin also publicly suggested on February 27, 2026 that the future should use “hash-based” signature schemes and enhance signature switching capability through EIP-8141 upgrades. The SLH-DSA algorithm standardized by NIST is one representative of this path, but its disadvantages are that signatures are large and it is slow, making it unsuitable for scenarios involving frequent signing;
  3. Code-based technical route. This route was selected by U.S. NIST as a “backup algorithm” and has nearly 50 years of history. Its security depends on error-correcting codes (Error Correction Code), which are commonly used in communications, to hide private key information. Attackers must recover the original information by verification in complex data environments, making computation difficulty extremely high. Whether in traditional computing or quantum computing, such problems are very difficult. Shor’s algorithm is ineffective against these “coding problems,” and the impact of Grover’s algorithm is also limited. Currently used as an alternative “key encapsulation algorithm,” it can avoid potential vulnerabilities that might appear in “lattice-based cryptography.” However, its drawback is that public keys are usually quite large—up to tens of thousands or even hundreds of thousands of bytes—so its use cases are limited.

In addition to these three mainstream technical routes, there are other niche routes corresponding to different use cases, such as multivariate cryptography and supersingular isogeny key exchange (which has been broken). The multivariate cryptography route is often used in blockchain for fast generation and verification of signatures; its advantage is fast verification, making it more suitable for signatures than encryption. Supersingular isogeny key exchange was previously commonly used in SSL protocols to establish quantum-secure session keys, but it was cracked in 2022, so this technical route was removed by U.S. NIST from its standards.

Overall, the core of post-quantum cryptography is the use of new mathematical protection mechanisms to replace the “old-style mathematical protection mechanisms” that will be broken by future quantum computing, thereby protecting data privacy. Therefore, for industries like finance that heavily rely on data encryption, deploying and migrating to post-quantum cryptography is especially important. Today’s top technology companies (Google, Microsoft, Amazon, etc.) are gradually integrating these algorithms into computer browsers and operating systems. As a result, ordinary users need not be overly anxious, because major platform companies are already planning migrations to quantum-secure algorithms.

Anti-quantum progress in the blockchain industry

For a blockchain industry that highly relies on cryptographic security, the potential threat of quantum computing has not been a recent concern—it has been under forward-looking research and technical preparations for years. A basic “consensus” among leading industry institutions and core practitioners is that: the threat from quantum computing is an engineering problem that can be solved, not an unmanageable systemic risk.

Therefore, although “native anti-quantum blockchains” have not yet become mainstream, as governance mechanisms continue to advance, some mainstream companies and public chains have already started preparing for migration to a quantum-secure environment. A recent series of developments indicates that the industry is gradually moving from theoretical discussions into deployment planning.

For example, Coinbase—one of the largest publicly listed crypto companies globally—founded an independent quantum advisory council in January 2026. It invited quantum computing professors and security experts from U.S. academia to participate, and it is expected to publish quantum risk assessment reports and an anti-quantum migration roadmap. Its plan includes upgrading Bitcoin address handling mechanisms, strengthening internal key management systems, and gradually supporting post-quantum signature schemes such as module-lattice-based digital signature algorithms (ML-DSA), etc.

Meanwhile, the Ethereum Foundation has also formed a dedicated anti-quantum research team and listed quantum security as a strategic priority for 2026. These actions suggest that 2026 may become the planning starting point for the blockchain industry to enter the “anti-quantum era,” with quantum security shifting from theoretical topics to engineering implementation (introduced in detail below).

In contrast, the Bitcoin community has taken a more cautious approach. Its challenges are not just technical but also governance-related. Because Bitcoin’s governance mechanism heavily depends on community consensus, upgrade cycles typically progress on the scale of “years,” so the main difficulties for quantum security migration come more from decision coordination and consensus formation rather than pure technical implementation.

Currently, the Bitcoin community is mainly discussing three technical routes:

  1. Gradually introduce new functionality via soft forks;
  2. Retire old address formats to reduce the risk of public key exposure;
  3. Integrate post-quantum signature schemes.

However, it remains difficult for a unified solution to emerge in the short term.

The most recently discussed proposal attracting attention is BIP-360 (also known as Pay-to-Tapscript-Hash, P2TSH). The proposal was first put forward in 2024 and underwent important updates by the end of 2025. It is still in draft stage, but discussions have reached a significant scale. Its core idea is to draw from the output mechanism of the Taproot upgrade in 2021—removing Key Path Spend (public key path spending)—to reduce the risk of public keys being exposed on-chain in early address formats, thereby leaving room for future integration of anti-quantum signature algorithms.

However, there is still another voice within the community: some believe the quantum threat is still at an early stage and there is a long time before real attacks. Therefore, whether a large-scale upgrade is needed immediately remains controversial.

That is to say, the blockchain industry is not passively waiting for quantum shocks. Instead, under different governance rhythms and risk perception frameworks, it is gradually advancing anti-quantum transformation. The real challenge is not only technical implementation, but how to achieve upgrade consensus across communities and across stakeholder structures in an open network.

Although it appears that Bitcoin has not yet suffered real quantum attacks and security threats, a small portion of Bitcoin is in quantum risk. Coinshares, a financial company in the United States in New Jersey, states that public key addresses using the earlier Pay-To-Public-Key (P2PK) format are the most likely targets for quantum attacks. Approximately 1.6 million addresses (about 8% of the total) would be more exposed to threat. The number that is most likely to cause market volatility is around 10,000 Bitcoins.

Number of Bitcoins affected by quantum threats, Image source: Coinshare

Given the long upgrade cycle problem of blockchain networks, before the Bitcoin community formally upgrades to anti-quantum measures, the developer community is also actively trying to create “quick” solutions. For example, the Project Eleven team’s anti-quantum key generation tool Yellow Pages developed in 2025 allows Bitcoin users to directly link their Bitcoin to anti-quantum addresses and prove ownership.

The mechanism of Yellow Pages is relatively concise. The product can generate post-quantum signature keys (supporting NIST standards). After users sign, it will associate the address with the post-quantum key. When the quantum threat arrives, users can transfer their Bitcoin to a quantum-secure address after proving ownership. In addition to the Bitcoin blockchain, Project Eleven is collaborating with Solana and other major blockchains to develop a series of tools for the infrastructure needed in the post-quantum era.

Compared with Bitcoin’s upgrade cycle, the Ethereum community is more forward-looking. In November 2025, Ethereum founder Vitalik Buterin warned at Devconnect that quantum computing might have enough computing power to break Ethereum security before the 2028 U.S. election. Vitalik Buterin actively pushed the Ethereum community to complete a full system migration for quantum security within four years. Two months later, in January 2026, the Ethereum Foundation listed quantum security as the top strategic priority for this year and formed a dedicated post-quantum team, funding and helping to develop related quantum security upgrade software. In February 2026, Vitalik Buterin updated the Ethereum anti-quantum roadmap on X. Ethereum will replace the current BLS digital signatures by adopting a hash-based cryptography technical route (see above), and use STARKs for aggregation to reduce overhead—addressing Ethereum’s quantum weaknesses earlier. The goal is, within a year, through the EIP-8141 upgrade, to completely solve the account abstraction problem and move away from a single ECDSA signature (which is vulnerable to quantum attacks). At that time, users can freely switch signature schemes, including anti-quantum signatures (based on hash-based technical routes).

Additionally, the Ethereum Foundation will invest $2 million to incentivize related R&D. Ethereum researcher Justin Drake also said that Ethereum is transitioning from the research stage to the engineering implementation stage, including hosting anti-quantum developer conferences and releasing multi-client anti-quantum testnets, etc.

Meanwhile, Ethereum’s Layer 2 network Optimism also released its Superchain / OP Stack anti-quantum roadmap strategy in January 2026. It plans to abandon vulnerable ECDSA-based EOA (External Owned Accounts) across the OP mainnet and the entire Superchain by 2036, transitioning from the Account Abstraction (AA) layer into the post-quantum era. External wallets can delegate their permissions to smart contract accounts. In 2036, OP mainnet and its ecosystem will no longer accept transactions signed purely with ECDSA. Users must interact on-chain via smart contract accounts that support post-quantum signatures, though users do not need to transfer assets. As an L2 (Layer 2) network, Optimism will become a pioneer for Ethereum’s quantum security. Over the following years, Optimism will support both ECDSA and post-quantum PQ signatures in parallel, mobilizing its ecosystem such as dApps (decentralized applications) to migrate to smart contract accounts and ultimately retire ECDSA, which is vulnerable to quantum attacks.

In a blockchain industry that relies on cryptographic infrastructure, 2026 is the time node for anti-quantum moving from theory to concrete implementation. The quantum migration in the Ethereum ecosystem will be smoother and more time-planned than Bitcoin’s. Although the Bitcoin community has not yet officially started the upgrade, there are already proposals under a scaled discussion scope, and its network design is upgradeable. As more anti-quantum proposals emerge, by the time quantum threats reach their timeline, the community can coordinate synchronized upgrades through soft forks with the community. Meanwhile, before the blockchain network successfully upgrades, users can also choose to try open-source tools (such as YellowPages mentioned earlier) to ensure their assets are “quantum secure.”

Preparation recommendations and timeline projections for anti-quantum in the blockchain industry

As of now, quantum computing technology is advancing from the scale of hundreds of qubits toward thousands of qubits. In 2025, Fujitsu and Riken jointly developed a superconducting machine with 256 qubits, aiming to develop a quantum computer with more than 1,000 qubits by 2026. On March 25, 2026, Google updated its post-quantum era timetable to 2029 and called on the industry to complete secure migration. Although the current qubit scale is not enough to quickly break traditional encryption, as quantum computers iterate, the storage scale of qubits will increase exponentially in the near future. Therefore, both countries and enterprises have also provided timelines for quantum security migration.

National-level migration planning

In 2022, the U.S. released the “Commercial National Security Algorithm Suite 2.0” (CNSA 2.0), which clearly defines the roadmap and standards for migrating national security systems to post-quantum cryptography. This framework is intended to provide long-term protection for national security systems and sensitive information to cope with the risk of cryptographic cracking caused by future quantum computing.

In March 2025, the UK National Cyber Security Centre released an anti-quantum cryptography migration timeline, with a plan to:

  • complete comprehensive risk assessments by 2028;
  • complete full migration of critical systems by 2035.

Under CNSA 2.0 arrangements, the U.S. National Security Agency sets 2030–2033 as the critical migration window period. The UK and Australia also view 2035 as the final milestone for completing migration.

In addition, the National Institute of Standards and Technology (NIST) has published post-quantum cryptography standards and clearly requires that starting in 2030, federal agencies and critical infrastructure should gradually deprecate traditional algorithms that are vulnerable to quantum attacks. In the European Union’s “Quantum Europe Strategy,” it also proposes that most critical infrastructure should complete quantum security upgrades by 2035.

Overall, 2025–2035 is becoming the policy window period for global quantum security migration.

Substantive deployment at the enterprise level

At the enterprise level, industries such as finance, communications, and cloud infrastructure are considered to face potential risks from “Harvest Now, Decrypt Later” (HNDL) attacks—meaning attackers steal encrypted data today and decrypt it after quantum computing matures. Therefore, long-term sensitive data (such as bank transaction records and identity data) becomes a priority protection target.

In May 2024, global large bank JPMorgan Chase announced the deployment of quantum-secure encryption agile networking (Q-CAN) to enhance its network encryption resilience in the quantum era.

In the communications domain, the CDN and cloud service provider Cloudflare began deploying hybrid post-quantum TLS protocols as early as 2022, and supports post-quantum key exchange mechanisms on the server side—laying groundwork ahead of time for the future full migration of the internet to post-quantum encryption environments.

On mobile operating systems, Google’s latest Android 17 has integrated ML-DSA-based post-quantum digital signature protection technology and complies with NIST standards.

Currently, most enterprises in critical infrastructure follow NIST standards and have moved from the technical validation phase into small-scale pilots and hybrid deployment phases.

Quantum threat and preparation schedule, Image source: Paloato Networks

Timeline for quantum security readiness in the blockchain industry

If we take the quantum security migration timelines from the U.S. NIST and the EU as references, the quantum migration deadline is set for 2035. However, according to Google’s latest quantum threat assessment in 2026, the quantum threat has arrived much earlier. Google’s official new timeline is 2029, indicating that the time for relevant industries to complete “quantum security migration” is only 3 years. This means the blockchain must urgently initiate post-quantum upgrades.

In fact, different blockchain ecosystems do deploy plans based on this time breakpoint. As the most “aggressive” anti-quantum pioneer among blockchain ecosystems—Ethereum may complete the deployment of the EIP-8141 proposal by the end of 2026. That proposal includes anti-quantum signature schemes. Ethereum ecosystem Optimism announced it will eliminate all external accounts (EOA) based on ECDSA across the OP mainnet and the entire Superchain within 10 years. The Bitcoin community merged BIP360 into the official BIPs repository draft in February 2026. BTQ Technologies completed deployment of a testnet. On March 31, 2026, Bitcoin Magazine published a developer notice that the anti-quantum proposal is under review. Clearly, the quantum migration moment for the blockchain industry is accelerating beginning in 2026.

In addition, to further clarify the specific timing of the “quantum jump day,” the quantum security infrastructure team Project Eleven recently launched the Q-Day Prize public quantum cryptography challenge, and also designed a dynamic evaluation model called the Q-Day Clock to measure the time window during which quantum computing poses a substantive threat to elliptic curve encryption. Project Eleven will use data from these two components to calibrate the industry’s understanding of how close the “Q-Day” quantum attack is. Overall, before the arrival of “jump day,” the entire industry needs to go through roughly three preparation stages: planning and experiments, large-scale migration and quantum security phases.

  • Planning and experiments stage: around 2026–2027, blockchain enterprises and public-chain development teams complete preliminary quantum risk assessments, attempt to develop testnets, deploy hybrid encryption models, and protect against HNDL attack risks for data;
  • Large-scale migration stage: around 2028–2029, after mainstream chains go live, post-quantum cryptography optional signature schemes, infrastructure transaction platforms, asset custody machines, and decentralized ecosystems such as cross-chain bridges complete hybrid encryption deployments. Public chains’ Layer 2 networks can first verify the stability of their operation;
  • Quantum security stage: 2030–2035, mainstream chains, transaction platforms, and related infrastructure deprecate or upgrade vulnerable ECDSA algorithms, adopt anti-quantum algorithms or other anti-quantum solutions, and achieve quantum security. By 2035, industry-related institutions reach the goals of NIST and the EU and complete quantum security migration.

Conclusion

At the beginning of 2026, the world’s top blockchain enterprises and teams all, independently and simultaneously, raised quantum security to strategic priorities—ranging from the industry’s infrastructure to public chains, and even regulation—where relatively clear quantum security timelines have been put forward.

Over the past decade since Bitcoin’s birth, blockchain based on cryptography has nurtured a huge digital finance ecosystem and has been running to this day. However, during this 17-year process, computer science has also advanced rapidly, and the threat from quantum computing to blockchain has evolved from early “wild fantasy” concerns into the era of post-quantum migration. Whether it is traditional banks or internet technology industries, their encryption infrastructure will face threats.

Although in 2026 the macro situation of the industry and market will still face enormous uncertainty, for now blockchain encryption security can still be maintained, and there are relatively complete response methods for the post-quantum era—possibly even more forward-looking than in traditional industries. The quantum era will ultimately arrive. Even though the media often paints this moment as crisis-ridden, objectively speaking, when the so-called “Q-day” arrives, the disruptive computing power it brings may open a more magnificent computing era. And the opportunities it contains may be far deeper and more positive than the challenges.

References

[1] Quantum Mechanics, wiki

[2] Quantum Computing, wiki

[3] Quantum entanglement, wiki

[4] What is quantum computing, Amazon AWS

[5] Shor’s algorithm, wiki

[6] Grover’s algorithm , wiki

[7] U.S. NIST releases post-quantum cryptography standards, Strategic Consulting Research Institute of the Chinese Academy of Sciences

[8] Quantum Threat Timeline Report 2024, Global Risk Institute

[9] Harvest Now, Decrypt Later (HNDL): The Quantum-Era Threat, PaloAlto Networks

[10] NIST Role and Activities Relative to the Post Quantum Cryptography White House Memo, NIST

Disclaimer: This report is written by Web3Caff Research. The information contained herein is for reference only and does not constitute any prediction or investment advice, proposal, or offer. Investors should not rely on such information to buy, sell any securities, cryptocurrencies, or take any investment strategies. The viewpoints and explanations of terms and expressions used in this report are intended to help understand industry trends and promote responsible development of Web3, including the blockchain industry. They should not be interpreted as explicit legal opinions or as the views of Web3Caff Research. The opinions in this report only reflect the author’s personal views as of the stated date, and are not related to Web3Caff Research’s position, and may change with subsequent circumstances. The information and viewpoints in this report come from proprietary and non-proprietary sources that Web3Caff Research deems reliable, and may not cover all data. It also does not guarantee the accuracy of such information. Therefore, Web3Caff Research makes no guarantee of any kind regarding the accuracy and reliability of the information, and assumes no responsibility for any errors or omissions arising in any other way (including responsibility for any harm caused to any person due to negligence). This report may contain “forward-looking” information, which may include predictions and forecasts. This report does not constitute any guarantee of any predictions. Whether to rely on the information contained in this report is entirely up to the reader. This report is for reference only and does not constitute investment advice, proposals, or offers of any kind to buy or sell any securities, cryptocurrencies, or take any investment strategies. Please strictly comply with relevant laws and regulations in your country or region.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin