IBM and Google expect to develop full-scale quantum computers by the end of the decade, following recent technical breakthroughs. IBM claims it can build the machine by the end of the decade, while Google announced last year it cleared a major hurdle and is on course to finish by the end of the decade. The creation of full-scale quantum computers could revolutionize computing and have significant implications for industries such as finance and healthcare.
IBM and Google are making significant strides in their quest to develop full-scale quantum computers, aiming to complete their projects by the end of the decade. Both companies have recently announced substantial technical breakthroughs that address the key challenges of scaling quantum systems.
IBM, in particular, has released a new blueprint that it claims fills in the missing pieces from earlier designs, putting it on a path to a workable system by 2030. Similarly, Google, which last year cleared what it called one of the most difficult remaining hurdles, has set the same target. Both companies assert that the most fundamental physics problems have been solved, leaving mainly engineering challenges to reach scale [1].
However, the road to large-scale quantum computing is fraught with obstacles. Increasing the number of qubits is not as simple as stacking more onto a chip. Qubits are inherently unstable and susceptible to environmental disturbances or noise, making it difficult to maintain their delicate quantum states for fractions of a second. As more qubits are added, interference grows, sometimes rendering systems unmanageable [1].
IBM encountered this issue with its Condor chip, which used 433 qubits and suffered from crosstalk between components. The company has since switched to a new type of coupler to reduce interference. Google, on the other hand, aims to cut component costs tenfold to keep a full machine near $1 billion, emphasizing the importance of error correction in ensuring the reliability of large-scale quantum computers [2].
Error correction strategies diverge between the two companies. Google uses a method called surface code, which organizes qubits into a two-dimensional grid. IBM, however, is pursuing a different error-correction code called low-density parity-check (LDPC), which it claims will cut qubit requirements by up to 90 percent. While IBM argues its approach is more efficient, experts note that the design is still theoretical and must be proven at scale [1].
Beyond the qubits themselves, companies are rethinking the quantum architecture. Early superconducting systems are wired together in dense bundles, a setup impossible to replicate at million-qubit scale. The leading approach involves integrating many components onto single chips, then connecting those chips into modules. The modules will need larger, more sophisticated refrigerators capable of operating near absolute zero.
The choice of qubit technology could determine which companies can scale fastest. Superconducting qubits, used by IBM and Google, have seen the largest practical gains but are difficult to control and require ultra-cold temperatures. Other approaches, such as trapped ions, neutral atoms, and photons, are more stable but have their own scaling barriers, including slower computation and difficulty linking multiple clusters into a single system [1].
Government funding is increasingly shaping the direction of quantum computing development. Agencies like Darpa are reviewing the fastest paths to practical systems, while analysts suggest that public investment could narrow the field to a few leading contenders. This strategic investment underscores the potential of quantum computing to revolutionize various industries, including finance and healthcare.
A large-scale quantum computer could deliver exponential speedups in certain calculations, enabling rapid design of new materials, optimization of complex systems, and breaking of encryption schemes. That potential has drawn billions in private and public investment worldwide, even as practical use remains years away. Still, optimism seems to be surging, with IBM's Jay Gambetta stating, “It doesn’t feel like a dream anymore. I really do feel like we’ve cracked the code and we’ll be able to build this machine by the end of the decade” [1].
References:
[1] https://thequantuminsider.com/2025/08/12/quantum-leaders-tell-ft-quantum-computing-race-enters-final-stretch-but-scaling-challenges-still-loom/
[2] https://www.ainvest.com/news/ibm-google-push-full-scale-quantum-systems-2030-2508/
Comments
No comments yet