Nord Quantique has successfully developed bosonic qubit technology with multimode encoding, outlining a path to a significant reduction in the number of qubits required for quantum error correction. This provides the system protection against many common types of errors, including bit flips, phase flips, and control errors. Another key advantage over single-mode encoding is that leakage errors, which remove the qubit from the encoding space, can now be detected and corrected. The Tesseract code allows for increased error detection, and it is expected that this will translate into additional quantum error correction benefits as more modes are added. These results are therefore a key stepping stone in the development of this hardware-efficient approach. The core concept of the multimode approach centres on simultaneously using multiple quantum modes to encode individual qubits. Each mode represents a different resonance frequency inside an aluminium cavity and offers additional redundancy, which protects quantum information. The number of photons populating each mode can also be increased for even more protection, further escalating QEC capabilities. This breakthrough enables additional quantum error correction capacity and extra means for detecting errors, while maintaining a fixed number of qubits. It also delivers more benefits, which compound as they scale, opening new avenues for fault-tolerant quantum computing. Through this scientific advance, Nord Quantique now has a clear path to delivering fault tolerance at utility scale. The team will continue to improve its results by leveraging systems with additional modes to push the boundaries of quantum error correction.
Oxford researchers’ new technique identifies and verifies the intrinsic topological superconductivity of materials capable of building scalable, fault-tolerant quantum computers
Oxford researchers have developed a powerful new technique to identify materials capable of supporting stable quantum states, marking a major step toward scalable, fault-tolerant quantum computing. In this new study, the Oxford researchers verified that the known superconductor uranium ditelluride (UTe 2) is an intrinsic topological superconductor. The researchers used a scanning tunneling microscope (STM), which uses an atomically sharp superconducting probe to obtain ultra-high-resolution images at the atomic scale, without using light or electron beams. The experiments used an entirely new operating mode invented by Professor Séamus Davis (called the Andreev STM technique). This method is specifically attuned only to electrons in a special quantum state (topological surface state) that is predicted to cover the surface of intrinsic topological superconductors. When implemented, the method performed exactly as theory suggested, enabling the researchers to not only detect the topological surface state but also to identify the intrinsic topological superconductivity of the material. The results indicated that UTe2 is indeed an intrinsic topological superconductor, but not exactly the kind physicists have been searching for. Although, based on the reported phenomena, Majorana quantum particles are believed to exist in this material, they occur in pairs and cannot be separated from each other. The technique now enables researchers to efficiently screen other materials for topological superconductivity, potentially replacing complex and costly synthetic quantum circuits with simpler, crystalline alternatives.
Iskay Quantum Optimizer outperforms popular classical optimization solvers in financial use cases such as portfolio optimization
IBM Quantum partners are attending the IBM Quantum Partner Forum 2025 in London, England, to hear from IBM leadership and researchers about the latest in quantum hardware, quantum algorithm discovery, and powerful new software tools. The company is proud to introduce two new application functions on the Qiskit Functions Catalog: the QUICK-PDE function by French quantum startup ColibriTD, and the Quantum Portfolio Optimizer by Spanish startup Global Data Quantum. These new functions provide a full, ready-made quantum pipeline for researchers and developers to harness the full power of utility-scale quantum computers in researching and developing new quantum use cases. Application functions are services that abstract away the complexities of the quantum workflow to accelerate quantum algorithm discovery and application prototyping. They take the same classical inputs as in a typical classical workflow and return domain-familiar classical outputs, making it easy to integrate quantum methods into pre-existing application workflows. As the hunt for quantum advantage progresses, more researchers will use application functions to tackle problems that are challenging or impossible for the most powerful high-performance computing (HPC) systems. The new application functions include the Iskay Quantum Optimizer by Kipu Quantum, which outperforms popular classical optimization solvers in financial use cases such as portfolio optimization, and the Singularity Machine Learning function by Multiverse Computing, which addresses classification problems that benefit from ensemble learning and complex model optimization. The new QUICK-PDE and Quantum Portfolio functions already incorporate all three of these improvements, and users can request a free trial through the Qiskit Functions Catalog homepage.
Xanadu is building modular and networked quantum computer using complex photonic states (known as GKP) with extremely low optical losses on a silicon-based chip platform to achieve scalable fault-tolerant quantum computing
Xanadu has taken a key step toward scalable fault-tolerant quantum computing by demonstrating the generation of error-resistant photonic qubits — known as GKP states — on a silicon-based chip platform, a first-of-its-kind achievement. The milestone positions the Toronto-based quantum startup closer to building a modular and networked photonic quantum computer, a device that uses photons, rather than electrons, to perform calculations, according to the paper and a company statement. By encoding quantum information into complex photon states that can withstand noise and loss, the work directly addresses one of the central obstacles to quantum scalability: preserving fragile quantum data as systems grow in size and complexity. Xanadu’s researchers generated what are known as Gottesman–Kitaev–Preskill (GKP) states — structured quantum states made of many photons arranged in specific superpositions. These states encode information in a way that makes it possible to detect and correct small errors, such as phase shifts or photon loss, using well-known quantum error correction techniques. Xanadu’s experiment demonstrates that GKP states can be produced directly on-chip using integrated photonics, paving the way for scalable manufacturing. The system is based on silicon nitride waveguides fabricated on 300 mm wafers, a format common in commercial semiconductor manufacturing. These waveguides exhibit extremely low optical losses, a critical requirement for preserving quantum coherence over time. In addition to the waveguide platform, the setup included photon-number-resolving detectors with over 99% efficiency, developed in-house by Xanadu. These detectors can distinguish between one photon and many, a capability essential for preparing and verifying complex photonic states like GKP. High-precision alignment, custom chip mounts, and loss-optimized fiber connections ensured that the quantum states could be routed and measured without degrading the delicate information they carried.
Study finds quantum-enhanced algorithm on a photonic circuit with small-sized quantum processors can outperform classical systems in specific machine learning tasks
A study published in Nature Photonics demonstrates that small-scale photonic quantum computers can outperform classical systems in specific machine learning tasks. Researchers from the University of Vienna and collaborators used a quantum-enhanced algorithm on a photonic circuit to classify data more accurately than conventional methods. The goal was to classify data points using a photonic quantum computer and single out the contribution of quantum effects, to understand the advantage with respect to classical computers. The experiment showed that already small-sized quantum processors can peform better than conventional algorithms. “We found that for specific tasks our algorithm commits fewer errors than its classical Counterpart”, explains Philip Walther from the University of Vienna, lead of the project. “This implies that existing quantum computers can show good performances without necessarily going beyond the state-of-the-art Technology” adds Zhenghao Yin, first author of the publication in Nature Photonics. Another interesting aspect of the new research is that photonic platforms can consume less energy with respect to standard computers. “This could prove crucial in the future, given that machine learning algorithms are becoming infeasible, due to the too high energy demands”, emphasizes co-author Iris Agresti.
IBM reveals roadmap to world’s first large-scale, fault-tolerant quantum computer in 2029, plans adding new architectural components to assist with correcting errors in real-time to create exceptional fault-tolerance
IBM revealed its expected roadmap for building the world’s first large-scale, fault-tolerant quantum computer, which would enable scaling up quantum computing for real-world practical results. The technology giant said it expects to be able to deliver the platform in 2029. The new computing system, dubbed IBM Quantum Starling, will be built at the company’s campus in Poughkeepsie, New York, and is expected to perform 20,000 times more operations than today’s quantum computers. According to the company, this new platform would require the memory of more than a quindecillion of the world’s most powerful supercomputers, that’s a number equal to a 1 with 48 zeros after it. IBM already operates a large, global fleet of quantum computers and released a new Quantum Roadmap that outlines its intent to build out practical quantum solutions. The company’s most recent IBM Heron, a 156-qubit quantum processor, released in 2024, demonstrated high fidelity with error-correction. The company said Starling will be able to access the computational power required to solve monumental problems by running 100 million operations using 200 logical qubits. The company intends to use this as the foundation for IBM Blue Jay, which will be capable of executing 1 billion quantum operations over 2,000 logical qubits. To reach the fault-tolerance needed for large scale, the company revealed in its roadmap that it will build new architectural components to assist with correcting errors in real-time to create exceptional fault-tolerance. This includes “C-couplers,” that connect qubits over longer distances within Quantum Loon, a processor expected this year. Another processor, IBM Kookaburra, expected in 2026, will be the company’s first modular processor design to store and process encoded information that will combine quantum memory with logic operations, a basic building block for scaling fault-tolerant systems beyond a single chip. In 2027, IBM Quantum Cockatoo will entangle two Kookaburra modules using “L-couplers” to link quantum chips together like nodes in a larger system, marking the final advancement toward building Starling in 2029.
NIST-led team uses quantum mechanics to make a factory for random numbers; Bell test measures pairs of “entangled” photons whose properties are correlated
NIST and the University of Colorado Boulder have launched CURBy, a publicly available random number generator based on quantum nonlocality, offering verifiable, truly random numbers. At the heart of this service is the NIST-run Bell test, which provides truly random results. This randomness acts as a kind of raw material that the rest of the researchers’ setup “refines” into random numbers published by the beacon. The Bell test measures pairs of “entangled” photons whose properties are correlated even when separated by vast distances. When researchers measure an individual particle, the outcome is random, but the properties of the pair are more correlated than classical physics allows, enabling researchers to verify the randomness. Einstein called this quantum nonlocality “spooky action at a distance.” This is the first random number generator service to use quantum nonlocality as a source of its numbers, and the most transparent source of random numbers to date. That’s because the results are certifiable and traceable to a greater extent than ever before. CURBy uses entangled photons in a Bell test to generate certifiable randomness, achieving a 99.7% success rate in its first 40 days and producing 512-bit outputs per run. A novel blockchain-based system called the Twine protocol ensures transparency and security by allowing users to trace and verify each step of the randomness generation process. CURBy can be used anywhere an independent, public source of random numbers would be useful, such as selecting jury candidates, making[A1] [A2] a random selection for an audit, or assigning resources through a public lottery.
New approach to quantum error-detection uses a dual-rail dimon qubit technology to detect and suppress errors at the individual qubit level, reducing the hardware overheads
Oxford Quantum Circuits (OQC), a global leader in quantum computing solutions, has developed a new approach to quantum error-detection that could accelerate the development of commercially viable quantum computers. The company’s breakthrough, the Dimon approach, uses a dual-rail dimon qubit technology to detect and suppress errors at the individual qubit level, reducing the hardware overheads required for quantum error-corrected logical qubits. This breakthrough has the potential to fundamentally change the economics of quantum computing by reducing the infrastructure and hardware costs needed for commercially-useful quantum computation. The research demonstrates that superconducting qubits can be made more robust with minimal increase in size and complexity. OQC’s breakthrough represents a major step towards a parallel transition in quantum technology, allowing for the development of affordable quantum computing infrastructure by 2028.
Microsoft’s new family of 4D geometric codes require very few physical qubits, can check for errors in a single shot, and exhibit a 1,000-fold reduction in quantum error rates
Microsoft Quantum is advancing the global quantum ecosystem by developing powerful error-correction codes for various types of qubits. These codes require very few physical qubits per logical qubit, can check for errors in a single shot, and exhibit a 1,000-fold reduction in error rates. Microsoft’s qubit-virtualization system, a core component of the Microsoft Quantum compute platform, enables the creation and entanglement of reliable logical qubits from high-quality physical qubits. Microsoft’s new 4D geometric codes require very few physical qubits to make each logical qubit, have efficient logical operations, and improve the performance of quantum hardware. This family of codes reduces the number of steps required to diagnose errors, resulting in low-depth operations and computations. Incorporation of these codes into the Microsoft Quantum compute platform will enable the creation and entanglement of 50 logical qubits in the near term, with the potential to scale to thousands of logical qubits in the future. Microsoft is bringing the capabilities for quantum advantage forward by coupling state-of-the-art quantum hardware with the Microsoft Quantum compute platform, which includes error correction, cloud high-performance computing, and advanced AI models. Microsoft’s team of experts is available to provide insight and technical expertise on use cases, industry challenges, and opportunities for innovation and collaborative research projects. Microsoft and Atom Computing have co-designed a pairing of neutral-atom qubits with the Microsoft Quantum compute platform, offering extensive scalability, low susceptibility to noise, and high fidelities needed for quantum error correction. The most groundbreaking use cases of quantum computing are likely to be achieved when quantum is used to improve and accelerate other technologies, such as high-performance computing and AI.
New quantum states that are magnet-freee could support building topological quantum computers that are stable and less prone to the errors
A new study published in Nature reports the discovery of over a dozen previously unseen quantum states in twisted molybdenum ditelluride, expanding the “quantum zoo” of exotic matter. Among them are states that could be used to create what is known, theoretically at the moment, as a topological quantum computer. Topological quantum computers will have unique quantum properties that should make them less prone to the errors that hinder quantum computers, which are currently built with superconducting materials. But superconducting materials are disrupted by magnets, which have until now been used in attempts to create the topological states needed for this (still unrealized) next generation of quantum computers. Lead author from Howard Family Professor of Nanoscience at Columbia, Xiaoyang Zhu’s zoo solves that problem: The states he and his team discovered can all be created without an external magnet, thanks to the special properties of a material called twisted molybdenum ditelluride. These states, including magnet-free fractional quantum Hall effects, could support non-Abelian anyons—key building blocks for more stable, topological quantum computers. The discoveries were made using a pump-probe spectroscopy technique that detects subtle shifts in quantum states with high sensitivity, revealing fractional charges and dynamic quantum behavior.