D-Wave has released a collection of offerings to help developers explore and advance quantum artificial intelligence (AI) and machine learning (ML) innovation, including an open-source quantum AI toolkit and a demo. Available now for download, the quantum AI toolkit enables developers to seamlessly integrate quantum computers into modern ML architectures. Developers can leverage this toolkit to experiment with using D-Wave™ quantum processors to generate simple images. By releasing this new set of tools, D-Wave aims to help organizations accelerate the use of annealing quantum computers in a growing set of AI applications. The quantum AI toolkit, part of D-Wave’s Ocean™ software suite, provides direct integration between D-Wave’s quantum computers and PyTorch, a production-grade ML framework widely used to build and train deep learning models. The toolkit includes a PyTorch neural network module for using a quantum computer to build and train ML models known as a restricted Boltzmann machine (RBM). By integrating with PyTorch, D-Wave’s new toolkit aims to make it easy for developers to experiment with quantum computing to address computational challenges in training AI models. “With this new toolkit and demo, D-Wave is enabling developers to build architectures that integrate our annealing quantum processors into a growing set of ML models,” said Dr. Trevor Lanting, chief development officer at D-Wave.
New quantum framework for analysing higher-order topological data achieves linear scaling in signal dimension using quantum linear systems algorithms compatible with data’s native format, that enable manipulating multi-way signals with efficient data encoding
A team of researchers led by Professor Kavan Modi from the Singapore University of Technology and Design (SUTD) has taken a conceptual leap into this complexity by developing a new quantum framework for analysing higher-order network data. Their work centres on a mathematical field called topological signal processing (TSP), which encodes more than connections between pairs of points but also among triplets, quadruplets, and beyond. Here, “signals” are information that lives on higher-dimensional shapes (triangles or tetrahedra) embedded in a network. The team introduced a quantum version of this framework, called Quantum Topological Signal Processing (QTSP). It is a mathematically rigorous method for manipulating multi-way signals using quantum linear systems algorithms. Unlike prior quantum approaches to topological data analysis, which often suffer from impractical scaling, the QTSP framework achieves linear scaling in signal dimension. It is an improvement that opens the door to efficient quantum algorithms for problems previously considered out of reach. The technical insight behind QTSP is in the structure of the data itself. Classical approaches typically require costly transformations to fit topological data into a form usable by quantum devices. However, in QTSP, the data’s native format is already compatible with quantum linear systems solvers, due to recent developments in quantum topological data analysis. This compatibility allows the team to circumvent a major bottleneck, efficient data encoding, while ensuring the algorithm remains mathematically grounded and modular. Still, loading data into quantum hardware and retrieving it without overwhelming the quantum advantage remains an unsolved challenge. Even with linear scaling, quantum speedups can be nullified by overheads in pre- and post-processing. The framework achieves linear scaling and has been demonstrated through a quantum extension of the classical HodgeRank algorithm, with potential applications in recommendation systems, neuroscience, physics and finance.
Physicists leverage real-time AI control to assemble world’s largest 2,024-atom quantum array, paving the way for scalable, efficient quantum computing breakthroughs
A team led by Chinese physicist Pan Jianwei used artificial intelligence (AI) to help create an atom-based quantum computing component that dwarfs previous systems in size, raising hopes that neutral-atom machines could one day operate with tens of thousands of qubits. The team arranged 2,024 rubidium atoms — each functioning as a qubit — into precise two- and three-dimensional arrays. The feat, reportedly marks a tenfold increase over the largest previous atom arrays and addresses one of the field’s most stubborn bottlenecks: how to scale beyond a few hundred qubits without prohibitive delays. Until now, researchers typically moved atoms into place one at a time, making large-scale arrays impractical. Pan’s team, working with the Shanghai Artificial Intelligence Laboratory, replaced this slow step with a real-time AI control system that shifts every atom in the array simultaneously. The setup uses a high-speed spatial light modulator to shape laser beams into traps that corral the atoms. The AI system calculates where each atom needs to go and directs the lasers to move them into perfect positions in just 60 milliseconds — 60,000th of a second, or about the same time it takes a hummingbird to flap its wings 5 times — regardless of whether the array contains hundreds or thousands of atoms. In principle, the method could scale to arrays with tens of thousands of atoms without slowing down. If successful, scaling neutral-atom arrays to that size could allow them to run algorithms that are currently beyond the reach of classical computers and existing quantum prototypes. Applications could range from simulating complex molecules for drug discovery to solving optimization problems in logistics and materials science. The AI-guided control method, coupled with high-precision lasers, essentially removes the scaling penalty that has long plagued neutral-atom designs.
Discovery of “neglectons” boosts topological quantum computing—theorized quasiparticles enable robust, universal quantum logic by expanding computational power of special particles called anyons
A team of mathematicians and physicists in the US has discovered a way to exploit a previously neglected aspect of topological quantum field theory, revealing that these states can be much more broadly useful for quantum computation than previously believed. The quantum bits in topological quantum computers are based on particle-like knots, or vortices, in the sea of electrons washing through a material. The advantage of anyon-based quantum computing is that the only thing that can change the state of anyons is moving them around in relation to each other – a process called “braiding” that alters their relative topology. However, not all anyons are up to the task. In the semisimple model, braiding the remaining anyons, known as Ising anyons, only lends itself to a limited range of computational logic gates, which can be efficiently simulated by classical computers, which reduces their usefulness for truly ground-breaking quantum machines. The team solved this problem with ingenious workarounds created by Lauda’s PhD student, Filippo Iulianelli, allowing the computational space to only those regions where anyon transformations work out as unitary.
Terra Quantum’s QMM-Enhanced Error Correction boosts quantum processor fidelity by reducing errors up to 35% without added complexity or mid-circuit measurements
Terra Quantum has introduced QMM-Enhanced Error Correction, a hardware-validated, measurement-free method that suppresses quantum errors and improves fidelity on existing processors without architectural changes. Validated on IBM’s superconducting processors, the QMM layer functions as a lightweight, unitary “booster” that enhances fidelity without mid-circuit measurements or added two-qubit gates, offering a powerful alternative to traditional surface codes. A single QMM cycle achieves 73% fidelity, is entirely unitary, and is feedback-free. When combined with a repetition code, logical fidelity increases to 94%, representing a 32% gain achieved without the addition of CX gates. In hybrid workloads such as variational quantum classifiers, QMM reduces training loss by 35% and halves run-to-run performance variance. Simulations show that three QMM layers can achieve error rates comparable to those of a distance-3 surface code, while requiring ten times fewer qubits. QMM is especially relevant in environments where traditional error correction is impractical or cost prohibitive. It addresses core challenges across photonic and analog platforms where mid-circuit measurements are infeasible, cloud-based quantum systems that demand minimal gate depth and latency, and hybrid quantum-classical applications, where even marginal stability gains translate to significant performance benefits. Terra Quantum’s QMM layer introduces a new architectural class for quantum systems. Think of it as a quantum tensor core: a compact, circuit-level module that boosts fidelity and suppresses coherent errors without increasing circuit depth or gate count. With up to 35% error reduction, seamless integration, and no extra two-qubit operations, QMM enables more performance per qubit, per dollar, and watt. For hardware vendors, system integrators, and developers, this provides a clear path toward scalable, fault-tolerant quantum computing without requiring redesign of the stack.
MIT says quantum computing is surging in USA with over 40 quantum processing units offered, a 5x increase in patents, and $2.2 billion in venture investment in 2024
Quantum computing is gaining significant business and commercial potential, according to a new report by researchers at the MIT Initiative on the Digital Economy. The “Quantum Index Report 2025” provides a comprehensive assessment of the state of quantum technologies, aiming to make them more accessible to entrepreneurs, investors, teachers, and business decision-makers. The report highlights the increasing interest in quantum computing, with the US leading the field with over 40 quantum processing units (QPUs). The report also notes that quantum technology patents have soared, with corporations and universities leading innovation efforts. Venture capital funding for quantum technology reached a new high point in 2024, with quantum computing firms receiving the most funding ($1.6 billion) followed by quantum software companies at $621 million. Businesses are also buzzing about quantum computing, with the frequency of mentions each quarter increasing from 2022 to 2024. The demand for quantum skills has nearly tripled since 2018, prompting universities to establish quantum hubs and programs connecting business leaders with researchers. The report highlights the rapid progress and developments across various areas, indicating a broad and deep development in the field.
MIT says quantum computing is surging in USA with over 40 quantum processing units offered, a 5x increase in patents, and $2.2 billion in venture investment in 2024
Quantum computing is gaining significant business and commercial potential, according to a new report by researchers at the MIT Initiative on the Digital Economy. The “Quantum Index Report 2025” provides a comprehensive assessment of the state of quantum technologies, aiming to make them more accessible to entrepreneurs, investors, teachers, and business decision-makers. The report highlights the increasing interest in quantum computing, with the US leading the field with over 40 quantum processing units (QPUs). The report also notes that quantum technology patents have soared, with corporations and universities leading innovation efforts. Venture capital funding for quantum technology reached a new high point in 2024, with quantum computing firms receiving the most funding ($1.6 billion) followed by quantum software companies at $621 million. Businesses are also buzzing about quantum computing, with the frequency of mentions each quarter increasing from 2022 to 2024. The demand for quantum skills has nearly tripled since 2018, prompting universities to establish quantum hubs and programs connecting business leaders with researchers. The report highlights the rapid progress and developments across various areas, indicating a broad and deep development in the field.
Quantum-as-a-Service provides businesses cloud-based, pay-as-you-go access to quantum computing; eliminating multimillion-dollar infrastructure costs and lowering expertise barriers for modeling and optimization
Current quantum computers are hugely expensive to own and difficult to maintain. This is where quantum-as-a-service comes in. Thanks to QaaS, businesses wanting to experiment with it or even start putting it to operational use don’t need to spend millions of dollars on hardware and a dedicated facility to operate it from. Instead, QaaS providers let businesses or research organizations access their quantum computers through the cloud, using a pay-as-you-go model to minimize initial overheads. QaaS is an increasingly attractive option for many businesses and organizations that want to experiment with quantum computing without incurring huge initial outlays. For instance, financial services companies are using it to model risks and understand the seemingly chaotic behavior of markets, to help them make better investment decisions. While multinational banks like JP Morgan and HSBC have money to invest in IT infrastructure, the quantum skills shortage still makes it difficult to find workers with the technical knowledge to maintain it. In short, QaaS hugely reduces (in principle) the financial cost, the logistical requirements, and the skills overhead of running quantum computing projects, potentially making it available to a much wider user base. If your business model involves modelling of complex systems, real-world environments or optimization challenges (such as finding the most efficient route between a large number of destinations), then quantum is certainly worth exploring.
D-Wave’s study finds 27% of business leaders whose company has implemented quantum optimization or plans to do so within the next two years expect a return on investment of more than $5 million in the first 12 months
D-Wave Quantum study highlights the potential for quantum optimization to create value across industries. According to the study, 46% of surveyed business leaders whose company has implemented quantum optimization or plans to do so within the next two years expect a return on investment of between $1 and $5 million, while 27% predict a return of more than $5 million in the first 12 months. A majority of the business leaders surveyed (81%) believe that they have reached the limit of the benefits they can achieve through optimization solutions running on classical computers. Against that backdrop, many are starting to explore whether quantum technologies can help. 53% are planning to build quantum computing into their workflows and 27% are considering doing so, indicating a growing recognition of quantum computing’s real-world business value. 22% are seeing quantum make a significant impact for those who have adopted it, while another 50% anticipate it will be disruptive for their industry. The results of the study show that quantum computing is gaining recognition among business leaders for its ability to potentially deliver major efficiencies in addressing complex optimization problems and operational improvements. 60% respondents expect quantum computing-based optimization to be very or extremely helpful in solving the specific operational challenges that their companies face. In fact, among those respondents most familiar with quantum, this figure rises to 73%, including nearly a quarter who describe it as “a game changer.” The areas in which business leaders expect to benefit from an investment in quantum optimization include: supply chain and logistics (50%), manufacturing (38%), planning and inventory (36%), and research and development (36%). Most respondents (88%), especially those in the manufacturing industry, believe that their company would go “above and beyond” for even a 5% improvement in optimization.
D-Wave’s study finds 27% of business leaders whose company has implemented quantum optimization or plans to do so within the next two years expect a return on investment of more than $5 million in the first 12 months
D-Wave Quantum study highlights the potential for quantum optimization to create value across industries. According to the study, 46% of surveyed business leaders whose company has implemented quantum optimization or plans to do so within the next two years expect a return on investment of between $1 and $5 million, while 27% predict a return of more than $5 million in the first 12 months. A majority of the business leaders surveyed (81%) believe that they have reached the limit of the benefits they can achieve through optimization solutions running on classical computers. Against that backdrop, many are starting to explore whether quantum technologies can help. 53% are planning to build quantum computing into their workflows and 27% are considering doing so, indicating a growing recognition of quantum computing’s real-world business value. 22% are seeing quantum make a significant impact for those who have adopted it, while another 50% anticipate it will be disruptive for their industry. The results of the study show that quantum computing is gaining recognition among business leaders for its ability to potentially deliver major efficiencies in addressing complex optimization problems and operational improvements. 60% respondents expect quantum computing-based optimization to be very or extremely helpful in solving the specific operational challenges that their companies face. In fact, among those respondents most familiar with quantum, this figure rises to 73%, including nearly a quarter who describe it as “a game changer.” The areas in which business leaders expect to benefit from an investment in quantum optimization include: supply chain and logistics (50%), manufacturing (38%), planning and inventory (36%), and research and development (36%). Most respondents (88%), especially those in the manufacturing industry, believe that their company would go “above and beyond” for even a 5% improvement in optimization.