How Will AI and Quantum Work Together? Quantinuum’s View
There’s great deal of discussion in the quantum computing (QC) world about running AI on quantum hardware and the development of so-called QuantumAI. The promise seems high, but so do the obstacles, such as the high volume of data used in AI model training.
Trapped ion quantum computer specialist, Quantinuum, today posted a blog, “Quantum Computers will Make AI Better,” that briefly reviews progress on adapting AI techniques and suggests bigger changes are ahead.
“Training models like ChatGPT requires processing vast datasets with billions, even trillions, of parameters. This demands immense computational power, often spread across thousands of GPUs or specialized hardware accelerators. The environmental cost is staggering—simply training GPT-3, for instance, consumed nearly 1,300 megawatt-hours of electricity, equivalent to the annual energy use of 130 average U.S. homes…
“Despite these challenges, the push to develop ever-larger models shows no signs of slowing down.
“Enter quantum computing. Quantum technology offers a more sustainable, efficient, and high-performance solution—one that will fundamentally reshape AI, dramatically lowering costs and increasing scalability, while overcoming the limitations of today’s classical systems.”
While the blog is promotional for Quantinuum’s AI work, emphasizing its deep expertise in that area, it’s also a good read for those unfamiliar with how AI tools and techniques are being adapted to run on quantum computers.
The blog’s main focus is on converting natural language processing techniques for use on quantum hardware; it’s an area on which Quantinuum argues great progress has been made. Quantinuum showcases its work converting recurrent neural networks into parameterized quantum circuits (PQS):
“In a recent experiment, the team used their quantum RNN to perform a standard NLP task: classifying movie reviews from Rotten Tomatoes as positive or negative. Remarkably, the quantum RNN performed as well as classical RNNs, GRUs, and LSTMs, using only four qubits. This result is notable for two reasons: it shows that quantum models can achieve competitive performance using a much smaller vector space, and it demonstrates the potential for significant energy savings in the future of AI,” report the blog writers.
“In a similar experiment, our team partnered with Amgen to use PQCs for peptide classification, which is a standard task in computational biology. Working on the Quantinuum System Model H1, the joint team performed sequence classification (used in the design of therapeutic proteins), and they found competitive performance with classical baselines of a similar scale. This work was our first proof-of-concept application of near-term quantum computing to a task critical to the design of therapeutic proteins, and helped us to elucidate the route toward larger-scale applications in this and related fields, in line with our hardware development roadmap.”
Work on transformers and tensor networks are also briefly reviewed.
A robust more public discussion of how to blend AI and quantum computing is really just starting to emerge. Clearly the quantum world doesn’t want to be overwhelmed or slowed by the AI revolution. Making sense of how to combine the two is likely to take awhile. That said, Quantinuum, like many in the quantum development community, believe QuantumAI will turn out to be a big ($) thing.
The blog concludes, “As quantum computing hardware continues to improve, quantum AI models may increasingly complement or even replace classical systems. By leveraging quantum superposition, entanglement, and interference, these models offer the potential for significant reductions in both computational cost and energy consumption. With fewer parameters required, quantum models could make AI more sustainable, tackling one of the biggest challenges facing the industry today.”
Link to Quantinuum blog, https://www.quantinuum.com/blog/quantum-computers-will-make-ai-better.
This article first appeared on sister site HPCwire.