Skip to main content

Influence of AI on Quantum Computing

· 13 min read

Quantum computing represents a paradigm shift in information processing, offering a fundamentally distinct approach to problem-solving and computation. Unlike classical computers operating in a single state at any given moment, quantum computers leverage the ability to exist in many states concurrently. This unique characteristic has led many researchers to posit that quantum computers could potentially deliver exponential speedups and tackle problems that are currently beyond the reach of classical computers.

Classical Computing: The Foundation

The advent of classical computers has propelled us into the Information Age, catalyzing a myriad of digital revolutions, including personal computing, internet communication, smartphones, machine learning, and the broader knowledge economy. Classical computers encode and manipulate data in units known as bits, using billions of semiconductor components called transistors to switch or amplify electrical signals. A classical bit, akin to the power switch on your electronic device, can exist in one of two states at any given time - 0 or 1. This binary nature of classical information processing forms the bedrock of our current digital landscape.

Quantum Computing: The Mechanics

Quantum computers, on the other hand, process information by harnessing the behaviours of subatomic particles such as electrons, ions, or photons. Data is stored in quantum registers composed of quantum bits or qubits. Unlike classical bits, qubits are not confined to binary states. The principles of superposition, entanglement, and interference govern them.

Superposition

Superposition is a quantum property that allows qubits to exist in multiple states until an external measurement is made. For instance, an electron's state could be a superposition of "spin up" and "spin down". Drawing from the famous Schrödinger's cat analogy, a qubit in superposition is akin to the cat being both dead and alive until observed. A qubit could be in a 0 state, a one state, or any complex linear combination of 0 and 1. Upon measurement, the qubit collapses into a binary state, becoming a classical bit.

Entanglement

Entanglement is another quantum phenomenon where particles become interconnected so that they cannot be described independently, even across vast distances. This is in stark contrast to classical bits, which are independent of each other. In quantum computing, entangled qubits fall into a shared quantum state, and manipulating one qubit can influence the entire system's probability distribution. The number of states also grows exponentially with the addition of each qubit, offering a significant advantage over classical computers.

Interference

Interference, the final quantum property impacting the operation of a quantum computer, involves the addition of the wave functions of all entangled qubits. This process describes both the quantum computer's state and the interference phenomenon. Constructive interference increases the probability of a correct solution, while destructive interference decreases it. Quantum algorithms are designed to orchestrate this interference to maximize the likelihood of helpful measurement states.

In essence, quantum computing leverages the principles of superposition, entanglement, and interference to process information in ways that classical computers cannot. The potential of quantum computing is immense, with the capability to process more possibilities than the number of atoms in the observable universe, given a quantum computer unaffected by decoherence and noise. However, the field is still nascent, and much research is needed to realize and harness this potential fully.

The Genesis of Quantum Computing

The inception of quantum computing can be traced back to the Soviet mathematician Yuri Manin, who first proposed the concept in his book, "Computable and Uncomputable," published in 1980. In the same year, American physicist Paul Benioff, affiliated with the French Centre de Physique Théorique, introduced a quantum mechanical model of a Turing machine in a scholarly paper.

In 1981, Benioff and American theoretical physicist Richard Feynman presented separate talks on quantum computing at MIT's inaugural Conference on the Physics of Computation. Feynman, in his lecture titled "Simulating Physics with Computers," underscored the necessity of a quantum computer for simulating a quantum system, famously stating, "Nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical."

Quantum Computing: Gaining Momentum

The pioneering work of Benioff and Feynman sparked a surge of interest in quantum computing during the final decades of the 20th century. British theoretical physicist David Deutsch, intrigued by the potential of a quantum computer to test the "many-worlds interpretation" of quantum physics, proposed the concept of a quantum Turing machine (QTM) in a 1985 paper.

By 1992, Deutsch, in collaboration with Australian mathematician Richard Jozsa, identified a computational problem that could be efficiently solved on a universal quantum computer using their Deutsch-Jozsa algorithm. This problem, they believed, could not be solved efficiently on a classical computer. For his significant contributions, Deutsch is often called the "father of quantum computing."

Quantum Speedup: Shor's and Grover's Algorithms

In the late 20th century, we witnessed the development of several quantum computer models and algorithms. One of the most notable is Shor's algorithm, developed by Peter Shor, an applied mathematician at AT&T Bell Labs. In 1994, Shor introduced a method for factoring large integers in polynomial time, considered "efficiently solvable" or "tractable."

Factoring, the process of decomposing a number into smaller numbers that multiply to give the original number, is a fundamental mathematical operation. While it is straightforward to multiply factors to produce the actual number, finding the original characteristics of large numbers is challenging due to the vast search space of possible factors.

Shor's algorithm can break existing encryption systems by factoring primes in polynomial time, which has significant implications for cryptographic systems and data security. However, the practical implementation of Shor's algorithm on a quantum computer capable of breaking current advanced encryption schemes is still a distant reality due to the limitations in the number of qubits in current quantum computers.

In 1996, Lov Grover, a researcher at Bell Labs, made another significant development in quantum computing. Grover introduced a quantum algorithm for database search, which provides a quadratic speedup for one-way function problems typically solved by random or brute-force search. Grover's algorithm leverages the principles of qubit superposition and interference to iteratively check and eliminate non-solution states, thereby finding the correct solution with certainty. This algorithm is particularly effective for computational problems where finding a solution is difficult, but verifying a solution is relatively straightforward.

Quantum AI/ML

The advent of quantum computing has sparked a wave of excitement due to its potential to revolutionize information processing. This enthusiasm led to the establishment of initiatives for enhanced information sharing, policymaking, and prioritization of national and international research efforts.

In the mid-1990s, the National Institute of Standards and Technology (NIST) and the Department of Defense (DoD) hosted the first U.S. government workshops on quantum computing. By 2000, theoretical physicist David DiVincenzo had outlined the requirements for constructing a quantum computer, known as the DiVincenzo criteria. These criteria include well-defined qubits, initialization to a pure state, a universal set of quantum gates, qubit-specific measurement, and long coherence times.

In 2002, an expert panel convened by Los Alamos National Laboratory released a Quantum Information Science and Technology Roadmap. This roadmap aimed to capture the challenges in quantum computing, provide direction on technical goals, and track progress toward those goals through various technologies and approaches. The panel adopted the DiVincenzo criteria to evaluate the viability of different quantum computing approaches.

Significant milestones were achieved as evaluations of quantum computing models and approaches yielded physical hardware and valuable algorithms. In 1995, Christopher Monroe and David Wineland demonstrated the first quantum logic gate with trapped ions, an indispensable component for constructing gate-based quantum computers. A decade later, researchers at the University of Michigan created a scalable and mass-producible semiconductor chip ion trap, paving the way for scalable quantum computing.

In 2009, researchers at Yale University made the first solid-state gate quantum processor. Two years later, D-Wave Systems of Burnaby, British Columbia, became the first company to market a commercial quantum computer. D-Wave's machine, which uses a unique approach to analogue computing known as quantum annealing, is not a universal quantum computer but is specialized for problems where the search space is discrete, with many local minima or plateaus, such as combinatorial optimization problems.

The introduction of the original D-Wave machine highlighted the potential economic rewards and national security dividends of advances in quantum hardware and software. However, the research involved would be expensive and risky. This led to partnerships between private-sector companies and government agencies in the early 2000s. Early adopters of D-Wave quantum computers included Google in alliance with NASA, Lockheed Martin Corporation in cooperation with the University of Southern California, and the U.S. Department of Energy's Los Alamos National Laboratory.

Recognizing the potential of quantum computers in solving intractable problems in computer science, especially machine learning, Google Research, NASA, and the Universities Space Research Association established a Quantum Artificial Intelligence Lab (QuAIL) at NASA's Ames Research Center in Silicon Valley. NASA aims to use hybrid quantum-classical technologies to tackle some of the most challenging machine learning problems, such as generative unsupervised learning. IBM, Intel, and Rigetti are also pursuing goals to demonstrate quantum computational speedups over classical computers and algorithms in various areas, sometimes called quantum supremacy or quantum advantage.

In 2017, University of Toronto assistant professor Peter Wittek founded the Quantum Stream in the Creative Destruction Lab (CDL). Quantum Stream encourages scientists, entrepreneurs, and investors to pursue commercial opportunities in quantum computing and machine learning. Quantum Stream's technology partners include D-Wave Systems, IBM Q, Rigetti Computing, Xanadu, and Zapata Computing. Numerous startups and well-established companies are also forging ahead to create their quantum computing technologies and applications.

In November 2021, IBM Quantum announced Eagle, a 127-qubit quantum processor. However, the University of Science and Technology of China also claimed a 66-qubit superconducting quantum processor called Zuchongzhi and an even more powerful photonic quantum computer called Jiuzhang 2.0 in the same month.

Determining who has achieved primacy in quantum computing is challenging due to the murky process of verifying and benchmarking quantum computers and the inherent diversity in current approaches and models of quantum computers. There is excitement surrounding various models for manipulating qubits: gate model quantum computing, quantum annealing, adiabatic quantum computing (AQC), and topological quantum computing. There is also great diversity in methods for building physical implementations of quantum systems.

The physical implementation of quantum computers is crucial because quantum computers and qubits are notoriously difficult to control. Information stored in qubits can escape when they become accidentally entangled with the outside environment, the measurement device and controls, or the material of the quantum computer itself. This seepage of quantum information is called decoherence. Qubits must also be physically shielded from any noise: changing magnetic and electrical fields, radiation from other electronic devices, cosmic rays from space, radiation from warm objects, and other rogue particles and waves.

2018, President Donald Trump signed the National Quantum Initiative Act into law. The act is designed to plan, coordinate, and accelerate quantum research and development for economic and national security over ten years. Funded under the National Quantum Initiative Act is the Quantum Economic Development Consortium™ (QED-C™), with NIST and SRI International as lead managers.

Several critical online resources support quantum computer science. The Quantum Algorithm Zoo, a comprehensive catalogue of quantum algorithms, is managed by Stephen Jordan in Microsoft Research's Quantum Systems group. IBM hosts the Quantum Experience, an online interface to the company's superconducting quantum systems and a repository of quantum information processing protocols. Qiskit is an open-source software development kit (SDK) for anyone interested in working with OpenQASM (a programming language for describing universal physical quantum circuits) and IBM Q quantum processors. In collaboration with the University of Waterloo, the "moonshot factory" X, and Volkswagen, Google AI announced TensorFlow Quantum (TFQ) in 2020; TFQ is a Python-based open-source library and framework for hands-on quantum machine learning.

The Revolution in Information Science and Technology

Quantum computing, a field that leverages the principles of quantum mechanics, has already begun to influence various sectors, including machine learning (ML) and artificial intelligence (AI), genomics, drug discovery, and more. Quantum simulation, a notable application, could expedite the prototyping of materials and designs, potentially revolutionizing industries like manufacturing and aerospace.

However, the current capabilities of quantum computers are limited to simulating only a few particles and their interactions. Despite this, researchers are uncovering promising insights that could help us understand complex phenomena like superconductivity, environmental-friendly production methods, and the intricacies of aerodynamics.

A Game Changer in Encryption and AI

In recent years, there have been several groundbreaking developments in quantum computing. For instance, the National Security Agency's SIGINT initiatives, revealed by Edward Snowden in 2014, aimed to break strong encryption and gain access to secure digital networks. The agency planned to develop an $80 million quantum "god machine" for these purposes.

Moreover, researchers have made strides in understanding quantum Darwinism, a theory that explains how our classical physics world emerges from the quantum world. This theory suggests that the transition from quantum to classical is akin to the process of evolutionary natural selection.

A New Era of Information Technology

The convergence of quantum computing and AI, often called Quantum AI/ML (QAI), drastically transforms information science and technology, economic activities, social paradigms, and political arrangements. This shift could lead to a post-scarcity golden age where quantum AI democratizes access to limitless computational possibilities.

Johannes Otterbach, from the quantum-computer company Rigetti, has noted that quantum computing and machine learning are inherently probabilistic, making them natural partners. Quantum computers could significantly speed up training in machine learning, advancing all three primary subcategories of ML: supervised learning, unsupervised learning, and reinforcement learning.

Revolutionizing Various Industries

Quantum computing and AI have already begun to intersect in various applications. For instance, quantum algorithms have been developed for route and traffic optimization, computing the quickest route for each vehicle in a fleet and optimizing it in real-time. Companies like Toyota Tsusho Corp and Volkswagen have demonstrated the potential of these quantum routing algorithms to reduce wait times and traffic congestion.

Predictive and risk analytic QAI technology could also aid in forecasting and managing hazards such as geopolitical events, financial panics, and future pandemics. Furthermore, quantum AI could revolutionize fields like seismology, geological prospecting, and medical imaging.

Quantum Ultra-intelligence: The Future of AI

The potential of quantum artificial intelligence has inspired a new literary subgenre called quantum fiction. While these works are purely fictional, they reflect the aspirations of computer scientists striving to engineer an artificial general intelligence (AGI) that possesses self-awareness.

However, it remains to be seen whether we can maintain control over a self-aware QAI or persuade it into a collaborative partnership with humanity. As we continue to develop these technologies, ensuring they remain beneficial to society is crucial.

Summary

The convergence of quantum computing and AI is set to revolutionize various aspects of our lives. As Max Tegmark, an MIT physicist and ML specialist has said, "Everything we love about civilization is a product of intelligence, so amplifying our human intelligence with artificial intelligence has the potential of helping civilization flourish like never before—as long as we manage to keep the technology beneficial."