First demonstration of Shor’s historic factoring algorithm

SAN JOSE, Calif., — Scientists at IBM’s Almaden Research Center have performed the world’s most complicated quantum-computer calculation to date. They caused a billion billion custom-designed molecules in a test tube to become a seven-qubit quantum computer that solved a simple version of the mathematical problem at the heart of many of today’s data-security cryptographic systems.

“This result reinforces the growing realization that quantum computers may someday be able to solve problems that are so complex that even the most powerful supercomputers working for millions of years can’t calculate the answers,” said Nabil Amer, manager and strategist of IBM Research’s physics of information group.

In today’s issue of the scientific journal Nature, a team of IBM scientists and Stanford University graduate students report the first demonstration of “Shor’s Algorithm” — a method developed in 1994 by AT&T scientist Peter Shor for using the futuristic quantum computer to find a number’s factors — numbers that are multiplied together to give the original number. Today, factoring a large number is so difficult for conventional computers — yet so simple to verify — that it is used by many cryptographic methods to protect data.

A quantum computer gets its power by taking advantage of certain quantum properties of atoms or nuclei that allow them to work together as quantum bits, or “qubits,” which serve simultaneously as the computer’s processor and memory . By directing the interactions between qubits while keeping them isolated from the external environment, scientists enable a quantum computer to perform certain calculations, such as factoring, exponentially faster than conventional computers. When factoring large numbers using a conventional computer, each added digit roughly doubles the time to find the factors. In contrast, the quantum factoring time increases by only a constant increment with each additional digit.

The simplest meaningful instance of Shor’s Algorithm is finding the factors of the number 15, which requires a seven-qubit quantum computer. IBM chemists designed and made a new molecule that has seven nuclear spins — the nuclei of five fluorine and two carbon atoms — which can interact with each other as qubits, be programmed by radio frequency pulses and be detected by nuclear magnetic resonance (NMR) instruments similar to those commonly used in hospitals and chemistry labs.

The IBM scientists controlled a vial of a billion billion (10**18) of these molecules so they executed Shor’s algorithm and correctly identified 3 and 5 as the factors of 15. “Although the answer may appear to be trivial, the unprecedented control required over the seven spins during the calculation made this the most complex quantum computation performed to date,” Amer said.

“Now we have the challenge of turning quantum computation into an engineering reality,” said Isaac Chuang, leader of the research team and now an associate professor at MIT. “If we could perform this calculation at much larger scales — say the thousands of qubits required to factor very large numbers — fundamental changes would be needed in cryptography implementations.”

While the potential for quantum computing is huge and recent progress is encouraging, commercial quantum computers are still many years away. NMR-based quantum computers are laboratory experiments. The first quantum computing applications would likely to be co-processors for specific functions, such as solving difficult mathematical problems, modeling quantum systems and performing unstructured searches. Word processing or simple problem-solving tasks are more easily handled by today’s computers.

IBM’s demonstration of Shor’s algorithm also shows the value of quantum computing experiments using NMR, an approach pioneered independently in the mid-1990s by Chuang and Neil Gershenfeld of MIT and by David Cory and colleagues, also at MIT. “Our NMR experiments stimulated us to develop fundamental tools that can be used in many future types of quantum computers,” said Chuang. “Most important of these was a way to simulate and predict the signal degradation caused by ‘decoherence’ — unintended quantum fluctuations. This tool enabled us to minimize decoherence errors in our 7-qubit experiment.”

While NMR will continue to provide a testbed for developing quantum computing tools and techniques, it will be very difficult to develop and synthesize molecules with many more than seven qubits. As a result, new experiments at IBM and elsewhere are aimed at developing new quantum computing systems that can more easily “scale” to the large numbers of qubits needed for practical applications. Strong candidates today include electron spins confined in semiconductor nanostructures (often called quantum dots), nuclear spins associated with single-atom impurities in a semiconductor, and electronic or magnetic flux through supercoductors. Atomic and optical implementations continue to be evaluated.

###

Co-authors of the Nature report with Chuang are Gregory Breyta, Mark Sherwood and Costantino Yannoni of IBM-Almaden and Stanford University graduate students Lieven M.K. Vandersypen and Matthias Steffen.

Quantum Computing History:

When quantum computers were first proposed in the 1970s and 1980s (by theorists such as the late Richard Feynmann of California Institute of Technology, Pasadena, Calif.; Paul Benioff of Argonne National Laboratory in Illinois; David Deutsch of Oxford U. in England., and Charles Bennett of IBM’s T.J. Watson Research Center, Yorktown Heights, N.Y.), many scientists doubted that they could ever be made practical. But in 1994, Peter Shor of AT&T Research described a specific quantum algorithm for factoring large numbers exponentially faster than conventional computers — fast enough to defeat the security of many public-key cryptosystems. The potential of Shor’s algorithm stimulated many scientists to work toward realizing the quantum computers’ potential. Significant progress has been made in recent years by numerous research groups around the world.

While at IBM, Chuang extended his reputation as one of the world’s leading quantum computing experimentalist. He led the team that demonstrated the world’s first 2-qubit quantum computer (in 1998 at University of California Berkeley). At IBM-Almaden, Chuang and colleagues were first to demonstrate important quantum computing algorithms — Grover’s database-search algorithm in 1999 with a 3-qubit quantum computer and order finding last year (August 2000) with a 5-qubit quantum computer. The factorization using Shor’s algorithm announced today is the most complex algorithm yet to be demonstrated by a quantum computer.

In addition to its ambitious experimental program, IBM Research is also noted for its many theoretical contributions to the emerging field of quantum information. IBM scientists pioneered quantum cryptography, quantum communications (including the concept of quantum teleportation) and efficient methods of error correction. David DiVincenzo, research staff member at IBM’s Watson lab, has promulgated the five criteria necessary for a practical quantum computer: 1) a scalable physical system with well-characterized qubits; 2) the ability to initialize the qubit state; decoherence times much longer than the quantum gate operation time; 4) a universal set of quantum gates; and 5) the ability to measure specific qubits.