|This file is available on a Cryptome DVD offered by Cryptome. Donate $25 for a DVD of the Cryptome 10-year archives of 35,000 files from June 1996 to June 2006 (~3.5 GB). Click Paypal or mail check/MO made out to John Young, 251 West 89th Street, New York, NY 10024. Archives include all files of cryptome.org, cryptome2.org, jya.com, cartome.org, eyeball-series.org and iraq-kill-maim.org. Cryptome offers with the Cryptome DVD an INSCOM DVD of about 18,000 pages of counter-intelligence dossiers declassified by the US Army Information and Security Command, dating from 1945 to 1985. No additional contribution required -- $25 for both. The DVDs will be sent anywhere worldwide without extra cost.|
20 December 2001: See IBM factorization of the Shor algorithm published today:
22 August 1999. Thanks to the author and The Sciences.
Source: Hardcopy The Sciences, July/August 1999, pp. 24-30.
BY LOV K. GROVER
LOV K GROVER is a member of the technical staff in physical sciences research at Bell Labs in Murray Hill, New Jersey. He is the inventor of what has been proven to be the fastest possible search algorithm that could run on a quantum computer.
IN THE DARKEST HOUR OF THE SECOND World War, when France had fallen and German submarines prowled the Atlantic at will, the finest minds in Britain w ere clustered in a place known as Bletchley Park. There, in low wooden buildings thrown up on the grounds of a country estate fifty miles northwest of London, thousands of men and women worked furiously to decode the orders the Axis sent its ships and troops. As intercepted radio messages chattered out of the teleprinters, clerks would snatch them and copy them by hand onto standardized forms. Then the cryptanalysts would take over. Mostly, they worked by hand: comparing sheets of gridded paper, running through permutations, crossing off false starts, staring, shuffling, guessing. It worked. For days or weeks the fog of war would clear as the Bletchley Park team found the keys to the Red and Light Blue ciphers of the Axis air force, the naval and army Enigma ciphers and then, triumphantly, the super-secret Fish cipher of the German high command.
It seems astonishing that the fate of Europe may once have depended on pencil stubs and intuition. Today, when calculations become complex and the stakes are high, ordinary brainpower almost always gets an electronic assist from a computer-and the faster and more powerful the better. Many of the hardest calculations still have to do with codes-though they are intended mostly to test security, not compromise it. But today's codes make Fish seem like the simple letter-substitution ciphers beloved of puzzlers in children's magazines (a becomes b, b becomes c and so forth). One is the so-called RSA protocol, which makes electronic banking possible by assuming banks and their customers that a bogus transfer of funds or a successful forgery would take the world's fastest computer millions of years to carry out. Another is the widespread Data Encryption Standard (DES), which remains secure for most ordinary business transactions.
Other calculations -- searching the Internet, modeling the national economy, forecasting the weather -- likewise strain the capacities of even the fastest and most powerful computers. The difficulty is not so much that microprocessors are too slow; it is that computers are inherently inefficient. Modern computers operate according to programs that divide a task into elementary operations, which are then carried out serially, one operation at a time. Computer designers have tried for some time to coax two or more computers (or at least two or more microprocessors) to work on different aspects of a problem at the same time, but progress in such parallel computing has been slow and fitful. The reason, in large part, is that the logic built into microprocessors is inherently serial. (Ordinary computers sometimes appear to be doing many tasks at once, such as running both a word-processor and a spreadsheet program, but in reality the central processor is simply cycling rapidly from one task to the next.)
A truly parallel computer, in contrast. would have simultaneity built into its very nature. It would be able to carry out many operations at once, to search instantly through a long list of possibilities and point out the one that solves a problem.
Such computers do exist. They are called quantum computers -- not so much because they are inherently small, but because they operate according to the bizarre rules of quantum mechanics, which do indeed govern the world of the very small: the waves and particles of subatomic physics. One quantum rule in particular creates an enormous incentive to apply quantum mechanics to computing: the startling discovery by twentieth-century physicists that elementary particles such as protons, neutrons and electrons can persist in two or more states at once. That makes it possible, at least in principle, for them to be harnessed as processing units in a machine more efficient than any conventionally designed "classical" computer could ever be.
In the past few years, simple quantum computers have been built in the laboratory. Meanwhile, in my field, the theory of quantum computation, investigators are eagerly describing what the more sophisticated machines will be like if they can ever be constructed. We are, if you will, writing the software for a device that does not yet exist. Yet on paper, at least, the prospects are stunning: an algorithm that could factor 140-digit-long numbers a billion (109) times faster than is currently possible with the best nonquantum methods; a search engine that could examine every nook and cranny of the Internet in half an hour; a "brute-force" decoder that could unscramble a DES transmission in five minutes.
PERHAPS THE MOST SURPRISING THING ABOUT quantum computing is that it was so slow to get started. Physicists have known since the 1920s that the world of subatomic particles is a realm apart, but it took computer scientists another half-century to begin wondering whether quantum effects might be harnessed for computation. The answer was far from obvious.
Boiled down to its essentials, any computer must meet two requirements: it must be able to store information as strings of 1's and 0's, or bits, and it must have a way of altering the bits in accordance with instructions. A computer transforms its bits by means of gates, or devices designed to carry out simple operations in logic. For example, a NOT gate converts any input bit into its opposite (0 becomes l, and 1 becomes 0). An OR gate. by contrast, converts two input bits into a single bit whose value is the higher of the two (0 OR 0 yields 0; any other combination gives 1). And an AND gate yields a 1 only if both input bits are 1's; otherwise, its output is a 0. Everything a computer does -- whether synthesizing speech, calculating the billionth digit of pi or beating Garry Kasparov at chess -- ultimately comes about through the transformation of bits by gates.
Could subatomic particles store bits? Could they form gates? Consider the electron. Every electron acts as if it were a little magnet, spinning about an axis, whose magnetic moment can point in only one of two directions, up or down. Thus the spin of the electron is quantized: it has just two possible states, which can readily be identified with the 0's and 1's of an ordinary computer processor. And you can flip the bit -- that is, change a down, or 0, to an up, or l, by adding just a smidgen of energy.
Suppose, however, you give it less energy than that -- say, half a smidgen. Once again, when you observe the electron's spin state, you will find that the spin is quantized: it points either up or down. But now there is an important, though subtle, difference. According to the rules of quantum mechanics, the probability of observing the spin in one or the other state will change. That change arises from a qualitatively new state, with no analogue in the ordinary, nonquantum, laws of physics, called a super position of the two spin states: a combined, in-between condition that can be. say, 60 percent up and 40 percent down, or 22 percent up and 78 percent down.
IN THE 1970s AND EARLY 1980s PHYSICISTS AND computer scientists began to investigate how the properties of quantum superpositions might be applied to computing. Early workers in the field -- including the physicists Charles H. Bennett of the IBM Thomas J. Watson Research Center in Yorktown Heights, New York, Paul A. Benioff of Argonne National Laboratory in Illinois, David Deutsch of the University of Oxford and the late Richard P. Feynman -- showed that particles in superposed states can function as quantum bits, or qubits, and can undergo operations analogous to the NOT, OR and AND operations of classical computing. But that is not all. Quantum computers, if they can be built, could achieve results that would seem almost magical -- and quite different from anything a classical system has to offer.
Imagine a quantum computer made of two atomic nuclei acted on by an external magnetic field. Suppose the nuclei belong to the neighboring atoms of carbon and hydrogen in a single molecule of chloroform, CHCl3. Just as electrons do, the nuclei align their spins with the magnetic field in the direction up (1) or down (0). One can now begin to compute with this toy system by tickling the nuclei with radio waves. By tuning the frequency and duration of a radio pulse in just the right way, it is possible to make one or the other nucleus flip its spin. It is even possible to ensure that the hydrogen nucleus flips over only if the carbon nucleus is already pointing up. In that case the quantized behavior of the two nuclei functions as what computer scientists call a controlled-NOT gate, with the carbon nucleus as the control. In symbols, with carbon in the first place and hydrogen in the second, there are four possible inputs, (1,1), (1,0), (0,1) and (0,0). Controlled-NOT can then operate in one of four ways: (1,1) -> (1,0); (1,0) -> (1,1); (0,1) -> (0,1); (0,0) -> (0,0). Physicists and computer scientists have proved that, by stringing together single-qubit operations and two-qubit controlled-NOT gates, it is theoretically possible to build a quantum computer capable of doing anything a classical computer can do.
BUT THE REAL MAGIC transpires when a two-qubit gate acts on a particle that is in a superposition of spin states. First, place the chloroform molecule in a strong external magnetic field that aligns both atomic nuclei into the down, or 0, position. Th