Such noise nearly drowned out the signal in Google’s quantum supremacy experiment. Researchers began by setting the 53 qubits to encode all possible outputs, which ranged from zero to 253. They implemented a set of randomly chosen interactions among the qubits that in repeated trials made some outputs more likely than others. Given the complexity of the interactions, a supercomputer would need thousands of years to calculate the pattern of outputs, the researchers said. So by measuring it, the quantum computer did something that no ordinary computer could match. But the pattern was barely distinguishable from the random flipping of qubits caused by noise. “Their demonstration is 99% noise and only 1% signal,” Kuperberg says.
To realize their ultimate dreams, developers want qubits that are as reliable as the bits in an ordinary computer. “You want to have a qubit that stays coherent until you switch off the machine,” Neven says.
Scientists’ approach of spreading the information of one qubit—a “logical qubit”—among many physical ones traces its roots to the early days of ordinary computers in the 1950s. The bits of early computers consisted of vacuum tubes or mechanical relays, which were prone to flip unexpectedly. To overcome the problem, famed mathematician John von Neumann pioneered the field of error correction.
Comments are closed.