This month, Yale physicists announced a major breakthrough in monitoring quantum information, a form of data that could one day be used by a powerful new kind of supercomputer.

In a Jan. 11 paper published in the journal Science, the study’s authors presented their observations of the state of a quantum bit — or qubit, the basic unit of quantum information — contained on a microscopic, superconducting aluminum circuit that was cooled to nearly absolute zero to minimize data interference, said lead authors Michael Hatridge and Shyam Shankar, postdoctorate scholars in the Yale Applied Physics Department. The study showed the degree to which observing a quantum system alters the information it contains, and proposes ways these errors can be corrected.

The physicists designed and monitored qubits, which are the foundation of a fundamentally different method of computing information. All computers that exist today use a normal binary system, in which each bit of information is the result of a transistor turning on and off, producing either a zero or a one. Quantum computing takes advantage of the uncertainty of quantum mechanics, caused by shrinking the computer’s hardware to extremely small sizes. Quantum uncertainty allows a qubit to register as a zero and a one at the same time. This superposition of data would allow quantum computers to handle far more information than today’s computers.

Paul Davies, a theoretical physicist at Arizona State University, has written that when completing certain tasks, quantum computers would be “an advance over current supercomputers as great as that of the electronic computer over the abacus.”

The Yale team will use their findings to design better qubits and quantum limited amplifiers that would make the information more easily detectable while preserving its integrity.

“When  you try to measure a quantum state, there is a fundamental limit on what quantum mechanics allows you to do. Usually you don’t reach that limit, but amplifiers allow you to reach that limit and measure a quantum system as well as you can,” Shankar said.

MIT mechanical engineering professor Seth Lloyd praised the Yale paper for answering longstanding questions in the field of quantum physics.

“This paper, and several other recent papers on quantum objects, are setting the record straight, and do so in an elegant and impressive fashion,” he said.

The study’s findings have provided the Yale researchers with a deeper understanding of the physics behind a quantum computer, but Hatridge said they still “don’t know the best way to put it to together.”

Quantum computing research at Yale receives funding from the United States military because of its great potential for breaking codes and searching databases. A common form of internet encryption today is based on the difficulty of factoring large numbers, and scientists have already built quantum computers that can do this much faster than a binary computer.

Though scientists are progressing towards the development of practical quantum computers, Hatridge said there is still much more work to be done. In the future, the Yale Quantronics Lab hopes to monitor several qubits at once, an achievement that could enable quantum computing on a much larger scale.

“We don’t know what all the possibilities are,” Shankar said. Though the United States military supports the Yale Quantronics lab, national security goals are not the sole motivating factor behind the researchers’ study of quantum computing.

“From our lab’s point of view, we are doing it because we don’t know what [quantum computing] can do,” Shankar said. “We are slowly building it up to see how far it can go.”

Funding for the study also came from the National Science Foundation, the Intelligence Advanced Research Projects Activity, the Agence National de Recherche and the College de France.

Correction: Feb. 11

A previous version of this article mistakenly stated that quantum computers that can factor large numbers have already been developed. In fact, 21 is the largest number to have been factored by a quantum computer.

JOSH MANDELL