Summary
In the new study, the team devised a system to continually and rapidly resupply qubits using “optical lattice conveyor belts” (laser waves that transport atoms) and “optical tweezers” (laser beams that grab individual atoms and arrange them into grid-like arrays). The system can reload up to 300,000 atoms per second.
“We’re showing a way where you can insert new atoms as you naturally lose them without destroying the information that’s already in the system,” said Elias Trapp, the paper co-author and a Ph.D. student in the Kenneth C. Griffin School of Arts and Sciences studying physics. “That really is solving this fundamental bottleneck of atom loss.”
The new system operated an array of more than 3,000 qubits for more than two hours — and in theory, the researchers said, could continue indefinitely. Over two hours, more than 50 million atoms had cycled through the system.
…
In follow-up experiments, the team plans to apply this approach to perform computations.
…
The new study advances a fast-developing frontier of research. In fact, this week a team from Caltech published a 6,100-qubit system, but it could only run for less than 13 seconds.
…
In a third paper published in Nature this week, the team demonstrates a quantum architecture with new methods for error correction. With this new body of research, Lukin believes that it is now possible to envision quantum computers that can execute billions of operations and continue running for days.
I’ve been assuming quantum computing won’t be practical for many years, but maybe it’s closer than I thought.