Good news! Very impressive (resulted in three papers)! This could be a breakthrough! We are getting there!
When will I get my hands on the first personal quantum computer like the IBM PC way back then when I was a very young adult?
I just blogged here a few days ago about another, very recent breakthrough in quantum computing (also mentioned in the article below), a 6,100-qubit system by Caltech!
"One often-repeated example illustrates the mind-boggling potential of quantum computing: A machine with 300 quantum bits could simultaneously store more information than the number of particles in the known universe.
Now process this: Harvard scientists just unveiled a system that was 10 times bigger and the first quantum machine able to operate continuously without restarting. ...
the team demonstrated a system of more than 3,000 quantum bits (or qubits) that could run for more than two hours, surmounting a series of technical challenges and representing a significant step toward building the super computers, which could revolutionize science, medicine, finance, and other fields. ...
In the new study, the team devised a system to continually and rapidly resupply qubits using “optical lattice conveyor belts” (laser waves that transport atoms) and “optical tweezers” (laser beams that grab individual atoms and arrange them into grid-like arrays). The system can reload up to 300,000 atoms per second. ... “That really is solving this fundamental bottleneck of atom loss.” ... Over two hours, more than 50 million atoms had cycled through the system. ...
The new study advances a fast-developing frontier of research. In fact, this week a team from Caltech published a 6,100-qubit system, but it could only run for less than 13 seconds. ...
The approach allows the connectivity of the processor to be changed during the process of computation. In contrast, most existing computer chips — like the ones in your cellphone or desktop — have fixed connectivity. ..."
From the abstract (1):
"Neutral atoms are a promising platform for quantum science, enabling advances in areas ranging from quantum simulations and computation to metrology, atomic clocks and quantum networking.
While atom losses typically limit these systems to a pulsed mode, continuous operation could significantly enhance cycle rates, remove bottlenecks in metrology, and enable deep-circuit quantum evolution through quantum error correction.
Here we demonstrate an experimental architecture for high-rate reloading and continuous operation of a large-scale atom array system while realizing coherent storage and manipulation of quantum information.
Our approach utilizes a series of two optical lattice conveyor belts to transport atom reservoirs into the science region, where atoms are repeatedly extracted into optical tweezers without affecting the coherence of qubits stored nearby.
Using a reloading rate of 300,000 atoms in tweezers per second, we create over 30,000 initialized qubits per second, which we leverage to assemble and maintain an array of over 3,000 atoms for more than two hours.
Furthermore, we demonstrate persistent refilling of the array with atomic qubits in either a spin-polarized or a coherent superposition state while preserving the quantum state of stored qubits.
Our results pave the way for realization of large-scale continuously operated atomic clocks, sensors, and fault-tolerant quantum computers."
From the abstract (2):
"Fast, reliable logical operations are essential for realizing useful quantum computers. By redundantly encoding logical qubits into many physical qubits and using syndrome measurements to detect and correct errors, we can achieve low logical error rates.
However, for many practical quantum error correction codes such as the surface code, owing to syndrome measurement errors, standard constructions require multiple extraction rounds—of the order of the code distance d—for fault-tolerant computation, particularly considering fault-tolerant state preparation.
Here we show that logical operations can be performed fault-tolerantly with only a constant number of extraction rounds for a broad class of quantum error correction codes, including the surface code with magic state inputs and feedforward, to achieve ‘transversal algorithmic fault tolerance’.
Through the combination of transversal operations and new strategies for correlated decoding, despite only having access to partial syndrome information, we prove that the deviation from the ideal logical measurement distribution can be made exponentially small in the distance, even if the instantaneous quantum state cannot be made close to a logical codeword because of measurement errors.
We supplement this proof with circuit-level simulations in a range of relevant settings, demonstrating the fault tolerance and competitive performance of our approach.
Our work sheds new light on the theory of quantum fault tolerance and has the potential to reduce the space–time cost of practical fault-tolerant quantum computation by over an order of magnitude."
From the abstract (3):
"Quantum simulations of many-body systems are among the most promising applications of quantum computers.
In particular, models based on strongly correlated fermions are central to our understanding of quantum chemistry and materials problems, and can lead to exotic, topological phases of matter.
However, owing to the non-local nature of fermions, such models are challenging to simulate with qubit devices.
Here we realize a digital quantum simulation architecture for two-dimensional fermionic systems based on reconfigurable atom arrays.
We utilize a fermion-to-qubit mapping based on Kitaev’s model on a honeycomb lattice, in which fermionic statistics are encoded using long-range entangled states. We prepare these states efficiently using measurement and feedforward, realize subsequent fermionic evolution through Floquet engineering with tunable entangling gates interspersed with atom rearrangement, and improve results with built-in error detection.
Leveraging this fermion description of the Kitaev spin model, we efficiently prepare topological states across its complex phase diagram and verify the non-Abelian spin-liquid phase3 by evaluating an odd Chern number.
We further explore this two-dimensional fermion system by realizing tunable dynamics and directly probing fermion exchange statistics.
Finally, we simulate strong interactions and study the dynamics of the Fermi–Hubbard model on a square lattice.
These results pave the way for digital quantum simulations of complex fermionic systems for materials science, chemistry and high-energy physics."
1) Continuous operation of a coherent 3,000-qubit system (no public access)
Continuous operation of a coherent 3,000-qubit system (preprint, open access)
2) Low-overhead transversal fault tolerance for universal quantum computation (no public access)
Low-Overhead Transversal Fault Tolerance for Universal Quantum Computation (preprint, open access)

No comments:
Post a Comment