Resolving catastrophic error bursts in large arrays of superconducting qubits

Scalable quantum computing can become a reality with error correction, provided coherent qubits can be constructed in large arrays. The key premise is that physical errors can remain both small and sufficiently uncorrelated as devices scale, so that logical error…

Resolving catastrophic error bursts in large arrays of superconducting qubits

Source

0
(0)

Scalable quantum computing can become a reality with error correction, provided coherent qubits can be constructed in large arrays. The key premise is that physical errors can remain both small and sufficiently uncorrelated as devices scale, so that logical error rates can be exponentially suppressed. However, energetic impacts from cosmic rays and latent radioactivity violate both of these assumptions.
We use the scale of Google’s Sycamore processor to directly observe impacts of high-energy rays and identify large bursts of quasiparticles that simultaneously and severely limit the energy coherence of all qubits, causing chip-wide failure.
We track the events from their initial localized impact to high error rates across the chip. Our results provide direct insights into the scale and dynamics of these damaging error bursts in large-scale devices, and highlight the necessity of mitigation to enable quantum error correction at scale.

Speaker: Tom O’Brien

Watch more Google Quantum AI Talks → https://goo.gle/QuantumAITalks
Subscribe to Google Quantum AI → https://goo.gle/QuantumAI

#QuantumAITalks

product: Quantum – General; Matthew McEwen; re_ty: Publish;

0 / 5. 0