Quantum error correction (QEC) is the framework for protecting fragile quantum information from the noise and decoherence that inevitably affect physical qubits. Unlike classical error correction, which can simply copy bits for redundancy, QEC must contend with the no-cloning theorem (quantum states cannot be copied) and the measurement problem (measuring a qubit destroys its superposition). QEC solves these challenges by encoding a single logical qubit across many entangled physical qubits, using indirect measurements (syndromes) to detect errors without revealing the encoded information.
The basic principle is redundancy through entanglement. A logical |0⟩ is encoded not as a single physical |0⟩ but as a complex entangled state of many physical qubits. Errors on individual physical qubits can be detected by measuring stabilizers — operators that check the consistency of relationships between physical qubits without revealing the logical state. When a stabilizer measurement yields an unexpected result (a syndrome), it indicates an error that can be identified and corrected.
QEC is essential for scaling quantum computing because physical error rates (typically 0.1-1% per gate) are far too high for the long computations needed for practical applications like cryptanalysis or drug discovery, which may require billions of gates. With QEC, as long as physical error rates are below the code's error threshold, increasing the code size (using more physical qubits per logical qubit) exponentially suppresses the logical error rate. Google's Willow chip demonstrated this below-threshold scaling in 2024, a watershed moment for the field.