News

An illustration from QuiX of photons on a “conveyor belt.”(Image credit: QuiX Quantum)Share this article 1Join the conversationFollow usAdd us as a preferred source on GoogleSubscribe to our newsletter
Scientists have unveiled a groundbreaking approach to preemptively correct errors in light-based quantum computers.
This significant advancement, achieved through a novel method known as photon distillation, brings physicists closer to realizing optical “photonic” quantum computers that can outperform classical supercomputers, achieving what is termed quantum advantage.
This research addresses what is considered the most significant obstacle in the development of fault-tolerant, universal quantum computers: the pervasive presence of noisy errors that can lead to computational failures.
Unlike their superconducting counterparts, which rely on electronic circuits to generate qubits (the quantum equivalent of classical bits), photonic quantum computers utilize light. Researchers direct beams of photons (light particles) through meticulously designed configurations of mirrors and beam splitters. These photons are then manipulated into intricate quantum states, enabling complex computations.
A key advantage of this quantum computing model is its ability to operate at ambient temperatures. However, the very reason for this operational efficiency is also the source of its primary challenge: photonic quantum computers generate minimal excess heat because light is in perpetual motion. This continuous movement allows computations to occur through photon interactions as they traverse the system. Paradoxically, this also results in a considerably higher rate of errors.
The fault tolerance problem
Superconducting quantum computers require energy input to their circuits to establish qubits — a process that generates heat. While photons do not face this heat issue, there’s a significant drawback: photonic quantum computers are inherently fragile. Photons, by their very nature, are not perfect, meaning a notable fraction of “faulty” photons often circulate, capable of corrupting a computation.
“Given that photons travel at the speed of light, you have qubits that are continuously moving through the architecture,” explained Jelmer Renema, chief scientist and co-founder of QuiX Quantum, to Live Science. “The computational process relies on interactions between these photons when they intersect on the chip.”
“Errors arise when a photon deviates from the expected behavior,” Renema stated. “Occasionally, a rogue photon emerges that disregards the established protocols of the other photons.”
This aberrant photon proceeds through the system without engaging with other photons, thereby introducing a distinct error. As this occurs before the photon is converted into a qubit for processing, it poses a challenge for conventional quantum error correction methods, which typically focus on rectifying qubit errors after they have manifested.

Due to their ability to exist in superposition, qubits are susceptible to errors.
(Image credit: Jorg Greuel/Getty Images)
The sheer quantity of qubits required to produce a single reliable qubit is so immense that it causes the cost of the computer to skyrocket.
Jelmer Renema, chief scientist and co-founder of QuiX Quantum
By employing a technique called quantum photonic distillation, QuiX implemented error mitigation strategies to address the fundamental causes of these errors before they could manifest.
“The interference is arranged such that the likelihood of your rogue photon reaching the output is diminished compared to the probability of photons behaving correctly reaching that same output,” Renema elaborated.
This probability is central to photonic quantum computing. As Renema observed, “Everything in photonics operates on probability.” When researchers direct photon beams through a series of mirrors and beam splitters, each photon has a certain chance of acting unpredictably. Without error mitigation, successful computations depend heavily on chance.
The probability of success further decreases for each photon as engineers incorporate additional quantum computing gates into the system.
Below the threshold
In a superconducting quantum computer, “logical” qubits can be added to enforce fault tolerance on physical qubits, compensating for errors. These are groups of physical qubits sharing identical data, ensuring that if one or more qubits fail, the data remains accessible elsewhere in the cluster, preventing disruption to calculations. However, in photonic quantum computing, increasing the system’s complexity typically introduces more errors than it resolves.
Photonic distillation also demonstrates “below threshold error mitigation” — a metric used by the study’s authors to signify that their method reduces the number of errors as the system scales. This contrasts with the usual outcome where scaling a quantum computer leads to an increase in errors, as noted by the QuiX scientists in their paper.
Comparable fault tolerance achievements have been realized in superconducting and neutral-atom quantum computers. For instance, Google reported below-threshold error correction in its Willow quantum processing unit (QPU) in December 2024. Nonetheless, the current study marks the first instance of this accomplishment in light-based systems.
“The volume of qubits you must utilize to establish a single functional qubit is so substantial that the overall expense of the computer becomes prohibitively high,” Renema remarked. “Thus, there’s this inherent trade-off.”
Related stories
- A revolutionary quantum computer promises significantly lower energy consumption than supercomputers and vastly superior problem-solving capabilities.
- Scientists have successfully linked two quantum processors using existing fiber-optic infrastructure, a key step toward building quantum supercomputers.
- The world’s most compact quantum computer has been developed; it operates at room temperature and is small enough to fit on a desk.
Photonic distillation works by passing imperfect photons through a specialized optical circuit. This circuit utilizes “quantum interference” — a peculiar quantum mechanical phenomenon where the probability amplitudes of quantum states combine — to eliminate physical imperfections and output a single, high-quality photon. This entire process occurs prior to the photons being transformed into qubits.
These enhanced-quality photons then enter the system with a significantly reduced likelihood of exhibiting errant behavior. This improvement in quality results in a net enhancement of error correction, even when accounting for all errors introduced during the photon-to-qubit conversion and subsequent operations.
Given the probabilistic nature of photonic computers, this experimental work showcases a scalable method for error mitigation that is expected to yield below-threshold performance at scales sufficient for performing meaningful quantum computations, according to the study’s authors.
Can you identify these ancient computing devices? Test your knowledge with our computing quiz!
Sourse: www.livescience.com
