
Image by Gerd Altmann, from Pixabay.
By James Myers
Since Thomas Young’s famous “double-slit” experiment in 1801 demonstrated that light behaves like a wave when it passes through two slits in a barrier, it has become generally accepted that light’s energy exists in two forms. In one form, light operates as a single particle called a photon, and in its other form light operates as a wave of many photons acting in concert. Two recent experiments put the two-form theory of light to the test, and the results conflict.
The results differ on the question of measuring the quantum causes and effects of light, either as individual particles or combined in a wave. Resolving the conflict could have profound implications for the development of stable quantum computing circuits, which has been the biggest challenge to unleashing the potential power of the technology.

Illustration of light from a single source passing through two slits onto a screen (P). The combination shown is created by a light beam calibrated to reach both slits at the same part of the wave cycle, producing an interference pattern on the screen that is coherent because the output correlates equally to the inputs from both slits. Image: Wikipedia .
The question of how light behaves when passing through two slits was the subject of a friendly debate between Albert Einstein and 1922 Nobel Prize-winning physicist Niels Bohr. Einstein held that a photon acts as a particle passing through just one of the two slits, but in the process applies a slight force on the slit which causes what appears to be an interference pattern of light waves. Bohr argued that measuring photons as particles in a specified path would eliminate the possibility of measuring them as a combined wave, because the act of observation changes the outcome.
In one recent test using the novel idea of “dark photons,” mathematics show that light can only be measured as a quantum particle.
According to a mathematical analysis by Celso Jorge Villas-Boas, a physicist at the Federal University of São Carlos in Brazil, and colleagues, the scattering pattern of light passing through two slits establishes that light is only a particle – provided that some photons are “dark.” Dark photons would be inert and undetectable, incapable of interacting with other particles to cause change as they pass through the slits. If dark photons exist, only the active photons would create the wavelike scattering pattern.
Published in April in the journal Physical Review Letters, the mathematical model uses a single atom instead of a broader screen as the end point for light passing through the two slits. Supporting Einstein’s interpretation, the calculations demonstrate that a combination of dark and active photons landing as particles on a single atom could produce the scattering patterns that have long been taken as a sign of light’s wave-particle duality.
The mathematical framework assumes that the positions of the dark and bright patterns observed in the scattering of photons is determined by the geometry of the slits, whose shape and relative positioning causes dark and normal photons to bend through them in a particular way.
The result of the calculations is controversial, in part because they dispense with the generally accepted theory that, for reasons still unknown, observing a quantum process causes the process to change. As Villas-Boas told New Scientist, “The idea the observer can change the reality or the direction of the photon, this seems kind of mystical, but according to our theory this does not happen anymore.”
Another test supports measuring light as both waves and particles.
The mathematical theory of dark photons conflicts with the findings of a recent experiment by physicists at MIT who created tiny slits with super-cooled atoms. The MIT team, led by 2001 Nobel Prize co-recipient Wolfgang Ketterle, used lasers to arrange more than 10,000 frozen atoms in an evenly-spaced lattice, creating a crystal-like structure through which the researchers passed beams of light. In their motion through the atomic lattice, the light beams were calibrated to ensure they were sufficiently weak so that each atom could scatter at most only one photon of light.

Wolfgang Ketterle (back row, second from right) and team from MIT used lasers to hold atoms in place in a new test of the double-slit experiment. Image: MIT Department of Physics.
The atoms in the lattice were prepared in different quantum states, allowing for measurement of the varying amount of information the atoms obtained from the photons as they passed through the lattice.
The researchers concluded that, as predicted by quantum theory, the more information that was collected about the particular path of the photons—and therefore about the quantum particle nature of light—the less information they were able to obtain about the scattering pattern from the photons acting as a wave. When a frozen atom was disturbed by a single passing photon acting as a particle, there was a reduction of the interference in the wave pattern, indicating the quantum duality of light as both a particle and a wave.
Ketterle’s team fine-tuned their experiment to direct half of the photons through the atomic lattice as particles and the other half as waves. They achieved this by adjusting the strength of the lasers, allowing some of the atoms to drift and making their positions uncertain, or “fuzzy.” With their wider range, the fuzzier atoms were more easily disturbed by the passing photons and provided a clearer measurement of the path of photons interacting as particles. In their October 2024 paper, entitled Coherent and incoherent light scattering by single-atom wavepackets, the researchers call this “which-way information,” and propose the use of optical lattices to test light scattering with two atoms.
The experiment with atom-sized slits supports Bohr’s interpretation of light as a quantum wave-particle duality, showing Einstein to have been wrong in thinking that a photon acts only as a particle in passing through one of two slits.
Resolving the conflicting results could yield valuable information about the connection of time, the path of light through time, and the timing of information transfers in quantum circuits.
One promising method for creating a quantum computing bit, or qubit, is with particles of light. Last month, in New Type of Quantum Bit, the Photonic GKP Qubit, Propels Development of Error-free Quantum Computing, The Quantum Record reported on the potential for a new type of light-based qubit to eliminate circuit errors and launch quantum computing into its full commercial and scientific potential.
What has so far held back full-scale quantum computing is the problem called “decoherence,” where connections in quantum circuits break before any useful length of time. An effective quantum computer in its full potential is thought to require stable connections between millions of qubits, but current prototypes have managed to maintain stable connections between only a few thousand qubits at a time.
If qubits are decohering because the time lengths of their information transfers are misaligned with the distribution of probabilities, then it’s possible that photonics will provide a brighter path to stable quantum circuits.

RS Puppis is one of the largest Cepheid variable stars in the Milky Way galaxy. Image by NASA, ESA, on Wikipedia .
One challenge, however, is that the photon is also the basis for our definition of distance. Since 2019, the length of a metre in the metric system has been defined as the distance of light’s path in a vacuum during a time interval of one second, with the second defined by the oscillation frequency of a caesium atom. One of the most accurate methods for measuring large-scale galactic distances is with reference to light emitted by a particular type of star called a Cepheid variable, because of the consistent rate of change in its luminosity.
The ”c” (light’s distance over time) in Einstein’s famous E=mc2, which tells us how limitless energy equates to the limits of mass, provides the connection between distance and time, in changes of energy and mass. Change involves a combination of cause, which initiates the change, and effect, which is the result of the change, and so untangling the question of whether light causes change as a particle or as a wave could theoretically lead to stable photonic quantum circuits. Stable quantum circuits, in turn, would unleash the full potential of the incredible speed and accuracy of quantum computers – far surpassing any supercomputer now in use.
What’s the measurable extent of quantum information from light?

Albert Einstein writing a density formula for the Milky Way galaxy on a blackboard in Pasadena, 1931. Image: Wikipedia.
Consider the nature of light. Nothing in the physical universe can overcome the rate at which light displaces space and time. Covering 299,792,458 metres per second in a vacuum where it counters zero resistance, light is the universal speed limit for components of physical matter like quarks, neutrons, protons, electrons, atoms, and molecules. Lightspeed is a universal constant, operating at the same 299,792,458 metres per second regardless of the speed of a light source (such as a Cepheid variable star) relative to the speed of an observer measuring the light source.
Since lightspeed is always the same, Einstein intuited that the rate of physical change in time for an observer moving at a high speed (in his thought experiment, an observer on a fast-moving train) is slower than for someone standing motionless. Einstein’s field equations for special relativity and general relativity establish an inverse relationship between space and time: accelerating speed to cover a greater distance in space slows the effects of time. Tests with extremely sensitive atomics clocks have proven the relationship.
The same lightspeed limit applies to the transmission of information – including data in qubits – from one point to another in space and time.
Information on a change in space, time, or state includes information about the input – which is the cause of the change – and information on the effect of the change, which is the output. Quantum mechanics encodes information as a sine wave function, which prevents observers from making a complete measurement of all possible connections between input and output. The Uncertainty Principle, which applies universally to quantum interactions and was demonstrated in the MIT experiment, says that the more we know of a particle’s position in space and time the less we know of its momentum (momentum is the particle’s rate and direction of change), and conversely the more we know of a particle’s momentum the less we know of its position.
Position at a single time, and rate of change over time, are both measurements with respect to time. So what do the conflicting new double-slit tests say about the nature of time?

Pierre-Simon Laplace. Portrait by Johann Ernst Heinsius, 1775, on Wikipedia .
In a fully predictable universe, the present would be “the effect of its past and cause of its future,” as physicist Pierre-Simon Laplace (1749-1827) proposed in A Philosophical Essay on Probabilities (see pdf). In that scenario, it would be possible to determine the future by knowing the causes of all past events. Laplace imagined an all-intelligent being (often now referred to as “Laplace’s demon”) able to “embrace in the same formula the movements of the greatest bodies of the universe and those of the lightest atom; for it, nothing would be uncertain and the future, as the past, would be present to its eyes.”
The Uncertainty Principle guarantees that no creature or technology like Laplace’s demon will ever be capable of measuring the connections of cause and effect in time with perfect certainty, and so we are left to deal with probabilities.
Whereas conventional computer bits can exist in only one state at any time, the power of the quantum computer derives from the qubit being in all probable states at the same time. Increasing the measurement accuracy for probable states of light in a photonic qubit at any time, and from time to time, could help to unlock the quantum computer’s potential to provide a far more accurate record of connections between cause and effect than today’s binary computers.
While the quantum computer can’t become Laplace’s demon, for which we can be grateful, it stands to shed much light on the still mysterious nature of time. What’s the probability of Einstein being both right and wrong about light’s particle nature? Perhaps only time will tell whether Einstein had the last word or not on that important question.