Coincidence counting (physics)

Last updated

In quantum physics, coincidence counting is used in experiments testing particle non-locality and quantum entanglement. In these experiments two or more particles are created from the same initial packet of energy, inexorably linking/entangling their physical properties. Separate particle detectors measure the quantum states of each particle and send the resulting signal to a coincidence counter. In any experiment studying entanglement, the entangled particles are vastly outnumbered by non-entangled particles which are also detected; patternless noise that drowns out the entangled signal. In a two detector system, a coincidence counter alleviates this problem by only recording detection signals that strike both detectors simultaneously (or more accurately, recording only signals that arrive at both detectors and correlate to the same emission time). This ensures that the data represents only entangled particles.

Contents

However, since no detector/counter circuit has infinitely precise temporal resolution (due both to limitations in the electronics and the laws of the Universe itself), detections must be sorted into time bins (detection windows equivalent to the temporal resolution of the system). Detections in the same bin appear to occur at the same time because their individual detection times cannot be resolved any further. Thus in a two detector system, two unrelated, non-entangled particles may randomly strike both detectors, get sorted into the same time bin, and create a false-coincidence that adds noise to the signal. This limits coincidence counters to improving the signal-to-noise ratio to the extent that the quantum behavior can be studied, without removing the noise completely.

History

As of 1951, coincidence counting was described as "an important tool in experimental physics for a long time." [1] In 1955, a seminal paper from the University of Glasgow suggested using "the coincidence counting technique of nuclear physics to measure the lifetime of excited atomic states." [2]

Every experiment to date that has been used to calculate Bell's inequalities, perform a quantum eraser, or conduct any experiment utilizing quantum entanglement as an information channel has only been possible through the use of coincidence counters.[ clarification needed ] This unavoidably prevents superluminal communication since, even if a random or purposeful decision appears to be affecting events that have already transpired (as in the delayed choice quantum eraser), the signal from the past cannot be seen/decoded until the coincidence circuit has correlated both the past and future behavior. Thus the "signal" in the past is only visible after it is "sent" from the future, precluding quantum entanglement from being exploited for the purposes of faster-than-light communication or data time travel.

See also

Related Research Articles

<span class="mw-page-title-main">Quantum entanglement</span> Physics phenomenon

Quantum entanglement is the phenomenon of a group of particles being generated, interacting, or sharing spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical physics and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.

Superluminal communication is a hypothetical process in which information is conveyed at faster-than-light speeds. The current scientific consensus is that faster-than-light communication is not possible, and to date it has not been achieved in any experiment.

Coherence expresses the potential for two waves to interfere. Two monochromatic beams from a single source always interfere. Wave sources are not strictly monochromatic: they may be partly coherent. Beams from different sources are mutually incoherent.

In physics and electrical engineering, a coincidence circuit or coincidence gate is an electronic device with one output and two inputs. The output activates only when the circuit receives signals within a time window accepted as at the same time and in parallel at both inputs. Coincidence circuits are widely used in particle detectors and in other areas of science and technology.

In experimental and applied particle physics, nuclear physics, and nuclear engineering, a particle detector, also known as a radiation detector, is a device used to detect, track, and/or identify ionizing particles, such as those produced by nuclear decay, cosmic radiation, or reactions in a particle accelerator. Detectors can measure the particle energy and other attributes such as momentum, spin, charge, particle type, in addition to merely registering the presence of the particle.

A Bell test, also known as Bell inequality test or Bell experiment, is a real-world physics experiment designed to test the theory of quantum mechanics in relation to Albert Einstein's concept of local realism. Named for John Stewart Bell, the experiments test whether or not the real world satisfies local realism, which requires the presence of some additional local variables to explain the behavior of particles like photons and electrons. The test empirically evaluates the implications of Bell's theorem. As of 2015, all Bell tests have found that the hypothesis of local hidden variables is inconsistent with the way that physical systems behave.

In quantum information science, the Bell's states or EPR pairs are specific quantum states of two qubits that represent the simplest examples of quantum entanglement. The Bell's states are a form of entangled and normalized basis vectors. This normalization implies that the overall probability of the particles being in one of the mentioned states is 1: . Entanglement is a basis-independent result of superposition. Due to this superposition, measurement of the qubit will "collapse" it into one of its basis states with a given probability. Because of the entanglement, measurement of one qubit will "collapse" the other qubit to a state whose measurement will yield one of two possible values, where the value depends on which Bell's state the two qubits are in initially. Bell's states can be generalized to certain quantum states of multi-qubit systems, such as the GHZ state for three or more subsystems.

For detection systems that record discrete events, such as particle and nuclear detectors, the dead time is the time after each event during which the system is not able to record another event. An everyday life example of this is what happens when someone takes a photo using a flash – another picture cannot be taken immediately afterward because the flash needs a few seconds to recharge. In addition to lowering the detection efficiency, dead times can have other effects, such as creating possible exploits in quantum cryptography.

In quantum mechanics, a quantum eraser experiment is an interferometer experiment that demonstrates several fundamental aspects of quantum mechanics, including quantum entanglement and complementarity. The quantum eraser experiment is a variation of Thomas Young's classic double-slit experiment. It establishes that when action is taken to determine which of 2 slits a photon has passed through, the photon cannot interfere with itself. When a stream of photons is marked in this way, then the interference fringes characteristic of the Young experiment will not be seen. The experiment also creates situations in which a photon that has been "marked" to reveal through which slit it has passed can later be "unmarked." A photon that has been "unmarked" will interfere with itself once again, restoring the fringes characteristic of Young's experiment.

A delayed-choice quantum eraser experiment, first performed by Yoon-Ho Kim, R. Yu, S. P. Kulik, Y. H. Shih and Marlan O. Scully, and reported in early 1998, is an elaboration on the quantum eraser experiment that incorporates concepts considered in John Archibald Wheeler's delayed-choice experiment. The experiment was designed to investigate peculiar consequences of the well-known double-slit experiment in quantum mechanics, as well as the consequences of quantum entanglement.

Wheeler's delayed-choice experiment describes a family of thought experiments in quantum physics proposed by John Archibald Wheeler, with the most prominent among them appearing in 1978 and 1984. These experiments illustrate the central point of quantum theory:

It is wrong to attribute a tangibility to the photon in all its travel from the point of entry to its last instant of flight.

Popper's experiment is an experiment proposed by the philosopher Karl Popper to test aspects of the uncertainty principle in quantum mechanics.

Quantum imaging is a new sub-field of quantum optics that exploits quantum correlations such as quantum entanglement of the electromagnetic field in order to image objects with a resolution or other imaging criteria that is beyond what is possible in classical optics. Examples of quantum imaging are quantum ghost imaging, quantum lithography, imaging with undetected photons, sub-shot-noise imaging, and quantum sensing. Quantum imaging may someday be useful for storing patterns of data in quantum computers and transmitting large amounts of highly secure encrypted information. Quantum mechanics has shown that light has inherent "uncertainties" in its features, manifested as moment-to-moment fluctuations in its properties. Controlling these fluctuations—which represent a sort of "noise"—can improve detection of faint objects, produce better amplified images, and allow workers to more accurately position laser beams.

Ghost imaging is a technique that produces an image of an object by combining information from two light detectors: a conventional, multi-pixel detector that does not view the object, and a single-pixel (bucket) detector that does view the object. Two techniques have been demonstrated. A quantum method uses a source of pairs of entangled photons, each pair shared between the two detectors, while a classical method uses a pair of correlated coherent beams without exploiting entanglement. Both approaches may be understood within the framework of a single theory.

Hardy's paradox is a thought experiment in quantum mechanics devised by Lucien Hardy in 1992–1993 in which a particle and its antiparticle may interact without annihilating each other.

<span class="mw-page-title-main">Photon counting</span> Counting photons using a single-photon detector

Photon counting is a technique in which individual photons are counted using a single-photon detector (SPD). A single-photon detector emits a pulse of signal for each detected photon. The counting efficiency is determined by the quantum efficiency and the system's electronic losses.

Quantum illumination is a paradigm for target detection that employs quantum entanglement between a signal electromagnetic mode and an idler electromagnetic mode, as well as joint measurement of these modes. The signal mode is propagated toward a region of space, and it is either lost or reflected, depending on whether a target is absent or present, respectively. In principle, quantum illumination can be beneficial even if the original entanglement is completely destroyed by a lossy and noisy environment.

Quantum microscopy allows microscopic properties of matter and quantum particles to be measured and imaged. Various types of microscopy use quantum principles. The first microscope to do so was the scanning tunneling microscope, which paved the way for development of the photoionization microscope and the quantum entanglement microscope.

<span class="mw-page-title-main">Aspect's experiment</span> Quantum mechanics experiment

Aspect's experiment was the first quantum mechanics experiment to demonstrate the violation of Bell's inequalities with photons using distant detectors. Its 1982 result allowed for further validation of the quantum entanglement and locality principles. It also offered an experimental answer to Albert Einstein, Boris Podolsky, and Nathan Rosen's paradox which had been proposed about fifty years earlier.

In particle physics, the coincidence method is an experimental design through which particle detectors register two or more simultaneous measurements of a particular event through different interaction channels. Detection can be made by sensing the primary particle and/or through the detection of secondary reaction products. Such a method is used to increase the sensitivity of an experiment to a specific particle interaction, reducing conflation with background interactions by creating more degrees of freedom by which the particle in question may interact. The first notable use of the coincidence method was conducted in 1924 by the Bothe–Geiger coincidence experiment.

References

  1. Prohaska, Charles Anton (1951). Heavy Element Decay Schemes with Alpha- Gamma and Alphaelectron Coincidence Counting. University of California. p. 1. Retrieved January 18, 2025.
  2. da Silva Neto, Climério Paulo (2023). Materializing the Foundations of Quantum Mechanics Instruments and the First Bell Tests. pp. 66–67. Retrieved January 18, 2025.