Coincidence counting (physics)

Last updated

In quantum physics, coincidence counting is used in experiments testing particle non-locality and quantum entanglement. In these experiments two or more particles are created from the same initial packet of energy, inexorably linking/entangling their physical properties. Separate particle detectors measure the quantum states of each particle and send the resulting signal to a coincidence counter. In any experiment studying entanglement, the entangled particles are vastly outnumbered by non-entangled particles which are also detected; patternless noise that drowns out the entangled signal. In a two detector system, a coincidence counter alleviates this problem by only recording detection signals that strike both detectors simultaneously (or more accurately, recording only signals that arrive at both detectors and correlate to the same emission time). This ensures that the data represents only entangled particles.

However, since no detector/counter circuit has infinitely precise temporal resolution (due both to limitations in the electronics and the laws of the Universe itself), detections must be sorted into time bins (detection windows equivalent to the temporal resolution of the system). Detections in the same bin appear to occur at the same time because their individual detection times cannot be resolved any further. Thus in a two detector system, two unrelated, non-entangled particles may randomly strike both detectors, get sorted into the same time bin, and create a false-coincidence that adds noise to the signal. This limits coincidence counters to improving the signal-to-noise ratio to the extent that the quantum behavior can be studied, without removing the noise completely.

Every experiment to date that has been used to calculate Bell's inequalities, perform a quantum eraser, or conduct any experiment utilizing quantum entanglement as an information channel has only been possible through the use of coincidence counters.[ clarification needed ] This unavoidably prevents superluminal communication since, even if a random or purposeful decision appears to be affecting events that have already transpired (as in the delayed choice quantum eraser), the signal from the past cannot be seen/decoded until the coincidence circuit has correlated both the past and future behavior. Thus the "signal" in the past is only visible after it is "sent" from the future, precluding quantum entanglement from being exploited for the purposes of faster-than-light communication or data time travel.

See also

Related Research Articles

<span class="mw-page-title-main">Quantum entanglement</span> Correlation between quantum systems

Quantum entanglement is the phenomenon of a group of particles being generated, interacting, or sharing spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.

Superluminal communication is a hypothetical process in which information is conveyed at faster-than-light speeds. The current scientific consensus is that faster-than-light communication is not possible, and to date it has not been achieved in any experiment.

In physics, coherence expresses the potential for two waves to interfere. Two monochromatic beams from a single source always interfere. Physical sources are not strictly monochromatic: they may be partly coherent. Beams from different sources are mutually incoherent.

In physics and electrical engineering, a coincidence circuit or coincidence gate is an electronic device with one output and two inputs. The output activates only when the circuit receives signals within a time window accepted as at the same time and in parallel at both inputs. Coincidence circuits are widely used in particle detectors and in other areas of science and technology.

A Bell test, also known as Bell inequality test or Bell experiment, is a real-world physics experiment designed to test the theory of quantum mechanics in relation to Albert Einstein's concept of local realism. Named for John Stewart Bell, the experiments test whether or not the real world satisfies local realism, which requires the presence of some additional local variables to explain the behavior of particles like photons and electrons. The test empirically evaluates the implications of Bell's theorem. As of 2015, all Bell tests have found that the hypothesis of local hidden variables is inconsistent with the way that physical systems behave.

In quantum information science, the Bell's states or EPR pairs are specific quantum states of two qubits that represent the simplest examples of quantum entanglement. The Bell's states are a form of entangled and normalized basis vectors. This normalization implies that the overall probability of the particle being in one of the mentioned states is 1: . Entanglement is a basis-independent result of superposition. Due to this superposition, measurement of the qubit will "collapse" it into one of its basis states with a given probability. Because of the entanglement, measurement of one qubit will "collapse" the other qubit to a state whose measurement will yield one of two possible values, where the value depends on which Bell's state the two qubits are in initially. Bell's states can be generalized to certain quantum states of multi-qubit systems, such as the GHZ state for three or more subsystems.

The Afshar experiment is a variation of the double-slit experiment in quantum mechanics, devised and carried out by Shahriar Afshar in 2004. In the experiment, light generated by a laser passes through two closely spaced pinholes, and is refocused by a lens so that the image of each pinhole falls on a separate single-photon detector. In addition, a grid of thin wires is placed just before the lens on the dark fringes of an interference pattern.

In quantum mechanics, a quantum eraser experiment is an interferometer experiment that demonstrates several fundamental aspects of quantum mechanics, including quantum entanglement and complementarity. The quantum eraser experiment is a variation of Thomas Young's classic double-slit experiment. It establishes that when action is taken to determine which of 2 slits a photon has passed through, the photon cannot interfere with itself. When a stream of photons is marked in this way, then the interference fringes characteristic of the Young experiment will not be seen. The experiment also creates situations in which a photon that has been "marked" to reveal through which slit it has passed can later be "unmarked." A photon that has been "unmarked" will interfere with itself once again, restoring the fringes characteristic of Young's experiment.

A delayed-choice quantum eraser experiment, first performed by Yoon-Ho Kim, R. Yu, S. P. Kulik, Y. H. Shih and Marlan O. Scully, and reported in early 1998, is an elaboration on the quantum eraser experiment that incorporates concepts considered in John Archibald Wheeler's delayed-choice experiment. The experiment was designed to investigate peculiar consequences of the well-known double-slit experiment in quantum mechanics, as well as the consequences of quantum entanglement.

<span class="mw-page-title-main">Wheeler's delayed-choice experiment</span> Number of quantum physics thought experiments

Wheeler's delayed-choice experiment describes a family of thought experiments in quantum physics proposed by John Archibald Wheeler, with the most prominent among them appearing in 1978 and 1984. These experiments are attempts to decide whether light somehow "senses" the experimental apparatus in the double-slit experiment it travels through, adjusting its behavior to fit by assuming an appropriate determinate state, or whether light remains in an indeterminate state, exhibiting both wave-like and particle-like behavior until measured.

Quantum radar is a speculative remote-sensing technology based on quantum-mechanical effects, such as the uncertainty principle or quantum entanglement. Broadly speaking, a quantum radar can be seen as a device working in the microwave range, which exploits quantum features, from the point of view of the radiation source and/or the output detection, and is able to outperform a classical counterpart. One approach is based on the use of input quantum correlations combined with a suitable interferometric quantum detection at the receiver.

Popper's experiment is an experiment proposed by the philosopher Karl Popper to test aspects of the uncertainty principle in quantum mechanics.

Quantum imaging is a new sub-field of quantum optics that exploits quantum correlations such as quantum entanglement of the electromagnetic field in order to image objects with a resolution or other imaging criteria that is beyond what is possible in classical optics. Examples of quantum imaging are quantum ghost imaging, quantum lithography, imaging with undetected photons, sub-shot-noise imaging, and quantum sensing. Quantum imaging may someday be useful for storing patterns of data in quantum computers and transmitting large amounts of highly secure encrypted information. Quantum mechanics has shown that light has inherent “uncertainties” in its features, manifested as moment-to-moment fluctuations in its properties. Controlling these fluctuations—which represent a sort of “noise”—can improve detection of faint objects, produce better amplified images, and allow workers to more accurately position laser beams.

Ghost imaging is a technique that produces an image of an object by combining information from two light detectors: a conventional, multi-pixel detector that does not view the object, and a single-pixel (bucket) detector that does view the object. Two techniques have been demonstrated. A quantum method uses a source of pairs of entangled photons, each pair shared between the two detectors, while a classical method uses a pair of correlated coherent beams without exploiting entanglement. Both approaches may be understood within the framework of a single theory.

<span class="mw-page-title-main">Photon counting</span> Counting photons using a single-photon detector

Photon counting is a technique in which individual photons are counted using a single-photon detector (SPD). A single-photon detector emits a pulse of signal for each detected photon. The counting efficiency is determined by the quantum efficiency and the system's electronic losses.

Quantum illumination is a paradigm for target detection that employs quantum entanglement between a signal electromagnetic mode and an idler electromagnetic mode, as well as joint measurement of these modes. The signal mode is propagated toward a region of space, and it is either lost or reflected, depending on whether a target is absent or present, respectively. In principle, quantum illumination can be beneficial even if the original entanglement is completely destroyed by a lossy and noisy environment.

Quantum microscopy allows microscopic properties of matter and quantum particles to be measured and imaged. Various types of microscopy use quantum principles. The first microscope to do so was the scanning tunneling microscope, which paved the way for development of the photoionization microscope and the quantum entanglement microscope.

<span class="mw-page-title-main">Aspect's experiment</span> Quantum mechanics experiment

Aspect's experiment was the first quantum mechanics experiment to demonstrate the violation of Bell's inequalities with photons using distant detectors. Its 1982 result allowed for further validation of the quantum entanglement and locality principles. It also offered an experimental answer to Albert Einstein, Boris Podolsky, and Nathan Rosen's paradox which had been proposed about fifty years earlier.

Quantum secret sharing (QSS) is a quantum cryptographic scheme for secure communication that extends beyond simple quantum key distribution. It modifies the classical secret sharing (CSS) scheme by using quantum information and the no-cloning theorem to attain the ultimate security for communications.

In particle physics, the coincidence method is an experimental design through which particle detectors register two or more simultaneous measurements of a particular event through different interaction channels. Detection can be made by sensing the primary particle and/or through the detection of secondary reaction products. Such a method is used to increase the sensitivity of an experiment to a specific particle interaction, reducing conflation with background interactions by creating more degrees of freedom by which the particle in question may interact. The first notable use of the coincidence method was conducted in 1924 by the Bothe–Geiger coincidence experiment.