In quantum mechanics (and computation), a weak value is a quantity related to a shift of a measuring device's pointer when usually there is pre- and postselection. It should not be confused with a weak measurement, which is often defined in conjunction. The weak value was first defined by Yakir Aharonov, David Albert, and Lev Vaidman, published in Physical Review Letters 1988, [1] and is related to the two-state vector formalism. There is also a way to obtain weak values without postselection. [2] [3]
There are many excellent review articles on weak values (see e.g. [4] [5] [6] [7] ) here we briefly cover the basics.
We will denote the initial state of a system as , while the final state of the system is denoted as . We will refer to the initial and final states of the system as the pre- and post-selected quantum mechanical states. With respect to these states, the weak value of the observable is defined as:
Notice that if then the weak value is equal to the usual expected value in the initial state or the final state . In general the weak value quantity is a complex number. The weak value of the observable becomes large when the post-selected state, , approaches being orthogonal to the pre-selected state, , i.e. . If is larger than the largest eigenvalue of or smaller than the smallest eigenvalue of the weak value is said to be anomalous.
As an example consider a spin 1/2 particle. [8] Take to be the Pauli Z operator with eigenvalues . Using the initial state
and the final state
we can calculate the weak value to be
For the weak value is anomalous.
Here we follow the presentation given by Duck, Stevenson, and Sudarshan, [8] (with some notational updates from Kofman et al. [4] )which makes explicit when the approximations used to derive the weak value are valid.
Consider a quantum system that you want to measure by coupling an ancillary (also quantum) measuring device. The observable to be measured on the system is . The system and ancilla are coupled via the Hamiltonian
where the coupling constant is integrated over an interaction time and is the canonical commutator. The Hamiltonian generates the unitary
Take the initial state of the ancilla to have a Gaussian distribution
the position wavefunction of this state is
The initial state of the system is given by above; the state , jointly describing the initial state of the system and ancilla, is given then by:
Next the system and ancilla interact via the unitary . After this one performs a projective measurement of the projectors on the system. If we postselect (or condition) on getting the outcome , then the (unnormalized) final state of the meter is
To arrive at this conclusion, we use the first order series expansion of on line (I), and we require that [4] [8]
On line (II) we use the approximation that for small . This final approximation is only valid when [4] [8]
As is the generator of translations, the ancilla's wavefunction is now given by
This is the original wavefunction, shifted by an amount . By Busch's theorem [9] the system and meter wavefunctions are necessarily disturbed by the measurement. There is a certain sense in which the protocol that allows one to measure the weak value is minimally disturbing, [10] but there is still disturbance. [10]
At the end of the original weak value paper [1] the authors suggested weak values could be used in quantum metrology:
Another striking aspect of this experiment becomes evident when we consider it as a device for measuring a small gradient of the magnetic field ... yields a tremendous amplification.
Aharonov, Albert, Vaidman [1]
This suggestion was followed by Hosten and Kwiat [11] and later by Dixon et al. [12] It appears to be an interesting line of research that could result in improved quantum sensing technology.
Additionally in 2011, weak measurements of many photons prepared in the same pure state, followed by strong measurements of a complementary variable, were used to perform quantum tomography (i.e. reconstruct the state in which the photons were prepared). [13]
Weak values have been used to examine some of the paradoxes in the foundations of quantum theory. This relies to a large extent on whether weak values are deemed to be relevant to describe properties of quantum systems, [14] a point which is not obvious since weak values are generally different from eigenvalues. For example, the research group of Aephraim M. Steinberg at the University of Toronto confirmed Hardy's paradox experimentally using joint weak measurement of the locations of entangled pairs of photons. [15] [16] (also see [17] )
Building on weak measurements, Howard M. Wiseman proposed a weak value measurement of the velocity of a quantum particle at a precise position, which he termed its "naïvely observable velocity". In 2010, a first experimental observation of trajectories of a photon in a double-slit interferometer was reported, which displayed the qualitative features predicted in 2001 by Partha Ghose [18] for photons in the de Broglie-Bohm interpretation. [19] [20] Following up on Wiseman's weak velocity measurement, Johannes Fankhauser and Patrick Dürr suggest in a paper that weak velocity measurements constitute no new arguments, let alone empirical evidence, in favor of or against standard de Broglie-Bohm theory. According to the authors such measurements could not provide direct experimental evidence displaying the shape of particle trajectories, even if it is assumed that some deterministic particle trajectories exist. [21]
Weak values have been implemented into quantum computing to get a giant speed up in time complexity. In a paper, [22] Arun Kumar Pati describes a new kind of quantum computer using weak value amplification and post-selection (WVAP), and implements search algorithm which (given a successful post selection) can find the target state in a single run with time complexity , beating out the well known Grover's algorithm.
Criticisms of weak values include philosophical and practical criticisms. Some noted researchers such as Asher Peres, Tony Leggett, David Mermin, and Charles H. Bennett are critical of weak values.[ citation needed ]
Recently, it has been shown that the pre- and postselection of a quantum system recovers a completely hidden interference phenomenon in the measurement apparatus. Studying the interference pattern shows that what is interpreted as an amplification using the weak value is a pure phase effect and the weak value plays no role in its interpretation. This phase effect increases the degree of the entanglement which lies behind the effectiveness of the pre- and postselection in the parameter estimation. [23]
{{cite journal}}
: CS1 maint: postscript (link){{cite journal}}
: Cite journal requires |journal=
(help)In physics, the no-cloning theorem states that it is impossible to create an independent and identical copy of an arbitrary unknown quantum state, a statement which has profound implications in the field of quantum computing among others. The theorem is an evolution of the 1970 no-go theorem authored by James Park, in which he demonstrates that a non-disturbing measurement scheme which is both simple and perfect cannot exist. The aforementioned theorems do not preclude the state of one system becoming entangled with the state of another as cloning specifically refers to the creation of a separable state with identical factors. For example, one might use the controlled NOT gate and the Walsh–Hadamard gate to entangle two qubits without violating the no-cloning theorem as no well-defined state may be defined in terms of a subsystem of an entangled state. The no-cloning theorem concerns only pure states whereas the generalized statement regarding mixed states is known as the no-broadcast theorem.
Quantum teleportation is a technique for transferring quantum information from a sender at one location to a receiver some distance away. While teleportation is commonly portrayed in science fiction as a means to transfer physical objects from one location to the next, quantum teleportation only transfers quantum information. The sender does not have to know the particular quantum state being transferred. Moreover, the location of the recipient can be unknown, but to complete the quantum teleportation, classical information needs to be sent from sender to receiver. Because classical information needs to be sent, quantum teleportation cannot occur faster than the speed of light.
The uncertainty principle, also known as Heisenberg's indeterminacy principle, is a fundamental concept in quantum mechanics. It states that there is a limit to the precision with which certain pairs of physical properties, such as position and momentum, can be simultaneously known. In other words, the more accurately one property is measured, the less accurately the other property can be known.
In quantum mechanics, a density matrix is a matrix that describes the quantum state of a physical system. It allows for the calculation of the probabilities of the outcomes of any measurement performed upon this system, using the Born rule. It is a generalization of the more usual state vectors or wavefunctions: while those can only represent pure states, density matrices can also represent mixed states. Mixed states arise in quantum mechanics in two different situations:
In quantum mechanics, wave function collapse occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an observation, and is the essence of a measurement in quantum mechanics, which connects the wave function with classical observables such as position and momentum. Collapse is one of the two processes by which quantum systems evolve in time; the other is the continuous evolution governed by the Schrödinger equation. Collapse is a black box for a thermodynamically irreversible interaction with a classical environment.
Quantum decoherence is the loss of quantum coherence, the process in which a system's behaviour changes from that which can be explained by quantum mechanics to that which can be explained by classical mechanics. In quantum mechanics, particles such as electrons are described by a wave function, a mathematical representation of the quantum state of a system; a probabilistic interpretation of the wave function is used to explain various quantum effects. As long as there exists a definite phase relation between different states, the system is said to be coherent. A definite phase relationship is necessary to perform quantum computing on quantum information encoded in quantum states. Coherence is preserved under the laws of quantum physics.
In atomic physics, the electron magnetic moment, or more specifically the electron magnetic dipole moment, is the magnetic moment of an electron resulting from its intrinsic properties of spin and electric charge. The value of the electron magnetic moment is −9.2847647043(28)×10−24 J⋅T−1. In units of the Bohr magneton (μB), it is −1.00115965218059(13) μB, a value that was measured with a relative accuracy of 1.3×10−13.
In particle physics, neutral particle oscillation is the transmutation of a particle with zero electric charge into another neutral particle due to a change of a non-zero internal quantum number, via an interaction that does not conserve that quantum number. Neutral particle oscillations were first investigated in 1954 by Murray Gell-mann and Abraham Pais.
In functional analysis and quantum information science, a positive operator-valued measure (POVM) is a measure whose values are positive semi-definite operators on a Hilbert space. POVMs are a generalization of projection-valued measures (PVM) and, correspondingly, quantum measurements described by POVMs are a generalization of quantum measurement described by PVMs.
In physics, in the area of quantum information theory, a Greenberger–Horne–Zeilinger state is a certain type of entangled quantum state that involves at least three subsystems. The four-particle version was first studied by Daniel Greenberger, Michael Horne and Anton Zeilinger in 1989, and the three-particle version was introduced by N. David Mermin in 1990. Extremely non-classical properties of the state have been observed. GHZ states for large numbers of qubits are theorized to give enhanced performance for metrology compared to other qubit superposition states.
The time-evolving block decimation (TEBD) algorithm is a numerical scheme used to simulate one-dimensional quantum many-body systems, characterized by at most nearest-neighbour interactions. It is dubbed Time-evolving Block Decimation because it dynamically identifies the relevant low-dimensional Hilbert subspaces of an exponentially larger original Hilbert space. The algorithm, based on the Matrix Product States formalism, is highly efficient when the amount of entanglement in the system is limited, a requirement fulfilled by a large class of quantum many-body systems in one dimension.
In spectroscopy, the Autler–Townes effect, is a dynamical Stark effect corresponding to the case when an oscillating electric field is tuned in resonance to the transition frequency of a given spectral line, and resulting in a change of the shape of the absorption/emission spectra of that spectral line. The AC Stark effect was discovered in 1955 by American physicists Stanley Autler and Charles Townes.
The Bethe–Salpeter equation describes the bound states of a two-body (particles) quantum field theoretical system in a relativistically covariant formalism. The equation was first published in 1950 at the end of a paper by Yoichiro Nambu, but without derivation.
Resonance fluorescence is the process in which a two-level atom system interacts with the quantum electromagnetic field if the field is driven at a frequency near to the natural frequency of the atom.
In the context of quantum mechanics and quantum information theory, symmetric, informationally complete, positive operator-valued measures (SIC-POVMs) are a particular type of generalized measurement (POVM). SIC-POVMs are particularly notable thanks to their defining features of (1) being informationally complete; (2)having the minimal number of outcomes compatible with informational completeness, and (3) being highly symmetric. In this context, informational completeness is the property of a POVM of allowing to fully reconstruct input states from measurement data.
Entanglement distillation is the transformation of N copies of an arbitrary entangled state into some number of approximately pure Bell pairs, using only local operations and classical communication.
Coherent states have been introduced in a physical context, first as quasi-classical states in quantum mechanics, then as the backbone of quantum optics and they are described in that spirit in the article Coherent states. However, they have generated a huge variety of generalizations, which have led to a tremendous amount of literature in mathematical physics. In this article, we sketch the main directions of research on this line. For further details, we refer to several existing surveys.
In quantum mechanics, weak measurements are a type of quantum measurement that results in an observer obtaining very little information about the system on average, but also disturbs the state very little. From Busch's theorem the system is necessarily disturbed by the measurement. In the literature weak measurements are also known as unsharp, fuzzy, dull, noisy, approximate, and gentle measurements. Additionally weak measurements are often confused with the distinct but related concept of the weak value.
The swap test is a procedure in quantum computation that is used to check how much two quantum states differ, appearing first in the work of Barenco et al. and later rediscovered by Harry Buhrman, Richard Cleve, John Watrous, and Ronald de Wolf. It appears commonly in quantum machine learning, and is a circuit used for proofs-of-concept in implementations of quantum computers.
In quantum mechanics, the quantum Cheshire cat is a quantum phenomena that suggest that particle's physical properties can take a different trajectory from that of the particle itself. The name makes reference to the Cheshire Cat from Lewis Carroll's Alice's Adventures in Wonderland, a feline character which could disappear leaving only its grin behind. The effect was originally proposed by Yakir Aharonov, Daniel Rohrlich, Sandu Popescu and Paul Skrzypczyk in 2012.
{{cite book}}
: |journal=
ignored (help){{cite journal}}
: CS1 maint: multiple names: authors list (link){{cite journal}}
: CS1 maint: multiple names: authors list (link){{cite journal}}
: CS1 maint: multiple names: authors list (link){{cite journal}}
: CS1 maint: multiple names: authors list (link)