|Part of a series on|
In theoretical physics, quantum nonlocality refers to the phenomenon by which the measurement statistics of a multipartite quantum system do not admit an interpretation in terms of a local realistic theory. Quantum nonlocality has been experimentally verified under different physical assumptions.Any physical theory that aims at superseding or replacing quantum theory should account for such experiments and therefore must also be nonlocal in this sense; quantum nonlocality is a property of the universe that is independent of our description of nature.
Quantum nonlocality does not allow for faster-than-light communication,and hence is compatible with special relativity and its universal speed limit of objects. Thus, quantum theory is local in the strict sense defined by special relativity and, as such, the term "quantum nonlocality" is sometimes considered a misnomer. Still, it prompts many of the foundational discussions concerning quantum theory, see Quantum foundations.
In 1935, Einstein, Podolsky and Rosen published a thought experiment with which they hoped to expose the incompleteness of the Copenhagen interpretation of quantum mechanics in relation to the violation of local causality at the microscopic scale that it described.Afterwards, Einstein presented a variant of these ideas in a letter to Erwin Schrödinger, which is the version that is presented here. The state and notation used here are more modern, and akin to David Bohm's take on EPR. The quantum state of the two particles prior to measurement can be written as
Here, subscripts “A” and “B” distinguish the two particles, though it is more convenient and usual to refer to these particles as being in the possession of two experimentalists called Alice and Bob. The rules of quantum theory give predictions for the outcomes of measurements performed by the experimentalists. Alice, for example, will measure her particle to be spin-up in an average of fifty percent of measurements. However, according to the Copenhagen interpretation, Alice's measurement causes the state of the two particles to collapse, so that if Alice performs a measurement of spin in the z-direction, that is with respect to the basis , then Bob's system will be left in one of the states . Likewise, if Alice performs a measurement of spin in the x-direction, that is, with respect to the basis , then Bob's system will be left in one of the states . Schrödinger referred to this phenomenon as "steering". This steering occurs in such a way that no signal can be sent by performing such a state update; quantum nonlocality cannot be used to send messages instantaneously and is therefore not in direct conflict with causality concerns in Special Relativity.
In the Copenhagen view of this experiment, Alice's measurement—and particularly her measurement choice—has a direct effect on Bob's state. However, under the assumption of locality, actions on Alice's system do not affect the "true", or "ontic" state of Bob's system. We see that the ontic state of Bob's system must be compatible with one of the quantum states or , since Alice can make a measurement that concludes with one of those states being the quantum description of his system. At the same time, it must also be compatible with one of the quantum states or for the same reason. Therefore, the ontic state of Bob's system must be compatible with at least two quantum states; the quantum state is therefore not a complete descriptor of his system. Einstein, Podolsky and Rosen saw this as evidence of the incompleteness of the Copenhagen interpretation of quantum theory, since the wavefunction is explicitly not a complete description of a quantum system under this assumption of locality. Their paper concludes:
While we have thus shown that the wave function does not provide a complete description of the physical reality, we left open the question of whether or not such a description exists. We believe, however, that such a theory is possible.
Although various authors (most notably Niels Bohr) criticised the ambiguous terminology of the EPR paper,the thought experiment nevertheless generated a great deal of interest. Their notion of a "complete description" was later formalised by the suggestion of hidden variables that determine the statistics of measurement results, but to which an observer does not have access. Bohmian mechanics provides such a completion of quantum mechanics, with the introduction of hidden variables; however the theory is explicitly nonlocal. The interpretation therefore does not give an answer to Einstein's question, which was whether or not a complete description of quantum mechanics could be given in terms of local hidden variables in keeping with the "Principle of Local Action".
In 1964 John Bell answered Einstein's question by showing that such local hidden variables can never reproduce the full range of statistical outcomes predicted by quantum theory.Bell showed that a local hidden variable hypothesis leads to restrictions on the strength of correlations of measurement results. If the Bell inequalities are violated experimentally as predicted by quantum mechanics, then reality cannot be described by local hidden variables and the mystery of quantum nonlocal causation remains. According to Bell:
This [grossly nonlocal structure] is characteristic ... of any such theory which reproduces exactly the quantum mechanical predictions.
Clauser, Horne, Shimony and Holt (CHSH) reformulated these inequalities in a manner that was more conductive to experimental testing (see CHSH inequality).
In the scenario proposed by Bell (a Bell scenario), two experimentalists, Alice and Bob, conduct experiments in separate labs. At each run, Alice (Bob) conducts an experiment in her (his) lab, obtaining outcome . If Alice and Bob repeat their experiments several times, then they can estimate the probabilities , namely, the probability that Alice and Bob respectively observe the results when they respectively conduct the experiments x,y. In the following, each such set of probabilities will be denoted by just . In the quantum nonlocality slang, is termed a box.
Bell formalized the idea of a hidden variable by introducing the parameter to locally characterize measurement results on each system: "It is a matter of indifference ... whether λ denotes a single variable or a set ... and whether the variables are discrete or continuous". However, it is equivalent (and more intuitive) to think of as a local "strategy" or "message" that occurs with some probability when Alice and Bob reboot their experimental setup. EPR's criteria of local separability then stipulates that each local strategy defines the distributions of independent outcomes if Alice conducts experiment x and Bob conducts experiment :
Here () denotes the probability that Alice (Bob) obtains the result when she (he) conducts experiment and the local variable describing her (his) experiment has value ().
Suppose that can take values from some set . If each pair of values has an associated probability of being selected (shared randomness is allowed, i.e., can be correlated), then one can average over this distribution to obtain a formula for the joint probability of each measurement result:
A box admitting such a decomposition is called a Bell local or a classical box. Fixing the number of possible values which can each take, one can represent each box as a finite vector with entries . In that representation, the set of all classical boxes forms a convex polytope. In the Bell scenario studied by CHSH, where can take values within , any Bell local box must satisfy the CHSH inequality:
The above considerations apply to model a quantum experiment. Consider two parties conducting local polarization measurements on a bipartite photonic state. The measurement result for the polarization of a photon can take one of two values (informally, whether the photon is polarized in that direction, or in the orthogonal direction). If each party is allowed to choose between just two different polarization directions, the experiment fits within the CHSH scenario. As noted by CHSH, there exist a quantum state and polarization directions which generate a box with equal to . This demonstrates an explicit way in which a theory with ontological states that are local, with local measurements and only local actions cannot match the probabilistic predictions of quantum theory, disproving Einstein's hypothesis. Experimentalists such as Alain Aspect have verified the quantum violation of the CHSH inequality as well as other formulations of Bell's inequality, to invalidate the local hidden variables hypothesis and confirm that reality is indeed nonlocal in the EPR sense.
The demonstration of nonlocality due to Bell is probabilistic in the sense that it shows that the precise probabilities predicted by quantum mechanics for some entangled scenarios cannot be met by a local theory. (For short, here and henceforth "local theory" means "local hidden variables theory".) However, quantum mechanics permits an even stronger violation of local theories: a possibilistic one, in which local theories cannot even agree with quantum mechanics on which events are possible or impossible in an entangled scenario. The first proof of this kind was due to Greenberger, Horne and Zeilinger in 1993
In 1993, Lucien Hardy demonstrated a logical proof of quantum nonlocality that, like the GHZ proof is a possibilistic proof. defined below can be written in a few suggestive ways:The state involved is often called the GHZ state. It starts with the observation that the state
where, as above, .
The experiment consists of this entangled state being shared between two experimenters, each of whom has the ability to measure either with respect to the basis or . We see that if they each measure with respect to , then they never see the outcome . If one measures with respect to and the other , they never see the outcomes However, sometimes they see the outcome when measuring with respect to , since
This leads to the paradox: having the outcome we conclude that if one of the experimenters had measured with respect to the basis instead, the outcome must have been or , since and are impossible. But then, if they had both measured with respect to the basis, by locality the result must have been , which is also impossible.
The work of Bancal et al.generalizes Bell's result by proving that correlations achievable in quantum theory are also incompatible with a large class of superluminal hidden variable models. In this framework, faster-than-light signaling is precluded. However, the choice of settings of one party can influence hidden variables at another party's distant location, if there is enough time for a superluminal influence (of finite, but otherwise unknown speed) to propagate from one point to the other. In this scenario, any bipartite experiment revealing Bell nonlocality can just provide lower bounds on the hidden influence's propagation speed. Quantum experiments with three or more parties can, nonetheless, disprove all such non-local hidden variable models.
The random variables measured in a general experiment can depend on each other in complicated ways. In the field of causal inference, such dependencies are represented via Bayesian networks: directed acyclic graphs where each node represents a variable and an edge from a variable to another signifies that the former influences the latter and not otherwise, see the figure. In a standard bipartite Bell experiment, Alice's (Bob's) setting (), together with her (his) local variable (), influence her (his) local outcome (). Bell's theorem can thus be interpreted as a separation between the quantum and classical predictions in a type of causal structures with just one hidden node . Similar separations have been established in other types of causal structures. The characterization of the boundaries for classical correlations in such extended Bell scenarios is challenging, but there exist complete practical computational methods to achieve it.
Quantum nonlocality is sometimes understood as being equivalent to entanglement. However, this is not the case. Quantum entanglement can be defined only within the formalism of quantum mechanics, i.e., it is a model-dependent property. In contrast, nonlocality refers to the impossibility of a description of observed statistics in terms of a local hidden variable model, so it is independent of the physical model used to describe the experiment.
It is true that for any pure entangled state there exists a choice of measurements that produce Bell nonlocal correlations, but the situation is more complex for mixed states. While any Bell nonlocal state must be entangled, there exist (mixed) entangled states which do not produce Bell nonlocal correlations(although, operating on several copies of some of such states, or carrying out local post-selections, it is possible to witness nonlocal effects). In addition, reasonably simple examples of Bell inequalities have been found for which the quantum state giving the largest violation is never a maximally entangled state, showing that entanglement is, in some sense, not even proportional to nonlocality.
As shown, the statistics achievable by two or more parties conducting experiments in a classical system are constrained in a non-trivial way. Analogously, the statistics achievable by separate observers in a quantum theory also happen to be restricted. The first derivation of a non-trivial statistical limit on the set of quantum correlations, due to B. Tsirelson,is known as Tsirelson's bound. Consider the CHSH Bell scenario detailed before, but this time assume that, in their experiments, Alice and Bob are preparing and measuring quantum systems. In that case, the CHSH parameter can be shown to be bounded by
Mathematically, a box admits a quantum realization if and only if there exists a pair of Hilbert spaces , a normalized vector and projection operators such that
In the following, the set of such boxes will be called . Contrary to the classical set of correlations, when viewed in probability space, is not a polytope. On the contrary, it contains both straight and curved boundaries. In addition, is not closed: this means that there exist boxes which can be arbitrarily well approximated by quantum systems but are themselves not quantum.
In the above definition, the space-like separation of the two parties conducting the Bell experiment was modeled by imposing that their associated operator algebras act on different factors of the overall Hilbert space describing the experiment. Alternatively, one could model space-like separation by imposing that these two algebras commute. This leads to a different definition:
admits a field quantum realization if and only if there exists a Hilbert space , a normalized vector and projection operators such that
Call the set of all such correlations .
How does this new set relate to the more conventional defined above? It can be proven that is closed. Moreover, , where denotes the closure of . Tsirelson's problem consists in deciding whether the inclusion relation is strict, i.e., whether or not . This problem only appears in infinite dimensions: when the Hilbert space in the definition of is constrained to be finite-dimensional, the closure of the corresponding set equals .
In January 2020, Ji, Natarajan, Vidick, Wright, and Yuen claimed a result in quantum complexity theory , thus solving Tsirelson's problem.that would imply that
Tsirelson's problem can be shown equivalent to Connes embedding problem,a famous conjecture in the theory of operator algebras.
Since the dimensions of and are, in principle, unbounded, determining whether a given box admits a quantum realization is a complicated problem. In fact, the dual problem of establishing whether a quantum box can have a perfect score at a non-local game is known to be undecidable. Moreover, the problem of deciding whether can be approximated by a quantum system with precision is NP-hard. Characterizing quantum boxes is equivalent to characterizing the cone of completely positive semidefinite matrices under a set of linear constraints.
For small fixed dimensions , one can explore, using variational methods, whether can be realized in a bipartite quantum system , with , . That method, however, can just be used to prove the realizability of , and not its unrealizability with quantum systems.
To prove unrealizability, the most known method is the Navascués-Pironio-Acín (NPA) hierarchy. with the properties:This is an infinite decreasing sequence of sets of correlations
The NPA hierarchy thus provides a computational characterization, not of , but of . If Tsirelson's problem is solved in the affirmative, namely, , then the above two methods would provide a practical characterization of . If, on the contrary, , then a new method to detect the non-realizability of the correlations in is needed.
The works listed above describe what the quantum set of correlations looks like, but they do not explain why. Are quantum correlations unavoidable, even in post-quantum physical theories, or on the contrary, could there exist correlations outside which nonetheless do not lead to any unphysical operational behavior?
In their seminal 1994 paper, Popescu and Rorhlich explore whether quantum correlations can be explained by appealing to relativistic causality alone. would allow building a device capable of transmitting information faster than the speed of light. At the level of correlations between two parties, Einstein's causality translates in the requirement that Alice's measurement choice should not affect Bob's statistics, and vice versa. Otherwise, Alice (Bob) could signal Bob (Alice) instantaneously by choosing her (his) measurement setting appropriately. Mathematically, Popescu and Rohrlich's no-signalling conditions are:Namely, whether any hypothetical box
Like the set of classical boxes, when represented in probability space, the set of no-signalling boxes forms a polytope. Popescu and Rohrlich identified a box that, while complying with the no-signalling conditions, violates Tsirelson's bound, and is thus unrealizable in quantum physics. Dubbed the PR-box, it can be written as:
Here take values in , and denotes the sum modulo two. It can be verified that the CHSH value of this box is 4 (as opposed to the Tsirelson bound of ). This box had been identified earlier, by Rastall and Khalfin and Tsirelson.
In view of this mismatch, Popescu and Rohrlich pose the problem of identifying a physical principle, stronger than the no-signalling conditions, that allows deriving the set of quantum correlations. Several proposals followed:
All these principles can be experimentally falsified under the assumption that we can decide if two or more events are space-like separated. This sets this research program aside from the axiomatic reconstruction of quantum mechanics via Generalized Probabilistic Theories.
The works above rely on the implicit assumption that any physical set of correlations must be closed under wirings.This means that any effective box built by combining the inputs and outputs of a number of boxes within the considered set must also belong to the set. Closure under wirings does not seem to enforce any limit on the maximum value of CHSH. However, it is not a void principle: on the contrary, in it is shown that many simple, intuitive families of sets of correlations in probability space happen to violate it.
Originally, it was unknown whether any of these principles (or a subset thereof) was strong enough to derive all the constraints defining . This state of affairs continued for some years until the construction of the almost quantum set . is a set of correlations that is closed under wirings and can be characterized via semidefinite programming. It contains all correlations in , but also some non-quantum boxes . Remarkably, all boxes within the almost quantum set are shown to be compatible with the principles of NTCC, NANLC, ML and LO. There is also numerical evidence that almost quantum boxes also comply with IC. It seems, therefore, that, even when the above principles are taken together, they do not suffice to single out the quantum set in the simplest Bell scenario of two parties, two inputs and two outputs.
Nonlocality can be exploited to conduct quantum information tasks which do not rely on the knowledge of the inner workings of the prepare-and-measurement apparatuses involved in the experiment. The security or reliability of any such protocol just depends on the strength of the experimentally measured correlations . These protocols are termed device-independent.
The first device-independent protocol proposed was device-independent Quantum Key Distribution (QKD). . Based on how non-local the box happens to be, Alice and Bob estimate how much knowledge an external quantum adversary Eve (the eavesdropper) could possess on the value of Alice and Bob's outputs. This estimation allows them to devise a reconciliation protocol at the end of which Alice and Bob share a perfectly correlated one-time pad of which Eve has no information whatsoever. The one-time pad can then be used to transmit a secret message through a public channel. Although the first security analyses on device-independent QKD relied on Eve carrying out a specific family of attacks, all such protocols have been recently proven unconditionally secure.In this primitive, two distant parties, Alice and Bob, are distributed an entangled quantum state, that they probe, thus obtaining the statistics
Nonlocality can be used to certify that the outcomes of one of the parties in a Bell experiment are partially unknown to an external adversary.By feeding a partially random seed to several non-local boxes, and, after processing the outputs, one can end up with a longer (potentially unbounded) string of comparable randomness or with a shorter but more random string. This last primitive can be proven impossible in a classical setting.
Sometimes, the box shared by Alice and Bob is such that it only admits a unique quantum realization. This means that there exist measurement operators and a quantum state giving rise to such that any other physical realization of is connected to via local unitary transformations. This phenomenon, that can be interpreted as an instance of device-independent quantum tomography, was first pointed out by Tsirelson and named self-testing by Mayers and Yao. Self-testing is known to be robust against systematic noise, i.e., if the experimentally measured statistics are close enough to , one can still determine the underlying state and measurement operators up to error bars.
The degree of non-locality of a quantum box can also provide lower bounds on the Hilbert space dimension of the local systems accessible to Alice and Bob. This problem is equivalent to deciding the existence of a matrix with low completely positive semidefinite rank. Finding lower bounds on the Hilbert space dimension based on statistics happens to be a hard task, and current general methods only provide very low estimates. However, a Bell scenario with five inputs and three outputs suffices to provide arbitrarily high lower bounds on the underlying Hilbert space dimension. Quantum communication protocols which assume a knowledge of the local dimension of Alice and Bob's systems, but otherwise do not make claims on the mathematical description of the preparation and measuring devices involved are termed semi-device independent protocols. Currently, there exist semi-device independent protocols for quantum key distribution and randomness expansion.
The Einstein–Podolsky–Rosen paradox is a thought experiment proposed by physicists Albert Einstein, Boris Podolsky and Nathan Rosen (EPR), with which they argued that the description of physical reality provided by quantum mechanics was incomplete. In a 1935 paper titled "Can Quantum-Mechanical Description of Physical Reality be Considered Complete?", they argued for the existence of "elements of reality" that were not part of quantum theory, and speculated that it should be possible to construct a theory containing them. Resolutions of the paradox have important implications for the interpretation of quantum mechanics.
Quantum entanglement is a physical phenomenon that occurs when a group of particles are generated, interact, or share spatial proximity in a way such that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics lacking in classical mechanics.
Bell's theorem proves that quantum physics is incompatible with local hidden-variable theories. It was introduced by physicist John Stewart Bell in a 1964 paper titled "On the Einstein Podolsky Rosen Paradox", referring to a 1935 thought experiment that Albert Einstein, Boris Podolsky and Nathan Rosen used to argue that quantum physics is an "incomplete" theory. By 1935, it was already recognized that the predictions of quantum physics are probabilistic. Einstein, Podolsky and Rosen presented a scenario that, in their view, indicated that quantum particles, like electrons and photons, must carry physical properties or attributes not included in quantum theory, and the uncertainties in quantum theory's predictions were due to ignorance of these properties, later termed "hidden variables". Their scenario involves a pair of widely separated physical objects, prepared in such a way that the quantum state of the pair is entangled.
In physics, the CHSH inequality can be used in the proof of Bell's theorem, which states that certain consequences of entanglement in quantum mechanics cannot be reproduced by local hidden variable theories. Experimental verification of violation of the inequalities is seen as experimental confirmation that nature cannot be described by local hidden variables theories. CHSH stands for John Clauser, Michael Horne, Abner Shimony, and Richard Holt, who described it in a much-cited paper published in 1969. They derived the CHSH inequality, which, as with John Bell's original inequality, is a constraint on the statistics of "coincidences" in a Bell test which is necessarily true if there exist underlying local hidden variables. This constraint can, on the other hand, be infringed by quantum mechanics.
A local hidden-variable theory in the interpretation of quantum mechanics is a hidden-variable theory that has the added requirement of being consistent with local realism. It refers to all types of the theory that attempt to account for the probabilistic features of quantum mechanics by the mechanism of underlying inaccessible variables, with the additional requirement from local realism that distant events be independent, ruling out instantaneous interactions between separate events.
A Tsirelson bound is an upper limit to quantum mechanical correlations between distant events. Given that quantum mechanics is non-local, a natural question to ask is "how non-local can quantum mechanics be?", or, more precisely, by how much can the Bell inequality be violated. The answer is precisely the Tsirelson bound for the particular Bell inequality in question. In general, this bound is lower than what would be possible without signalling faster than light, and much research has been dedicated to the question of why this is the case.
In probability theory and mathematical physics, a random matrix is a matrix-valued random variable—that is, a matrix in which some or all elements are random variables. Many important properties of physical systems can be represented mathematically as matrix problems. For example, the thermal conductivity of a lattice can be computed from the dynamical matrix of the particle-particle interactions within the lattice.
The Peres–Horodecki criterion is a necessary condition, for the joint density matrix of two quantum mechanical systems and , to be separable. It is also called the PPT criterion, for positive partial transpose. In the 2x2 and 2x3 dimensional cases the condition is also sufficient. It is used to decide the separability of mixed states, where the Schmidt decomposition does not apply.
In mathematics, the Grothendieck inequality states that there is a universal constant with the following property. If Mij is an n by n matrix with
Surface roughness scattering or interface roughness scattering is the elastic scattering of a charged particle by an imperfect interface between two different materials. It is an important effect in electronic devices which contain narrow layers, such as field effect transistors and quantum cascade lasers.
The Koopman–von Neumann mechanics is a description of classical mechanics in terms of Hilbert space, introduced by Bernard Koopman and John von Neumann in 1931 and 1932, respectively.
Quantum contextuality is a feature of the phenomenology of quantum mechanics whereby measurements of quantum observables cannot simply be thought of as revealing pre-existing values. Any attempt to do so in a realistic hidden-variable theory leads to values that are dependent upon the choice of the other (compatible) observables which are simultaneously measured. More formally, the measurement result of a quantum observable is dependent upon which other commuting observables are within the same measurement set.
The quantum algorithm for linear systems of equations, also called HHL algorithm, designed by Aram Harrow, Avinatan Hassidim, and Seth Lloyd, is a quantum algorithm formulated in 2009 for solving linear systems. The algorithm estimates the result of a scalar measurement on the solution vector to a given linear system of equations.
The light front quantization of quantum field theories provides a useful alternative to ordinary equal-time quantization. In particular, it can lead to a relativistic description of bound systems in terms of quantum-mechanical wave functions. The quantization is based on the choice of light-front coordinates, where plays the role of time and the corresponding spatial coordinate is . Here, is the ordinary time, is one Cartesian coordinate, and is the speed of light. The other two Cartesian coordinates, and , are untouched and often called transverse or perpendicular, denoted by symbols of the type . The choice of the frame of reference where the time and -axis are defined can be left unspecified in an exactly soluble relativistic theory, but in practical calculations some choices may be more suitable than others.
In quantum mechanics, weak measurements are a type of quantum measurement that results in an observer obtaining very little information about the system on average, but also disturbs the state very little. From Busch's theorem the system is necessarily disturbed by the measurement. In the literature weak measurements are also known as unsharp, fuzzy, dull, noisy, approximate, and gentle measurements. Additionally weak measurements are often confused with the distinct but related concept of the weak value.
Infinite derivative gravity is a theory of gravity which attempts to remove cosmological and black hole singularities by adding extra terms to the Einstein–Hilbert action, which weaken gravity at short distances.
In physics, in the area of quantum information theory and quantum computation, quantum steering is a special kind of nonlocal correlations, which is intermediate between Bell nonlocality and quantum entanglement. A state exhibiting Bell nonlocality must also exhibit quantum steering, a state exhibiting quantum steering must also exhibit quantum entanglement. But for mixed quantum states, there exist examples which lie between these different quantum correlation sets. The notion was initially proposed by Schrödinger, and later made popular by Howard M. Wiseman, S. J. Jones, and A. C. Doherty.
The quantum Fisher information is a central quantity in quantum metrology and is the quantum analogue of the classical Fisher information. The quantum Fisher information of a state with respect to the observable is defined as
The continuous spontaneous localization (CSL) model is a spontaneous collapse model in quantum mechanics, proposed in 1989 by Philip Pearle. and finalized in 1990 Gian Carlo Ghirardi, Philip Pearle and Alberto Rimini.
The Dicke model is a fundamental model of quantum optics, which describes the interaction between light and matter. In the Dicke model, the light component is described as a single quantum mode, while the matter is described as a set of two-level systems. When the coupling between the light and matter crosses a critical value, the Dicke model shows a mean-field phase transition to a superradiant phase. This transition belongs to the Ising universality class and was realized experimentally in cavity quantum electrodynamics experiments. Although the superradiant transition bears some analogy with the lasing instability, these two transitions belong to different universality classes.