This article may be too technical for most readers to understand.(June 2024) |
The Love numbers (h, k, and l) are dimensionless parameters that measure the rigidity of a planetary body or other gravitating object, and the susceptibility of its shape to change in response to an external tidal potential.
In 1909, Augustus Edward Hough Love introduced the values h and k which characterize the overall elastic response of the Earth to the tides— Earth tides or body tides. [1] Later, in 1912, Toshi Shida added a third Love number, l, which was needed to obtain a complete overall description of the solid Earth's response to the tides. [2]
The Love number h is defined as the ratio of the body tide to the height of the static equilibrium tide; [3] also defined as the vertical (radial) displacement or variation of the planet's elastic properties. In terms of the tide generating potential , the displacement is where is latitude, is east longitude and is acceleration due to gravity. [4] For a hypothetical solid Earth . For a liquid Earth, one would expect . However, the deformation of the sphere causes the potential field to change, and thereby deform the sphere even more. The theoretical maximum is . For the real Earth, lies between 0 and 1.
The Love number k is defined as the cubical dilation or the ratio of the additional potential (self-reactive force) produced by the deformation of the deforming potential. It can be represented as , where for a rigid body. [4]
The Love number l represents the ratio of the horizontal (transverse) displacement of an element of mass of the planet's crust to that of the corresponding static ocean tide. [3] In potential notation the transverse displacement is , where is the horizontal gradient operator. As with h and k, for a rigid body. [4]
According to Cartwright, "An elastic solid spheroid will yield to an external tide potential of spherical harmonic degree 2 by a surface tide and the self-attraction of this tide will increase the external potential by ." [5] The magnitudes of the Love numbers depend on the rigidity and mass distribution of the spheroid. Love numbers , , and can also be calculated for higher orders of spherical harmonics.
For elastic Earth the Love numbers lie in the range: , and . [3]
For Earth's tides one can calculate the tilt factor as and the gravimetric factor as , where subscript two is assumed. [5]
Neutron stars are thought to have high rigidity in the crust, and thus a low Love number: ; [6] [7] isolated, nonrotating black holes in vacuum have vanishing Love numbers for all multipoles . [8] [9] [10] Measuring the Love numbers of compact objects in binary mergers is a key goal of gravitational-wave astronomy.
In physics, the CHSH inequality can be used in the proof of Bell's theorem, which states that certain consequences of entanglement in quantum mechanics cannot be reproduced by local hidden-variable theories. Experimental verification of the inequality being violated is seen as confirmation that nature cannot be described by such theories. CHSH stands for John Clauser, Michael Horne, Abner Shimony, and Richard Holt, who described it in a much-cited paper published in 1969. They derived the CHSH inequality, which, as with John Stewart Bell's original inequality, is a constraint—on the statistical occurrence of "coincidences" in a Bell test—which is necessarily true if an underlying local hidden-variable theory exists. In practice, the inequality is routinely violated by modern experiments in quantum mechanics.
In Einstein's theory of general relativity, the Schwarzschild metric is an exact solution to the Einstein field equations that describes the gravitational field outside a spherical mass, on the assumption that the electric charge of the mass, angular momentum of the mass, and universal cosmological constant are all zero. The solution is a useful approximation for describing slowly rotating astronomical objects such as many stars and planets, including Earth and the Sun. It was found by Karl Schwarzschild in 1916.
In particle and condensed matter physics, Goldstone bosons or Nambu–Goldstone bosons (NGBs) are bosons that appear necessarily in models exhibiting spontaneous breakdown of continuous symmetries. They were discovered by Yoichiro Nambu in particle physics within the context of the BCS superconductivity mechanism, and subsequently elucidated by Jeffrey Goldstone, and systematically generalized in the context of quantum field theory. In condensed matter physics such bosons are quasiparticles and are known as Anderson–Bogoliubov modes.
In the Standard Model of particle physics, the Higgs mechanism is essential to explain the generation mechanism of the property "mass" for gauge bosons. Without the Higgs mechanism, all bosons (one of the two classes of particles, the other being fermions) would be considered massless, but measurements show that the W+, W−, and Z0 bosons actually have relatively large masses of around 80 GeV/c2. The Higgs field resolves this conundrum. The simplest description of the mechanism adds a quantum field (the Higgs field) which permeates all of space to the Standard Model. Below some extremely high temperature, the field causes spontaneous symmetry breaking during interactions. The breaking of symmetry triggers the Higgs mechanism, causing the bosons it interacts with to have mass. In the Standard Model, the phrase "Higgs mechanism" refers specifically to the generation of masses for the W±, and Z weak gauge bosons through electroweak symmetry breaking. The Large Hadron Collider at CERN announced results consistent with the Higgs particle on 14 March 2013, making it extremely likely that the field, or one like it, exists, and explaining how the Higgs mechanism takes place in nature. The view of the Higgs mechanism as involving spontaneous symmetry breaking of a gauge symmetry is technically incorrect since by Elitzur's theorem gauge symmetries can never be spontaneously broken. Rather, the Fröhlich–Morchio–Strocchi mechanism reformulates the Higgs mechanism in an entirely gauge invariant way, generally leading to the same results.
In particle physics, Fermi's interaction is an explanation of the beta decay, proposed by Enrico Fermi in 1933. The theory posits four fermions directly interacting with one another. This interaction explains beta decay of a neutron by direct coupling of a neutron with an electron, a neutrino and a proton.
In particle physics, the Peccei–Quinn theory is a well-known, long-standing proposal for the resolution of the strong CP problem formulated by Roberto Peccei and Helen Quinn in 1977. The theory introduces a new anomalous symmetry to the Standard Model along with a new scalar field which spontaneously breaks the symmetry at low energies, giving rise to an axion that suppresses the problematic CP violation. This model has long since been ruled out by experiments and has instead been replaced by similar invisible axion models which utilize the same mechanism to solve the strong CP problem.
The Kerr–Newman metric describes the spacetime geometry around a mass which is electrically charged and rotating. It is a vacuum solution which generalizes the Kerr metric by additionally taking into account the energy of an electromagnetic field, making it the most general asymptotically flat and stationary solution of the Einstein–Maxwell equations in general relativity. As an electrovacuum solution, it only includes those charges associated with the magnetic field; it does not include any free electric charges.
In mathematics, a symplectic integrator (SI) is a numerical integration scheme for Hamiltonian systems. Symplectic integrators form the subclass of geometric integrators which, by definition, are canonical transformations. They are widely used in nonlinear dynamics, molecular dynamics, discrete element methods, accelerator physics, plasma physics, quantum physics, and celestial mechanics.
In particle physics, neutral particle oscillation is the transmutation of a particle with zero electric charge into another neutral particle due to a change of a non-zero internal quantum number, via an interaction that does not conserve that quantum number. Neutral particle oscillations were first investigated in 1954 by Murray Gell-mann and Abraham Pais.
In Newton's theory of gravitation and in various relativistic classical theories of gravitation, such as general relativity, the tidal tensor represents
The Kuramoto model, first proposed by Yoshiki Kuramoto, is a mathematical model used in describing synchronization. More specifically, it is a model for the behavior of a large set of coupled oscillators. Its formulation was motivated by the behavior of systems of chemical and biological oscillators, and it has found widespread applications in areas such as neuroscience and oscillating flame dynamics. Kuramoto was quite surprised when the behavior of some physical systems, namely coupled arrays of Josephson junctions, followed his model.
In astrophysics, the Tolman–Oppenheimer–Volkoff (TOV) equation constrains the structure of a spherically symmetric body of isotropic material which is in static gravitational equilibrium, as modeled by general relativity. The equation is
The one-way quantum computer, also known as measurement-based quantum computer (MBQC), is a method of quantum computing that first prepares an entangled resource state, usually a cluster state or graph state, then performs single qubit measurements on it. It is "one-way" because the resource state is destroyed by the measurements.
In quantum information theory, a set of bases in Hilbert space Cd are said to be mutually unbiased if when a system is prepared in an eigenstate of one of the bases, then all outcomes of the measurement with respect to the other basis are predicted to occur with an equal probability inexorably equal to 1/d.
In graph theory, the Lovász number of a graph is a real number that is an upper bound on the Shannon capacity of the graph. It is also known as Lovász theta function and is commonly denoted by , using a script form of the Greek letter theta to contrast with the upright theta used for Shannon capacity. This quantity was first introduced by László Lovász in his 1979 paper On the Shannon Capacity of a Graph.
In particle physics, CLs represents a statistical method for setting upper limits on model parameters, a particular form of interval estimation used for parameters that can take only non-negative values. Although CLs are said to refer to Confidence Levels, "The method's name is ... misleading, as the CLs exclusion region is not a confidence interval." It was first introduced by physicists working at the LEP experiment at CERN and has since been used by many high energy physics experiments. It is a frequentist method in the sense that the properties of the limit are defined by means of error probabilities, however it differs from standard confidence intervals in that the stated confidence level of the interval is not equal to its coverage probability. The reason for this deviation is that standard upper limits based on a most powerful test necessarily produce empty intervals with some fixed probability when the parameter value is zero, and this property is considered undesirable by most physicists and statisticians.
In physics, geometrothermodynamics (GTD) is a formalism developed in 2007 by Hernando Quevedo to describe the properties of thermodynamic systems in terms of concepts of differential geometry.
Linear optical quantum computing or linear optics quantum computation (LOQC), also photonic quantum computing (PQC), is a paradigm of quantum computation, allowing (under certain conditions, described below) universal quantum computation. LOQC uses photons as information carriers, mainly uses linear optical elements, or optical instruments (including reciprocal mirrors and waveplates) to process quantum information, and uses photon detectors and quantum memories to detect and store quantum information.
The Peierls substitution method, named after the original work by Rudolf Peierls is a widely employed approximation for describing tightly-bound electrons in the presence of a slowly varying magnetic vector potential.
The KLM scheme or KLM protocol is an implementation of linear optical quantum computing (LOQC) developed in 2000 by Emanuel Knill, Raymond Laflamme and Gerard J. Milburn. This protocol allows for the creation of universal quantum computers using solely linear optical tools. The KLM protocol uses linear optical elements, single-photon sources and photon detectors as resources to construct a quantum computation scheme involving only ancilla resources, quantum teleportations and error corrections.