General relativity |
---|
The equivalence principle is the hypothesis that the observed equivalence of gravitational and inertial mass is a consequence of nature. The weak form, known for centuries, relates to masses of any composition in free fall taking the same trajectories and landing at identical times. The extended form by Albert Einstein requires special relativity to also hold in free fall and requires the weak equivalence to be valid everywhere. This form was a critical input for the development of the theory of general relativity. The strong form requires Einstein's form to work for stellar objects. Highly precise experimental tests of the principle limit possible deviations from equivalence to be very small.
In classical mechanics, Newton's equation of motion in a gravitational field, written out in full, is:
Careful experiments have shown that the inertial mass on the left side and gravitational mass on the right side are numerically equal and independent of the material composing the masses. The equivalence principle is the hypothesis that this numerical equality of inertial and gravitational mass is a consequence of their fundamental identity. [1] : 32
The equivalence principle can be considered an extension of the principle of relativity, the principle that the laws of physics are invariant under uniform motion. An observer in a windowless room cannot distinguish between being on the surface of the Earth and being in a spaceship in deep space accelerating at 1g and the laws of physics are unable to distinguish these cases. [1] : 33
By experimenting with the acceleration of different materials, Galileo determined that gravitation is independent of the amount of mass being accelerated. [2]
Newton, just 50 years after Galileo, investigated whether gravitational and inertial mass might be different concepts. He compared the periods of pendulums composed of different materials and found them to be identical. From this, he inferred that gravitational and inertial mass are the same thing. The form of this assertion, where the equivalence principle is taken to follow from empirical consistency, later became known as "weak equivalence". [2]
A version of the equivalence principle consistent with special relativity was introduced by Albert Einstein in 1907, when he observed that identical physical laws are observed in two systems, one subject to a constant gravitational field causing acceleration and the other subject to constant acceleration, like a rocket far from any gravitational field. [3] : 152 Since the physical laws are the same, Einstein assumed the gravitational field and the acceleration were "physically equivalent". Einstein stated this hypothesis by saying he would:
...assume the complete physical equivalence of a gravitational field and a corresponding acceleration of the reference system.
— Einstein, 1907 [4]
In 1911 Einstein demonstrated the power of the equivalence principle by using it to predict that clocks run at different rates in a gravitational potential, and light rays bend in a gravitational field. [3] : 153 He connected the equivalence principle to his earlier principle of special relativity:
This assumption of exact physical equivalence makes it impossible for us to speak of the absolute acceleration of the system of reference, just as the usual theory of relativity forbids us to talk of the absolute velocity of a system; and it makes the equal falling of all bodies in a gravitational field seem a matter of course.
— Einstein, 1911 [5]
Soon after completing work on his theory of gravity (known as general relativity) [6] : 111 and then also in later years, Einstein recalled the importance of the equivalence principle to his work:
The breakthrough came suddenly one day. I was sitting on a chair in my patent office in Bern. Suddenly a thought struck me: If a man falls freely, he would not feel his weight. I was taken aback. This simple thought experiment made a deep impression on me. This led me to the theory of gravity.
— Einstein, 1922 [7]
Einstein's development of general relativity necessitated some means of empirically discriminating the theory from other theories of gravity compatible with special relativity. Accordingly, Robert Dicke developed a test program incorporating two new principles—the § Einstein equivalence principle, and the § Strong equivalence principle—each of which assumes the weak equivalence principle as a starting point.
Three main forms of the equivalence principle are in current use: weak (Galilean), Einsteinian, and strong. [8] : 6 Some proposals also suggest finer divisions or minor alterations. [9] [10]
The weak equivalence principle, also known as the universality of free fall or the Galilean equivalence principle can be stated in many ways. The strong equivalence principle, a generalization of the weak equivalence principle, includes astronomic bodies with gravitational self-binding energy. [11] Instead, the weak equivalence principle assumes falling bodies are self-bound by non-gravitational forces only (e.g. a stone). Either way:
Uniformity of the gravitational field eliminates measurable tidal forces originating from a radial divergent gravitational field (e.g., the Earth) upon finite sized physical bodies.
What is now called the "Einstein equivalence principle" states that the weak equivalence principle holds, and that:
Here local means that experimental setup must be small compared to variations in the gravitational field, called tidal forces. The test experiment must be small enough so that its gravitational potential does not alter the result.
The two additional constraints added to the weak principle to get the Einstein form − (1) the independence of the outcome on relative velocity (local Lorentz invariance) and (2) independence of "where" known as (local positional invariance) − have far reaching consequences. With these constraints alone Einstein was able to predict the gravitational redshift. [13] Theories of gravity that obey the Einstein equivalence principle must be "metric theories", meaning that trajectories of freely falling bodies are geodesics of symmetric metric. [14] : 9
Around 1960 Leonard I. Schiff conjectured that any complete and consistent theory of gravity that embodies the weak equivalence principle implies the Einstein equivalence principle; the conjecture can't be proven but has several plausibility arguments in its favor. [14] : 20 Nonetheless, the two principles are tested with very different kinds of experiments.
The Einstein equivalence principle has been criticized as imprecise, because there is no universally accepted way to distinguish gravitational from non-gravitational experiments (see for instance Hadley [15] and Durand [16] ).
The strong equivalence principle applies the same constraints as the Einstein equivalence principle, but allows the freely falling bodies to be massive gravitating objects as well as test particles. [8] Thus this is a version of the equivalence principle that applies to objects that exert a gravitational force on themselves, such as stars, planets, black holes or Cavendish experiments. It requires that the gravitational constant be the same everywhere in the universe [14] : 49 and is incompatible with a fifth force. It is much more restrictive than the Einstein equivalence principle.
Like the Einstein equivalence principle, the strong equivalence principle requires gravity is geometrical by nature, but in addition it forbids any extra fields, so the metric alone determines all of the effects of gravity. If an observer measures a patch of space to be flat, then the strong equivalence principle suggests that it is absolutely equivalent to any other patch of flat space elsewhere in the universe. Einstein's theory of general relativity (including the cosmological constant) is thought to be the only theory of gravity that satisfies the strong equivalence principle. A number of alternative theories, such as Brans–Dicke theory and the Einstein-aether theory add additional fields. [8]
Some of the tests of the equivalence principle use names for the different ways mass appears in physical formulae. In nonrelativistic physics three kinds of mass can be distinguished: [14]
By definition of active and passive gravitational mass, the force on due to the gravitational field of is: Likewise the force on a second object of arbitrary mass2 due to the gravitational field of mass0 is:
By definition of inertial mass:if and are the same distance from then, by the weak equivalence principle, they fall at the same rate (i.e. their accelerations are the same).
Hence:
Therefore:
In other words, passive gravitational mass must be proportional to inertial mass for objects, independent of their material composition if the weak equivalence principle is obeyed.
The dimensionless Eötvös-parameter or Eötvös ratio is the difference of the ratios of gravitational and inertial masses divided by their average for the two sets of test masses "A" and "B". Values of this parameter are used to compare tests of the equivalence principle. [14] : 10
A similar parameter can be used to compare passive and active mass. By Newton's third law of motion: must be equal and opposite to
It follows that:
In words, passive gravitational mass must be proportional to active gravitational mass for all objects. The difference, is used to quantify differences between passive and active mass. [17]
Tests of the weak equivalence principle are those that verify the equivalence of gravitational mass and inertial mass. An obvious test is dropping different objects and verifying that they land at the same time. Historically this was the first approach—though probably not by Galileo's Leaning Tower of Pisa experiment [18] : 19–21 but instead earlier by Simon Stevin, [19] who dropped lead balls of different masses off the Delft churchtower and listened for the sound of them hitting a wooden plank.
Isaac Newton measured the period of pendulums made with different materials as an alternative test giving the first precision measurements. [2] Loránd Eötvös's approach in 1908 used a very sensitive torsion balance to give precision approaching 1 in a billion. Modern experiments have improved this by another factor of a million.
A popular exposition of this measurement was done on the Moon by David Scott in 1971. He dropped a falcon feather and a hammer at the same time, showing on video [20] that they landed at the same time.
Year | Investigator | Sensitivity | Method |
---|---|---|---|
500? | John Philoponus [22] | "small" | Drop tower |
1585 | Simon Stevin [23] [19] | 5×10−2 | Drop tower |
1590? | Galileo Galilei [24] [21] : 91 | 2×10−3 | Pendulum, drop tower |
1686 | Isaac Newton [25] [21] : 91 | 10−3 | Pendulum |
1832 | Friedrich Wilhelm Bessel [26] [21] : 91 | 2×10−5 | Pendulum |
1908 (1922) | Loránd Eötvös [27] [21] : 92 | 2×10−9 | Torsion balance |
1910 | Southerns [28] [21] : 91 | 5×10−6 | Pendulum |
1918 | Zeeman [29] [21] : 91 | 3×10−8 | Torsion balance |
1923 | Potter [30] [21] : 91 | 3×10−6 | Pendulum |
1935 | Renner [31] [21] : 92 | 2×10−9 | Torsion balance |
1964 | Roll, Krotkov, Dicke [32] | 3×10−11 | Torsion balance |
1972 | Braginsky, Panov [33] [21] : 92 | 10−12 | Torsion balance |
1976 | Shapiro, et al. [34] [21] : 92 | 10−12 | Lunar laser ranging |
1979 | Keiser, Faller [35] [21] : 93 | 4×10−11 | Fluid support |
1987 | Niebauer, et al. [36] [21] : 95 | 10−10 | Drop tower |
1989 | Stubbs, et al. [37] [21] : 93 | 10−11 | Torsion balance |
1990 | Adelberger, Eric G.; et al. [38] [21] : 95 | 10−12 | Torsion balance |
1999 | Baessler, et al. [39] [40] | 5×10−14 | Torsion balance |
2008 | Schlamminger, et al. [41] | 10−13 | Torsion balance |
2017 | MICROSCOPE [42] [43] | 10−15 | Earth orbit |
Experiments are still being performed at the University of Washington which have placed limits on the differential acceleration of objects towards the Earth, the Sun and towards dark matter in the Galactic Center. [44] Future satellite experiments [45] – Satellite Test of the Equivalence Principle [46] and Galileo Galilei – will test the weak equivalence principle in space, to much higher accuracy. [47]
With the first successful production of antimatter, in particular anti-hydrogen, a new approach to test the weak equivalence principle has been proposed. Experiments to compare the gravitational behavior of matter and antimatter are currently being developed. [48]
Proposals that may lead to a quantum theory of gravity such as string theory and loop quantum gravity predict violations of the weak equivalence principle because they contain many light scalar fields with long Compton wavelengths, which should generate fifth forces and variation of the fundamental constants. Heuristic arguments suggest that the magnitude of these equivalence principle violations could be in the 10−13 to 10−18 range. [49]
Currently envisioned tests of the weak equivalence principle are approaching a degree of sensitivity such that non-discovery of a violation would be just as profound a result as discovery of a violation. Non-discovery of equivalence principle violation in this range would suggest that gravity is so fundamentally different from other forces as to require a major reevaluation of current attempts to unify gravity with the other forces of nature. A positive detection, on the other hand, would provide a major guidepost towards unification. [49]
In addition to the tests of the weak equivalence principle, the Einstein equivalence principle requires testing the local Lorentz invariance and local positional invariance conditions.
Testing local Lorentz invariance amounts to testing special relativity, a theory with vast number of existing tests. [14] : 12 Nevertheless, attempts to look for quantum gravity require even more precise tests. The modern tests include looking for directional variations in the speed of light (called "clock anisotropy tests") and new forms of the Michelson-Morley experiment. The anisotropy measures less than one part in 10−20. [14] : 14
Testing local positional invariance divides in to tests in space and in time. [14] : 17 Space-based tests use measurements of the gravitational redshift, the classic is the Pound–Rebka experiment in the 1960s. The most precise measurement was done in 1976 by flying a hydrogen maser and comparing it to one on the ground. The Global positioning system requires compensation for this redshift to give accurate position values.
Time-based tests search for variation of dimensionless constants and mass ratios. [50] For example, Webb et al. [51] reported detection of variation (at the 10−5 level) of the fine-structure constant from measurements of distant quasars. Other researchers dispute these findings. [52]
The present best limits on the variation of the fundamental constants have mainly been set by studying the naturally occurring Oklo natural nuclear fission reactor, where nuclear reactions similar to ones we observe today have been shown to have occurred underground approximately two billion years ago. These reactions are extremely sensitive to the values of the fundamental constants.
Constant | Year | Method | Limit on fractional change per year |
---|---|---|---|
weak interaction constant | 1976 | Oklo | 10−11 |
fine-structure constant | 1976 | Oklo | 10−16 |
electron–proton mass ratio | 2002 | quasars | 10−15 |
The strong equivalence principle can be tested by 1) finding orbital variations in massive bodies (Sun-Earth-Moon), 2) variations in the gravitational constant (G) depending on nearby sources of gravity or on motion, or 3) searching for a variation of Newton's gravitational constant over the life of the universe [14] : 47
Orbital variations due to gravitational self-energy should cause a "polarization" of solar system orbits called the Nordtvedt effect. This effect has been sensitively tested by the Lunar Laser Ranging Experiment. [53] [54] Up to the limit of one part in 1013 there is no Nordtvedt effect.
A tight bound on the effect of nearby gravitational fields on the strong equivalence principle comes from modeling the orbits of binary stars and comparing the results to pulsar timing data. [14] : 49 In 2014, astronomers discovered a stellar triple system containing a millisecond pulsar PSR J0337+1715 and two white dwarfs orbiting it. The system provided them a chance to test the strong equivalence principle in a strong gravitational field with high accuracy. [55] [56] [57] [58]
Most alternative theories of gravity predict a change in the gravity constant over time. Studies of Big Bang nucleosynthesis, analysis of pulsars, and the lunar laser ranging data have shown that G cannot have varied by more than 10% since the creation of the universe. The best data comes from studies of the ephemeris of Mars, based on three successive NASA missions, Mars Global Surveyor, Mars Odyssey, and Mars Reconnaissance Orbiter. [14] : 50
In physics and general relativity, gravitational redshift is the phenomenon that electromagnetic waves or photons travelling out of a gravitational well lose energy. This loss of energy corresponds to a decrease in the wave frequency and increase in the wavelength, known more generally as a redshift. The opposite effect, in which photons gain energy when travelling into a gravitational well, is known as a gravitational blueshift. The effect was first described by Einstein in 1907, eight years before his publication of the full theory of relativity.
General relativity, also known as the general theory of relativity, and as Einstein's theory of gravity, is the geometric theory of gravitation published by Albert Einstein in 1915 and is the current description of gravitation in modern physics. General relativity generalizes special relativity and refines Newton's law of universal gravitation, providing a unified description of gravity as a geometric property of space and time, or four-dimensional spacetime. In particular, the curvature of spacetime is directly related to the energy and momentum of whatever present matter and radiation. The relation is specified by the Einstein field equations, a system of second-order partial differential equations.
Mass is an intrinsic property of a body. It was traditionally believed to be related to the quantity of matter in a body, until the discovery of the atom and particle physics. It was found that different atoms and different elementary particles, theoretically with the same amount of matter, have nonetheless different masses. Mass in modern physics has multiple definitions which are conceptually distinct, but physically equivalent. Mass can be experimentally defined as a measure of the body's inertia, meaning the resistance to acceleration when a net force is applied. The object's mass also determines the strength of its gravitational attraction to other bodies.
In theoretical physics, negative mass is a hypothetical type of exotic matter whose mass is of opposite sign to the mass of normal matter, e.g. −1 kg. Such matter would violate one or more energy conditions and exhibit strange properties such as the oppositely oriented acceleration for an applied force orientation. It is used in certain speculative hypothetical technologies such as time travel to the past and future, construction of traversable artificial wormholes, which may also allow for time travel, Krasnikov tubes, the Alcubierre drive, and potentially other types of faster-than-light warp drives. Currently, the closest known real representative of such exotic matter is a region of negative pressure density produced by the Casimir effect.
The Kerr metric or Kerr geometry describes the geometry of empty spacetime around a rotating uncharged axially symmetric black hole with a quasispherical event horizon. The Kerr metric is an exact solution of the Einstein field equations of general relativity; these equations are highly non-linear, which makes exact solutions very difficult to find.
In theoretical physics, the Einstein–Cartan theory, also known as the Einstein–Cartan–Sciama–Kibble theory, is a classical theory of gravitation, one of several alternatives to general relativity. The theory was first proposed by Élie Cartan in 1922.
In theoretical physics, the hierarchy problem is the problem concerning the large discrepancy between aspects of the weak force and gravity. There is no scientific consensus on why, for example, the weak force is 1024 times stronger than gravity.
In physics, the Brans–Dicke theory of gravitation is a competitor to Einstein's general theory of relativity. It is an example of a scalar–tensor theory, a gravitational theory in which the gravitational interaction is mediated by a scalar field as well as the tensor field of general relativity. The gravitational constant is not presumed to be constant but instead is replaced by a scalar field which can vary from place to place and with time.
In physics, curved spacetime is the mathematical model in which, with Einstein's theory of general relativity, gravity naturally arises, as opposed to being described as a fundamental force in Newton's static Euclidean reference frame. Objects move along geodesics—curved paths determined by the local geometry of spacetime—rather than being influenced directly by distant bodies. This framework led to two fundamental principles: coordinate independence, which asserts that the laws of physics are the same regardless of the coordinate system used, and the equivalence principle, which states that the effects of gravity are indistinguishable from those of acceleration in sufficiently small regions of space. These principles laid the groundwork for a deeper understanding of gravity through the geometry of spacetime, as formalized in Einstein's field equations.
The Shapiro time delay effect, or gravitational time delay effect, is one of the four classic Solar System tests of general relativity. Radar signals passing near a massive object take slightly longer to travel to a target and longer to return than they would if the mass of the object were not present. The time delay is caused by time dilation, which increases the time it takes light to travel a given distance from the perspective of an outside observer. In a 1964 article entitled Fourth Test of General Relativity, Irwin Shapiro wrote:
Because, according to the general theory, the speed of a light wave depends on the strength of the gravitational potential along its path, these time delays should thereby be increased by almost 2×10−4 sec when the radar pulses pass near the sun. Such a change, equivalent to 60 km in distance, could now be measured over the required path length to within about 5 to 10% with presently obtainable equipment.
Tests of general relativity serve to establish observational evidence for the theory of general relativity. The first three tests, proposed by Albert Einstein in 1915, concerned the "anomalous" precession of the perihelion of Mercury, the bending of light in gravitational fields, and the gravitational redshift. The precession of Mercury was already known; experiments showing light bending in accordance with the predictions of general relativity were performed in 1919, with increasingly precise measurements made in subsequent tests; and scientists claimed to have measured the gravitational redshift in 1925, although measurements sensitive enough to actually confirm the theory were not made until 1954. A more accurate program starting in 1959 tested general relativity in the weak gravitational field limit, severely limiting possible deviations from the theory.
Scalar theories of gravitation are field theories of gravitation in which the gravitational field is described using a scalar field, which is required to satisfy some field equation.
In theoretical physics, a scalar–tensor theory is a field theory that includes both a scalar field and a tensor field to represent a certain interaction. For example, the Brans–Dicke theory of gravitation uses both a scalar field and a tensor field to mediate the gravitational interaction.
Scalar–tensor–vector gravity (STVG) is a modified theory of gravity developed by John Moffat, a researcher at the Perimeter Institute for Theoretical Physics in Waterloo, Ontario. The theory is also often referred to by the acronym MOG.
Alternatives to general relativity are physical theories that attempt to describe the phenomenon of gravitation in competition with Einstein's theory of general relativity. There have been many different attempts at constructing an ideal theory of gravity.
Newton–Cartan theory is a geometrical re-formulation, as well as a generalization, of Newtonian gravity first introduced by Élie Cartan and Kurt Friedrichs and later developed by G. Dautcourt, W. G. Dixon, P. Havas, H. Künzle, Andrzej Trautman, and others. In this re-formulation, the structural similarities between Newton's theory and Albert Einstein's general theory of relativity are readily seen, and it has been used by Cartan and Friedrichs to give a rigorous formulation of the way in which Newtonian gravity can be seen as a specific limit of general relativity, and by Jürgen Ehlers to extend this correspondence to specific solutions of general relativity.
Gravitoelectromagnetism, abbreviated GEM, refers to a set of formal analogies between the equations for electromagnetism and relativistic gravitation; specifically: between Maxwell's field equations and an approximation, valid under certain conditions, to the Einstein field equations for general relativity. Gravitomagnetism is a widely used term referring specifically to the kinetic effects of gravity, in analogy to the magnetic effects of moving electric charge. The most common version of GEM is valid only far from isolated sources, and for slowly moving test particles.
Entropic gravity, also known as emergent gravity, is a theory in modern physics that describes gravity as an entropic force—a force with macro-scale homogeneity but which is subject to quantum-level disorder—and not a fundamental interaction. The theory, based on string theory, black hole physics, and quantum information theory, describes gravity as an emergent phenomenon that springs from the quantum entanglement of small bits of spacetime information. As such, entropic gravity is said to abide by the second law of thermodynamics under which the entropy of a physical system tends to increase over time.
Frame-dragging is an effect on spacetime, predicted by Albert Einstein's general theory of relativity, that is due to non-static stationary distributions of mass–energy. A stationary field is one that is in a steady state, but the masses causing that field may be non-static — rotating, for instance. More generally, the subject that deals with the effects caused by mass–energy currents is known as gravitoelectromagnetism, which is analogous to the magnetism of classical electromagnetism.
Bimetric gravity or bigravity refers to two different classes of theories. The first class of theories relies on modified mathematical theories of gravity in which two metric tensors are used instead of one. The second metric may be introduced at high energies, with the implication that the speed of light could be energy-dependent, enabling models with a variable speed of light.
We have seen that the various formulations of the equivalence principle form hierarchy (or rather, a nested sequence of statements narrowing down the type of gravitational theory),
{{cite book}}
: |journal=
ignored (help){{cite journal}}
: CS1 maint: multiple names: authors list (link)