The Millennium Run, or Millennium Simulation (referring to its size [1] [2] ) is a computer N-body simulation used to investigate how the distribution of matter in the Universe has evolved over time, in particular, how the observed population of galaxies was formed. It is used by scientists working in physical cosmology to compare observations with theoretical predictions.
A basic scientific method for testing theories in cosmology is to evaluate their consequences for the observable parts of the universe. One piece of observational evidence is the distribution of matter, including galaxies and intergalactic gas, which are observed today. Light emitted from more distant matter must travel longer in order to reach Earth, meaning looking at distant objects is like looking further back in time. This means the evolution in time of the matter distribution in the universe can also be observed directly.
The Millennium Simulation was run in 2005 by the Virgo Consortium, an international group of astrophysicists from Germany, the United Kingdom, Canada, Japan and the United States. It starts at the epoch when the Cosmic microwave background was emitted, about 379,000 years after the universe began. The cosmic background radiation has been studied by satellite experiments, and the observed inhomogeneities in the cosmic background serve as the starting point for following the evolution of the corresponding matter distribution. Using the physical laws expected to hold in the currently known cosmologies and simplified representations of the astrophysical processes observed to affect real galaxies, the initial distribution of matter is allowed to evolve, and the simulation's predictions for formation of galaxies and black holes are recorded.
Since the completion of the Millennium Run simulation in 2005, a series of ever more sophisticated and higher fidelity simulations of the formation of the galaxy population have been built within its stored output and have been made publicly available over the internet. In addition to improving the treatment of the astrophysics of galaxy formation, recent versions have adjusted the parameters of the underlying cosmological model to reflect changing ideas about their precise values. To date (mid-2018) more than 950 published papers have made use of data from the Millennium Run, making it, at least by this measure, the highest impact astrophysical simulation of all time. [3]
For the first scientific results, published on June 2, 2005, the Millennium Simulation traced 21603, or just over 10 billion, "particles." These are not particles in the particle physics sense – each "particle" represents approximately a billion solar masses of dark matter. [1] The region of space simulated was a cube with about 2 billion light years as its length. [1] This volume was populated by about 20 million "galaxies". A super computer located in Garching, Germany executed the simulation, which used a version of the GADGET code, for more than a month. The output of the simulation needed about 25 terabytes of storage. [4]
The Sloan Digital Sky Survey had challenged the current understanding of cosmology by finding black hole candidates in very bright quasars at large distances. This meant that they were created much earlier than initially expected. In successfully managing to produce quasars at early times, the Millennium Simulation demonstrated that these objects do not contradict our models of the evolution of the universe.
In 2009, the same group ran the 'Millennium II' simulation (MS-II) on a smaller cube (about 400 million light years on a side), with the same number of particles but with each particle representing 6.9 million solar masses. This is a rather harder numerical task since splitting the computational domain between processors becomes harder when dense clumps of matter are present. MS-II used 1.4 million CPU hours over 2048 cores (i.e. about a month) on the Power-6 computer at Garching; a simulation was also run with the same initial conditions and fewer particles to check that features in the higher-resolution run were also seen at lower resolution.
In 2010, the 'Millennium XXL' simulation (MXXL) was performed, this time using a much larger cube (over 13 billion light years on a side), and 67203 particles each representing 7 billion times the mass of the Sun. The MXXL spans a cosmological volume 216 and 27,000 times the size of the Millennium and the MS-II simulation boxes, respectively. The simulation was run on JUROPA, one of the top 15 supercomputers in the world in 2010. It used more than 12,000 cores for an equivalent of 300 years CPU time, 30 terabytes of RAM and generated more than 100 terabytes of data. [5] Cosmologists use the MXXL simulation to study the distribution of galaxies and dark matter halos on very large scales and how the rarest and most massive structures in the universe came about.
In 2012, the Millennium Run Observatory (MRObs) project was launched. The MRObs is a theoretical virtual observatory that integrates detailed predictions for the dark matter (from the Millennium simulations) and for the galaxies (from semi-analytical models) with a virtual telescope to synthesize artificial observations. Astrophysicists use these virtual observations to study how the predictions from the Millennium simulations compare to the real universe, to plan future observational surveys, and to calibrate the techniques used by astronomers to analyze real observations. A first set of virtual observations produced by the MRObs have been released to the astronomical community for analysis through the MRObs Web portal. The virtual universe can also be accessed through a new online tool, the MRObs browser, which allows users to interact with the Millennium Run Relational Database where the properties of millions of dark matter halos and their galaxies from the Millennium project are being stored. Upgrades to the MRObs framework, and its extension to other types of simulations, are currently being planned.
Physical cosmology is a branch of cosmology concerned with the study of cosmological models. A cosmological model, or simply cosmology, provides a description of the largest-scale structures and dynamics of the universe and allows study of fundamental questions about its origin, structure, evolution, and ultimate fate. Cosmology as a science originated with the Copernican principle, which implies that celestial bodies obey identical physical laws to those on Earth, and Newtonian mechanics, which first allowed those physical laws to be understood.
The study of galaxy formation and evolution is concerned with the processes that formed a heterogeneous universe from a homogeneous beginning, the formation of the first galaxies, the way galaxies change over time, and the processes that have generated the variety of structures observed in nearby galaxies. Galaxy formation is hypothesized to occur from structure formation theories, as a result of tiny quantum fluctuations in the aftermath of the Big Bang. The simplest model in general agreement with observed phenomena is the Lambda-CDM model—that is, that clustering and merging allows galaxies to accumulate mass, determining both their shape and structure. Hydrodynamics simulation, which simulates both baryons and dark matter, is widely used to study galaxy formation and evolution.
Astronomy is a natural science that studies celestial objects and the phenomena that occur in the cosmos. It uses mathematics, physics, and chemistry in order to explain their origin and their overall evolution. Objects of interest include planets, moons, stars, nebulae, galaxies, meteoroids, asteroids, and comets. Relevant phenomena include supernova explosions, gamma ray bursts, quasars, blazars, pulsars, and cosmic microwave background radiation. More generally, astronomy studies everything that originates beyond Earth's atmosphere. Cosmology is a branch of astronomy that studies the universe as a whole.
Plasma cosmology is a non-standard cosmology whose central postulate is that the dynamics of ionized gases and plasmas play important, if not dominant, roles in the physics of the universe at interstellar and intergalactic scales. In contrast, the current observations and models of cosmologists and astrophysicists explain the formation, development, and evolution of large-scale structures as dominated by gravity.
The Lambda-CDM, Lambda cold dark matter, or ΛCDM model is a mathematical model of the Big Bang theory with three major components:
In modern models of physical cosmology, a dark matter halo is a basic unit of cosmological structure. It is a hypothetical region that has decoupled from cosmic expansion and contains gravitationally bound matter. A single dark matter halo may contain multiple virialized clumps of dark matter bound together by gravity, known as subhalos. Modern cosmological models, such as ΛCDM, propose that dark matter halos and subhalos may contain galaxies. The dark matter halo of a galaxy envelops the galactic disc and extends well beyond the edge of the visible galaxy. Thought to consist of dark matter, halos have not been observed directly. Their existence is inferred through observations of their effects on the motions of stars and gas in galaxies and gravitational lensing. Dark matter halos play a key role in current models of galaxy formation and evolution. Theories that attempt to explain the nature of dark matter halos with varying degrees of success include cold dark matter (CDM), warm dark matter, and massive compact halo objects (MACHOs).
In physical cosmology, structure formation is the formation of galaxies, galaxy clusters and larger structures from small early density fluctuations. The universe, as is now known from observations of the cosmic microwave background radiation, began in a hot, dense, nearly uniform state approximately 13.8 billion years ago. However, looking at the night sky today, structures on all scales can be seen, from stars and planets to galaxies. On even larger scales, galaxy clusters and sheet-like structures of galaxies are separated by enormous voids containing few galaxies. Structure formation attempts to model how these structures were formed by gravitational instability of small early ripples in spacetime density or another emergence.
The expansion of the universe is the increase in distance between gravitationally unbound parts of the observable universe with time. It is an intrinsic expansion; the universe does not expand "into" anything and does not require space to exist "outside" it. To any observer in the universe, it appears that all but the nearest galaxies recede at speeds that are proportional to their distance from the observer, on average. While objects cannot move faster than light, this limitation only applies with respect to local reference frames and does not limit the recession rates of cosmologically distant objects.
Galaxy mergers can occur when two galaxies collide. They are the most violent type of galaxy interaction. The gravitational interactions between galaxies and the friction between the gas and dust have major effects on the galaxies involved. The exact effects of such mergers depend on a wide variety of parameters such as collision angles, speeds, and relative size/composition, and are currently an extremely active area of research. Galaxy mergers are important because the merger rate is a fundamental measurement of galaxy evolution. The merger rate also provides astronomers with clues about how galaxies bulked up over time.
An inhomogeneous cosmology is a physical cosmological theory which, unlike the currently widely accepted cosmological concordance model, assumes that inhomogeneities in the distribution of matter across the universe affect local gravitational forces enough to skew our view of the Universe. When the universe began, matter was distributed homogeneously, but over billions of years, galaxies, clusters of galaxies, and superclusters have coalesced, and must, according to Einstein's theory of general relativity, warp the space-time around them. While the concordance model acknowledges this fact, it assumes that such inhomogeneities are not sufficient to affect large-scale averages of gravity in our observations. When two separate studies claimed in 1998-1999 that high redshift supernovae were further away than our calculations showed they should be, it was suggested that the expansion of the universe is accelerating, and dark energy, a repulsive energy inherent in space, was proposed to explain the acceleration. Dark energy has since become widely accepted, but it remains unexplained. Accordingly, some scientists continue to work on models that might not require dark energy. Inhomogeneous cosmology falls into this class.
Simon David Manton White, FRS, is a British astrophysicist. He was one of directors at the Max Planck Institute for Astrophysics before his retirement in late 2019.
In cosmology, galaxy filaments are the largest known structures in the universe, consisting of walls of galactic superclusters. These massive, thread-like formations can commonly reach 50/h to 80/h Megaparsecs — with the largest found to date being the Hercules-Corona Borealis Great Wall at around 3 gigaparsecs (9.8 Gly) in length — and form the boundaries between voids. Due to the accelerating expansion of the universe, the individual clusters of gravitationally bound galaxies that make up galaxy filaments are moving away from each other at an accelerated rate; in the far future they will dissolve.
The Virgo Consortium was founded in 1994 for Cosmological Supercomputer Simulations in response to the UK's High Performance Computing Initiative. Virgo developed rapidly into an international collaboration between a dozen scientists in the UK, Germany, Netherlands, Canada, United States and Japan.
The chronology of the universe describes the history and future of the universe according to Big Bang cosmology.
In cosmology, primordial black holes (PBHs) are hypothetical black holes that formed soon after the Big Bang. In the inflationary era and early radiation-dominated universe, extremely dense pockets of subatomic matter may have been tightly packed to the point of gravitational collapse, creating primordial black holes without the supernova compression needed to make black holes today. Because the creation of primordial black holes would pre-date the first stars, they are not limited to the narrow mass range of stellar black holes.
The Bolshoi simulation, a computer model of the universe run in 2010 on the Pleiades supercomputer at the NASA Ames Research Center, was the most accurate cosmological simulation to that date of the evolution of the large-scale structure of the universe. The Bolshoi simulation used the now-standard ΛCDM (Lambda-CDM) model of the universe and the WMAP five-year and seven-year cosmological parameters from NASA's Wilkinson Microwave Anisotropy Probe team. "The principal purpose of the Bolshoi simulation is to compute and model the evolution of dark matter halos, thereby rendering the invisible visible for astronomers to study, and to predict visible structure that astronomers can seek to observe." “Bolshoi” is a Russian word meaning “big.”
U1.11 is a large quasar group located in the constellations of Leo and Virgo. It is one of the largest LQG's known, with the estimated maximum diameter of 780 Mpc and contains 38 quasars. It was discovered in 2011 during the course of the Sloan Digital Sky Survey. Until the discovery of the Huge-LQG in November 2012, it was the largest known structure in the universe, beating Clowes–Campusano LQG's 20-year record as largest known structure at the time of its discovery.
The Illustris project is an ongoing series of astrophysical simulations run by an international collaboration of scientists. The aim was to study the processes of galaxy formation and evolution in the universe with a comprehensive physical model. Early results were described in a number of publications following widespread press coverage. The project publicly released all data produced by the simulations in April, 2015. Key developers of the Illustris simulation have been Volker Springel and Mark Vogelsberger. The Illustris simulation framework and galaxy formation model has been used for a wide range of spin-off projects, starting with Auriga and IllustrisTNG followed by Thesan (2021), MillenniumTNG (2022) and TNG-Cluster.
The photon underproduction crisis is a cosmological discussion concerning the purported deficit between observed photons and predicted photons.
The UniverseMachine is a project carrying out astrophysical supercomputer simulations of various models of possible universes, created by astronomer Peter Behroozi and his research team at the Steward Observatory and the University of Arizona. Numerous universes with different physical characteristics may be simulated in order to develop insights into the possible beginning and evolution of our universe. A major objective is to better understand the role of dark matter in the development of the universe. According to Behroozi, "On the computer, we can create many different universes and compare them to the actual one, and that lets us infer which rules lead to the one we see."
{{cite journal}}
: Cite journal requires |journal=
(help)