Millennium Run

Last updated

The Millennium Run, or Millennium Simulation (referring to its size [1] [2] ) is a computer N-body simulation used to investigate how the distribution of matter in the Universe has evolved over time, in particular, how the observed population of galaxies was formed. It is used by scientists working in physical cosmology to compare observations with theoretical predictions.

Contents

Overview

A basic scientific method for testing theories in cosmology is to evaluate their consequences for the observable parts of the universe. One piece of observational evidence is the distribution of matter, including galaxies and intergalactic gas, which are observed today. Light emitted from more distant matter must travel longer in order to reach Earth, meaning looking at distant objects is like looking further back in time. This means the evolution in time of the matter distribution in the universe can also be observed directly.

The Millennium Simulation was run in 2005 by the Virgo Consortium, an international group of astrophysicists from Germany, the United Kingdom, Canada, Japan and the United States. It starts at the epoch when the cosmic background radiation was emitted, about 379,000 years after the universe began. The cosmic background radiation has been studied by satellite experiments, and the observed inhomogeneities in the cosmic background serve as the starting point for following the evolution of the corresponding matter distribution. Using the physical laws expected to hold in the currently known cosmologies and simplified representations of the astrophysical processes observed to affect real galaxies, the initial distribution of matter is allowed to evolve, and the simulation's predictions for formation of galaxies and black holes are recorded.

Since the completion of the Millennium Run simulation in 2005, a series of ever more sophisticated and higher fidelity simulations of the formation of the galaxy population have been built within its stored output and have been made publicly available over the internet. In addition to improving the treatment of the astrophysics of galaxy formation, recent versions have adjusted the parameters of the underlying cosmological model to reflect changing ideas about their precise values. To date (mid-2018) more than 950 published papers have made use of data from the Millennium Run, making it, at least by this measure, the highest impact astrophysical simulation of all time. [3]

Size of the simulation

For the first scientific results, published on June 2, 2005, the Millennium Simulation traced 21603, or just over 10 billion, "particles." These are not particles in the particle physics sense – each "particle" represents approximately a billion solar masses of dark matter. [1] The region of space simulated was a cube with about 2 billion light years as its length. [1] This volume was populated by about 20 million "galaxies". A super computer located in Garching, Germany executed the simulation, which used a version of the GADGET code, for more than a month. The output of the simulation needed about 25 terabytes of storage. [4]

First results

The Sloan Digital Sky Survey had challenged the current understanding of cosmology by finding black hole candidates in very bright quasars at large distances. This meant that they were created much earlier than initially expected. In successfully managing to produce quasars at early times, the Millennium Simulation demonstrated that these objects do not contradict our models of the evolution of the universe.

Millennium II

In 2009, the same group ran the 'Millennium II' simulation (MS-II) on a smaller cube (about 400 million light years on a side), with the same number of particles but with each particle representing 6.9 million solar masses. This is a rather harder numerical task since splitting the computational domain between processors becomes harder when dense clumps of matter are present. MS-II used 1.4 million CPU hours over 2048 cores (i.e. about a month) on the Power-6 computer at Garching; a simulation was also run with the same initial conditions and fewer particles to check that features in the higher-resolution run were also seen at lower resolution.

Millennium XXL

In 2010, the 'Millennium XXL' simulation (MXXL) was performed, this time using a much larger cube (over 13 billion light years on a side), and 67203 particles each representing 7 billion times the mass of the Sun. The MXXL spans a cosmological volume 216 and 27,000 times the size of the Millennium and the MS-II simulation boxes, respectively. The simulation was run on JUROPA, one of the top 15 supercomputers in the world in 2010. It used more than 12,000 cores for an equivalent of 300 years CPU time, 30 terabytes of RAM and generated more than 100 terabytes of data. [5] Cosmologists use the MXXL simulation to study the distribution of galaxies and dark matter halos on very large scales and how the rarest and most massive structures in the universe came about.

Millennium Run Observatory

In 2012, the Millennium Run Observatory (MRObs) project was launched. The MRObs is a theoretical virtual observatory that integrates detailed predictions for the dark matter (from the Millennium simulations) and for the galaxies (from semi-analytical models) with a virtual telescope to synthesize artificial observations. Astrophysicists use these virtual observations to study how the predictions from the Millennium simulations compare to the real universe, to plan future observational surveys, and to calibrate the techniques used by astronomers to analyze real observations. A first set of virtual observations produced by the MRObs have been released to the astronomical community for analysis through the MRObs Web portal. The virtual universe can also be accessed through a new online tool, the MRObs browser, which allows users to interact with the Millennium Run Relational Database where the properties of millions of dark matter halos and their galaxies from the Millennium project are being stored. Upgrades to the MRObs framework, and its extension to other types of simulations, are currently being planned.

See also

Related Research Articles

Physical cosmology Branch of astronomy

Physical cosmology is a branch of cosmology concerned with the study of cosmological models. A cosmological model, or simply cosmology, provides a description of the largest-scale structures and dynamics of the universe and allows study of fundamental questions about its origin, structure, evolution, and ultimate fate. Cosmology as a science originated with the Copernican principle, which implies that celestial bodies obey identical physical laws to those on Earth, and Newtonian mechanics, which first allowed those physical laws to be understood. Physical cosmology, as it is now understood, began with the development in 1915 of Albert Einstein's general theory of relativity, followed by major observational discoveries in the 1920s: first, Edwin Hubble discovered that the universe contains a huge number of external galaxies beyond the Milky Way; then, work by Vesto Slipher and others showed that the universe is expanding. These advances made it possible to speculate about the origin of the universe, and allowed the establishment of the Big Bang theory, by Georges Lemaître, as the leading cosmological model. A few researchers still advocate a handful of alternative cosmologies; however, most cosmologists agree that the Big Bang theory best explains the observations.

Dark matter Hypothetical form of matter comprising most of the matter in the universe

Dark matter is a hypothetical form of matter thought to account for approximately 85% of the matter in the universe. Its presence is implied in a variety of astrophysical observations, including gravitational effects that cannot be explained by accepted theories of gravity unless more matter is present than can be seen. For this reason, most experts think that dark matter is abundant in the universe and that it has had a strong influence on its structure and evolution. Dark matter is called dark because it does not appear to interact with the electromagnetic field, which means it does not absorb, reflect or emit electromagnetic radiation, and is therefore difficult to detect.

Galaxy formation and evolution from a homogeneous beginning, the formation of the first galaxies, the way galaxies change over time

The study of galaxy formation and evolution is concerned with the processes that formed a heterogeneous universe from a homogeneous beginning, the formation of the first galaxies, the way galaxies change over time, and the processes that have generated the variety of structures observed in nearby galaxies. Galaxy formation is hypothesized to occur from structure formation theories, as a result of tiny quantum fluctuations in the aftermath of the Big Bang. The simplest model in general agreement with observed phenomena is the Lambda-CDM model—that is, that clustering and merging allows galaxies to accumulate mass, determining both their shape and structure.

Cosmological principle Notion that the spatial distribution of matter in the universe is homogeneous and isotropic at large scales

In modern physical cosmology, the cosmological principle is the notion that the spatial distribution of matter in the universe is homogeneous and isotropic when viewed on a large enough scale, since the forces are expected to act uniformly throughout the universe, and should, therefore, produce no observable irregularities in the large-scale structuring over the course of evolution of the matter field that was initially laid down by the Big Bang.

In cosmology and physics, cold dark matter (CDM) is a hypothetical type of dark matter. Observations indicate that approximately 85% of the matter in the universe is dark matter, with only a small fraction being the ordinary baryonic matter that composes stars, planets, and living organisms. Cold refers to the fact that the dark matter moves slowly compared to the speed of light, while dark indicates that it interacts very weakly with ordinary matter and electromagnetic radiation.

Observable universe All matter that can be observed from the Earth at the present

The observable universe is a ball-shaped region of the universe comprising all matter that can be observed from Earth or its space-based telescopes and exploratory probes at the present time, because the electromagnetic radiation from these objects has had time to reach the Solar System and Earth since the beginning of the cosmological expansion. There may be 2 trillion galaxies in the observable universe, although that number has recently been estimated at only several hundred billion based on new data from New Horizons. Assuming the universe is isotropic, the distance to the edge of the observable universe is roughly the same in every direction. That is, the observable universe has a spherical volume centered on the observer. Every location in the universe has its own observable universe, which may or may not overlap with the one centered on Earth.

Plasma cosmology Non-standard model of the universe; emphasizes the role of ionized gases

Plasma cosmology is a non-standard cosmology whose central postulate is that the dynamics of ionized gases and plasmas play important, if not dominant, roles in the physics of the universe beyond the Solar System. In contrast, the current observations and models of cosmologists and astrophysicists explain the formation, development, and evolution of astronomical bodies and large-scale structures in the universe as influenced by gravity and baryonic physics.

Reionization Process that caused matter to reionize early in the history of the Universe

In the fields of Big Bang theory and cosmology, reionization is the process that caused matter in the universe to reionize after the lapse of the "dark ages".

The cuspy halo problem refers to a discrepancy between the inferred dark matter density profiles of low-mass galaxies and the density profiles predicted by cosmological N-body simulations. Nearly all simulations form dark matter halos which have "cuspy" dark matter distributions, with density increasing steeply at small radii, while the rotation curves of most observed dwarf galaxies suggest that they have flat central dark matter density profiles ("cores").

Lambda-CDM model Model of big-bang cosmology

The ΛCDM or Lambda-CDM model is a parameterization of the Big Bang cosmological model in which the universe contains three major components: first, a cosmological constant denoted by Lambda associated with dark energy; second, the postulated cold dark matter ; and third, ordinary matter. It is frequently referred to as the standard model of Big Bang cosmology because it is the simplest model that provides a reasonably good account of the following properties of the cosmos:

Dark matter halo A theoretical component of a galaxy that envelops the galactic disc and extends well beyond the edge of the visible galaxy

According to modern models of physical cosmology, a dark matter halo is a basic unit of cosmological structure. It is a hypothetical region that has decoupled from cosmic expansion and contains gravitationally bound matter. A single dark matter halo may contain multiple virialized clumps of dark matter bound together by gravity, known as subhalos. Modern cosmological models, such as ΛCDM, propose that dark matter halos and subhalos may contain galaxies. The dark matter halo of a galaxy envelops the galactic disc and extends well beyond the edge of the visible galaxy. Thought to consist of dark matter, halos have not been observed directly. Their existence is inferred through observations of their effects on the motions of stars and gas in galaxies and gravitational lensing. Dark matter halos play a key role in current models of galaxy formation and evolution. Theories that attempt to explain the nature of dark matter halos with varying degrees of success include cold dark matter (CDM), warm dark matter, and massive compact halo objects (MACHOs).

Structure formation Formation of galaxies, galaxy clusters and larger structures from small early density fluctuations

In physical cosmology, structure formation is the formation of galaxies, galaxy clusters and larger structures from small early density fluctuations. The universe, as is now known from observations of the cosmic microwave background radiation, began in a hot, dense, nearly uniform state approximately 13.8 billion years ago. However, looking at the night sky today, structures on all scales can be seen, from stars and planets to galaxies. On even larger scales, galaxy clusters and sheet-like structures of galaxies are separated by enormous voids containing few galaxies. Structure formation attempts to model how these structures formed by gravitational instability of small early ripples in spacetime density.

Galaxy merger Merger whereby at least two galaxies collide

Galaxy mergers can occur when two galaxies collide. They are the most violent type of galaxy interaction. The gravitational interactions between galaxies and the friction between the gas and dust have major effects on the galaxies involved. The exact effects of such mergers depend on a wide variety of parameters such as collision angles, speeds, and relative size/composition, and are currently an extremely active area of research. Galaxy mergers are important because the merger rate is a fundamental measurement of galaxy evolution. The merger rate also provides astronomers with clues about how galaxies bulked up over time.

Simon White British astronomer

Simon David Manton White, FRS, is a British astrophysicist. He was one of directors at the Max Planck Institute for Astrophysics before his retirement in late 2019.

Huge-LQG

The Huge Large Quasar Group, is a possible structure or pseudo-structure of 73 quasars, referred to as a large quasar group, that measures about 4 billion light-years across. At its discovery, it was identified as the largest and the most massive known structure in the observable universe, though it has been superseded by the Hercules-Corona Borealis Great Wall at 10 billion light-years. There are also issues about its structure.

The Bolshoi simulation, a computer model of the universe run in 2010 on the Pleiades supercomputer at the NASA Ames Research Center, was the most accurate cosmological simulation to that date of the evolution of the large-scale structure of the universe. The Bolshoi simulation used the now-standard ΛCDM (Lambda-CDM) model of the universe and the WMAP five-year and seven-year cosmological parameters from NASA's Wilkinson Microwave Anisotropy Probe team. "The principal purpose of the Bolshoi simulation is to compute and model the evolution of dark matter halos, thereby rendering the invisible visible for astronomers to study, and to predict visible structure that astronomers can seek to observe." “Bolshoi” is a Russian word meaning “big.”

U1.11 Large quasar group in the constellation Virgo

U1.11 is a large quasar group located in the constellations of Leo and Virgo. It is one of the largest LQG's known, with the estimated maximum diameter of 780 Mpc and contains 38 quasars. It was discovered in 2011 during the course of the Sloan Digital Sky Survey. Until the discovery of the Huge-LQG in November 2012, it was the largest known structure in the universe, beating Clowes–Campusano LQG's 20-year record as largest known structure at the time of its discovery.

Illustris project Computer-simulated universes

The Illustris project is an ongoing series of astrophysical simulations run by an international collaboration of scientists. The aim is to study the processes of galaxy formation and evolution in the universe with a comprehensive physical model. Early results are described in a number of publications following widespread press coverage. The project publicly released all data produced by the simulations in April, 2015. A follow-up to the project, IllustrisTNG, was presented in 2017.

The photon underproduction crisis is a cosmological discussion concerning the purported deficit between observed photons and predicted photons.

UniverseMachine Computer simulated universes

The UniverseMachine is a project of an ongoing series of astrophysical supercomputer simulations of various models of possible universes, that was created by astronomer Peter Behroozi and his research team at the Steward Observatory and the University of Arizona. As such, numerous universes with different physical characteristics may be simulated in order to develop insights into the possible beginning, and later evolution, of our current universe. One of the major objectives of the project is to better understand the role of dark matter in the development of the universe. According to Behroozi, "On the computer, we can create many different universes and compare them to the actual one, and that lets us infer which rules lead to the one we see."

References

  1. 1 2 3 Springel, Volker; et al. (2005). "Simulations of the formation, evolution, and clustering of galaxies and quasars" (PDF). Nature . 435 (7042): 629–636. arXiv: astro-ph/0504097 . Bibcode:2005Natur.435..629S. doi:10.1038/nature03597. hdl:2027.42/62586. PMID   15931216. S2CID   4383030.
  2. "MPA :: Current Research Highlight :: August 2004" . Retrieved 2009-05-28.
  3. "The Millennium Simulation public page" . Retrieved 2017-02-15.
  4. "Millennium Simulation - The Largest Ever Model of the Universe" . Retrieved 2009-05-28.
  5. "The Millennium-XXL Project: Simulating the Galaxy Population in Dark Energy Universes" . Retrieved 2013-07-02.

Further reading