Part of a series on |
Physical cosmology |
---|
The Illustris project is an ongoing series of astrophysical simulations run by an international collaboration of scientists. [1] The aim is to study the processes of galaxy formation and evolution in the universe with a comprehensive physical model. Early results were described in a number of publications [2] [3] [4] following widespread press coverage. [5] [6] [7] The project publicly released all data produced by the simulations in April, 2015. Key developers of the Illustris simulation have been Volker Springel (Max-Planck-Institut für Astrophysik) and Mark Vogelsberger (Massachusetts Institute of Technology). The Illustris simulation framework and galaxy formation model has been used for a wide range of spin-off projects, starting with Auriga and IllustrisTNG (both 2017) followed by Thesan (2021), MillenniumTNG (2022) and TNG-Cluster (2023).
The original Illustris project was carried out by Mark Vogelsberger [8] and collaborators as the first large-scale galaxy formation application of Volker Springel's novel Arepo code. [9]
The Illustris project included large-scale cosmological simulations of the evolution of the universe, spanning initial conditions of the Big Bang, to the present day, 13.8 billion years later. Modeling, based on the most precise data and calculations currently available, are compared to actual findings of the observable universe in order to better understand the nature of the universe, including galaxy formation, dark matter and dark energy. [5] [6] [7]
The simulation included many physical processes which are thought to be critical for galaxy formation. These include the formation of stars and the subsequent "feedback" due to supernova explosions, as well as the formation of super-massive black holes, their consumption of nearby gas, and their multiple modes of energetic feedback. [1] [4] [10]
Images, videos, and other data visualizations for public distribution are available at official media page.
The main Illustris simulation was run on the Curie supercomputer at CEA (France) and the SuperMUC supercomputer at the Leibniz Computing Centre (Germany). [1] [11] A total of 19 million CPU hours was required, using 8,192 CPU cores. [1] The peak memory usage was approximately 25 TB of RAM. [1] A total of 136 snapshots were saved over the course of the simulation, totaling over 230 TB cumulative data volume. [2]
A code called "Arepo" was used to run the Illustris simulations. It was written by Volker Springel, the same author as the GADGET code. The name is derived from the Sator Square. This code solves the coupled equations of gravity and hydrodynamics using a discretization of space based on a moving Voronoi tessellation. It is optimized for running on large, distributed memory supercomputers using an MPI approach.
In April, 2015 (eleven months after the first papers were published) the project team publicly released all data products from all simulations. [12] All original data files can be directly downloaded through the data release webpage. This includes group catalogs of individual halos and subhalos, merger trees tracking these objects through time, full snapshot particle data at 135 distinct time points, and various supplementary data catalogs. In addition to direct data download, a web-based API allows for many common search and data extraction tasks to be completed without needing access to the full data sets.
In December 2018, the Illustris simulation was recognized by Deutsche Post through a special series stamp.
The Illustris simulation framework has been used by a wide range of spin-off projects that focus on specific scientific questions. IllustrisTNG: The IllustrisTNG project, "the next generation" follow up to the original Illustris simulation, was first presented in July, 2017. A team of scientists from Germany and the U.S. led by Prof. Volker Springel. [13] First, a new physical model was developed, which among other features included Magnetohydrodynamics planned three simulations, which used different volumes at different resolutions. The intermediate simulation (TNG100) was equivalent to the original Illustris simulation. Unlike Illustris, it was run on the Hazel Hen machine at the High Performance Computing Center, Stuttgart in Germany. Up to 25,000 computer cores were employed. In December 2018 the simulation data from IllustrisTNG was released publicly. The data service includes a JupyterLab interface. Auriga: The Auriga project consists of high-resolution zoom simulations of Milky Way-like dark matter halos to understand the formation of our Milky Way galaxy. Thesan: The Thesan project is a radiative-transfer version of IllustrisTNG to explore the epoch of reionization. MillenniumTNG: The MillenniumTNG employs the IllustrisTNG galaxy formation model in a larger cosmological volume to explore the massive end of the halo mass function for detailed cosmological probe forecasts. TNG-Cluster: A suite of high-resolution zoom-in simulations of galaxy clusters.
Physical cosmology is a branch of cosmology concerned with the study of cosmological models. A cosmological model, or simply cosmology, provides a description of the largest-scale structures and dynamics of the universe and allows study of fundamental questions about its origin, structure, evolution, and ultimate fate. Cosmology as a science originated with the Copernican principle, which implies that celestial bodies obey identical physical laws to those on Earth, and Newtonian mechanics, which first allowed those physical laws to be understood.
The study of galaxy formation and evolution is concerned with the processes that formed a heterogeneous universe from a homogeneous beginning, the formation of the first galaxies, the way galaxies change over time, and the processes that have generated the variety of structures observed in nearby galaxies. Galaxy formation is hypothesized to occur from structure formation theories, as a result of tiny quantum fluctuations in the aftermath of the Big Bang. The simplest model in general agreement with observed phenomena is the Lambda-CDM model—that is, clustering and merging allows galaxies to accumulate mass, determining both their shape and structure. Hydrodynamics simulation, which simulates both baryons and dark matter, is widely used to study galaxy formation and evolution.
In cosmology and physics, cold dark matter (CDM) is a hypothetical type of dark matter. According to the current standard model of cosmology, Lambda-CDM model, approximately 27% of the universe is dark matter and 68% is dark energy, with only a small fraction being the ordinary baryonic matter that composes stars, planets, and living organisms. Cold refers to the fact that the dark matter moves slowly compared to the speed of light, giving it a vanishing equation of state. Dark indicates that it interacts very weakly with ordinary matter and electromagnetic radiation. Proposed candidates for CDM include weakly interacting massive particles, primordial black holes, and axions.
The Sunyaev–Zeldovich effect is the spectral distortion of the cosmic microwave background (CMB) through inverse Compton scattering by high-energy electrons in galaxy clusters, in which the low-energy CMB photons receive an average energy boost during collision with the high-energy cluster electrons. Observed distortions of the cosmic microwave background spectrum are used to detect the disturbance of density in the universe. Using the Sunyaev–Zeldovich effect, dense clusters of galaxies have been observed.
The Lambda-CDM, Lambda cold dark matter, or ΛCDM model is a mathematical model of the Big Bang theory with three major components:
In modern models of physical cosmology, a dark matter halo is a basic unit of cosmological structure. It is a hypothetical region that has decoupled from cosmic expansion and contains gravitationally bound matter. A single dark matter halo may contain multiple virialized clumps of dark matter bound together by gravity, known as subhalos. Modern cosmological models, such as ΛCDM, propose that dark matter halos and subhalos may contain galaxies. The dark matter halo of a galaxy envelops the galactic disc and extends well beyond the edge of the visible galaxy. Thought to consist of dark matter, halos have not been observed directly. Their existence is inferred through observations of their effects on the motions of stars and gas in galaxies and gravitational lensing. Dark matter halos play a key role in current models of galaxy formation and evolution. Theories that attempt to explain the nature of dark matter halos with varying degrees of success include cold dark matter (CDM), warm dark matter, and massive compact halo objects (MACHOs).
The Millennium Run, or Millennium Simulation is a computer N-body simulation used to investigate how the distribution of matter in the Universe has evolved over time, in particular, how the observed population of galaxies was formed. It is used by scientists working in physical cosmology to compare observations with theoretical predictions.
Physical computing involves interactive systems that can sense and respond to the world around them. While this definition is broad enough to encompass systems such as smart automotive traffic control systems or factory automation processes, it is not commonly used to describe them. In a broader sense, physical computing is a creative framework for understanding human beings' relationship to the digital world. In practical use, the term most often describes handmade art, design or DIY hobby projects that use sensors and microcontrollers to translate analog input to a software system, and/or control electro-mechanical devices such as motors, servos, lighting or other hardware.
GADGET is free software for cosmological N-body/SPH simulations written by Volker Springel at the Max Planck Institute for Astrophysics. The name is an acronym of "GAlaxies with Dark matter and Gas intEracT". It is released under the GNU GPL. It can be used to study for example galaxy formation and dark matter.
Lars Hernquist is a theoretical astrophysicist and Mallinckrodt Professor of Astrophysics at the Center for Astrophysics | Harvard & Smithsonian. He is best known for his research on dynamical processes in cosmology and galaxy formation/galaxy evolution.
Simon David Manton White, FRS, is a British-German astrophysicist. He was one of directors at the Max Planck Institute for Astrophysics before his retirement in late 2019.
The Bolshoi simulation, a computer model of the universe run in 2010 on the Pleiades supercomputer at the NASA Ames Research Center, was the most accurate cosmological simulation to that date of the evolution of the large-scale structure of the universe. The Bolshoi simulation used the now-standard ΛCDM (Lambda-CDM) model of the universe and the WMAP five-year and seven-year cosmological parameters from NASA's Wilkinson Microwave Anisotropy Probe team. "The principal purpose of the Bolshoi simulation is to compute and model the evolution of dark matter halos, thereby rendering the invisible visible for astronomers to study, and to predict visible structure that astronomers can seek to observe." “Bolshoi” is a Russian word meaning “big.”
MACS J0416.1-2403 or MACS0416 abbreviated, is a cluster of galaxies at a redshift of z=0.397 with a mass 160 trillion times the mass of the Sun inside 200 kpc (650 kly). Its mass extends out to a radius of 950 kpc (3,100 kly) and was measured as 1.15 × 1015 solar masses. The system was discovered in images taken by the Hubble Space Telescope during the Massive Cluster Survey, MACS. This cluster causes gravitational lensing of distant galaxies producing multiple images. Based on the distribution of the multiple image copies, scientists have been able to deduce and map the distribution of dark matter. The images, released in 2014, were used in the Cluster Lensing And Supernova survey with Hubble (CLASH) to help scientists peer back in time at the early Universe and to discover the distribution of dark matter.
NGC 1961 is a spiral galaxy in the constellation Camelopardalis. It was discovered by William Herschel on 3 December 1788. It is at a distance of about 200 million light years from Earth, which, given its apparent dimensions, means that NGC 1961 is more than 220,000 light years across.
The UniverseMachine is a project carrying out astrophysical supercomputer simulations of various models of possible universes, created by astronomer Peter Behroozi and his research team at the Steward Observatory and the University of Arizona. Numerous universes with different physical characteristics may be simulated in order to develop insights into the possible beginning and evolution of our universe. A major objective is to better understand the role of dark matter in the development of the universe. According to Behroozi, "On the computer, we can create many different universes and compare them to the actual one, and that lets us infer which rules lead to the one we see."
Oliver Zahn is US/German theoretical astrophysicist, data scientist, and entrepreneur, best known for developing algorithms for astrophysical data analysis and widely cited discoveries of phenomena in the history of the Universe. He is also known for his more recent work as founder and CEO of Climax Foods, a California-based biotechnology company modeling dairy and other animal products directly from plant ingredients. Prior to becoming an entrepreneur, Zahn directed UC Berkeley's Center for Cosmological Physics alongside George Smoot and Saul Perlmutter and was Head of Data Science at Google
Sultan Hassan is a Sudanese computational astrophysicist and NASA Hubble Fellow.
Debora Šijački is a computational cosmologist whose research involves computational methods for simulating the formation and development of the structures in the universe including galaxies, galaxy clusters, and dark matter, including collaborations in the Illustris project. Originally from Serbia, she was educated in Italy and Germany, and works in the UK as a professor at the University of Cambridge and deputy director of the Kavli Institute for Cosmology.
Volker Springel is a German astrophysicist known for his work in the field of galaxy formation and evolution, as well as his development of computational tools for cosmological simulations.