Machine-learned interatomic potentials (MLIPs), or simply machine learning potentials (MLPs), are interatomic potentials constructed by machine learning programs. Beginning in the 1990s, researchers have employed such programs to construct interatomic potentials by mapping atomic structures to their potential energies. These potentials are referred to as MLIPs or MLPs.
Such machine learning potentials promised to fill the gap between density functional theory, a highly accurate but computationally intensive modelling method, and empirically derived or intuitively-approximated potentials, which were far lighter computationally but substantially less accurate. Improvements in artificial intelligence technology heightened the accuracy of MLPs while lowering their computational cost, increasing the role of machine learning in fitting potentials. [1] [2]
Machine learning potentials began by using neural networks to tackle low-dimensional systems. While promising, these models could not systematically account for interatomic energy interactions; they could be applied to small molecules in a vacuum, or molecules interacting with frozen surfaces, but not much else – and even in these applications, the models often relied on force fields or potentials derived empirically or with simulations. [1] These models thus remained confined to academia.
Modern neural networks construct highly accurate and computationally light potentials, as theoretical understanding of materials science was increasingly built into their architectures and preprocessing. Almost all are local, accounting for all interactions between an atom and its neighbor up to some cutoff radius. There exist some nonlocal models, but these have been experimental for almost a decade. For most systems, reasonable cutoff radii enable highly accurate results. [1] [3]
Almost all neural networks intake atomic coordinates and output potential energies. For some, these atomic coordinates are converted into atom-centered symmetry functions. From this data, a separate atomic neural network is trained for each element; each atomic network is evaluated whenever that element occurs in the given structure, and then the results are pooled together at the end. This process – in particular, the atom-centered symmetry functions which convey translational, rotational, and permutational invariances – has greatly improved machine learning potentials by significantly constraining the neural network search space. Other models use a similar process but emphasize bonds over atoms, using pair symmetry functions and training one network per atom pair. [1] [4]
Other models to learn their own descriptors rather than using predetermined symmetry-dictating functions. These models, called message-passing neural networks (MPNNs), are graph neural networks. Treating molecules as three-dimensional graphs (where atoms are nodes and bonds are edges), the model takes feature vectors describing the atoms as input, and iteratively updates these vectors as information about neighboring atoms is processed through message functions and convolutions. These feature vectors are then used to predict the final potentials. The flexibility of this method often results in stronger, more generalizable models. In 2017, the first-ever MPNN model (a deep tensor neural network) was used to calculate the properties of small organic molecules. Such technology was commercialized, leading to the development of Matlantis in 2022, which extracts properties through both the forward and backward passes.[ citation needed ]
One popular class of machine-learned interatomic potential is the Gaussian Approximation Potential (GAP), [5] [6] [7] which combines compact descriptors of local atomic environments [8] with Gaussian process regression [9] to machine learn the potential energy surface of a given system. To date, the GAP framework has been used to successfully develop a number of MLIPs for various systems, including for elemental systems such as Carbon, [10] [11] Silicon, [12] Phosphorus, [13] and Tungsten, [14] as well as for multicomponent systems such as Ge2Sb2Te5 [15] and austenitic stainless steel, Fe7Cr2Ni. [16]
In condensed matter physics, a Bose–Einstein condensate (BEC) is a state of matter that is typically formed when a gas of bosons at very low densities is cooled to temperatures very close to absolute zero, i.e., 0 K. Under such conditions, a large fraction of bosons occupy the lowest quantum state, at which microscopic quantum-mechanical phenomena, particularly wavefunction interference, become apparent macroscopically. More generally, condensation refers to the appearance of macroscopic occupation of one or several states: for example, in BCS theory, a superconductor is a condensate of Cooper pairs. As such, condensation can be associated with phase transition, and the macroscopic occupation of the state is the order parameter.
Molecular dynamics (MD) is a computer simulation method for analyzing the physical movements of atoms and molecules. The atoms and molecules are allowed to interact for a fixed period of time, giving a view of the dynamic "evolution" of the system. In the most common version, the trajectories of atoms and molecules are determined by numerically solving Newton's equations of motion for a system of interacting particles, where forces between the particles and their potential energies are often calculated using interatomic potentials or molecular mechanical force fields. The method is applied mostly in chemical physics, materials science, and biophysics.
A Rydberg atom is an excited atom with one or more electrons that have a very high principal quantum number, n. The higher the value of n, the farther the electron is from the nucleus, on average. Rydberg atoms have a number of peculiar properties including an exaggerated response to electric and magnetic fields, long decay periods and electron wavefunctions that approximate, under some conditions, classical orbits of electrons about the nuclei. The core electrons shield the outer electron from the electric field of the nucleus such that, from a distance, the electric potential looks identical to that experienced by the electron in a hydrogen atom.
In probability theory and mathematical physics, a random matrix is a matrix-valued random variable—that is, a matrix in which some or all of its entries are sampled randomly from a probability distribution. Random matrix theory (RMT) is the study of properties of random matrices, often as they become large. RMT provides techniques like mean-field theory, diagrammatic methods, the cavity method, or the replica method to compute quantities like traces, spectral densities, or scalar products between eigenvectors. Many physical phenomena, such as the spectrum of nuclei of heavy atoms, the thermal conductivity of a lattice, or the emergence of quantum chaos, can be modeled mathematically as problems concerning large, random matrices.
An atom interferometer uses the wave-like nature of atoms in order to produce interference. In atom interferometers, the roles of matter and light are reversed compared to the laser based interferometers, i.e. the beam splitter and mirrors are lasers while the source emits matter waves rather than light. Atom interferometers measure the difference in phase between atomic matter waves along different paths. Matter waves are controlled and manipulated using systems of lasers. Atom interferometers have been used in tests of fundamental physics, including measurements of the gravitational constant, the fine-structure constant, and universality of free fall. Applied uses of atom interferometers include accelerometers, rotation sensors, and gravity gradiometers.
In physics, a Feshbach resonance can occur upon collision of two slow atoms, when they temporarily stick together forming an unstable compound with short lifetime. It is a feature of many-body systems in which a bound state is achieved if the coupling(s) between at least one internal degree of freedom and the reaction coordinates, which lead to dissociation, vanish. The opposite situation, when a bound state is not formed, is a shape resonance. It is named after Herman Feshbach, a physicist at MIT.
The Bose–Hubbard model gives a description of the physics of interacting spinless bosons on a lattice. It is closely related to the Hubbard model that originated in solid-state physics as an approximate description of superconducting systems and the motion of electrons between the atoms of a crystalline solid. The model was introduced by Gersch and Knollman in 1963 in the context of granular superconductors. The model rose to prominence in the 1980s after it was found to capture the essence of the superfluid-insulator transition in a way that was much more mathematically tractable than fermionic metal-insulator models.
QuantumATK is a commercial software for atomic-scale modeling and simulation of nanosystems. The software was originally developed by Atomistix A/S, and was later acquired by QuantumWise following the Atomistix bankruptcy. QuantumWise was then acquired by Synopsys in 2017.
Reservoir computing is a framework for computation derived from recurrent neural network theory that maps input signals into higher dimensional computational spaces through the dynamics of a fixed, non-linear system called a reservoir. After the input signal is fed into the reservoir, which is treated as a "black box," a simple readout mechanism is trained to read the state of the reservoir and map it to the desired output. The first key benefit of this framework is that training is performed only at the readout stage, as the reservoir dynamics are fixed. The second is that the computational power of naturally available systems, both classical and quantum mechanical, can be used to reduce the effective computational cost.
The Landau–Zener formula is an analytic solution to the equations of motion governing the transition dynamics of a two-state quantum system, with a time-dependent Hamiltonian varying such that the energy separation of the two states is a linear function of time. The formula, giving the probability of a diabatic transition between the two energy states, was published separately by Lev Landau, Clarence Zener, Ernst Stueckelberg, and Ettore Majorana, in 1932.
Inelastic electron tunneling spectroscopy (IETS) is an experimental tool for studying the vibrations of molecular adsorbates on metal oxides. It yields vibrational spectra of the adsorbates with high resolution (< 0.5 meV) and high sensitivity (< 1013 molecules are required to provide a spectrum). An additional advantage is the fact that optically forbidden transitions may be observed as well. Within IETS, an oxide layer with molecules adsorbed on it is put between two metal plates. A bias voltage is applied between the two contacts. An energy diagram of the metal-oxide-metal device under bias is shown in the top figure. The metal contacts are characterized by a constant density of states, filled up to the Fermi energy. The metals are assumed to be equal. The adsorbates are situated on the oxide material. They are represented by a single bridge electronic level, which is the upper dashed line. If the insulator is thin enough, there is a finite probability that the incident electron tunnels through the barrier. Since the energy of the electron is not changed by this process, it is an elastic process. This is shown in the left figure.
In physics, a trojan wave packet is a wave packet that is nonstationary and nonspreading. It is part of an artificially created system that consists of a nucleus and one or more electron wave packets, and that is highly excited under a continuous electromagnetic field. Its discovery as one of significant contributions to the quantum mechanics was awarded the 2022 Wigner Medal for Iwo Bialynicki-Birula
Interatomic Coulombic decay (ICD) is a general, fundamental property of atoms and molecules that have neighbors. Interatomic (intermolecular) Coulombic decay is a very efficient interatomic (intermolecular) relaxation process of an electronically excited atom or molecule embedded in an environment. Without the environment the process cannot take place. Until now it has been mainly demonstrated for atomic and molecular clusters, independently of whether they are of van-der-Waals or hydrogen bonded type.
Double ionization is a process of formation of doubly charged ions when laser radiation or charged particles like electrons, positrons or heavy ions are exerted on neutral atoms or molecules. Double ionization is usually less probable than single-electron ionization. Two types of double ionization are distinguished: sequential and non-sequential.
Quantum machine learning is the integration of quantum algorithms within machine learning programs.
Interatomic potentials are mathematical functions to calculate the potential energy of a system of atoms with given positions in space. Interatomic potentials are widely used as the physical basis of molecular mechanics and molecular dynamics simulations in computational chemistry, computational physics and computational materials science to explain and predict materials properties. Examples of quantitative properties and qualitative phenomena that are explored with interatomic potentials include lattice parameters, surface energies, interfacial energies, adsorption, cohesion, thermal expansion, and elastic and plastic material behavior, as well as chemical reactions.
The helium dimer is a van der Waals molecule with formula He2 consisting of two helium atoms. This chemical is the largest diatomic molecule—a molecule consisting of two atoms bonded together. The bond that holds this dimer together is so weak that it will break if the molecule rotates, or vibrates too much. It can only exist at very low cryogenic temperatures.
Spin squeezing is a quantum process that decreases the variance of one of the angular momentum components in an ensemble of particles with a spin. The quantum states obtained are called spin squeezed states. Such states have been proposed for quantum metrology, to allow a better precision for estimating a rotation angle than classical interferometers. However a wide body of work contradicts this analysis. In particular, these works show that the estimation precision obtainable for any quantum state can be expressed solely in terms of the state response to the signal. As squeezing does not increase the state response to the signal, it cannot fundamentally improve the measurement precision.
Applying machine learning (ML) methods to the study of quantum systems is an emergent area of physics research. A basic example of this is quantum state tomography, where a quantum state is learned from measurement. Other examples include learning Hamiltonians, learning quantum phase transitions, and automatically generating new quantum experiments. ML is effective at processing large amounts of experimental or calculated data in order to characterize an unknown quantum system, making its application useful in contexts including quantum information theory, quantum technology development, and computational materials design. In this context, for example, it can be used as a tool to interpolate pre-calculated interatomic potentials, or directly solving the Schrödinger equation with a variational method.
Giuseppe Carleo is an Italian physicist. He is a professor of computational physics at EPFL and the head of the Laboratory of Computational Quantum Science.