Lubachevsky-Stillinger (compression) algorithm (LS algorithm, LSA, or LS protocol) is a numerical procedure suggested by F. H. Stillinger and Boris D. Lubachevsky that simulates or imitates a physical process of compressing an assembly of hard particles. [1] As the LSA may need thousands of arithmetic operations even for a few particles, it is usually carried out on a computer.
A physical process of compression often involves a contracting hard boundary of the container, such as a piston pressing against the particles. The LSA is able to simulate such a scenario. [2] However, the LSA was originally introduced in the setting without a hard boundary [1] [3] where the virtual particles were "swelling" or expanding in a fixed, finite virtual volume with periodic boundary conditions. The absolute sizes of the particles were increasing but particle-to-particle relative sizes remained constant. In general, the LSA can handle an external compression and an internal particle expansion, both occurring simultaneously and possibly, but not necessarily, combined with a hard boundary. In addition, the boundary can be mobile.
In a final, compressed, or "jammed" state, some particles are not jammed, they are able to move within "cages" formed by their immobile, jammed neighbors and the hard boundary, if any. These free-to-move particles are not an artifact, or pre-designed, or target feature of the LSA, but rather a real phenomenon. The simulation revealed this phenomenon, somewhat unexpectedly for the authors of the LSA. Frank H. Stillinger coined the term "rattlers" for the free-to-move particles, because if one physically shakes a compressed bunch of hard particles, the rattlers will be rattling.
In the "pre-jammed" mode when the density of the configuration is low and when the particles are mobile, the compression and expansion can be stopped, if so desired. Then the LSA, in effect, would be simulating a granular flow. Various dynamics of the instantaneous collisions can be simulated such as: with or without a full restitution, with or without tangential friction. Differences in masses of the particles can be taken into account. It is also easy and sometimes proves useful to "fluidize" a jammed configuration, by decreasing the sizes of all or some of the particles. Another possible extension of the LSA is replacing the hard collision force potential (zero outside the particle, infinity at or inside) with a piece-wise constant force potential. The LSA thus modified would approximately simulate molecular dynamics with continuous short range particle-particle force interaction. External force fields, such as gravitation, can be also introduced, as long as the inter-collision motion of each particle can be represented by a simple one-step calculation.
Using LSA for spherical particles of different sizes and/or for jamming in a non-commeasureable size container proved to be a useful technique for generating and studying micro-structures formed under conditions of a crystallographic defect [4] or a geometrical frustration [5] [6] It should be added that the original LS protocol was designed primarily for spheres of same or different sizes. [7]
Any deviation from the spherical (or circular in two dimensions) shape, even a simplest one, when spheres are replaced with ellipsoids (or ellipses in two dimensions), [8] causes thus modified LSA to slow down substantially. But as long as the shape is spherical, the LSA is able to handle particle assemblies in tens to hundreds of thousands on today's (2011) standard personal computers. Only a very limited experience was reported [9] in using the LSA in dimensions higher than 3.
The state of particle jamming is achieved via simulating a granular flow. The flow is rendered as a discrete event simulation, the events being particle-particle or particle-boundary collisions. Ideally, the calculations should have been performed with the infinite precision. Then the jamming would have occurred ad infinitum. In practice, the precision is finite as is the available resolution of representing the real numbers in the computer memory, for example, a double-precision resolution. The real calculations are stopped when inter-collision runs of the non-rattler particles become smaller than an explicitly or implicitly specified small threshold. For example, it is useless to continue the calculations when inter-collision runs are smaller than the roundoff error.
The LSA is efficient in the sense that the events are processed essentially in an event-driven fashion, rather than in a time-driven fashion. This means almost no calculation is wasted on computing or maintaining the positions and velocities of the particles between the collisions. Among the event-driven algorithms intended for the same task of simulating granular flow, like, for example, the algorithm of D.C. Rapaport, [10] the LSA is distinguished by a simpler data structure and data handling.
For any particle at any stage of calculations the LSA keeps record of only two events: an old, already processed committed event, which comprises the committed event time stamp, the particle state (including position and velocity), and, perhaps, the "partner" which could be another particle or boundary identification, the one with which the particle collided in the past, and a new event proposed for a future processing with a similar set of parameters. The new event is not committed. The maximum of the committed old event times must never exceed the minimum of the non-committed new event times.
Next particle to be examined by the algorithm has the current minimum of new event times. At examining the chosen particle, what was previously the new event, is declared to be the old one and to be committed, whereas the next new event is being scheduled, with its new time stamp, new state, and new partner, if any. As the next new event for a particle is being set, some of the neighboring particles may update their non-committed new events to better account for the new information.
As the calculations of the LSA progress, the collision rates of particles may and usually do increase. Still the LSA successfully approaches the jamming state as long as those rates remain comparable among all the particles, except for the rattlers. (Rattlers experience consistently low collision rates. This property allows one to detect rattlers.) However, it is possible for a few particles, even just for a single particle, to experience a very high collision rate along the approach to a certain simulated time. The rate will be increasing without a bound in proportion to the rates of collisions in the rest of the particle ensemble. If this happens, then the simulation will be stuck in time, it won't be able to progress toward the state of jamming.
The stuck-in-time failure can also occur when simulating a granular flow without particle compression or expansion. This failure mode was recognized by the practitioners of granular flow simulations as an "inelastic collapse" [11] because it often occurs in such simulations when the restitution coefficient in collisions is low (i.e. inelastic). The failure is not specific to only the LSA algorithm. Techniques to avoid the failure have been proposed. [12]
The LSA was a by-product of an attempt to find a fair measure of speedup in parallel simulations. The Time Warp parallel simulation algorithm by David Jefferson was advanced as a method to simulate asynchronous spatial interactions of fighting units in combat models on a parallel computer. [13] Colliding particles models [14] offered similar simulation tasks with spatial interactions of particles but clear of the details that are non-essential for exposing the simulation techniques. The speedup was presented as the ratio of the execution time on a uniprocessor over that on a multiprocessor, when executing the same parallel Time Warp algorithm. Boris D. Lubachevsky noticed that such a speedup assessment might be faulty because executing a parallel algorithm for a task on a uniprocessor is not necessarily the fastest way to perform the task on such a machine. The LSA was created in an attempt to produce a faster uniprocessor simulation and hence to have a more fair assessment of the parallel speedup. Later on, a parallel simulation algorithm, different from the Time Warp, was also proposed, that, when run on a uniprocessor, reduces to the LSA. [15]
A quantum computer is a computer that takes advantage of quantum mechanical phenomena.
Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be deterministic in principle. The name comes from the Monte Carlo Casino in Monaco, where the primary developer of the method, physicist Stanislaw Ulam, was inspired by his uncle's gambling habits.
A discrete element method (DEM), also called a distinct element method, is any of a family of numerical methods for computing the motion and effect of a large number of small particles. Though DEM is very closely related to molecular dynamics, the method is generally distinguished by its inclusion of rotational degrees-of-freedom as well as stateful contact and often complicated geometries. With advances in computing power and numerical algorithms for nearest neighbor sorting, it has become possible to numerically simulate millions of particles on a single processor. Today DEM is becoming widely accepted as an effective method of addressing engineering problems in granular and discontinuous materials, especially in granular flows, powder mechanics, and rock mechanics. DEM has been extended into the Extended Discrete Element Method taking heat transfer, chemical reaction and coupling to CFD and FEM into account.
Molecular dynamics (MD) is a computer simulation method for analyzing the physical movements of atoms and molecules. The atoms and molecules are allowed to interact for a fixed period of time, giving a view of the dynamic "evolution" of the system. In the most common version, the trajectories of atoms and molecules are determined by numerically solving Newton's equations of motion for a system of interacting particles, where forces between the particles and their potential energies are often calculated using interatomic potentials or molecular mechanical force fields. The method is applied mostly in chemical physics, materials science, and biophysics.
In plasma physics, the particle-in-cell (PIC) method refers to a technique used to solve a certain class of partial differential equations. In this method, individual particles in a Lagrangian frame are tracked in continuous phase space, whereas moments of the distribution such as densities and currents are computed simultaneously on Eulerian (stationary) mesh points.
A granular material is a conglomeration of discrete solid, macroscopic particles characterized by a loss of energy whenever the particles interact. The constituents that compose granular material are large enough such that they are not subject to thermal motion fluctuations. Thus, the lower size limit for grains in granular material is about 1 μm. On the upper size limit, the physics of granular materials may be applied to ice floes where the individual grains are icebergs and to asteroid belts of the Solar System with individual grains being asteroids.
A dynamical billiard is a dynamical system in which a particle alternates between free motion and specular reflections from a boundary. When the particle hits the boundary it reflects from it without loss of speed. Billiards are Hamiltonian idealizations of the game of billiards, but where the region contained by the boundary can have shapes other than rectangular and even be multidimensional. Dynamical billiards may also be studied on non-Euclidean geometries; indeed, the first studies of billiards established their ergodic motion on surfaces of constant negative curvature. The study of billiards which are kept out of a region, rather than being kept in a region, is known as outer billiard theory.
The kinetic Monte Carlo (KMC) method is a Monte Carlo method computer simulation intended to simulate the time evolution of some processes occurring in nature. Typically these are processes that occur with known transition rates among states. It is important to understand that these rates are inputs to the KMC algorithm, the method itself cannot predict them.
Random close packing (RCP) of spheres is an empirical parameter used to characterize the maximum volume fraction of solid objects obtained when they are packed randomly. For example, when a solid container is filled with grain, shaking the container will reduce the volume taken up by the objects, thus allowing more grain to be added to the container. In other words, shaking increases the density of packed objects. But shaking cannot increase the density indefinitely, a limit is reached, and if this is reached without obvious packing into an ordered structure, such as a regular crystal lattice, this is the empirical random close-packed density for this particular procedure of packing. The random close packing is the highest possible volume fraction out of all possible packing procedures.
The material point method (MPM) is a numerical technique used to simulate the behavior of solids, liquids, gases, and any other continuum material. Especially, it is a robust spatial discretization method for simulating multi-phase (solid-fluid-gas) interactions. In the MPM, a continuum body is described by a number of small Lagrangian elements referred to as 'material points'. These material points are surrounded by a background mesh/grid that is used to calculate terms such as the deformation gradient. Unlike other mesh-based methods like the finite element method, finite volume method or finite difference method, the MPM is not a mesh based method and is instead categorized as a meshless/meshfree or continuum-based particle method, examples of which are smoothed particle hydrodynamics and peridynamics. Despite the presence of a background mesh, the MPM does not encounter the drawbacks of mesh-based methods which makes it a promising and powerful tool in computational mechanics.
Contact dynamics deals with the motion of multibody systems subjected to unilateral contacts and friction. Such systems are omnipresent in many multibody dynamics applications. Consider for example
Jamming is the physical process by which the viscosity of some mesoscopic materials, such as granular materials, glasses, foams, polymers, emulsions, and other complex fluids, increases with increasing particle density. The jamming transition has been proposed as a new type of phase transition, with similarities to a glass transition but very different from the formation of crystalline solids.
The CFD-DEM model, or Computational Fluid Dynamics / Discrete Element Method model, is a process used to model or simulate systems combining fluids with solids or particles. In CFD-DEM, the motion of discrete solids or particles phase is obtained by the Discrete Element Method (DEM) which applies Newton's laws of motion to every particle, while the flow of continuum fluid is described by the local averaged Navier–Stokes equations that can be solved using the traditional Computational Fluid Dynamics (CFD) approach. The interactions between the fluid phase and solids phase is modeled by use of Newton's third law.
The moving particle semi-implicit (MPS) method is a computational method for the simulation of incompressible free surface flows. It is a macroscopic, deterministic particle method developed by Koshizuka and Oka (1996).
The distance of closest approach of two objects is the distance between their centers when they are externally tangent. The objects may be geometric shapes or physical particles with well-defined boundaries. The distance of closest approach is sometimes referred to as the contact distance.
Salvatore Torquato is an American theoretical scientist born in Falerna, Italy. His research work has impacted a variety of fields, including physics, chemistry, applied and pure mathematics, materials science, engineering, and biological physics. He is the Lewis Bernard Professor of Natural Sciences in the department of chemistry and Princeton Institute for the Science and Technology of Materials at Princeton University. He has been a senior faculty fellow in the Princeton Center for Theoretical Science, an enterprise dedicated to exploring frontiers across the theoretical natural sciences. He is also an associated faculty member in three departments or programs at Princeton University: physics, applied and computational mathematics, and mechanical and aerospace engineering. On multiple occasions, he was a member of the schools of mathematics and natural sciences at the Institute for Advanced Study, Princeton, New Jersey.
Frank H. Stillinger is an American theoretical chemist and a namesake of the Lubachevsky–Stillinger algorithm. He has recently collaborated with research groups as a senior scientist at Princeton University.
In the study of the physics of granular materials, a force chain consists of a set of particles within a compressed granular material that are held together and jammed into place by a network of mutual compressive forces.
Hyperuniform materials are characterized by an anomalous suppression of density fluctuations at large scales. More precisely, the vanishing of density fluctuations in the long-wave length limit distinguishes hyperuniform systems from typical gases, liquids, or amorphous solids. Examples of hyperuniformity include all perfect crystals, perfect quasicrystals, and exotic amorphous states of matter.
Random sequential adsorption (RSA) refers to a process where particles are randomly introduced in a system, and if they do not overlap any previously adsorbed particle, they adsorb and remain fixed for the rest of the process. RSA can be carried out in computer simulation, in a mathematical analysis, or in experiments. It was first studied by one-dimensional models: the attachment of pendant groups in a polymer chain by Paul Flory, and the car-parking problem by Alfréd Rényi. Other early works include those of Benjamin Widom. In two and higher dimensions many systems have been studied by computer simulation, including in 2d, disks, randomly oriented squares and rectangles, aligned squares and rectangles, various other shapes, etc.