This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these messages)
|
Peter Coveney FREng, MAE, FRSC, FInstP | |
---|---|
Born | Peter V. Coveney Ealing, England |
Education | University of Oxford |
Scientific career | |
Fields |
|
Institutions | University College London, University of Amsterdam Yale University |
Thesis | Semiclassical methods in scattering and spectroscopy (1985) |
Doctoral advisor | Mark Child [1] |
Website | www |
Peter V. Coveney is a British chemist who is Professor of Physical Chemistry, Honorary Professor of Computer Science, and the Director of the Centre for Computational Science (CCS) [2] and Associate Director of the Advanced Research Computing Centre at University College London (UCL). He is also a Professor of Applied High Performance Computing at University of Amsterdam (UvA) and Professor Adjunct at the Yale School of Medicine, Yale University. He is a Fellow of the Royal Academy of Engineering and Member of Academia Europaea. [3] Coveney is active in a broad area of interdisciplinary research including condensed matter physics and chemistry, materials science, as well as life and medical sciences in all of which high performance computing plays a major role. The citation about Coveney on his election as a FREng says: Coveney "has made outstanding contributions across a wide range of scientific and engineering fields, including physics, chemistry, chemical engineering, materials, computer science, high performance computing and biomedicine, much of it harnessing the power of supercomputing to conduct original research at unprecedented space and time scales. He has shown outstanding leadership across these fields, manifested through running multiple initiatives and multi-partner interdisciplinary grants, in the UK, Europe and the US. His achievements at national and international level in advocacy and enablement are exceptional". [4]
Coveney was awarded a Doctor of Philosophy degree from the University of Oxford in 1985 for his work on Semiclassical methods in scattering and spectroscopy. [1]
Coveney has held positions at University of Oxford, Princeton University, Schlumberger and QMUL, and currently holds positions at UCL, [5] UvA [6] and Yale, as well as acting as a Member of several academic councils in the UK [7] [8] and EU.
Coveney worked with Ilya Prigogine at the Free University of Brussels (1985-87) and went on to publish work with the mathematician Oilver Penrose on rigorous foundations of irreversibility and the derivation of kinetic equations based on chaotic dynamical systems. [9] [10] [11] [12] He collaborated with Jonathan Wattis on extensions and generalisations of the Becker-Döring and Smoluchowski equations for the kinetics of aggregation-fragmentation processes which they applied to a wide range of phenomena, from self-reproducing micelles and vesicles to a scenario for the origin of the RNA world in which they showed that self-reproducing sequences of RNA can spontaneously arise from an aqueous mixture of the RNA nucleotide bases. [13] [14] [15] [16]
At Schlumberger Cambridge Research (SCR), Coveney initiated new lines of research in which advanced computational methods played a central role. Some parts of this work, to develop highly scalable lattice-gas and, later, lattice-Boltzmann models of complex fluids, was done in collaboration with Bruce M. Boghosian, following Schlumberger’s acquisition of a Connection Machine, the CM-5, from the company.[ citation needed ]
In a forerunner of many contemporary applications of machine learning, Coveney showed that one can use a combination of infrared spectroscopy and artificial neural networks to predict the setting properties of cement, without any need to dwell on the polemics of the chemical composition of cementitious materials and the concrete that forms when it hardens. [17] [18] At the same time, using methods from nonlinear dynamics, he was able to identify the rate-determining processes that enable one to design new compounds which inhibit the crystallisation of the mineral ettringite by molecular modelling. [19]
From 2006, Coveney moved away from studying oilfield fluids to investigate blood flow in the human body, including the brain. Working with a PhD student, Marco Mazzeo, he developed a new code, named HemeLB, which simulates blood flow in the complex geometries of the human vasculature, as derived from a variety of medical imaging modalities. [20] [21] [22] The algorithm, based on indirect addressing, scales to very large core counts on CPU-based supercomputers. Most recently, he and his team have developed a GPU-accelerated version of the code which scales to around 20,000 GPUs on the Summit supercomputer and will soon[ when? ] be deployed on the world’s first exascale machine, Frontier. [23]
Coveney works in the domain of multiscale modelling and simulation. Working initially with Eirik Flekkøy on foundations of the dissipative particle dynamics method and then with Rafael Delgado-Buscalioni, he was among the first to develop theoretical schemes which couple molecular dynamics and continuum fluid dynamics representations of fluids in a single simulation.[ citation needed ] His work covers numerous applications of these methods in advanced materials and biomedical domains. [24] [25] [26] [27] [28] [29]
Coveney’s recent work is on the rapid, accurate, precise and reliable prediction of free energies of binding of ligands to proteins, [30] a major topic in drug discovery. Coveney has noted that classical molecular dynamics is chaotic and to make robust predictions from it requires the use of ensembles at all times. [31] This is a practical manifestation of his earlier work on simpler dynamical systems, for which a thermodynamic description is possible using a probabilistic formulation. [32] It has only become possible in the era of petascale computing, when supercomputers have grown to sufficient size to make calculations of ensemble averages feasible.
Working with Bruce M. Boghosian and Hongyan Wang, Coveney showed that there are a variety of problems which arise when simulating even the simplest of all dynamical systems — the generalised Bernoulli map — on a computer. [33] The IEEE floating point numbers can produce errors which are extremely large as well others of more modest scale, but they are each wrong when compared with the known exact mathematical description of the dynamics.
In recent years, Coveney has been a leading player in the development and application of validation, verification and uncertainty quantification (VVUQ) to computer simulation codes across a wide range of domains. The VECAM Toolkit [34] [35] and later SEAVEA Toolkit [36] provide a set of open-source, open-development software components which can be used to instrument any code so as to study its VVUQ characteristics. The methods his team has developed [37] are aimed at the analysis of real-world codes of substantial complexity which run on high performance computers.
Coveney has become active in quantum computing, where he is specifically concerned with seeking to assess the feasibility of realising quantum advantage from its application to the solution of molecular electronic structure problems. He and his team are currently dealing with noise reduction and implementing error mitigation as extensively as possible on a range of quantum device architectures. [38] [39] [40] [41]
Coveney led the EPSRC RealityGrid e-Science Pilot Project [42] and its extension project, and the EU FP7 Virtual Physiological Human (VPH) Network of Excellent. [43] He is the Principal Investigator on the EU Horizon 2020 projects Verified Exascale Computing for Multiscale Applications, "VECMA" [44] and Centre of Excellence in Computational Biomedicine,"CompBioMed2". [45] The original CompBioMed initiative [46] was launched after Coveney and his team successfully challenged the EU [47] following a rejected grant proposal.
Coveney has been the recipient of US NSF and DoE, and European DEISA and PRACE [48] supercomputing awards.
Coveney has chaired the UK Collaborative Computational Projects Steering Panel [49] and served on the programme committee of the 2002 Nobel Symposium on self-organization. [50] He is a founding member of the UK Government's e-Infrastructure Leadership Council and a Medical Academy Nominated Expert to the UK Prime Minister's Council for Science and Technology [51] on Data, Algorithms and Modelling, which has led to the creation of the London-based Alan Turing Institute.
Coveney has co-authored three popular science books with his long term friend and collaborator, Roger Highfield:
This is a timeline of quantum computing.
Lattice QCD is a well-established non-perturbative approach to solving the quantum chromodynamics (QCD) theory of quarks and gluons. It is a lattice gauge theory formulated on a grid or lattice of points in space and time. When the size of the lattice is taken infinitely large and its sites infinitesimally close to each other, the continuum QCD is recovered.
Numerical relativity is one of the branches of general relativity that uses numerical methods and algorithms to solve and analyze problems. To this end, supercomputers are often employed to study black holes, gravitational waves, neutron stars and many other phenomena described by Albert Einstein's theory of general relativity. A currently active field of research in numerical relativity is the simulation of relativistic binaries and their associated gravitational waves.
Nuclear magnetic resonance quantum computing (NMRQC) is one of the several proposed approaches for constructing a quantum computer, that uses the spin states of nuclei within molecules as qubits. The quantum states are probed through the nuclear magnetic resonances, allowing the system to be implemented as a variation of nuclear magnetic resonance spectroscopy. NMR differs from other implementations of quantum computers in that it uses an ensemble of systems, in this case molecules, rather than a single pure state.
Quantum annealing (QA) is an optimization process for finding the global minimum of a given objective function over a given set of candidate solutions, by a process using quantum fluctuations. Quantum annealing is used mainly for problems where the search space is discrete with many local minima; such as finding the ground state of a spin glass or solving the traveling salesman problem. The term "quantum annealing" was first proposed in 1988 by B. Apolloni, N. Cesa Bianchi and D. De Falco as a quantum-inspired classical algorithm. It was formulated in its present form by T. Kadowaki and H. Nishimori in 1998, though an imaginary-time variant without quantum coherence had been discussed by A. B. Finnila, M. A. Gomez, C. Sebenik and J. D. Doll in 1994.
Car–Parrinello molecular dynamics or CPMD refers to either a method used in molecular dynamics or the computational chemistry software package used to implement this method.
The percolation threshold is a mathematical concept in percolation theory that describes the formation of long-range connectivity in random systems. Below the threshold a giant connected component does not exist; while above it, there exists a giant component of the order of system size. In engineering and coffee making, percolation represents the flow of fluids through porous media, but in the mathematics and physics worlds it generally refers to simplified lattice models of random systems or networks (graphs), and the nature of the connectivity in them. The percolation threshold is the critical value of the occupation probability p, or more generally a critical surface for a group of parameters p1, p2, ..., such that infinite connectivity (percolation) first occurs.
The Landau–Zener formula is an analytic solution to the equations of motion governing the transition dynamics of a two-state quantum system, with a time-dependent Hamiltonian varying such that the energy separation of the two states is a linear function of time. The formula, giving the probability of a diabatic transition between the two energy states, was published separately by Lev Landau, Clarence Zener, Ernst Stueckelberg, and Ettore Majorana, in 1932.
Adiabatic quantum computation (AQC) is a form of quantum computing which relies on the adiabatic theorem to perform calculations and is closely related to quantum annealing.
Jens Eisert is a German physicist, ERC fellow, and professor at the Free University of Berlin. He is also affiliated with the Helmholtz Association and the Fraunhofer Society.
Multi-particle collision dynamics (MPC), also known as stochastic rotation dynamics (SRD), is a particle-based mesoscale simulation technique for complex fluids which fully incorporates thermal fluctuations and hydrodynamic interactions. Coupling of embedded particles to the coarse-grained solvent is achieved through molecular dynamics.
In the context of the physical and mathematical theory of percolation, a percolation transition is characterized by a set of universal critical exponents, which describe the fractal properties of the percolating medium at large scales and sufficiently close to the transition. The exponents are universal in the sense that they only depend on the type of percolation model and on the space dimension. They are expected to not depend on microscopic details such as the lattice structure, or whether site or bond percolation is considered. This article deals with the critical exponents of random percolation.
Cristopher David Moore, known as Cris Moore, is an American computer scientist, mathematician, and physicist. He is resident faculty at the Santa Fe Institute, and was formerly a full professor at the University of New Mexico. He is an elected Fellow of the American Physical Society, the American Mathematical Society, and the American Association for the Advancement of Science.
The Kibble–Zurek mechanism (KZM) describes the non-equilibrium dynamics and the formation of topological defects in a system which is driven through a continuous phase transition at finite rate. It is named after Tom W. B. Kibble, who pioneered the study of domain structure formation through cosmological phase transitions in the early universe, and Wojciech H. Zurek, who related the number of defects it creates to the critical exponents of the transition and to its rate—to how quickly the critical point is traversed.
Quantum simulators permit the study of a quantum system in a programmable fashion. In this instance, simulators are special purpose devices designed to provide insight about specific physics problems. Quantum simulators may be contrasted with generally programmable "digital" quantum computers, which would be capable of solving a wider class of quantum problems.
Random sequential adsorption (RSA) refers to a process where particles are randomly introduced in a system, and if they do not overlap any previously adsorbed particle, they adsorb and remain fixed for the rest of the process. RSA can be carried out in computer simulation, in a mathematical analysis, or in experiments. It was first studied by one-dimensional models: the attachment of pendant groups in a polymer chain by Paul Flory, and the car-parking problem by Alfréd Rényi. Other early works include those of Benjamin Widom. In two and higher dimensions many systems have been studied by computer simulation, including in 2d, disks, randomly oriented squares and rectangles, aligned squares and rectangles, various other shapes, etc.
Continuous-variable (CV) quantum information is the area of quantum information science that makes use of physical observables, like the strength of an electromagnetic field, whose numerical values belong to continuous intervals. One primary application is quantum computing. In a sense, continuous-variable quantum computation is "analog", while quantum computation using qubits is "digital." In more technical terms, the former makes use of Hilbert spaces that are infinite-dimensional, while the Hilbert spaces for systems comprising collections of qubits are finite-dimensional. One motivation for studying continuous-variable quantum computation is to understand what resources are necessary to make quantum computers more powerful than classical ones.
Lawrence S. Schulman is an American-Israeli physicist known for his work on path integrals, quantum measurement theory and statistical mechanics. He introduced topology into path integrals on multiply connected spaces and has contributed to diverse areas from galactic morphology to the arrow of time.
In quantum computing, a qubit is a unit of information analogous to a bit in classical computing, but it is affected by quantum mechanical properties such as superposition and entanglement which allow qubits to be in some ways more powerful than classical bits for some tasks. Qubits are used in quantum circuits and quantum algorithms composed of quantum logic gates to solve computational problems, where they are used for input/output and intermediate computations.
Turbulent phenomena are observed universally in energetic fluid dynamics, associated with highly chaotic fluid motion, and typically involving excitations spreading over a wide range of length scales. The particular features of turbulence are dependent on the fluid and geometry, and specifics of forcing and dissipation.
{{cite journal}}
: Cite journal requires |journal=
(help)