Lawrence S. Schulman | |
---|---|
Born | 1941 83) | (age
Nationality | American |
Citizenship | United States |
Known for | Boltzmann's brain Measurement problem Arrow of time |
Scientific career | |
Fields | Physics |
Institutions | Yeshiva University Princeton University Indiana University (Bloomington) Technion – Israel Institute of Technology Clarkson University Georgia Institute of Technology |
Thesis | A path integral for spin (1967) |
Doctoral advisor | Arthur Wightman |
Lawrence S. Schulman (born 1941) is an American-Israeli physicist known for his work on path integrals, quantum measurement theory and statistical mechanics. He introduced topology into path integrals on multiply connected spaces and has contributed to diverse areas from galactic morphology to the arrow of time.
He was born to Anna and Louis Schulman in Newark, New Jersey. He first went to the local public school, but switched to more Jewish oriented institutions, graduating from Yeshiva University in 1963. While still in college he married Claire Frangles Sherman. From Yeshiva he went to Princeton where he received the Ph.D. in physics for his thesis (under Arthur Wightman) A path integral for spin.
After completing his thesis he took a position as Assistant Professor at Indiana University (Bloomington), but in 1970 went to the Technion-Israel Institute of Technology in Haifa on a NATO postdoctoral fellowship.
At the Technion he accepted a position as Associate Professor, but only resigned from Indiana several years later as professor. In 1985, he returned to the United States as Chair of the Physics Department of Clarkson University and eventually (1988) also resigned from the Technion as a full professor. In 1991, he left the chair-ship and since then has stayed on at Clarkson as professor of physics.
In 2013, he spent part of a sabbatical at Georgia Institute of Technology and has since been adjunct professor at that institution.
Together with Phil Seiden of IBM, he began the first studies of randomized cellular automata, [1] an area that morphed into a theory of star formation in galaxies, once they were joined by Humberto Gerola (an astrophysicist at IBM) who realized that star formation regions - as well as epidemic models- could be viewed as random cellular automata. [2] Besides providing an explanation for spiral arms, this work ultimately solved the mystery of why dwarf galaxies can vary in their luminosity by large factors. [3]
In 1981, Schulman published Techniques and Applications of Path Integration, [4] from which many physicists learned about Feynman's path integral and its many applications. The book went on to become a Wiley classic and in 2005 came out in a Dover edition (with a supplement).
Once Schulman proved that there was no infinite cluster for long-range percolation in one dimension for sufficiently small but non-zero connection probability, [5] it became of interest whether for sufficiently large connection probability there was an infinite cluster. Together with Charles Newman, then of the University of Arizona. They used real-space renormalization methods to prove that there was. [6]
Schulman lowered his Erdös number to two by collaborating with Mark Kac and others on Feynman's checkerboard path integral, [7] [8] realizing that a particle only acquires mass by scattering, reversing its speed-of-light propagation. Later the path to Erdös was reinforced by another collaboration, with his son Leonard, whose Erdös number is also one. [9] [10]
Quantum measurement had always seemed an oxymoron and in the 1980s Schulman conceived of a way to retain unitary time evolution while at the same time having a single "world" (in the sense of the many worlds interpretation). So measurements in quantum mechanics could yield definite results. The mechanism for achieving definite outcomes was the use of "special states" in which pure unitary evolution led to only a single outcome, when in the absence of special initial conditions many outcomes were conceivable. The need for those states at all times led to an examination of the arrow of time and of determinism (achieved here, but in a way that might have surprised Einstein, at least according to his collaborator - and Schulman's Technion colleague - Nathan Rosen). [11]
These ideas have not been accepted in the mainstream of physics and Schulman himself has expressed doubts about them - his claim though is that other ideas on the quantum measurement process are even less believable. [12] As of 1997, the work was summarized in a book, Time's arrows and quantum measurement. [13] Despite the apparent finality of book publication, more than a decade later practical experimental tests of these ideas were conceived and published. [14] [15]
The arrow of time, of significance in the measurement problem, became a topic in and of itself. This goes back to Schulman's attempt to understand the Wheeler-Feynman absorber theory. [16] Using similar tools he was able to demonstrate that two systems with opposite arrows of time could coexist, even with mild contact between them. [17] There was also examination of other ideas on the arrow, including Thomas Gold's contribution (relating the thermodynamic arrow to the expansion of the universe) [18] and a critique of Boltzmann's notions (now known as Boltzmann's Brain) as a form of solipsism. [19] [20] See Schulman's critique on page 154 of. [21]
Schulman was interested in the quantum Zeno effect, the deviation from exponential decay for short times. He predicted that the slowdown in decay that occurred in pulsed observation and the slowdown resulting from continuous measurement would differ by a factor of 4. [22] This was verified on Bose-Einstein condensates by a group at MIT. [23]
Schulman has also contributed to practical matters through his collaboration with a group in Prague interested in luminescence and scintillators. This was first realized in a study anomalous decay caused by KAM tori in phase space (and the associated data fits) [24] and more recently has led to studies of quantum tunneling. [25] When funds were available undergraduate students from Clarkson were sent to Prague to work in the optical materials laboratories.
Together with Bernard Gaveau (University of Paris VI) Schulman developed an embedding of a stochastic dynamical system in low dimensional Euclidean space, known as the "observable representation." This has proved useful in numerous areas from spin-glasses to ecology. [26] [27] [28]
In 2005, he was awarded a Gutwiller fellowship from the Max Planck Institute for the Physics of Complex Systems in Dresden. [29]
He is the father of Leonard Schulman, Computer Science professor at the California Institute of Technology, Linda Parmet, Hebrew and Creative Design teacher at The Weber School, [30] and David Schulman, an intellectual property attorney at Greenberg Traurig, one of the nation's largest law firms.
The quantum Zeno effect is a feature of quantum-mechanical systems allowing a particle's time evolution to be slowed down by measuring it frequently enough with respect to some chosen measurement setting.
Peter V. Coveney is a British chemist who is Professor of Physical Chemistry, Honorary Professor of Computer Science, and the Director of the Centre for Computational Science (CCS) and Associate Director of the Advanced Research Computing Centre at University College London (UCL). He is also a Professor of Applied High Performance Computing at University of Amsterdam (UvA) and Professor Adjunct at the Yale School of Medicine, Yale University. He is a Fellow of the Royal Academy of Engineering and Member of Academia Europaea. Coveney is active in a broad area of interdisciplinary research including condensed matter physics and chemistry, materials science, as well as life and medical sciences in all of which high performance computing plays a major role. The citation about Coveney on his election as a FREng says: Coveney "has made outstanding contributions across a wide range of scientific and engineering fields, including physics, chemistry, chemical engineering, materials, computer science, high performance computing and biomedicine, much of it harnessing the power of supercomputing to conduct original research at unprecedented space and time scales. He has shown outstanding leadership across these fields, manifested through running multiple initiatives and multi-partner interdisciplinary grants, in the UK, Europe and the US. His achievements at national and international level in advocacy and enablement are exceptional".
In quantum mechanics, the Kochen–Specker (KS) theorem, also known as the Bell–KS theorem, is a "no-go" theorem proved by John S. Bell in 1966 and by Simon B. Kochen and Ernst Specker in 1967. It places certain constraints on the permissible types of hidden-variable theories, which try to explain the predictions of quantum mechanics in a context-independent way. The version of the theorem proved by Kochen and Specker also gave an explicit example for this constraint in terms of a finite number of state vectors.
Objective-collapse theories, also known spontaneous collapse models or dynamical reduction models, are proposed solutions to the measurement problem in quantum mechanics. As with other interpretations of quantum mechanics, they are possible explanations of why and how quantum measurements always give definite outcomes, not a superposition of them as predicted by the Schrödinger equation, and more generally how the classical world emerges from quantum theory. The fundamental idea is that the unitary evolution of the wave function describing the state of a quantum system is approximate. It works well for microscopic systems, but progressively loses its validity when the mass / complexity of the system increases.
The percolation threshold is a mathematical concept in percolation theory that describes the formation of long-range connectivity in random systems. Below the threshold a giant connected component does not exist; while above it, there exists a giant component of the order of system size. In engineering and coffee making, percolation represents the flow of fluids through porous media, but in the mathematics and physics worlds it generally refers to simplified lattice models of random systems or networks (graphs), and the nature of the connectivity in them. The percolation threshold is the critical value of the occupation probability p, or more generally a critical surface for a group of parameters p1, p2, ..., such that infinite connectivity (percolation) first occurs.
The Feynman checkerboard, or relativistic chessboard model, was Richard Feynman's sum-over-paths formulation of the kernel for a free spin-1/2 particle moving in one spatial dimension. It provides a representation of solutions of the Dirac equation in (1+1)-dimensional spacetime as discrete sums.
The Landau–Zener formula is an analytic solution to the equations of motion governing the transition dynamics of a two-state quantum system, with a time-dependent Hamiltonian varying such that the energy separation of the two states is a linear function of time. The formula, giving the probability of a diabatic transition between the two energy states, was published separately by Lev Landau, Clarence Zener, Ernst Stueckelberg, and Ettore Majorana, in 1932.
In the context of the physical and mathematical theory of percolation, a percolation transition is characterized by a set of universal critical exponents, which describe the fractal properties of the percolating medium at large scales and sufficiently close to the transition. The exponents are universal in the sense that they only depend on the type of percolation model and on the space dimension. They are expected to not depend on microscopic details such as the lattice structure, or whether site or bond percolation is considered. This article deals with the critical exponents of random percolation.
The SP formula for the dephasing rate of a particle that moves in a fluctuating environment unifies various results that have been obtained, notably in condensed matter physics, with regard to the motion of electrons in a metal. The general case requires to take into account not only the temporal correlations but also the spatial correlations of the environmental fluctuations. These can be characterized by the spectral form factor , while the motion of the particle is characterized by its power spectrum . Consequently, at finite temperature the expression for the dephasing rate takes the following form that involves S and P functions:
In quantum mechanics, quantum scarring is a phenomenon where the eigenstates of a classically chaotic quantum system have enhanced probability density around the paths of unstable classical periodic orbits. The instability of the periodic orbit is a decisive point that differentiates quantum scars from the more trivial observation that the probability density is enhanced in the neighborhood of stable periodic orbits. The latter can be understood as a purely classical phenomenon, a manifestation of the Bohr correspondence principle, whereas in the former, quantum interference is essential. As such, scarring is both a visual example of quantum-classical correspondence, and simultaneously an example of a (local) quantum suppression of chaos.
Joseph Alan Rudnick is an American physicist and professor in the Department of Physics and Astronomy at UCLA. Rudnick currently serves as the senior dean of the UCLA College of Letters and Science and dean of the Division of Physical Sciences. He previously served as the chair of the Department of Physics and Astronomy. His research interests include condensed-matter physics, statistical mechanics, and biological physics.
In physics, non-Hermitian quantum mechanics, describes quantum mechanical systems where Hamiltonians are not Hermitian.
Stochastic thermodynamics is an emergent field of research in statistical mechanics that uses stochastic variables to better understand the non-equilibrium dynamics present in many microscopic systems such as colloidal particles, biopolymers, enzymes, and molecular motors.
Crispin William Gardiner is a New Zealand physicist, who has worked in the fields of quantum optics, ultracold atoms and stochastic processes. He has written about 120 journal articles and several books in the fields of quantum optics, stochastic processes and ultracold atoms.
In quantum computing, a qubit is a unit of information analogous to a bit in classical computing, but it is affected by quantum mechanical properties such as superposition and entanglement which allow qubits to be in some ways more powerful than classical bits for some tasks. Qubits are used in quantum circuits and quantum algorithms composed of quantum logic gates to solve computational problems, where they are used for input/output and intermediate computations.
Many-body localization (MBL) is a dynamical phenomenon occurring in isolated many-body quantum systems. It is characterized by the system failing to reach thermal equilibrium, and retaining a memory of its initial condition in local observables for infinite times.
Gian Michele Graf is a Swiss mathematical physicist.
Tin-Lun "Jason" Ho is a Chinese-American theoretical physicist, specializing in condensed matter theory, quantum gases, and Bose-Einstein condensates. He is known for the Mermin-Ho relation.
Mordehai "Moty" Heiblum is an Israeli electrical engineer and condensed matter physicist, known for his research in mesoscopic physics.
Dov I. Levine is an American-Israeli physicist, known for his research on quasicrystals, soft condensed matter physics, and statistical mechanics out of equilibrium.