K. Birgitta Whaley | |
---|---|
Born | 1956 (age 66–67) Barnehurst, Kent, England |
Alma mater | Oxford University, University of Chicago |
Scientific career | |
Fields | Chemistry, Physics, Quantum Information |
Institutions | University of California, Berkeley, Lawrence Berkeley National Laboratory |
Doctoral advisor | John C. Light |
Katherine Birgitta Whaley (born 1956) [1] is a Professor of Chemistry at the University of California Berkeley and a senior faculty scientist in the Division of Chemical Sciences at Lawrence Berkeley National Laboratory. [2] At UC Berkeley, Whaley is the Director of the Berkeley Quantum Information and Computation Center, a member of the executive board for the Center for Quantum Coherent Science, and a member of the Kavli Energy Nanosciences Institute. [3] At Lawrence Berkeley National Laboratory, Whaley is a member of the Quantum Algorithms Team for Chemical Sciences in the research area of resource-efficient algorithms. [4]
Whaley's research team explores topics in the areas of quantum information, quantum computation, macroscopic quantum systems, and quantum control/simulation. [5] [1]
Whaley received her B.A. from Oxford University (1978) where she was a Nuffield Scholar at St Hilda's from 1974. [6] Whaley has stated that, during her undergraduate studies, she found it hard to decide whether to study chemistry or physics. [7] After graduating, Whaley was a Kennedy Fellow at Harvard University (1978–1979), and went on to receive her M. Sc. (1982) and Ph.D. (1984) from the University of Chicago where she was a student of John C. Light. [8] Her thesis was titled "Topics in molecule-surface scattering and multiphoton excitation dynamics". [9] She went on to be a Golda Meir Fellow at the Hebrew University, Jerusalem (1984–1985) and a post-doctoral fellow at Tel Aviv University (1985–1986) where she studied with Abraham Nitzan and Robert Gerber. Whaley joined the Chemistry faculty at UC Berkeley in 1986. Her interest in quantum biology was spurred by a series of low-temperature experiments in bacteria performed by Graham Fleming and his team in 2007. [7]
Whaley has been recognized by many awards for her scientific contributions in the course of her career. In 2002 she was nominated as a Fellow of American Physical Society by the Division of Computational Physics "for her contributions to theoretical understanding of quantum nanoscale phenomena, especially in superfluid helium droplets, and to control of decoherence in quantum information processing". [10]
Whaley served on the Advisory Board of the Kavli Institute for Theoretical Physics (KITP) at the University of California, Santa Barbara from 2014 to 2017 and was Chairperson from 2016-2017. [11] From 2017-2018, Whaley was a Simons Distinguished Visiting Scholar/Scientist at KITP. [12] At the Perimeter Institute for Theoretical Physics, Whaley served on the Scientific Advisory Committee from 2010 to 2013. [13] In the American Physical Society, Whaley served as Vice Chair, Chair Elect and then Chair for the Division of Chemical Physics from 2009–2011. In 2016, Whaley also served as Chair of the 2015 Fellowship Committee of the American Physical Society Division of Quantum Information. [14]
In the area of editorial roles, Whaley has served on the editorial board of Journal of Chemical Physics (2010–2012) and the Journal of Physical Chemistry (1998–2003). Since 1996, she has served on the editorial board of Chemical Physics. She a current member of the editorial boards of Quantum Information Processing (2005–), European Physical Journal (EPJ) Quantum Technology (2013–), and Advances in Physics X (2014–). [5]
In October 2019, Whaley was appointed to the President's Council of Advisors on Science and Technology. [15]
A quantum computer is a computer that exploits quantum mechanical phenomena. At small scales, physical matter exhibits properties of both particles and waves, and quantum computing leverages this behavior using specialized hardware. Classical physics cannot explain the operation of these quantum devices, and a scalable quantum computer could perform some calculations exponentially faster than any modern "classical" computer. In particular, a large-scale quantum computer could break widely used encryption schemes and aid physicists in performing physical simulations; however, the current state of the art is still largely experimental and impractical.
Quantum Darwinism is a theory meant to explain the emergence of the classical world from the quantum world as due to a process of Darwinian natural selection induced by the environment interacting with the quantum system; where the many possible quantum states are selected against in favor of a stable pointer state. It was proposed in 2003 by Wojciech Zurek and a group of collaborators including Ollivier, Poulin, Paz and Blume-Kohout. The development of the theory is due to the integration of a number of Zurek's research topics pursued over the course of twenty-five years including: pointer states, einselection and decoherence.
Artur Konrad Ekert is a British-Polish professor of quantum physics at the Mathematical Institute, University of Oxford, professorial fellow in quantum physics and cryptography at Merton College, Oxford, Lee Kong Chian Centennial Professor at the National University of Singapore and the founding director of the Centre for Quantum Technologies (CQT). His research interests extend over most aspects of information processing in quantum-mechanical systems, with a focus on quantum communication and quantum computation. He is best known as one of the pioneers of quantum cryptography.
Leonard Mlodinow is an American theoretical physicist and mathematician, screenwriter and author. In physics, he is known for his work on the large N expansion, a method of approximating the spectrum of atoms based on the consideration of an infinite-dimensional version of the problem, and for his work on the quantum theory of light inside dielectrics.
D-Wave Systems Inc. is a Canadian quantum computing company, based in Burnaby, British Columbia, Canada. D-Wave was the world's first company to sell computers to exploit quantum effects in their operation. D-Wave's early customers include Lockheed Martin, University of Southern California, Google/NASA and Los Alamos National Lab.
Adiabatic quantum computation (AQC) is a form of quantum computing which relies on the adiabatic theorem to do calculations and is closely related to quantum annealing.
Daniel Amihud Lidar is the holder of the Viterbi Professorship of Engineering at the University of Southern California, where he is a Professor of Electrical Engineering, Chemistry, Physics & Astronomy. He is the Director and co-founder of the USC Center for Quantum Information Science & Technology (CQIST) as well as Scientific Director of the USC-Lockheed Martin Quantum Computing Center, notable for his research on control of quantum systems and quantum information processing.
The framework of noiseless subsystems has been developed as a tool to preserve fragile quantum information against decoherence. In brief, when a quantum register is subjected to decoherence due to an interaction with an external and uncontrollable environment, information stored in the register is, in general, degraded. It has been shown that when the source of decoherence exhibits some symmetries, certain subsystems of the quantum register are unaffected by the interactions with the environment and are thus noiseless. These noiseless subsystems are therefore very natural and robust tools that can be used for processing quantum information.
Dynamical decoupling (DD) is an open-loop quantum control technique employed in quantum computing to suppress decoherence by taking advantage of rapid, time-dependent control modulation. In its simplest form, DD is implemented by periodic sequences of instantaneous control pulses, whose net effect is to approximately average the unwanted system-environment coupling to zero. Different schemes exist for designing DD protocols that use realistic bounded-strength control pulses, as well as for achieving high-order error suppression, and for making DD compatible with quantum gates. In spin systems in particular, commonly used protocols for dynamical decoupling include the Carr-Purcell and the Carr-Purcell-Meiboom-Gill schemes. They are based on the Hahn spin echo technique of applying periodic pulses to enable refocusing and hence extend the coherence times of qubits.
The USC-Lockheed Martin Quantum Computing Center (QCC) is a joint scientific research effort between Lockheed Martin Corporation and the University of Southern California (USC). The QCC is housed at the Information Sciences Institute (ISI), a computer science and engineering research unit of the USC Viterbi School of Engineering, and is jointly operated by ISI and Lockheed Martin.
Andrew MacGregor Childs is an American computer scientist and physicist known for his work on quantum computing. He is currently a Professor in the Department of Computer Science and Institute for Advanced Computer Studies at the University of Maryland. He also co-directs the Joint Center for Quantum Information and Computer Science, a partnership between the University of Maryland and the National Institute of Standards and Technology.
Irfan Siddiqi is an American physicist and currently a professor of physics at the University of California, Berkeley and a faculty scientist at Lawrence Berkeley National Laboratory (LBNL). He currently is the director of the Quantum Nanoelectronics Laboratory at UC Berkeley and the Advanced Quantum Testbed at LBNL. Siddiqi is known for groundbreaking contributions to the fields of superconducting quantum circuits, including dispersive single-shot readout of superconducting quantum bits, quantum feedback, observation of single quantum trajectories, and near-quantum limited microwave frequency amplification. In addition to other honors, for his pioneering work in superconducting devices, he was awarded with the American Physical Society George E. Valley, Jr. Prize in 2006, "for the development of the Josephson bifurcation amplifier for ultra-sensitive measurements at the quantum limit." Siddiqi is a fellow of the American Physical Society and a recipient of the UC Berkeley Distinguished Teaching Award in 2016.
In quantum computing, quantum supremacy, quantum primacy or quantum advantage is the goal of demonstrating that a programmable quantum device can solve a problem that no classical computer can solve in any feasible amount of time. Conceptually, quantum supremacy involves both the engineering task of building a powerful quantum computer and the computational-complexity-theoretic task of finding a problem that can be solved by that quantum computer and has a superpolynomial speedup over the best known or possible classical algorithm for that task. The term was coined by John Preskill in 2012, but the concept of a qualitative quantum computational advantage, specifically for simulating quantum systems, dates back to Yuri Manin's (1980) and Richard Feynman's (1981) proposals of quantum computing. Examples of proposals to demonstrate quantum supremacy include the boson sampling proposal of Aaronson and Arkhipov, D-Wave's specialized frustrated cluster loop problems, and sampling the output of random quantum circuits.
Julia Kempe is a French, German, and Israeli researcher in quantum computing. She is currently the Director of the Center for Data Science at NYU and Professor at the Courant Institute.
Continuous-variable (CV) quantum information is the area of quantum information science that makes use of physical observables, like the strength of an electromagnetic field, whose numerical values belong to continuous intervals. One primary application is quantum computing. In a sense, continuous-variable quantum computation is "analog", while quantum computation using qubits is "digital." In more technical terms, the former makes use of Hilbert spaces that are infinite-dimensional, while the Hilbert spaces for systems comprising collections of qubits are finite-dimensional. One motivation for studying continuous-variable quantum computation is to understand what resources are necessary to make quantum computers more powerful than classical ones.
In quantum computing, a qubit is a unit of information analogous to a bit in classical computing, but it is affected by quantum mechanical properties such as superposition and entanglement which allow qubits to be in some ways more powerful than classical bits for some tasks. Qubits are used in quantum circuits and quantum algorithms composed of quantum logic gates to solve computational problems, where they are used for input/output and intermediate computations.
Applying classical methods of machine learning to the study of quantum systems is the focus of an emergent area of physics research. A basic example of this is quantum state tomography, where a quantum state is learned from measurement. Other examples include learning Hamiltonians, learning quantum phase transitions, and automatically generating new quantum experiments. Classical machine learning is effective at processing large amounts of experimental or calculated data in order to characterize an unknown quantum system, making its application useful in contexts including quantum information theory, quantum technologies development, and computational materials design. In this context, it can be used for example as a tool to interpolate pre-calculated interatomic potentials or directly solving the Schrödinger equation with a variational method.
The Eastin–Knill theorem is a no-go theorem that states: "No quantum error correcting code can have a continuous symmetry which acts transversely on physical qubits". In other words, no quantum error correcting code can transversely implement a universal gate set. Since quantum computers are inherently noisy, quantum error correcting codes are used to correct errors that affect information due to decoherence. Decoding error corrected data in order to perform gates on the qubits makes it prone to errors. Fault tolerant quantum computation avoids this by performing gates on encoded data. Transversal gates, which perform a gate between two "logical" qubits each of which is encoded in N "physical qubits" by pairing up the physical qubits of each encoded qubit, and performing independent gates on each pair, can be used to perform fault tolerant but not universal quantum computation because they guarantee that errors don't spread uncontrollably through the computation. This is because transversal gates ensure that each qubit in a code block is acted on by at most a single physical gate and each code block is corrected independently when an error occurs. Due to the Eastin–Knill theorem, a universal set like {H, S, CNOT, T} gates can't be implemented transversally. For example, the T gate can't be implemented transversely in the Steane code. This calls for ways of circumventing Eastin–Knill in order to perform fault tolerant quantum computation. In addition to investigating fault tolerant quantum computation, the Eastin–Knill theorem is also useful for studying quantum gravity via the AdS/CFT correspondence and in condensed matter physics via quantum reference frame or many-body theory.
This glossary of quantum computing is a list of definitions of terms and concepts used in quantum computing, its sub-disciplines, and related fields.