Viv Kendon

Last updated
Vivien M. Kendon
Viv Kendon on IFISC.jpg
Kendon in 2021
Alma mater University of Edinburgh
University of Oxford
Scientific career
Institutions University of Strathclyde
Imperial College London
Thesis Finite Reynolds number effects in fluid mixtures : an investigation using numerical simulation methods  (1999)

Vivien Mary Kendon is a British physicist who is Professor of Quantum Technology at the University of Strathclyde. Her research considers quantum computation and the properties of quantum walks. She is the director of the Computational Collaborative Project: Quantum Computing, which looks to develop useful applications of quantum computers.

Contents

Early life and education

She lived on Station Road in Carlton. [1] She gained 9 O-levels in 1974 at Carlton le Willows School, which was a grammar school until 1973. [2]

Kendon was an undergraduate student at the University of Oxford. [3] She moved to the University of Edinburgh for doctoral research, where she used numerical simulations to understand fluid dynamics. [4] She then moved to the University of Strathclyde, where she shifted from soft condensed matter to quantum information theory. [3] She joined Imperial College London in 2002, where she studied quantum walks on discrete lattices. Classical random walks underpin the design of classical algorithms, and it is believed that their quantum counterparts will support the implementation of efficient quantum algorithms. [5] In 2004, she moved into industry to work in global electronic networking. [3]

Research and career

In 2014, Kendon joined Durham University as a member of the Quantum Light and Matter (QLM) research section and the Joint Quantum Centre. She held an EPSRC established career fellowship on Hybrid Quantum and Classical Computation from 2014 to 2019. [6]

In 2020, Kendon launched the Computational Collaborative Project: Quantum Computing, which looks to develop the first useful applications of quantum computers. [7] She moved to the University of Strathclyde in 2021. [8]

Selected publications

Related Research Articles

<span class="mw-page-title-main">Quantum computing</span> Technology that uses quantum mechanics

A quantum computer is a computer that takes advantage of quantum mechanical phenomena.

In logic circuits, the Toffoli gate, invented by Tommaso Toffoli, is a universal reversible logic gate, which means that any classical reversible circuit can be constructed from Toffoli gates. It is also known as the "controlled-controlled-not" gate, which describes its action. It has 3-bit inputs and outputs; if the first two bits are both set to 1, it inverts the third bit, otherwise all bits stay the same.

In quantum computing, a quantum algorithm is an algorithm which runs on a realistic model of quantum computation, the most commonly used model being the quantum circuit model of computation. A classical algorithm is a finite sequence of instructions, or a step-by-step procedure for solving a problem, where each step or instruction can be performed on a classical computer. Similarly, a quantum algorithm is a step-by-step procedure, where each of the steps can be performed on a quantum computer. Although all classical algorithms can also be performed on a quantum computer, the term quantum algorithm is usually used for those algorithms which seem inherently quantum, or use some essential feature of quantum computation such as quantum superposition or quantum entanglement.

In quantum mechanics, einselections, short for "environment-induced superselection", is a name coined by Wojciech H. Zurek for a process which is claimed to explain the appearance of wavefunction collapse and the emergence of classical descriptions of reality from quantum descriptions. In this approach, classicality is described as an emergent property induced in open quantum systems by their environments. Due to the interaction with the environment, the vast majority of states in the Hilbert space of a quantum open system become highly unstable due to entangling interaction with the environment, which in effect monitors selected observables of the system. After a decoherence time, which for macroscopic objects is typically many orders of magnitude shorter than any other dynamical timescale, a generic quantum state decays into an uncertain state which can be expressed as a mixture of simple pointer states. In this way the environment induces effective superselection rules. Thus, einselection precludes stable existence of pure superpositions of pointer states. These 'pointer states' are stable despite environmental interaction. The einselected states lack coherence, and therefore do not exhibit the quantum behaviours of entanglement and superposition.

In quantum computing, the Gottesman–Knill theorem is a theoretical result by Daniel Gottesman and Emanuel Knill that states that stabilizer circuits, circuits that only consist of gates from the normalizer of the qubit Pauli group, also called Clifford group, can be perfectly simulated in polynomial time on a probabilistic classical computer. The Clifford group can be generated solely by using CNOT, Hadamard, and phase gate S; and therefore stabilizer circuits can be constructed using only these gates.

<span class="mw-page-title-main">Nuclear magnetic resonance quantum computer</span> Proposed spin-based quantum computer implementation

Nuclear magnetic resonance quantum computing (NMRQC) is one of the several proposed approaches for constructing a quantum computer, that uses the spin states of nuclei within molecules as qubits. The quantum states are probed through the nuclear magnetic resonances, allowing the system to be implemented as a variation of nuclear magnetic resonance spectroscopy. NMR differs from other implementations of quantum computers in that it uses an ensemble of systems, in this case molecules, rather than a single pure state.

In quantum computing, the threshold theorem states that a quantum computer with a physical error rate below a certain threshold can, through application of quantum error correction schemes, suppress the logical error rate to arbitrarily low levels. This shows that quantum computers can be made fault-tolerant, as an analogue to von Neumann's threshold theorem for classical computation. This result was proven by the groups of Dorit Aharanov and Michael Ben-Or; Emanuel Knill, Raymond Laflamme, and Wojciech Zurek; and Alexei Kitaev independently. These results built off a paper of Peter Shor, which proved a weaker version of the threshold theorem.

Daniel Amihud Lidar is the holder of the Viterbi Professorship of Engineering at the University of Southern California, where he is a professor of electrical engineering, chemistry, physics and astronomy. He is the director and co-founder of the USC Center for Quantum Information Science & Technology (CQIST) as well as scientific director of the USC-Lockheed Martin Quantum Computing Center, notable for his research on control of quantum systems and quantum information processing.

Andrew James Fisher is Professor of Physics in the Department of Physics and Astronomy at University College London. His team is part of the Condensed Matter and Materials Physics group, and based in the London Centre for Nanotechnology.

<span class="mw-page-title-main">Ancilla bit</span> Extra bits required in reversible and quantum computation, as bits cannot be modified arbitrarily

In reversible computing, ancilla bits are extra bits being used to implement irreversible logical operations. In classical computation, any memory bit can be turned on or off at will, requiring no prior knowledge or extra complexity. However, this is not the case in quantum computing or classical reversible computing. In these models of computing, all operations on computer memory must be reversible, and toggling a bit on or off would lose the information about the initial value of that bit. For this reason, in a quantum algorithm there is no way to deterministically put bits in a specific prescribed state unless one is given access to bits whose original state is known in advance. Such bits, whose values are known a priori, are known as ancilla bits in a quantum or reversible computing task.

The framework of noiseless subsystems has been developed as a tool to preserve fragile quantum information against decoherence. In brief, when a quantum register is subjected to decoherence due to an interaction with an external and uncontrollable environment, information stored in the register is, in general, degraded. It has been shown that when the source of decoherence exhibits some symmetries, certain subsystems of the quantum register are unaffected by the interactions with the environment and are thus noiseless. These noiseless subsystems are therefore very natural and robust tools that can be used for processing quantum information.

<span class="mw-page-title-main">Quantum machine learning</span> Interdisciplinary research area at the intersection of quantum physics and machine learning

Quantum machine learning is the integration of quantum algorithms within machine learning programs.

The USC-Lockheed Martin Quantum Computing Center (QCC) is a joint scientific research effort between Lockheed Martin Corporation and the University of Southern California (USC). The QCC is housed at the Information Sciences Institute (ISI), a computer science and engineering research unit of the USC Viterbi School of Engineering, and is jointly operated by ISI and Lockheed Martin.

The DiVincenzo criteria are conditions necessary for constructing a quantum computer, conditions proposed in 2000 by the theoretical physicist David P. DiVincenzo, as being those necessary to construct such a computer—a computer first proposed by mathematician Yuri Manin, in 1980, and physicist Richard Feynman, in 1982—as a means to efficiently simulate quantum systems, such as in solving the quantum many-body problem.

In quantum computing, quantum supremacy or quantum advantage is the goal of demonstrating that a programmable quantum computer can solve a problem that no classical computer can solve in any feasible amount of time, irrespective of the usefulness of the problem. The term was coined by John Preskill in 2012, but the concept dates back to Yuri Manin's 1980 and Richard Feynman's 1981 proposals of quantum computing.

<span class="mw-page-title-main">Julia Kempe</span> French, German, and Israeli researcher in quantum computing

Julia Kempe is a French, German, and Israeli researcher in quantum computing. She is currently the Director of the Center for Data Science at NYU and Professor at the Courant Institute.

Continuous-variable (CV) quantum information is the area of quantum information science that makes use of physical observables, like the strength of an electromagnetic field, whose numerical values belong to continuous intervals. One primary application is quantum computing. In a sense, continuous-variable quantum computation is "analog", while quantum computation using qubits is "digital." In more technical terms, the former makes use of Hilbert spaces that are infinite-dimensional, while the Hilbert spaces for systems comprising collections of qubits are finite-dimensional. One motivation for studying continuous-variable quantum computation is to understand what resources are necessary to make quantum computers more powerful than classical ones.

In quantum computing, a qubit is a unit of information analogous to a bit in classical computing, but it is affected by quantum mechanical properties such as superposition and entanglement which allow qubits to be in some ways more powerful than classical bits for some tasks. Qubits are used in quantum circuits and quantum algorithms composed of quantum logic gates to solve computational problems, where they are used for input/output and intermediate computations.

The current state of quantum computing is referred to as the Noisy Intermediate-Scale Quantum (NISQ) era, characterized by quantum processors containing up to 1000 qubits which are not advanced enough yet for fault-tolerance or large enough to achieve quantum advantage. These processors, which are sensitive to their environment (noisy) and prone to quantum decoherence, are not yet capable of continuous quantum error correction. This intermediate-scale is defined by the quantum volume, which is based on the moderate number of qubits and gate fidelity. The term NISQ was coined by John Preskill in 2018.

This glossary of quantum computing is a list of definitions of terms and concepts used in quantum computing, its sub-disciplines, and related fields.

References

  1. Nottingham Evening Post Wednesday 21 January 1976, page 12
  2. Nottingham Evening Post Tuesday 27 August 1974, page 12
  3. 1 2 3 University, Durham. "Dr Viv Kendon - Durham University". www.durham.ac.uk. Retrieved 2023-04-29.
  4. "Finite Reynolds number effects in fluid mixtures : an investigation using numerical simulation methods | WorldCat.org". www.worldcat.org. Retrieved 2023-04-26.
  5. Kendon, Viv; Tregenna, Ben (2003-04-22). "Decoherence can be useful in quantum walks". Physical Review A. 67 (4): 042315. arXiv: quant-ph/0209005 . doi:10.1103/PhysRevA.67.042315.
  6. "Hybrid quantum and classical computation: exploiting the best of both paradigms".
  7. CCP-QC. "· CCP-QC". CCP-QC. Retrieved 2023-04-26.
  8. "IES - hybrid event - Quantum Computing - how to build a REALLY cool computer". IES. Retrieved 2023-04-27.