Decoding Reality

Last updated
Decoding Reality: The Universe as Quantum Information
Decoding Reality Vedral 2010.jpg
Front Cover
Author Vlatko Vedral
CountryUnited Kingdom
LanguageEnglish
Subject Science
GenreNon-fiction
Publisher Oxford University Press
Publication date
2010
Media typePrint (Hardback)
Pagesx, 256 pp.
ISBN 0-19-923769-7
LC Class Q360 .V43 2010

Decoding Reality: The Universe as Quantum Information is a popular science book by Vlatko Vedral published by Oxford University Press in 2010. Vedral examines information theory and proposes information as the most fundamental building block of reality. He argues what a useful framework this is for viewing all natural and physical phenomena. In building out this framework the books touches upon the origin of information, the idea of entropy, the roots of this thinking in thermodynamics, the replication of DNA, development of social networks, quantum behaviour at the micro and macro level, and the very role of indeterminism in the universe. The book finishes by considering the answer to the ultimate question: where did all of the information in the Universe come from? The ideas address concepts related to the nature of particles, time, determinism, and of reality itself.

Contents

Contents

"Creation Ex Nihilo: Something from Nothing"

Vedral believes in the principle that information is physical. Creation ex nihilo comes from Catholic dogma, the idea being that God created the universe out of nothing. Vedral says that invoking a supernatural being as an explanation for creation does not explain reality because the supernatural being would have to come into existence itself too somehow presumably from nothing (or else from an infinite regression of supernatural beings), thus of course the reality can come from nothing without a supernatural being. Occam's razor principle favours the simplest explanation. Vedral believes information is the fundamental building block of reality as it occurs at the macro level (economics, human behaviour etc.) as well as the subatomic level. Vedral argues that information is the only candidate for such a building block that can explain its own existence as information generates additional information that needs to be compressed thus generating more information. 'Annihilation of everything' is a more fitting term than creation ex nihilo Vedral states, as compression of possibilities is the process of how new information is created.

"Information for all Seasons"

Vedral uses an Italo Calvino philosophical story about a tarot-like card game as the kernel for his metaphor of conscious life arriving in medias res to a pre-existing contextual reality. In this game the individual observers/players (Vedral suggests: quantum physics, thermodynamics, biology, sociology, economics, philosophy) lay down cards with ambiguous meanings as an attempt to communicate messages to deduce meaning out of the other players' interactions. The results (information) of previous rounds establish contextual rules for observers/players in subsequent rounds. The point of this game is not established until the last card has been played as later cards can change the meaning of previous events, as in the case of the quantum explanation for the photoelectric effect instantly disproving classical physics. Vedral points out that in our reality there is no last card.

"Back to Basics: Bits and Pieces"

Shannon entropy or information content measured as the surprise value of a particular event, is essentially inversely proportional to the logarithm of the event's probability, i = log(1/p). Claude Shannon's information theory arose from research at Bell labs, building upon George Boole's digital logic. As information theory predicts common and easily predicted words tend to become shorter for optimal communication channel efficiency while less common words tend to be longer for redundancy and error correction. Vedral compares the process of life to John von Neumann's self replicating automata. These are enduring information carriers that will survive wear and tear of the individual by producing copies that can in turn go on to produce more copies.

"Digital Romance: Life is a Four-Letter Word"

Genetic code as an efficient digital information store, containing built in codon redundancy for error correction in transcription.

"Murphy’s Law: I Knew this Would Happen to Me"

Examines the Second law of thermodynamics and the process of information increasing entropy. Maxwell's Demon was thought to be a way around this inevitability, however such a demon would run out of information storage space and have to delete unwanted data thus having to do work to do so, increasing entropy.

"Place Your Bets: In It to Win It"

Blackjack as controlled risk taking using Shannon's information theory probability formulas. Casino as a ′cool′ financial entropy source and the gambler as a ′hot′ financial source, once again the Second law of thermodynamics means the flow is almost always from hot to cold in the long run. For managed risk spread bets widely and in high risk high reward investments (assuming a known probability), this is the Log optimal portfolio approach.

"Social Informatics: Get Connected or Die Tryin’"

Six degrees of separation means well connected people tend to be more successful as their social networks expose them to more chances to make choices they want. Schelling precommitment as strategy in social and self-control, for example burning your bridges by buying gym membership to help motivated self win over lazy self. Mutual information resulting in phase transitions in social and political demography as well as physical systems, like water freezing into ice at a particular critical temperature or magnetic fields spontaneously aligning in certain atoms when cooling from a molten state.

"Quantum Schmuntum: Lights, Camera, Action!"

Vedral examines the basis of quantum information, the qubit, and examines one-time pad quantum cryptography as the most secure form of encryption because of its uncomputability. Quantum entanglement demonstrates the importance of mutual information in defining outcomes in a reality.

"Surfing the Waves: Hyper-Fast Computers"

Quantum computers offer a search advantage over classical computers by searching many database elements at once as a result of quantum superpositions. A sufficiently advanced quantum computer would break current encryption methods by factorizing large numbers several orders of magnitude faster than any existing classical computer. Any computable problem may be expressed as a general quantum search algorithm although classical computers may have an advantage over quantum search when using more efficient tailored classical algorithms. The issue with quantum computers is that a measurement must be made to determine if the problem is solved which collapses the superposition. Vedral points out that unintentional interaction with the environment can be mitigated with redundancy, and this would be necessary if we were to scale up current quantum computers to achieve greater utility, i.e. to utilize 10 qubits have a 100 atom quantum system so that if one atom decoheres a consensus will still be held by the other 9 for the state of the same qubit.

"Children of the Aimless Chance: Randomness versus Determinism"

Randomness is key to generating new sources of surprise in a reality. Compression of these new sources to discard unimportant information is the deterministic element and organising principle.

"Sand Reckoning: Whose Information is It, Anyway?"

The information content of the universe as measured in bits or qubits. Vedral uses the initial effort of Archimedes of Syracuse in calculating the amount of sand that could theoretically fit inside the universe and compares it to a modern-day attempt to calculate the bit content of the universe. Vedral however sees this content as ultimately limitless as possibly maximum entropy is never reached as compression of complexity is an open ended process and random events will continue to occur. As Vedral sees information as the ultimate building block of physical reality, he speculates that information originating at any scale can force outcomes in all other scales to abide where mutual information is shared. For example, a human performed macro-level scientific test in search of a behaviour in a quantum particle could set parameters for that type of particle in the future when subjected to a similar test.

"Destruction ab Toto: Nothing from Something"

The information basis for creation ex nihilo. According to John von Neumann, starting trivially from an empty set of numbers an infinite sequence of numbers can bootstrap their way out. An empty set creates the number 1 by observing an empty set within itself which is enough of a basis for distinguishability. It creates the number 2 by observing an empty set within the second empty set and the number 1, and so on. Vedral sees this not as creation but as data compression, as every event of a reality breaks the symmetry of the pre-existing formlessness. Science is the process of describing a large amount of observed phenomena in a compressed programmatic way to predict future outcomes, and in this process of data compression science creates new information by eliminating all contrary possibilities to explain those phenomena.

Synopsis

The book explains the world as being made up of information. The Universe and its workings are the ebb and flow of information. We are all transient patterns of information, passing on the recipe for our basic forms to future generations using a four-letter digital code called DNA. In this engaging and mind-stretching account, Vlatko Vedral considers some of the deepest questions about the Universe and considers the implications of interpreting it in terms of information. He explains the nature of information, the idea of entropy, and the roots of this thinking in thermodynamics. He describes the bizarre effects of quantum behaviour - effects such as 'entanglement', which Albert Einstein called 'spooky action at a distance' and explores cutting-edge work on the harnessing quantum effects in hyperfast quantum computers, and how recent evidence suggests that the weirdness of the quantum world, once thought limited to the tiniest scales, may reach into the macro world. Vedral finishes by considering the answer to the ultimate question: where did all of the information in the Universe come from? The answers he considers are exhilarating, drawing upon the work of distinguished physicist John Wheeler and his concept of “it from bit”. The ideas challenge our concept of the nature of particles, of time, of determinism, and of reality itself.

Notes

    Related Research Articles

    Entropy Property of a thermodynamic system

    Entropy is a scientific concept as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

    The holographic principle is a tenet of string theories and a supposed property of quantum gravity that states that the description of a volume of space can be thought of as encoded on a lower-dimensional boundary to the region—such as a light-like boundary like a gravitational horizon. First proposed by Gerard 't Hooft, it was given a precise string-theory interpretation by Leonard Susskind, who combined his ideas with previous ones of 't Hooft and Charles Thorn. Leonard Susskind said, “The three-dimensional world of ordinary experience––the universe filled with galaxies, stars, planets, houses, boulders, and people––is a hologram, an image of reality coded on a distant two-dimensional (2D) surface." As pointed out by Raphael Bousso, Thorn observed in 1978 that string theory admits a lower-dimensional description in which gravity emerges from it in what would now be called a holographic way. The prime example of holography is the AdS/CFT correspondence.

    Information theory is the scientific study of the quantification, storage, and communication of digital information. The field was fundamentally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.

    Quantum information Information held in the state of a quantum system

    Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both the technical definition in terms of Von Neumann entropy and the general computational term.

    Quantum entanglement Correlation between measurements of quantum subsystems, even when spatially separated

    Quantum entanglement is the physical phenomenon that occurs when a group of particles are generated, interact, or share spatial proximity in a way such that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics lacking in classical mechanics.

    In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles.

    T-symmetry Time reversal symmetry in physics

    T-symmetry or time reversal symmetry is the theoretical symmetry of physical laws under the transformation of time reversal,

    Arrow of time Concept in physics that time is asymmetric (flowing one way)

    The arrow of time, also called time's arrow, is the concept positing the "one-way direction" or "asymmetry" of time. It was developed in 1927 by the British astrophysicist Arthur Eddington, and is an unsolved general physics question. This direction, according to Eddington, could be determined by studying the organization of atoms, molecules, and bodies, and might be drawn upon a four-dimensional relativistic map of the world.

    The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.

    A timeline of events related to  information theory,  quantum information theory and statistical physics,  data compression,  error correcting codes and related subjects.

    Entropy is one of the few quantities in the physical sciences that require a particular direction for time, sometimes called an arrow of time. As one goes "forward" in time, the second law of thermodynamics says, the entropy of an isolated system can increase, but not decrease. Thus, entropy measurement is a way of distinguishing the past from the future. In thermodynamic systems that are not isolated, entropy can decrease with time, for example living systems where local entropy is reduced at the expense of an environmental increase, the formation of typical crystals, the workings of a refrigerator and within living organisms.

    <i>Programming the Universe</i>

    Programming the Universe: A Quantum Computer Scientist Takes On the Cosmos is a 2006 popular science book by Seth Lloyd, professor of mechanical engineering at the Massachusetts Institute of Technology. The book proposes that the Universe is a quantum computer (supercomputer), and advances in the understanding of physics may come from viewing entropy as a phenomenon of information, rather than simply thermodynamics. Lloyd also postulates that the Universe can be fully simulated using a quantum computer; however, in the absence of a theory of quantum gravity, such a simulation is not yet possible. "Particles not only collide, they compute."

    The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.

    Branches of physics Overview of the branches of physics

    Physics is a scientific discipline that seeks to construct and experimentally test theories of the physical universe. These theories vary in their scope and can be organized into several distinct branches, which are outlined in this article.

    Natural computing, also called natural computation, is a terminology introduced to encompass three classes of methods: 1) those that take inspiration from nature for the development of novel problem-solving techniques; 2) those that are based on the use of computers to synthesize natural phenomena; and 3) those that employ natural materials to compute. The main fields of research that compose these three branches are artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others.

    Vlatko Vedral is a Serbian-born physicist and Professor in the Department of Physics at the University of Oxford and Centre for Quantum Technologies (CQT) at the National University of Singapore and a Fellow of Wolfson College, Oxford. He is known for his research on the theory of quantum entanglement and quantum information theory. He has published numerous research papers, which are regularly cited, in quantum mechanics and quantum information, and was awarded the Royal Society Wolfson Research Merit Award in 2007. He has held a lectureship and readership at Imperial College, a professorship at Leeds and visiting professorships in Vienna, Singapore (NUS) and at the Perimeter Institute for Theoretical Physics in Canada. He is the author of several books, including Decoding Reality.

    Until recently, most studies on time travel are based upon classical general relativity. Coming up with a quantum version of time travel requires physicists to figure out the time evolution equations for density states in the presence of closed timelike curves (CTC).

    <i>The Information: A History, a Theory, a Flood</i> 2011 book by James Gleick

    The Information: A History, a Theory, a Flood is a book by science history writer James Gleick, published in March 2011, which covers the genesis of the current information age. It was on The New York Times best-seller list for three weeks following its debut.

    Algorithmic cooling is an algorithmic method for transferring heat from some qubits to others or outside the system and into the environment, which results in a cooling effect. This method uses regular quantum operations on ensembles of qubits, and it can be shown that it can succeed beyond Shannon's bound on data compression. The phenomenon is a result of the connection between thermodynamics and information theory.

    Stochastic thermodynamics is an emergent field of research in statistical mechanics that uses stochastic variables to better understand the non-equilibrium dynamics present in many microscopic systems such as colloidal particles, biopolymers, enzymes, and molecular motors.

    References

    See also