Author | Vlatko Vedral |
---|---|
Language | English |
Subject | Science |
Genre | Non-fiction |
Publisher | Oxford University Press |
Publication date | 2010 |
Publication place | United Kingdom |
Media type | Print (Hardback) |
Pages | x, 256 pp. |
ISBN | 0-19-923769-7 |
LC Class | Q360 .V43 2010 |
Decoding Reality: The Universe as Quantum Information is a popular science book by Vlatko Vedral published by Oxford University Press in 2010. Vedral examines information theory and proposes information as the most fundamental building block of reality. He argues what a useful framework this is for viewing all natural and physical phenomena. In building out this framework the books touches upon the origin of information, the idea of entropy, the roots of this thinking in thermodynamics, the replication of DNA, development of social networks, quantum behaviour at the micro and macro level, and the very role of indeterminism in the universe. The book finishes by considering the answer to the ultimate question: where did all of the information in the Universe come from? The ideas address concepts related to the nature of particles, time, determinism, and of reality itself.
Vedral believes in the principle that information is physical. Creation ex nihilo comes from Catholic dogma, the idea being that God created the universe out of nothing. Vedral says that invoking a supernatural being as an explanation for creation does not explain reality because the supernatural being would have to come into existence itself too somehow presumably from nothing (or else from an infinite regression of supernatural beings), thus of course the reality can come from nothing without a supernatural being. Occam's razor principle favours the simplest explanation. Vedral believes information is the fundamental building block of reality as it occurs at the macro level (economics, human behaviour etc.) as well as the subatomic level. Vedral argues that information is the only candidate for such a building block that can explain its own existence as information generates additional information that needs to be compressed thus generating more information. 'Annihilation of everything' is a more fitting term than creation ex nihilo Vedral states, as compression of possibilities is the process of how new information is created.
Vedral uses an Italo Calvino philosophical story about a tarot-like card game as the kernel for his metaphor of conscious life arriving in medias res to a pre-existing contextual reality. In this game the individual observers/players (Vedral suggests: quantum physics, thermodynamics, biology, sociology, economics, philosophy) lay down cards with ambiguous meanings as an attempt to communicate messages to deduce meaning out of the other players' interactions. The results (information) of previous rounds establish contextual rules for observers/players in subsequent rounds. The point of this game is not established until the last card has been played as later cards can change the meaning of previous events, as in the case of the quantum explanation for the photoelectric effect instantly disproving classical physics. Vedral points out that in our reality there is no last card.
Shannon entropy or information content measured as the surprise value of a particular event, is essentially inversely proportional to the logarithm of the event's probability, i = log(1/p). Claude Shannon's information theory arose from research at Bell labs, building upon George Boole's digital logic. As information theory predicts common and easily predicted words tend to become shorter for optimal communication channel efficiency while less common words tend to be longer for redundancy and error correction. Vedral compares the process of life to John von Neumann's self replicating automata. These are enduring information carriers that will survive wear and tear of the individual by producing copies that can in turn go on to produce more copies.
Genetic code as an efficient digital information store, containing built in codon redundancy for error correction in transcription.
Examines the second law of thermodynamics and the process of information increasing entropy. Maxwell's demon was thought to be a way around this inevitability; however, such a demon would run out of information storage space and have to delete unwanted data thus having to do work to do so, increasing entropy.
Blackjack as controlled risk taking using Shannon's information theory probability formulas. Casino as a ′cool′ financial entropy source and the gambler as a ′hot′ financial source, once again the second law of thermodynamics means the flow is almost always from hot to cold in the long run. For managed risk spread bets widely and in high-risk high-reward investments (assuming a known probability), this is the Log optimal portfolio approach.
Six degrees of separation means well connected people tend to be more successful as their social networks expose them to more chances to make choices they want. Schelling precommitment as strategy in social and self-control, for example burning your bridges by buying gym membership to help motivated self win over lazy self. Mutual information resulting in phase transitions in social and political demography as well as physical systems, like water freezing into ice at a particular critical temperature or magnetic fields spontaneously aligning in certain atoms when cooling from a molten state.
Vedral examines the basis of quantum information, the qubit, and examines one-time pad quantum cryptography as the most secure form of encryption because of its uncomputability. Quantum entanglement demonstrates the importance of mutual information in defining outcomes in a reality.
Quantum computers offer a search advantage over classical computers by searching many database elements at once as a result of quantum superpositions. A sufficiently advanced quantum computer would break current encryption methods by factorizing large numbers several orders of magnitude faster than any existing classical computer. Any computable problem may be expressed as a general quantum search algorithm although classical computers may have an advantage over quantum search when using more efficient tailored classical algorithms. The issue with quantum computers is that a measurement must be made to determine if the problem is solved which collapses the superposition. Vedral points out that unintentional interaction with the environment can be mitigated with redundancy, and this would be necessary if we were to scale up current quantum computers to achieve greater utility, i.e. to utilize 10 qubits have a 100 atom quantum system so that if one atom decoheres a consensus will still be held by the other 9 for the state of the same qubit.
Randomness is key to generating new sources of surprise in a reality. Compression of these new sources to discard unimportant information is the deterministic element and organising principle.
The information content of the universe as measured in bits or qubits. Vedral uses the initial effort of Archimedes of Syracuse in calculating the amount of sand that could theoretically fit inside the universe and compares it to a modern-day attempt to calculate the bit content of the universe. Vedral however sees this content as ultimately limitless as possibly maximum entropy is never reached as compression of complexity is an open ended process and random events will continue to occur. As Vedral sees information as the ultimate building block of physical reality, he speculates that information originating at any scale can force outcomes in all other scales to abide where mutual information is shared. For example, a human performed macro-level scientific test in search of a behaviour in a quantum particle could set parameters for that type of particle in the future when subjected to a similar test.
The information basis for creation ex nihilo. According to John von Neumann, starting trivially from an empty set of numbers an infinite sequence of numbers can bootstrap their way out. An empty set creates the number 1 by observing an empty set within itself which is enough of a basis for distinguishability. It creates the number 2 by observing an empty set within the second empty set and the number 1, and so on. Vedral sees this not as creation but as data compression, as every event of a reality breaks the symmetry of the pre-existing formlessness. Science is the process of describing a large amount of observed phenomena in a compressed programmatic way to predict future outcomes, and in this process of data compression science creates new information by eliminating all contrary possibilities to explain those phenomena.
The book explains the world as being made up of information. The Universe and its workings are the ebb and flow of information. We are all transient patterns of information, passing on the recipe for our basic forms to future generations using a four-letter digital code called DNA. In this engaging and mind-stretching account, Vlatko Vedral considers some of the deepest questions about the Universe and considers the implications of interpreting it in terms of information. He explains the nature of information, the idea of entropy, and the roots of this thinking in thermodynamics. He describes the bizarre effects of quantum behaviour - effects such as 'entanglement', which Albert Einstein called 'spooky action at a distance' and explores cutting-edge work on the harnessing quantum effects in hyperfast quantum computers, and how recent evidence suggests that the weirdness of the quantum world, once thought limited to the tiniest scales, may reach into the macro world. Vedral finishes by considering the answer to the ultimate question: where did all of the information in the Universe come from? The answers he considers are exhilarating, drawing upon the work of distinguished physicist John Wheeler and his concept of “it from bit”. The ideas challenge our concept of the nature of particles, of time, of determinism, and of reality itself.
The holographic principle is a property of string theories and a supposed property of quantum gravity that states that the description of a volume of space can be thought of as encoded on a lower-dimensional boundary to the region – such as a light-like boundary like a gravitational horizon. First proposed by Gerard 't Hooft, it was given a precise string theoretic interpretation by Leonard Susskind, who combined his ideas with previous ones of 't Hooft and Charles Thorn. Susskind said, "The three-dimensional world of ordinary experience—the universe filled with galaxies, stars, planets, houses, boulders, and people—is a hologram, an image of reality coded on a distant two-dimensional surface." As pointed out by Raphael Bousso, Thorn observed in 1978, that string theory admits a lower-dimensional description in which gravity emerges from it in what would now be called a holographic way. The prime example of holography is the AdS/CFT correspondence.
Information theory is the mathematical study of the quantification, storage, and communication of information. The field was established and put on a firm footing by Claude Shannon in the 1940s, though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley. It is at the intersection of electronic engineering, mathematics, statistics, computer science, neurobiology, physics, and electrical engineering.
Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both the technical definition in terms of Von Neumann entropy and the general computational term.
Quantum entanglement is the phenomenon of a group of particles being generated, interacting, or sharing spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. Sometimes called statistical physics or statistical thermodynamics, its applications include many problems in the fields of physics, biology, chemistry, neuroscience, computer science, information theory and sociology. Its main purpose is to clarify the properties of matter in aggregate, in terms of physical laws governing atomic motion.
The arrow of time, also called time's arrow, is the concept positing the "one-way direction" or "asymmetry" of time. It was developed in 1927 by the British astrophysicist Arthur Eddington, and is an unsolved general physics question. This direction, according to Eddington, could be determined by studying the organization of atoms, molecules, and bodies, and might be drawn upon a four-dimensional relativistic map of the world.
The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.
Entropy is one of the few quantities in the physical sciences that require a particular direction for time, sometimes called an arrow of time. As one goes "forward" in time, the second law of thermodynamics says, the entropy of an isolated system can increase, but not decrease. Thus, entropy measurement is a way of distinguishing the past from the future. In thermodynamic systems that are not isolated, local entropy can decrease over time, accompanied by a compensating entropy increase in the surroundings; examples include objects undergoing cooling, living systems, and the formation of typical crystals.
Programming the Universe: A Quantum Computer Scientist Takes On the Cosmos is a 2006 popular science book by Seth Lloyd, professor of mechanical engineering at the Massachusetts Institute of Technology. The book proposes that the Universe is a quantum computer (supercomputer), and advances in the understanding of physics may come from viewing entropy as a phenomenon of information, rather than simply thermodynamics. Lloyd also postulates that the Universe can be fully simulated using a quantum computer; however, in the absence of a theory of quantum gravity, such a simulation is not yet possible. "Particles not only collide, they compute."
The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.
Physics is a scientific discipline that seeks to construct and experimentally test theories of the physical universe. These theories vary in their scope and can be organized into several distinct branches, which are outlined in this article.
Natural computing, also called natural computation, is a terminology introduced to encompass three classes of methods: 1) those that take inspiration from nature for the development of novel problem-solving techniques; 2) those that are based on the use of computers to synthesize natural phenomena; and 3) those that employ natural materials to compute. The main fields of research that compose these three branches are artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others.
Vlatko Vedral is a Serbian-born British physicist. He is best known for his contributions to quantum information theory, quantum mechanics, and quantum entanglement. He earned his Bachelor of Science and Doctor of Philosophy degrees from Imperial College London, where he graduated with a PhD in 1998.
The theoretical study of time travel generally follows the laws of general relativity. Quantum mechanics requires physicists to solve equations describing how probabilities behave along closed timelike curves (CTCs), which are theoretical loops in spacetime that might make it possible to travel through time.
The Information: A History, a Theory, a Flood is a book by science history writer James Gleick, published in March 2011, which covers the genesis of the current Information Age. It was on The New York Times best-seller list for three weeks following its debut.
In quantum information theory, quantum discord is a measure of nonclassical correlations between two subsystems of a quantum system. It includes correlations that are due to quantum physical effects but do not necessarily involve quantum entanglement.
Algorithmic cooling is an algorithmic method for transferring heat from some qubits to others or outside the system and into the environment, which results in a cooling effect. This method uses regular quantum operations on ensembles of qubits, and it can be shown that it can succeed beyond Shannon's bound on data compression. The phenomenon is a result of the connection between thermodynamics and information theory.
The index of physics articles is split into multiple pages due to its size.
Decoding the Universe: How the New Science of Information Is Explaining Everything in the Cosmos, from Our Brains to Black Holes is the third non-fiction book by American author and journalist Charles Seife. The book was initially published on January 30, 2007 by Viking.
Stochastic thermodynamics is an emergent field of research in statistical mechanics that uses stochastic variables to better understand the non-equilibrium dynamics present in many microscopic systems such as colloidal particles, biopolymers, enzymes, and molecular motors.