The Pattern on the Stone

Last updated

The Pattern on the Stone: The Simple Ideas that Make Computers Work is a book by W. Daniel Hillis, published in 1998 by Basic Books ( ISBN   0-465-02595-1). The book attempts to explain concepts from computer science in layman's terms by metaphor and analogy. It aims to demystify computer science by demonstrating that complex processes can be broken down into simple, repeated patterns. The book emphasizes the underlying simplicity and elegance of computer science, encouraging readers to see the beauty in the patterns that power the technology that permeates our lives.

Contents

Summary

The book is composed of eight chapters, and two extra topics:

  1. Boolean algebra: The book starts with the fundamental building block of all digital computers: Boolean logic. It explains how simple switches ("on" or "off") can be combined using operators like AND, OR, and NOT to represent information and perform calculations. Think of it as the basic alphabet of computer language.
  2. Information theory: This section delves into the science of measuring, storing, and transmitting information. It covers concepts like data compression, which efficiently encodes information for storage or transmission, and error-correcting codes, which ensure information integrity despite glitches.
  3. Turing Machines: Turing machines are theoretical models of universal computers. The book dives into their workings, highlighting how simple components can be combined to achieve remarkable computational abilities.
  4. Universal computing: Imagine a machine that can perform any computation another machine can. This is the concept of a universal computer, explored in the book through the fascinating example of the Turing machine. It demonstrates the fundamental limits and capabilities of computational power.
  5. Algorithms: Step-by-step instructions that computers follow to solve problems are known as algorithms. This section tackles the essential role of algorithms in controlling computers and explores how programming languages allow humans to communicate these instructions effectively.
  6. Cryptography: Keeping information safe is crucial in the digital age. The book dives into the fascinating world of cryptography, explaining how codes and encryption techniques are used to scramble information and protect it from unauthorized access.
  7. Heuristics: Finding the optimal solution to every problem isn't always feasible. Heuristics offer a practical approach, using "rules of thumb" to find good-enough solutions quickly, even if they might not be perfect. The book explains this trade-off between speed and accuracy in problem-solving.
  8. Parallel computing: Imagine solving a problem by dividing it into smaller tasks and working on them simultaneously. This is the core idea of parallel computing, explained in the book. It explores how using multiple processors can significantly speed up computations, paving the way for future computing advancements.
  9. Quantum computing : This revolutionary field harnesses the bizarre principles of quantum mechanics to perform calculations that are impossible for classical computers. The book explains the basics of quantum bits (qubits), superposition, and entanglement, and how these properties can be used to solve problems like drug discovery, materials science, and financial modeling much faster than conventional methods. However, the book also discusses the challenges of building and controlling quantum computers, highlighting that this technology is still in its early stages.
  10. Emergence : These are complex systems where the whole exhibits properties that are not present in its individual parts. Imagine an anthill, where millions of ants working independently create a collective intelligence far greater than any individual ant. The book uses examples like ant colonies, the brain, and even the economy to illustrate how simple rules can lead to complex and unpredictable behavior. Emergent systems offer inspiration for designing new computational models and understanding complex phenomena in various fields.

Reviews

He begins by imparting Boolean logic through a demonstration of a machine that plays tic-tac-toe...Hillis gift is his ability to convey the logical processes of computers that begin with switches and circuitry and escalate to self-organizing learning ability relevant to parallel computing systems.
Step by step from computer logic to programming to memory and compression. The final two chapters show how computers are truly close to being thinking machines.
A delightful all-in-one introduction to computer science.
There’s nothing special about silicon, Hillis wants the reader to know. The universal building blocks of computation -- simple, logical functions such as and, or, and not – can be implemented using rods and springs, water pipes and hydraulic valves, and many other physical systems.

Reception

The book has been covered by various media outlets, including scientific journals, newspapers, and online publications. During the year it was first published, it garnered several awards, including:

The book has aged well and has been positively received by critics and readers alike, earning praise for its clear and accessible writing style, ability to make complex topics understandable to a general audience, and overall message of empowerment and fascination with the world of computers. However, it has been criticized for lacking depth and specificity.

The book has been translated into many languages and has sold over a million copies.

Related Research Articles

<span class="mw-page-title-main">Algorithm</span> Sequence of operations for a task

In mathematics and computer science, an algorithm is a finite sequence of mathematically rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes and deduce valid inferences, achieving automation eventually. Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus".

In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm.

<span class="mw-page-title-main">Quantum computing</span> Technology that uses quantum mechanics

A quantum computer is a computer that exploits quantum mechanical phenomena. On small scales, physical matter exhibits properties of both particles and waves, and quantum computing leverages this behavior using specialized hardware. Classical physics cannot explain the operation of these quantum devices, and a scalable quantum computer could perform some calculations exponentially faster than any modern "classical" computer. In particular, a large-scale quantum computer could break widely used encryption schemes and aid physicists in performing physical simulations; however, the current state of the art is largely experimental and impractical, with several obstacles to useful applications.

In theoretical computer science and mathematics, the theory of computation is the branch that deals with what problems can be solved on a model of computation, using an algorithm, how efficiently they can be solved or to what degree. The field is divided into three major branches: automata theory and formal languages, computability theory, and computational complexity theory, which are linked by the question: "What are the fundamental capabilities and limitations of computers?".

<span class="mw-page-title-main">Turing machine</span> Computation model defining an abstract machine

A Turing machine is a mathematical model of computation describing an abstract machine that manipulates symbols on a strip of tape according to a table of rules. Despite the model's simplicity, it is capable of implementing any computer algorithm.

Computer science is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. One well known subject classification system for computer science is the ACM Computing Classification System devised by the Association for Computing Machinery.

Hypercomputation or super-Turing computation is a set of hypothetical models of computation that can provide outputs that are not Turing-computable. For example, a machine that could solve the halting problem would be a hypercomputer; so too would one that could correctly evaluate every statement in Peano arithmetic.

<span class="mw-page-title-main">Evolutionary computation</span> Trial and error problem solvers with a metaheuristic or stochastic optimization character

In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.

Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.

<span class="mw-page-title-main">Complexity class</span> Set of problems in computational complexity theory

In computational complexity theory, a complexity class is a set of computational problems "of related resource-based complexity". The two most commonly analyzed resources are time and memory.

In computational complexity theory, the Cook–Levin theorem, also known as Cook's theorem, states that the Boolean satisfiability problem is NP-complete. That is, it is in NP, and any problem in NP can be reduced in polynomial time by a deterministic Turing machine to the Boolean satisfiability problem.

<span class="mw-page-title-main">History of computer science</span>

The history of computer science began long before the modern discipline of computer science, usually appearing in forms like mathematics or physics. Developments in previous centuries alluded to the discipline that we now know as computer science. This progression, from mechanical inventions and mathematical theories towards modern computer concepts and machines, led to the development of a major academic field, massive technological advancement across the Western world, and the basis of a massive worldwide trade and culture.

Unconventional computing is computing by any of a wide range of new or unusual methods.

<span class="mw-page-title-main">Boolean circuit</span> Model of computation

In computational complexity theory and circuit complexity, a Boolean circuit is a mathematical model for combinational digital logic circuits. A formal language can be decided by a family of Boolean circuits, one circuit for each possible input length.

Lateral computing is a lateral thinking approach to solving computing problems. Lateral thinking has been made popular by Edward de Bono. This thinking technique is applied to generate creative ideas and solve problems. Similarly, by applying lateral-computing techniques to a problem, it can become much easier to arrive at a computationally inexpensive, easy to implement, efficient, innovative or unconventional solution.

In computability theory, the halting problem is the problem of determining, from a description of an arbitrary computer program and an input, whether the program will finish running, or continue to run forever. The halting problem is undecidable, meaning that no general algorithm exists that solves the halting problem for all possible program–input pairs. The problem comes up often in discussions of computability since it demonstrates that some functions are mathematically definable but not computable.

Natural computing, also called natural computation, is a terminology introduced to encompass three classes of methods: 1) those that take inspiration from nature for the development of novel problem-solving techniques; 2) those that are based on the use of computers to synthesize natural phenomena; and 3) those that employ natural materials to compute. The main fields of research that compose these three branches are artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others.

Random-access Turing machines (RATMs) represent a pivotal computational model in theoretical computer science, especially critical in the study of tractability within big data computing scenarios. Diverging from the sequential memory access limitations of conventional Turing machines, RATMs introduce the capability for random access to memory positions. This advancement is not merely a technical enhancement but a fundamental shift in computational paradigms, aligning more closely with the memory access patterns of modern computing systems. The inherent ability of RATMs to access any memory cell in a constant amount of time significantly bolsters computational efficiency, particularly for problem sets where data size and access speed are critical factors. Furthermore, the conceptual evolution of RATMs from traditional Turing machines marks a significant leap in the understanding of computational processes, providing a more realistic framework for analyzing algorithms that handle the complexities of large-scale data. This transition from a sequential to a random-access paradigm not only mirrors the advancements in real-world computing systems but also underscores the growing relevance of RATMs in addressing the challenges posed by big data applications.

This glossary of artificial intelligence is a list of definitions of terms and concepts relevant to the study of artificial intelligence, its sub-disciplines, and related fields. Related glossaries include Glossary of computer science, Glossary of robotics, and Glossary of machine vision.