# Thomson's lamp

Last updated

Thomson's lamp is a philosophical puzzle based on infinites. It was devised in 1954 by British philosopher James F. Thomson, who used it to analyze the possibility of a supertask, which is the completion of an infinite number of tasks.

A puzzle is a game, problem, or toy that tests a person's ingenuity or knowledge. In a puzzle, the solver is expected to put pieces together in a logical way, in order to arrive at the correct or fun solution of the puzzle. There are different genres of puzzles, such as crossword puzzles, word-search puzzles, number puzzles, relational puzzles, or logic puzzles.

James F. Thomson (1921–1984) was a British philosopher who devised the puzzle of Thomson's lamp, to argue against the possibility of supertasks.

In philosophy, a supertask is a countably infinite sequence of operations that occur sequentially within a finite interval of time. Supertasks are called "hypertasks" when the number of operations becomes uncountably infinite. A hypertask that includes one operation for each ordinal number is called an "ultratask". The term supertask was coined by the philosopher James F. Thomson, who devised Thomson's lamp. The term hypertask derives from Clark and Read in their paper of that name.

## Contents

TimeState
0.000On
1.000Off
1.500On
1.750Off
1.875On
......
2.000?

Consider a lamp with a toggle switch. Flicking the switch once turns the lamp on. Another flick will turn the lamp off. Now suppose that there is a being who is able to perform the following task: starting a timer, he turns the lamp on. At the end of one minute, he turns it off. At the end of another half minute, he turns it on again. At the end of another quarter of a minute, he turns it off. At the next eighth of a minute, he turns it on again, and he continues thus, flicking the switch each time after waiting exactly one-half the time he waited before flicking it previously. [1] The sum of this infinite series of time intervals is exactly two minutes. [2]

In mathematics, a series is, roughly speaking, a description of the operation of adding infinitely many quantities, one after the other, to a given starting quantity. The study of series is a major part of calculus and its generalization, mathematical analysis. Series are used in most areas of mathematics, even for studying finite structures, through generating functions. In addition to their ubiquity in mathematics, infinite series are also widely used in other quantitative disciplines such as physics, computer science, statistics and finance.

The following question is then considered: Is the lamp on or off at two minutes? [1] Thomson reasoned that this supertask creates a contradiction:

It seems impossible to answer this question. It cannot be on, because I did not ever turn it on without at once turning it off. It cannot be off, because I did in the first place turn it on, and thereafter I never turned it off without at once turning it on. But the lamp must be either on or off. This is a contradiction. [1]

## Mathematical series analogy

The question is related to the behavior of Grandi's series, i.e. the divergent infinite series

In mathematics, the infinite series 1 − 1 + 1 − 1 + ⋯, also written

• S = 1 − 1 + 1 − 1 + 1 − 1 + · · ·

For even values of n, the above finite series sums to 1; for odd values, it sums to 0. In other words, as n takes the values of each of the non-negative integers 0, 1, 2, 3, ... in turn, the series generates the sequence {1, 0, 1, 0, ...}, representing the changing state of the lamp. [3] The sequence does not converge as n tends to infinity, so neither does the infinite series.

An integer is a number that can be written without a fractional component. For example, 21, 4, 0, and −2048 are integers, while 9.75, 5 1/2, and 2 are not.

In mathematics, a sequence is an enumerated collection of objects in which repetitions are allowed. Like a set, it contains members. The number of elements is called the length of the sequence. Unlike a set, the same elements can appear multiple times at different positions in a sequence, and order matters. Formally, a sequence can be defined as a function whose domain is either the set of the natural numbers or the set of the first n natural numbers.

In mathematics, the limit of a sequence is the value that the terms of a sequence "tend to". If such a limit exists, the sequence is called convergent. A sequence that does not converge is said to be divergent. The limit of a sequence is said to be the fundamental notion on which the whole of analysis ultimately rests.

Another way of illustrating this problem is to rearrange the series:

• S = 1 − (1 − 1 + 1 − 1 + 1 − 1 + · · ·)

The unending series in the brackets is exactly the same as the original series S. This means S = 1 − S which implies S = 12. In fact, this manipulation can be rigorously justified: there are generalized definitions for the sums of series that do assign Grandi's series the value 12.

One of Thomson's objectives in his original 1954 paper is to differentiate supertasks from their series analogies. He writes of the lamp and Grandi's series,

Then the question whether the lamp is on or off… is the question: What is the sum of the infinite divergent sequence

+1, −1, +1, ...?

Now mathematicians do say that this sequence has a sum; they say that its sum is 12. And this answer does not help us, since we attach no sense here to saying that the lamp is half-on. I take this to mean that there is no established method for deciding what is done when a super-task is done. … We cannot be expected to pick up this idea, just because we have the idea of a task or tasks having been performed and because we are acquainted with transfinite numbers. [4]

Later, he claims that even the divergence of a series does not provide information about its supertask: "The impossibility of a super-task does not depend at all on whether some vaguely-felt-to-be-associated arithmetical sequence is convergent or divergent." [5]

## Notes

1. Thomson 1954, p. 5.
2. Thomson 1954, p. 9.
3. Thomson 1954, p. 6.
4. Thomson p.6. For the mathematics and its history he cites Hardy and Waismann's books, for which see History of Grandi's series .
5. Thomson 1954, p. 7.

## Related Research Articles

Zeno's paradoxes are a set of philosophical problems generally thought to have been devised by Greek philosopher Zeno of Elea to support Parmenides' doctrine that contrary to the evidence of one's senses, the belief in plurality and change is mistaken, and in particular that motion is nothing but an illusion. It is usually assumed, based on Plato's Parmenides (128a–d), that Zeno took on the project of creating these paradoxes because other philosophers had created paradoxes against Parmenides' view. Thus Plato has Zeno say the purpose of the paradoxes "is to show that their hypothesis that existences are many, if properly followed up, leads to still more absurd results than the hypothesis that they are one." Plato has Socrates claim that Zeno and Parmenides were essentially arguing exactly the same point.

In mathematics, the harmonic series is the divergent infinite series:

Hypercomputation or super-Turing computation refers to models of computation that can provide outputs that are not Turing computable. For example, a machine that could solve the halting problem would be a hypercomputer; so too would one that can correctly evaluate every statement in Peano arithmetic.

In mathematics, a divergent series is an infinite series that is not convergent, meaning that the infinite sequence of the partial sums of the series does not have a finite limit.

In mathematics and computer science, Zeno machines are a hypothetical computational model related to Turing machines that allows a countably infinite number of algorithmic steps to be performed in finite time. These machines are ruled out in most models of computation.

A Malament–Hogarth (M-H) spacetime, named after David B. Malament and Mark Hogarth, is a relativistic spacetime that possesses the following property: there exists a worldline and an event p such that all events along are a finite interval in the past of p, but the proper time along is infinite. The event p is known as an M-H event. The significance of M-H spacetimes is that they allow for the implementation of certain non-Turing computable tasks (hypercomputation). The idea is for an observer at some event in p's past to set a computer to work on some task and then have the Turing machine travel on , computing for all eternity. Since lies in p's past, the Turing machine can signal to p at any stage of this never-ending task. Meanwhile, the observer takes a quick trip through spacetime to p, to pick up the solution. The set-up can be used to decide the halting problem, which is known to be undecidable by an ordinary Turing machine. All the observer needs to do is to prime the Turing machine to signal to p if and only if the Turing machine halts.

The Ross–Littlewood paradox is a hypothetical problem in abstract mathematics and logic designed to illustrate the seemingly paradoxical, or at least non-intuitive, nature of infinity. More specifically, like the Thomson's lamp paradox, the Ross–Littlewood paradox tries to illustrate the conceptual difficulties with the notion of a supertask, in which an infinite number of tasks are completed sequentially. The problem was originally described by mathematician John E. Littlewood in his 1953 book Littlewood's Miscellany, and was later expanded upon by Sheldon Ross in his 1988 book A First Course in Probability.

In mathematics, 1 − 2 + 3 − 4 + ··· is an infinite series whose terms are the successive positive integers, given alternating signs. Using sigma summation notation the sum of the first m terms of the series can be expressed as

The infinite series whose terms are the natural numbers 1 + 2 + 3 + 4 + ⋯ is a divergent series. The nth partial sum of the series is the triangular number

In mathematics, the infinite series 1/2 + 1/4 + 1/8 + 1/16 + ··· is an elementary example of a geometric series that converges absolutely.

This article contains a discussion of paradoxes of set theory. As with most mathematical paradoxes, they generally reveal surprising and counter-intuitive mathematical results, rather than actual logical contradictions within modern axiomatic set theory.

Infinity is a concept describing something without any bound, or something larger than any natural number. Philosophers have speculated about the nature of the infinite, for example Zeno of Elea, who proposed many paradoxes involving infinity, and Eudoxus of Cnidus, who used the idea of infinitely small quantities in his method of exhaustion. This idea is also at the basis of infinitesimal calculus.

## References

• Allen, Benjamin William (2008). Zeno, Aristotle, the Racetrack and the Achilles: A Historical and Philosophical Investigation. New Brunswick, NJ: Rutgers, The State University of New Jersey. pp. 209–210. ISBN   9781109058437.
• Benacerraf, Paul (1962). "Tasks, Super-Tasks, and the Modern Eleatics". The Journal of Philosophy. 59 (24): 765–784. JSTOR   2023500.
• Huggett, Nick (2010). Everywhere and Everywhen : Adventures in Physics and Philosophy: Adventures in Physics and Philosophy. Oxford University Press. pp. 22–23. ISBN   9780199702114.
• Thomson, James F. (October 1954). "Tasks and Super-Tasks". Analysis. Analysis, Vol. 15, No. 1. 15 (1): 1–13. doi:10.2307/3326643. JSTOR   3326643.
• Earman, John and Norton, John (1996) Infinite Pains: The Trouble with Supertasks. In Benacerraf and his Critics, Adam Morton and Stephen P. Stich (Eds.), p. 231-261.