The Wedgwood scale (°W) is an obsolete temperature scale, which was used to measure temperatures above the boiling point of mercury of 356 °C (673 °F). The scale and associated measurement technique were proposed by the English potter Josiah Wedgwood in the 18th century. The measurement was based on the shrinking of clay when heated above red heat, and the shrinking was evaluated by comparing heated and unheated clay cylinders. It was the first standardised pyrometric device. The scale began with 0 °W being equivalent to 1,077.5 °F (580.8 °C) and had 240 steps of 130 °F (72 °C) each. The origin and the sizing of the steps were later both found to be inaccurate.
The boiling point of mercury limits the mercury-in-glass thermometer to temperatures below 356 °C, which is too low for many industrial applications such as pottery, glass making and metallurgy.
To solve this problem, in 1782, Wedgwood created an accurately scaled pyrometric device, with details published in the Philosophical Transactions of the Royal Society of London in 1782 (Vol. LXXII, part 2). This led him to be elected a fellow of the Royal Society. [1] [2] [3] [4] [5] [6] [7]
A 0.5-inch-diameter cylinder made from pipe clay was dried at the temperature of boiling water. This would prepare it for heating in the oven in which the temperature was to be measured. During the annealing, sintering (merging) of fine particles resulted in contraction of clay. After cooling, the temperature was evaluated from the diameter difference before and after heating assuming that the contraction is linear with temperature. [8]
To facilitate the temperature calculation, Wedgwood built a device which would directly read the temperature. Two metal bars with scales on them were fixed one above another on a metal plate and inclined at a small angle. The spacing between the bars was 0.5 inches at one end and 0.3 inches at the lower end. The scale was divided into 240 equidistant parts. The unheated piece of clay would fit the 0.5-inch gap giving the zero temperature reading. After annealing, the clay cylinder would shrink and fit somewhere in between the left and right ends of the bars, and the temperature could be read from the scales on the bars. [9] [10]
The origin on the Wedgwood scale (0 °W) was set at the onset temperature of red heat, 1,077.5 °F (580.8 °C). The scale had 240 steps of 130 °F (72 °C) and extended up to 32,277 °F (17,914 °C). [8] [11] Wedgwood tried to compare his scale with other scales by measuring the expansion of silver as a function of temperature. He also determined the melting points of three metals, namely copper (27 °W or 4,587 °F (2,531 °C)), silver (28 °W or 4,717 °F (2,603 °C)) and gold (32 °W or 5,237 °F (2,892 °C)). All these values are at least 2,500 °F (1,400 °C) too high. [12]
Louis-Bernard Guyton de Morveau used his pyrometer to evaluate the temperature scale of Wedgwood and came to the conclusion that the starting point should be significantly lower, at 517 °F (269 °C) instead of 1,077.5 °F (580.8 °C), and that the steps should be nearly halved from 130 °F (72 °C) to no more than 62.5 °F (34.7 °C). However, even after this revision the Wedgwood measurements overestimated the melting points of elements. [10]
Daniel Gabriel Fahrenheit FRS was a physicist, inventor, and scientific instrument maker, born in Poland to a family of German extraction. Fahrenheit invented thermometers accurate and consistent enough to allow the comparison of temperature measurements between different observers using different instruments. Fahrenheit is also credited with inventing mercury-in-glass thermometers more accurate and superior to spirit-filled thermometers at the time. The popularity of his thermometers led to the widespread adoption of his Fahrenheit scale attached to his instruments.
The Fahrenheit scale is a temperature scale based on one proposed in 1724 by the European physicist Daniel Gabriel Fahrenheit (1686–1736). It uses the degree Fahrenheit as the unit. Several accounts of how he originally defined his scale exist, but the original paper suggests the lower defining point, 0 °F, was established as the freezing temperature of a solution of brine made from a mixture of water, ice, and ammonium chloride. The other limit established was his best estimate of the average human body temperature, originally set at 90 °F, then 96 °F.
A thermometer is a device that measures temperature or temperature gradient. A thermometer has two important elements: (1) a temperature sensor in which some change occurs with a change in temperature; and (2) some means of converting this change into a numerical value. Thermometers are widely used in technology and industry to monitor processes, in meteorology, in medicine, and in scientific research.
The melting point of a substance is the temperature at which it changes state from solid to liquid. At the melting point the solid and liquid phase exist in equilibrium. The melting point of a substance depends on pressure and is usually specified at a standard pressure such as 1 atmosphere or 100 kPa.
Thermodynamic temperature is a quantity defined in thermodynamics as distinct from kinetic theory or statistical mechanics.
This is a timeline of temperature and pressure measurement technology or the history of temperature measurement and pressure measurement technology.
A pyrometer, or radiation thermometer, is a type of remote sensing thermometer used to measure the temperature of distant objects. Various forms of pyrometers have historically existed. In the modern usage, it is a device that from a distance determines the temperature of a surface from the amount of the thermal radiation it emits, a process known as pyrometry, a type of radiometry.
The mercury-in-glass or mercury thermometer is a thermometer that uses the thermal expansion and contraction of liquid mercury to indicate the temperature.
The Réaumur scale, also known as the "octogesimal division", is a temperature scale for which the melting and boiling points of water are defined as 0 and 80 degrees respectively. The scale is named for René Antoine Ferchault de Réaumur, who first proposed a similar scale in 1730.
The Delisle scale is a temperature scale invented in 1732 by the French astronomer Joseph-Nicolas Delisle (1688–1768). Delisle was the author of Mémoires pour servir à l'histoire et aux progrès de l'Astronomie, de la Géographie et de la Physique (1738). The Delisle scale is notable as one of the few temperature scales that are inverted from the amount of thermal energy they measure; unlike most other temperature scales, higher measurements in degrees Delisle are colder, while lower measurements are warmer.
The Newton scale is a temperature scale devised by Isaac Newton in 1701. He called his device a "thermometer", but he did not use the term "temperature", speaking of "degrees of heat" instead. Newton's publication represents the first attempt to introduce an objective way of measuring temperature. Newton likely developed his scale for practical use rather than for a theoretical interest in thermodynamics; he had been appointed Warden of the Mint in 1695, and Master of the Mint in 1699, and his interest in the melting points of metals are likely inspired by his duties in connection with the Royal Mint.
Pyrometric cones are pyrometric devices that are used to gauge heatwork during the firing of ceramic materials in a kiln. The cones, often used in sets of three, are positioned in a kiln with the wares to be fired and, because the individual cones in a set soften and fall over at different temperatures, they provide a visual indication of when the wares have reached a required state of maturity, a combination of time and temperature.
Temperature measurement describes the process of measuring a current temperature for immediate or later evaluation. Datasets consisting of repeated standardized measurements can be used to assess temperature trends.
An infrared thermometer is a thermometer which infers temperature from a portion of the thermal radiation sometimes called black-body radiation emitted by the object being measured. They are sometimes called laser thermometers as a laser is used to help aim the thermometer, or non-contact thermometers or temperature guns, to describe the device's ability to measure temperature from a distance. By knowing the amount of infrared energy emitted by the object and its emissivity, the object's temperature can often be determined within a certain range of its actual temperature. Infrared thermometers are a subset of devices known as "thermal radiation thermometers".
Pyrometric devices gauge heatwork when firing materials inside a kiln. Pyrometric devices do not measure temperature, but can report temperature equivalents. In principle, a pyrometric device relates the amount of heat work on ware to a measurable shrinkage or deformation of a regular shape.
A medical thermometer or clinical thermometer is a device used for measuring the body temperature of a human or other animal. The tip of the thermometer is inserted into the mouth under the tongue, under the armpit, into the rectum via the anus, into the ear, or on the forehead.
Hugh Longbourne Callendar was a British physicist known for his contributions to the areas of thermometry and thermodynamics.
The degree Celsius is the unit of temperature on the Celsius temperature scale, one of two temperature scales used in the International System of Units (SI), the other being the closely related Kelvin scale. The degree Celsius can refer to a specific point on the Celsius temperature scale or to a difference or range between two temperatures. It is named after the Swedish astronomer Anders Celsius (1701–1744), who proposed the first version of it in 1742. The unit was called centigrade in several languages for many years. In 1948, the International Committee for Weights and Measures renamed it to honor Celsius and also to remove confusion with the term for one hundredth of a gradian in some languages. Most countries use this scale.
Temperature is a physical quantity that quantitatively expresses the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the average kinetic energy of the vibrating and colliding atoms making up a substance.
Scale of temperature is a methodology of calibrating the physical quantity temperature in metrology. Empirical scales measure temperature in relation to convenient and stable parameters or reference points, such as the freezing and boiling point of water. Absolute temperature is based on thermodynamic principles: using the lowest possible temperature as the zero point, and selecting a convenient incremental unit.