Least count

Last updated
The dieter's problem: this scale can only resolve weight changes of 0.2 lbs, even though the digital display looks as if it could show 0.1 Scale least count.jpg
The dieter's problem: this scale can only resolve weight changes of 0.2 lbs, even though the digital display looks as if it could show 0.1

In the science of measurement, the least count of a measuring instrument is the smallest value in the measured quantity that can be resolved on the instrument's scale. [1] The least count is related to the precision of an instrument; an instrument that can measure smaller changes in a value relative to another instrument, has a smaller "least count" value and so is more precise. Any measurement made by the instrument can be considered repeatable to no less than the resolution of the least count. The least count of an instrument is inversely proportional to the precision of the instrument.

For example, a sundial might only have scale marks representing hours, not minutes; it would have a least count of one hour. A stopwatch used to time a race might resolve down to a hundredth of a second, its least count. The stopwatch is more precise at measuring time intervals than the sundial because it has more "counts" (scale intervals) in each hour of elapsed time. Least count of an instrument is one of the very important tools in order to get accurate readings of instruments like vernier caliper and screw gauge used in various experiments.

Least count uncertainty is one of the sources of experimental error in measurements.

Least count error

The smallest value that can be measured by the measuring instrument is called its least count. Measured values are good only up to this value. The least count error is the error associated with the resolution of the instrument.

A metre ruler may have graduations at 1 mm division scale spacing or interval. A Vernier scale on a caliper may have a least count of 0.1 mm while a micrometer may have a least count of 0.01 mm or 10 microns.

The least count error occurs with both systematic and random errors. Instruments of higher precision can reduce the least count error. By repeating the observations and taking the arithmetic mean of the result, the mean value would be very close to the true value of the measured quantity.

Related Research Articles

<span class="mw-page-title-main">Measurement</span> Process of assigning numbers to objects or events

Measurement is the quantification of attributes of an object or event, which can be used to compare with other objects or events. In other words, measurement is a process of determining how large or small a physical quantity is as compared to a basic reference quantity of the same kind. The scope and application of measurement are dependent on the context and discipline. In natural sciences and engineering, measurements do not apply to nominal properties of objects or events, which is consistent with the guidelines of the International vocabulary of metrology published by the International Bureau of Weights and Measures. However, in other fields such as statistics as well as the social and behavioural sciences, measurements can have multiple levels, which would include nominal, ordinal, interval and ratio scales.

Observational error is the difference between a measured value of a quantity and its true value. In statistics, an error is not necessarily a "mistake". Variability is an inherent part of the results of measurements and of the measurement process.

<span class="mw-page-title-main">Multimeter</span> Electronic measuring instrument that combines several measurement functions in one unit

A multimeter is a measuring instrument that can measure multiple electrical properties. A typical multimeter can measure voltage, resistance, and current, in which case can be used as a voltmeter, ammeter, and ohmmeter. Some feature the measurement of additional properties such as temperature and capacitance.

Accuracy and precision are two measures of observational error. Accuracy is how close a given set of measurements are to their true value, while precision is how close the measurements are to each other.

<span class="mw-page-title-main">Micrometer (device)</span> Tool for the precise measurement of a components length, width, and/or depth

A micrometer, sometimes known as a micrometer screw gauge, is a device incorporating a calibrated screw widely used for accurate measurement of components in mechanical engineering and machining as well as most mechanical trades, along with other metrological instruments such as dial, vernier, and digital calipers. Micrometers are usually, but not always, in the form of calipers. The spindle is a very accurately machined screw and the object to be measured is placed between the spindle and the anvil. The spindle is moved by turning the ratchet knob or thimble until the object to be measured is lightly touched by both the spindle and the anvil.

In measurement technology and metrology, calibration is the comparison of measurement values delivered by a device under test with those of a calibration standard of known accuracy. Such a standard could be another measurement device of known accuracy, a device generating the quantity to be measured such as a voltage, a sound tone, or a physical artifact, such as a meter ruler.

<span class="mw-page-title-main">Vernier scale</span> Auxiliary scale of a measurement device, used to increase precision

A vernier scale, named after Pierre Vernier, is a visual aid to take an accurate measurement reading between two graduation markings on a linear scale by using mechanical interpolation, thereby increasing resolution and reducing measurement uncertainty by using vernier acuity to reduce human estimation error. It may be found on many types of instrument measuring linear or angular quantities, but in particular on a vernier caliper, which measures internal and external diameters.

Significant figures, also referred to as significant digits or sig figs, are specific digits within a number written in positional notation that carry both reliability and necessity in conveying a particular quantity. When presenting the outcome of a measurement, if the number of digits exceeds what the measurement instrument can resolve, only the number of digits within the resolution's capability are dependable and therefore considered significant.

In electronic instrumentation and signal processing, a time-to-digital converter (TDC) is a device for recognizing events and providing a digital representation of the time they occurred. For example, a TDC might output the time of arrival for each incoming pulse. Some applications wish to measure the time interval between two events rather than some notion of an absolute time.

Level of measurement or scale of measure is a classification that describes the nature of information within the values assigned to variables. Psychologist Stanley Smith Stevens developed the best-known classification with four levels, or scales, of measurement: nominal, ordinal, interval, and ratio. This framework of distinguishing levels of measurement originated in psychology and has since had a complex history, being adopted and extended in some disciplines and by some scholars, and criticized or rejected by others. Other classifications include those by Mosteller and Tukey, and by Chrisman.

A spectroradiometer is a light measurement tool that is able to measure both the wavelength and amplitude of the light emitted from a light source. Spectrometers discriminate the wavelength based on the position the light hits at the detector array allowing the full spectrum to be obtained with a single acquisition. Most spectrometers have a base measurement of counts which is the un-calibrated reading and is thus impacted by the sensitivity of the detector to each wavelength. By applying a calibration, the spectrometer is then able to provide measurements of spectral irradiance, spectral radiance and/or spectral flux. This data is also then used with built in or PC software and numerous algorithms to provide readings or Irradiance (W/cm2), Illuminance, Radiance (W/sr), Luminance (cd), Flux, Chromaticity, Color Temperature, Peak and Dominant Wavelength. Some more complex spectrometer software packages also allow calculation of PAR μmol/m2/s, Metamerism, and candela calculations based on distance and include features like 2- and 20-degree observer, baseline overlay comparisons, transmission and reflectance.

<span class="mw-page-title-main">Calipers</span> Tool used to measure dimensions of an object

Caliper(s) or calliper(s) are an instrument used to measure the dimensions of an object; namely, the diameter or depth of a hole. The least count of vernier caliper is 0.1 mm.

A bore gauge is a collective term for the tools that are unique to the process of accurately measuring holes.

Surface metrology is the measurement of small-scale features on surfaces, and is a branch of metrology. Surface primary form, surface fractality, and surface finish are the parameters most commonly associated with the field. It is important to many disciplines and is mostly known for the machining of precision parts and assemblies which contain mating surfaces or which must operate with high internal pressures.

In metrology, measurement uncertainty is the expression of the statistical dispersion of the values attributed to a measured quantity. All measurements are subject to uncertainty and a measurement result is complete only when it is accompanied by a statement of the associated uncertainty, such as the standard deviation. By international agreement, this uncertainty has a probabilistic basis and reflects incomplete knowledge of the quantity value. It is a non-negative parameter.

<span class="mw-page-title-main">Standard atomic weight</span> Relative atomic mass as defined by IUPAC (CIAAW)

The standard atomic weight of a chemical element (symbol Ar°(E) for element "E") is the weighted arithmetic mean of the relative isotopic masses of all isotopes of that element weighted by each isotope's abundance on Earth. For example, isotope 63Cu (Ar = 62.929) constitutes 69% of the copper on Earth, the rest being 65Cu (Ar = 64.927), so

<span class="mw-page-title-main">Diameter tape</span>

A diameter tape (D-tape) is a measuring tape used to estimate the diameter of a cylinder object, typically the stem of a tree or pipe. A diameter tape has either metric or imperial measurements reduced by the value of π. This means the tape measures the diameter of the object. It is assumed that the cylinder object is a perfect circle. The diameter tape provides an approximation of diameter; most commonly used in dendrometry.

<span class="mw-page-title-main">Celsius</span> Scale and unit of measurement for temperature

The degree Celsius is the unit of temperature on the Celsius scale, one of two temperature scales used in the International System of Units (SI), the other being the Kelvin scale. The degree Celsius can refer to a specific temperature on the Celsius scale or a unit to indicate a difference or range between two temperatures. It is named after the Swedish astronomer Anders Celsius (1701–1744), who developed a variant of it in 1742. The unit was called centigrade in several languages for many years. In 1948, the International Committee for Weights and Measures renamed it to honor Celsius and also to remove confusion with the term for one hundredth of a gradian in some languages. Most countries use this scale; the other major scale, Fahrenheit, is still used in the United States, some island territories, and Liberia. The Kelvin scale is of use in the sciences, with 0 K (−273.15 °C) representing absolute zero.

<span class="mw-page-title-main">Statistical dispersion</span> Statistical property quantifying how much a collection of data is spread out

In statistics, dispersion is the extent to which a distribution is stretched or squeezed. Common examples of measures of statistical dispersion are the variance, standard deviation, and interquartile range. For instance, when the variance of data in a set is large, the data is widely scattered. On the other hand, when the variance is small, the data in the set is clustered.

References

  1. William Woolsey Johnson The Theory of Errors and Method of Least Squares, Press of I. Friedenwald, 1890; page 1