McLeod gauge

Last updated
A glass McLeod gauge, drained of mercury McLeod gauge 01.jpg
A glass McLeod gauge, drained of mercury

A McLeod gauge is a scientific instrument used to measure very low pressures, down to 10−6 Torr. It was invented in 1874 by Herbert McLeod (1841–1923). [1] McLeod gauges were once commonly found attached to equipment that operates under vacuum, such as a lyophilizer. Today, however, these gauges have largely been replaced by electronic vacuum gauges.

Pressure Force distributed continuously over an area

Pressure is the force applied perpendicular to the surface of an object per unit area over which that force is distributed. Gauge pressure is the pressure relative to the ambient pressure.

The torr is a unit of pressure based on an absolute scale, now defined as exactly 1/760 of a standard atmosphere. Thus one torr is exactly 101325/760 pascals (≈ 133.32 Pa).

Herbert McLeod British chemist

Herbert McLeod, FRS was a British chemist, noted for the invention of the McLeod gauge and for the invention of a sunshine recorder.

Contents

The design of a McLeod gauge is somewhat similar to a that of a mercury-column manometer. Typically it is filled with mercury. If used incorrectly, this mercury can escape and contaminate the vacuum system attached to the gauge.

Mercury (element) Chemical element with atomic number 80

Mercury is a chemical element with symbol Hg and atomic number 80. It is commonly known as quicksilver and was formerly named hydrargyrum. A heavy, silvery d-block element, mercury is the only metallic element that is liquid at standard conditions for temperature and pressure; the only other element that is liquid under these conditions is the halogen bromine, though metals such as caesium, gallium, and rubidium melt just above room temperature.

McLeod manometer symbol according to ISO 3753-1977 (E) McLeod manometer.png
McLeod manometer symbol according to ISO 3753-1977 (E)

McLeod gauges operate by taking in a sample volume of gas from a vacuum chamber, then compressing it by tilting and infilling with mercury. The pressure in this smaller volume is then measured by a mercury manometer, and knowing the compression ratio (the ratio of the initial and final volumes), the pressure of the original vacuum can be determined by applying Boyle's law.

Boyles law Relationship between pressure and volume in a gas at constant temperature

Boyle's law, most often referred to as the Boyle–Mariotte law, or Mariotte's law, is an experimental gas law that describes how the pressure of a gas tends to increase as the volume of the container decreases. A modern statement of Boyle's law is

The absolute pressure exerted by a given mass of an ideal gas is inversely proportional to the volume it occupies if the temperature and amount of gas remain unchanged within a closed system.

This method is fairly accurate for non-condensible gases, such as oxygen and nitrogen. However, condensible gases, such as water vapour, ammonia, carbon dioxide, and pump-oil vapors may be in gaseous form in the low pressure of the vacuum chamber, but will condense when compressed by the McLeod gauge. The result is an erroneous reading, showing a pressure much lower than actually present. A cold trap may be used in conjunction with a McLeod gauge to condense these vapors before they enter the gauge.

Cold trap

In vacuum applications, a cold trap is a device that condenses all vapors except the permanent gases into a liquid or solid. The most common objective is to prevent vapors being evacuated from an experiment from entering a vacuum pump where they would condense and contaminate it. Particularly large cold traps are necessary when removing large amounts of liquid as in freeze drying.

The McLeod gauge has the advantage that it is simple to use and that its calibration is nearly the same for all non-condensable gases. The device can be manually operated and the scale read visually, or the process can be automated in various ways. For example, a small electric motor can periodically rotate the assembly to collect a gas sample. If a fine platinum wire is in the capillary tube, its resistance indicates the height of the mercury column around it.

Modern electronic vacuum gauges are simpler to use, less fragile, and do not present a mercury hazard, but their reading is highly dependent on the chemical nature of the gas being measured, and their calibration is unstable. For this reason, McLeod gauges continue to be used as a calibration standard for electronic gauges.

See also

Related Research Articles

Diffusion pump

Diffusion pumps use a high speed jet of vapor to direct gas molecules in the pump throat down into the bottom of the pump and out the exhaust. Invented in 1915 by Wolfgang Gaede using mercury vapor, and improved by Irving Langmuir and W. Crawford, they were the first type of high vacuum pumps operating in the regime of free molecular flow, where the movement of the gas molecules can be better understood as diffusion than by conventional fluid dynamics. Gaede used the name diffusion pump since his design was based on the finding that gas cannot diffuse against the vapor stream, but will be carried with it to the exhaust. However, the principle of operation might be more precisely described as gas-jet pump, since diffusion plays a role also in other high vacuum pumps. In modern textbooks, the diffusion pump is categorized as a momentum transfer pump.

Distillation method of separating mixtures based on differences in volatility of components in a boiling liquid mixture

Distillation is the process of separating the components or substances from a liquid mixture by using selective boiling and condensation. Distillation may result in essentially complete separation, or it may be a partial separation that increases the concentration of selected components in the mixture. In either case, the process exploits differences in the volatility of the mixture's components. In industrial chemistry, distillation is a unit operation of practically universal importance, but it is a physical separation process, not a chemical reaction.

Pressure measurement technique to measure pressure

Pressure measurement is the analysis of an applied force by a fluid on a surface. Pressure is typically measured in units of force per unit of surface area. Many techniques have been developed for the measurement of pressure and vacuum. Instruments used to measure and display pressure in an integral unit are called pressure gauges or vacuum gauges. A manometer is a good example, as it uses a column of liquid to both measure and indicate pressure. Likewise the widely used Bourdon gauge is a mechanical device, which both measures and indicates and is probably the best known type of gauge.

Vacuum pump

A vacuum pump is a device that removes gas molecules from a sealed volume in order to leave behind a partial vacuum. The first vacuum pump was invented in 1650 by Otto von Guericke, and was preceded by the suction pump, which dates to antiquity.

Vacuum Space that is empty of matter

Vacuum is space devoid of matter. The word stems from the Latin adjective vacuus for "vacant" or "void". An approximation to such vacuum is a region with a gaseous pressure much less than atmospheric pressure. Physicists often discuss ideal test results that would occur in a perfect vacuum, which they sometimes simply call "vacuum" or free space, and use the term partial vacuum to refer to an actual imperfect vacuum as one might have in a laboratory or in space. In engineering and applied physics on the other hand, vacuum refers to any space in which the pressure is lower than atmospheric pressure. The Latin term in vacuo is used to describe an object that is surrounded by a vacuum.

Calibration in measurement technology and metrology is the comparison of measurement values delivered by a device under test with those of a calibration standard of known accuracy. Such a standard could be another measurement device of known accuracy, a device generating the quantity to be measured such as a voltage, sound tone, or a physical artefact, such as a metre ruler.

A cryopump or a "cryogenic pump" is a vacuum pump that traps gases and vapours by condensing them on a cold surface, but are only effective on some gases. The effectiveness depends on the freezing and boiling points of the gas relative to the cryopump's temperature. They are sometimes used to block particular contaminants, for example in front of a diffusion pump to trap backstreaming oil, or in front of a McLeod gauge to keep out water. In this function, they are called a cryotrap, waterpump or cold trap, even though the physical mechanism is the same as for a cryopump.

Fractional distillation is the separation of a mixture into its component parts, or fractions. Chemical compounds are separated by heating them to a temperature at which one or more fractions of the mixture will vaporize. It uses distillation to fractionate. Generally the component parts have boiling points that differ by less than 25 °C (77 °F) from each other under a pressure of one atmosphere. If the difference in boiling points is greater than 25 °C, a simple distillation is typically used.

A fractionating column is an essential item used in distillation of liquid mixtures so as to separate the mixture into its component parts, or fractions, based on the differences in volatilities. Fractionating columns are used in small scale laboratory distillations as well as for large scale industrial distillations.

Sphygmomanometer instrument for measuring blood pressure

A sphygmomanometer, also known as a blood pressure meter, blood pressure monitor, or blood pressure gauge, is a device used to measure blood pressure, composed of an inflatable cuff to collapse and then release the artery under the cuff in a controlled manner, and a mercury or mechanical manometer to measure the pressure. It is always used in conjunction with a means to determine at what pressure blood flow is just starting, and at what pressure it is unimpeded. Manual sphygmomanometers are used in conjunction with a stethoscope.

Gas chromatography common type of chromatography

Gas chromatography (GC) is a common type of chromatography used in analytical chemistry for separating and analyzing compounds that can be vaporized without decomposition. Typical uses of GC include testing the purity of a particular substance, or separating the different components of a mixture. In some situations, GC may help in identifying a compound. In preparative chromatography, GC can be used to prepare pure compounds from a mixture.

Ultra-high vacuum (UHV) is the vacuum regime characterised by pressures lower than about 10−7 pascal or 100 nanopascals. UHV conditions are created by pumping the gas out of a UHV chamber. At these low pressures the mean free path of a gas molecule is greater than approximately 40 km, so the gas is in free molecular flow, and gas molecules will collide with the chamber walls many times before colliding with each other. Almost all molecular interactions therefore take place on various surfaces in the chamber.

Hot-filament ionization gauge

The hot-filament ionization gauge, sometimes called a hot-filament gauge or hot-cathode gauge, is the most widely used low-pressure (vacuum) measuring device for the region from 10−3 to 10−10 Torr. It is a triode, with the filament being the cathode.

Vacuum engineering deals with technological processes and equipment that use vacuum to achieve better results than those run under atmospheric pressure. The most widespread applications of vacuum technology are:

The air permeability specific surface of a powder material is a single-parameter measurement of the fineness of the powder. The specific surface is derived from the resistance to flow of air through a porous bed of the powder. The SI units are m2·kg−1 or m2·m−3.

Pirani gauge

The Pirani gauge is a robust thermal conductivity gauge used for the measurement of the pressures in vacuum systems. It was invented in 1906 by Marcello Pirani.

Supermarine Spitfire variants: specifications, performance and armament specific model of the Supermarine Spitfire aircraft family

The British Supermarine Spitfire was one of the most popular fighter aircraft of the Second World War. The basic airframe proved to be extremely adaptable, capable of taking far more powerful engines and far greater loads than its original role as a short-range interceptor had allowed for. This would lead to 24 marks of Spitfire, and many sub-variants within the marks, being produced throughout the Second World War and beyond, in continuing efforts to fulfill Royal Air Force requirements and successfully combat ever-improving enemy aircraft.

A dead weight tester apparatus uses known traceable weights to apply pressure to a fluid for checking the accuracy of readings from a pressure gauge. A dead weight tester (DWT) is a calibration standard method that uses a piston cylinder on which a load is placed to make an equilibrium with an applied pressure underneath the piston. Deadweight testers are so called primary standards which means that the pressure measured by a deadweight tester is defined through other quantities: length, mass and time. Typically deadweight testers are used in calibration laboratories to calibrate pressure transfer standards like electronic pressure measuring devices.

References

  1. McLeod, Herbert (1874). "Apparatus for measurement of low pressures of gas". Philosophical Magazine. xlviii: 110–113.