Minimum resolvable temperature difference

Last updated

Minimum resolvable temperature difference (MRTD) is a measure for assessing the performance of infrared cameras, and is inversely proportional to the modulation transfer function.

Contents

Typically, an operator is asked to assess the minimum temperature difference at which a 4-bar target can be resolved. This minimum difference will change with the spatial frequency of the bar target used. A curve of MRTD against spatial frequency is obtained which characterises the performance of the imaging system.

Modern infrared imaging systems can have low spatial frequency MRTDs of tens of millikelvins.

Manual test

A manual subjective test is implemented to determine the MRTD. An operator uses a series of 4-bar targets of different spatial frequencies. For each target he/she adjusts the blackbody, (source of Infrared radiation), temperature up and down until the pattern is "just resolvable." The positive and negative temperature differences are stored into a two dimensional array. The corresponding spatial frequencies used in each test are also stored into an array. The MRTD curve is a plot of these arrays (just resolvable temperature difference versus target spatial frequency). From the experimental MRTD data, a general polynomial best fit is calculated and the result is the MRTD curve which gives direct insight into the quality of the image. i.e. the infrared camera's ability to resolve detail, in this case temperature. [1]

Calculations

, the MRTD curve [ dubious ]
= array of just resolvable temperature differences
= array of spatial frequencies

Minimum detectable temperature difference

Minimum detectable temperature difference (MDTD), also called minimum detectable temperature (MDT), is not the same phenomenon as MRTD and is only subtly different. Like MRTD, it is a measure of the performance of infrared cameras. However, MDTD is a measure of visibility, not resolvability.

Manual test

The manual subjective test for MDTD is similar to the one for MRTD. A trained operator views a series of pinhole targets at different spatial frequencies. For each series of pinhole targets the operator ramps the blackbody (source of IR radiation) up and down until the targets are "just visible". The data at which the pinhole targets are "just visible" is stored into an array and plotted against spatial frequency, and a curve is fitted to the data. The MDTD curve is thus defined as the temperature versus spatial frequency.

Here the relevant spatial frequency is f = 1/W where W is the angular subtense of the target.

See also

Related Research Articles

Johnson–Nyquist noise

Johnson–Nyquist noise is the electronic noise generated by the thermal agitation of the charge carriers inside an electrical conductor at equilibrium, which happens regardless of any applied voltage. Thermal noise is present in all electrical circuits, and in sensitive electronic equipment such as radio receivers can drown out weak signals, and can be the limiting factor on sensitivity of an electrical measuring instrument. Thermal noise increases with temperature. Some sensitive electronic equipment such as radio telescope receivers are cooled to cryogenic temperatures to reduce thermal noise in their circuits. The generic, statistical physical derivation of this noise is called the fluctuation-dissipation theorem, where generalized impedance or generalized susceptibility is used to characterize the medium.

Angular resolution describes the ability of any image-forming device such as an optical or radio telescope, a microscope, a camera, or an eye, to distinguish small details of an object, thereby making it a major determinant of image resolution. The closely related term spatial resolution refers to the precision of a measurement with respect to space, which is directly connected to angular resolution in imaging instruments.

In physics, two wave sources are perfectly coherent if they have a constant phase difference and the same frequency, and the same waveform. Coherence is an ideal property of waves that enables stationary interference. It contains several distinct concepts, which are limiting cases that never quite occur in reality but allow an understanding of the physics of waves, and has become a very important concept in quantum physics. More generally, coherence describes all properties of the correlation between physical quantities of a single wave, or between several waves or wave packets.

Thermographic camera device that forms an image using infrared radiation

A thermographic camera is a device that creates an image using infrared radiation, similar to a common camera that forms an image using visible light. Instead of the 400–700 nanometre range of the visible light camera, infrared cameras are sensitive to wavelengths from about 1,000 nm (1 μm) to about 14,000 nm (14 μm). The art of capturing and analyzing the data they provide is called thermography.

Michelson interferometer common configuration for optical interferometry invented by Albert Abraham Michelson

The Michelson interferometer is a common configuration for optical interferometry and was invented by Albert Abraham Michelson. Using a beam splitter, a light source is split into two arms. Each of those light beams is reflected back toward the beamsplitter which then combines their amplitudes using the superposition principle. The resulting interference pattern that is not directed back toward the source is typically directed to some type of photoelectric detector or camera. For different applications of the interferometer, the two light paths can be with different lengths or incorporate optical elements or even materials under test.

Zone plate device used to focus light using diffraction

A zone plate is a device used to focus light or other things exhibiting wave character. Unlike lenses or curved mirrors however, zone plates use diffraction instead of refraction or reflection. Based on analysis by Augustin-Jean Fresnel, they are sometimes called Fresnel zone plates in his honor. The zone plate's focusing ability is an extension of the Arago spot phenomenon caused by diffraction from an opaque disc.

Sensor array group of sensors, usually deployed in a geometric pattern, used to increase gain or dimensionality over a single sensor

A sensor array is a group of sensors, usually deployed in a certain geometry pattern, used for collecting and processing electromagnetic or acoustic signals. The advantage of using a sensor array over using a single sensor lies in the fact that an array adds new dimensions to the observation, helping to estimate more parameters and improve the estimation performance. For example an array of radio antenna elements used for beamforming can increase antenna gain in the direction of the signal while decreasing the gain in other directions, i.e., increasing signal-to-noise ratio (SNR) by amplifying the signal coherently. Another example of sensor array application is to estimate the direction of arrival of impinging electromagnetic waves. The related processing method is called array signal processing. Application examples of array signal processing include radar/sonar, wireless communications, seismology, machine condition monitoring, astronomical observations fault diagnosis, etc.

Continuous-wave radar

Continuous-wave radar is a type of radar system where a known stable frequency continuous wave radio energy is transmitted and then received from any reflecting objects. Individual objects are detected using the Doppler effect, which causes the received signal to have a different frequency than the transmission, allowing it to be detected by filtering out the transmitted frequency.

Acousto-optic modulator uses the acousto-optic effect to diffract and shift the frequency of light using sound waves

An acousto-optic modulator (AOM), also called a Bragg cell, uses the acousto-optic effect to diffract and shift the frequency of light using sound waves. They are used in lasers for Q-switching, telecommunications for signal modulation, and in spectroscopy for frequency control. A piezoelectric transducer is attached to a material such as glass. An oscillating electric signal drives the transducer to vibrate, which creates sound waves in the material. These can be thought of as moving periodic planes of expansion and compression that change the index of refraction. Incoming light scatters off the resulting periodic index modulation and interference occurs similar to Bragg diffraction. The interaction can be thought of as a three-wave mixing process resulting in Sum-frequency generation or Difference-frequency generation between phonons and photons.

Optical resolution describes the ability of an imaging system to resolve detail in the object that is being imaged.

Optical transfer function function that specifies how different spatial frequencies are handled by the system; describes how the optics project light from the object or scene onto a photographic film, detector array, retina, screen, etc.

The optical transfer function (OTF) of an optical system such as a camera, microscope, human eye, or projector specifies how different spatial frequencies are handled by the system. It is used by optical engineers to describe how the optics project light from the object or scene onto a photographic film, detector array, retina, screen, or simply the next item in the optical transmission chain. A variant, the modulation transfer function (MTF), neglects phase effects, but is equivalent to the OTF in many situations.

In optics, spatial cutoff frequency is a precise way to quantify the smallest object resolvable by an optical system. Due to diffraction at the image plane, all optical systems act as low pass filters with a finite ability to resolve detail. If it were not for the effects of diffraction, a 2" aperture telescope could theoretically be used to read newspapers on a planet circling Alpha Centauri, over four light-years distant. Unfortunately, the wave nature of light will never permit this to happen.

Contrast (vision) difference in luminance and/or color that makes an object distinguishable

Contrast is the difference in luminance or colour that makes an object distinguishable. In visual perception of the real world, contrast is determined by the difference in the color and brightness of the object and other objects within the same field of view. The human visual system is more sensitive to contrast than absolute luminance; we can perceive the world similarly regardless of the huge changes in illumination over the day or from place to place. The maximum contrast of an image is the contrast ratio or dynamic range.

Pulse compression is a signal processing technique commonly used by radar, sonar and echography to increase the range resolution as well as the signal to noise ratio. This is achieved by modulating the transmitted pulse and then correlating the received signal with the transmitted pulse.

In astronomy, color–color diagrams are a means of comparing the apparent magnitudes of stars at different wavelengths. Astronomers typically observe at narrow bands around certain wavelengths, and objects observed will have different brightnesses in each band. The difference in brightness between two bands is referred to as color. On color–color diagrams, the color defined by two wavelength bands is plotted on the horizontal axis, and then the color defined by another brightness difference will be plotted on the vertical axis.

Radar engineering details are technical details pertaining to the components of a radar and their ability to detect the return energy from moving scatterers — determining an object's position or obstruction in the environment. This includes field of view in terms of solid angle and maximum unambiguous range and velocity, as well as angular, range and velocity resolution. Radar sensors are classified by application, architecture, radar mode, platform, and propagation window.

Minimum resolvable contrast (MRC) is a subjective measure of a visible spectrum sensor’s or camera's sensitivity and ability to resolve data. A snapshot image of a series of three bar targets of selected spatial frequencies and various contrast coatings captured by the unit under test (UUT) is used to determine the MRC of the UUT, i.e. the visible spectrum camera or sensor. A trained observer selects the smallest target resolvable at each contrast level. Typically, specialized computer software collects the inputted data of the observer and provides a graph of contrast v.s. spatial frequency at a given luminance level. A first order polynomial is fitted to the data and an MRC curve of spatial frequency versus contrast is generated.

The signal transfer function (SiTF) is a measure of the signal output versus the signal input of a system such as an infrared system or sensor. There are many general applications of the SiTF. Specifically, in the field of image analysis, it gives a measure of the noise of an imaging system, and thus yields one assessment of its performance.

Super-resolution photoacoustic imaging is a set of techniques used to enhance spatial resolution in photoacoustic imaging. Specifically, these techniques primarily break the optical diffraction limit of the photoacoustic imaging system. It can be achieved in a variety of mechanisms, such as blind structured illumination, multi-speckle illumination, or photo-imprint photoacoustic microscopy in Figure 1.

References

  1. Electro Optical Industries, Inc.(2005). EO TestLab Methodology. In Education/Ref. "Archived copy". Archived from the original on 2008-08-28. Retrieved 2008-05-22.CS1 maint: archived copy as title (link).