Multispectral imaging captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or detected with the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range (i.e. infrared and ultraviolet). It can allow extraction of additional information the human eye fails to capture with its visible receptors for red, green and blue. It was originally developed for military target identification and reconnaissance. Early space-based imaging platforms incorporated multispectral imaging technology [1] to map details of the Earth related to coastal boundaries, vegetation, and landforms. [2] Multispectral imaging has also found use in document and painting analysis. [3] [4]
Multispectral imaging measures light in a small number (typically 3 to 15) of spectral bands. Hyperspectral imaging is a special case of spectral imaging where often hundreds of contiguous spectral bands are available. [5]
For different purposes, different combinations of spectral bands can be used. They are usually represented with red, green, and blue channels. Mapping of bands to colors depends on the purpose of the image and the personal preferences of the analysts. Thermal infrared is often omitted from consideration due to poor spatial resolution, except for special purposes.
Many other combinations are in use. NIR is often shown as red, causing vegetation-covered areas to appear red.
The wavelengths are approximate; exact values depend on the particular instruments (e.g. characteristics of satellite's sensors for Earth observation, characteristics of illumination and sensors for document analysis):
Unlike other aerial photographic and satellite image interpretation work, these multispectral images do not make it easy to identify directly the feature type by visual inspection. Hence the remote sensing data has to be classified first, followed by processing by various data enhancement techniques so as to help the user to understand the features that are present in the image.
Such classification is a complex task which involves rigorous validation of the training samples depending on the classification algorithm used. The techniques can be grouped mainly into two types.
Supervised classification makes use of training samples. Training samples are areas on the ground for which there is ground truth, that is, what is there is known. The spectral signatures of the training areas are used to search for similar signatures in the remaining pixels of the image, and we will classify accordingly. This use of training samples for classification is called supervised classification. Expert knowledge is very important in this method since the selection of the training samples and a biased selection can badly affect the accuracy of classification. Popular techniques include the maximum likelihood principle and convolutional neural network. The maximum likelihood principle calculates the probability of a pixel belonging to a class (i.e. feature) and allots the pixel to its most probable class. Newer convolutional neural network based methods [6] account for both spatial proximity and entire spectra to determine the most likely class.
In case of unsupervised classification no prior knowledge is required for classifying the features of the image. The natural clustering or grouping of the pixel values (i.e. the gray levels of the pixels) are observed. Then a threshold is defined for adopting the number of classes in the image. The finer the threshold value, the more classes there will be. However, beyond a certain limit the same class will be represented in different classes in the sense that variation in the class is represented. After forming the clusters, ground truth validation is done to identify the class the image pixel belongs to. Thus in this unsupervised classification a priori information about the classes is not required. One of the popular methods in unsupervised classification is k-means clustering.
Multispectral imaging measures light emission and is often used in detecting or tracking military targets. In 2003, researchers at the United States Army Research Laboratory and the Federal Laboratory Collaborative Technology Alliance reported a dual band multispectral imaging focal plane array (FPA). This FPA allowed researchers to look at two infrared (IR) planes at the same time. [9] Because mid-wave infrared (MWIR) and long wave infrared (LWIR) technologies measure radiation inherent to the object and require no external light source, they also are referred to as thermal imaging methods.
The brightness of the image produced by a thermal imager depends on the objects emissivity and temperature. [10] Every material has an infrared signature that aids in the identification of the object. [11] These signatures are less pronounced in hyperspectral systems (which image in many more bands than multispectral systems) and when exposed to wind and, more dramatically, to rain. [11] Sometimes the surface of the target may reflect infrared energy. This reflection may misconstrue the true reading of the objects’ inherent radiation. [12] Imaging systems that use MWIR technology function better with solar reflections on the target's surface and produce more definitive images of hot objects, such as engines, compared to LWIR technology. [13] However, LWIR operates better in hazy environments like smoke or fog because less scattering occurs in the longer wavelengths. [10] Researchers claim that dual-band technologies combine these advantages to provide more information from an image, particularly in the realm of target tracking. [9]
For nighttime target detection, thermal imaging outperformed single-band multispectral imaging. Dual band MWIR and LWIR technology resulted in better visualization during the nighttime than MWIR alone. Citation Citation. The US Army reports that its dual band LWIR/MWIR FPA demonstrated better visualizing of tactical vehicles than MWIR alone after tracking them through both day and night.[ citation needed ]
By analyzing the emissivity of ground surfaces, multispectral imaging can detect the presence of underground missiles. Surface and sub-surface soil possess different physical and chemical properties that appear in spectral analysis. [11] Disturbed soil has increased emissivity in the wavelength range of 8.5 to 9.5 micrometers while demonstrating no change in wavelengths greater than 10 micrometers. [9] The US Army Research Laboratory's dual MWIR/LWIR FPA used "red" and "blue" detectors to search for areas with enhanced emissivity. The red detector acts as a backdrop, verifying realms of undisturbed soil areas, as it is sensitive to the 10.4 micrometer wavelength. The blue detector is sensitive to wavelengths of 9.3 micrometers. If the intensity of the blue image changes when scanning, that region is likely disturbed. The scientists reported that fusing these two images increased detection capabilities. [9]
Intercepting an intercontinental ballistic missile (ICBM) in its boost phase requires imaging of the hard body as well as the rocket plumes. MWIR presents a strong signal from highly heated objects including rocket plumes, while LWIR produces emissions from the missile's body material. The US Army Research Laboratory reported that with their dual-band MWIR/LWIR technology, tracking of the Atlas 5 Evolved Expendable Launch Vehicles, similar in design to ICBMs, picked up both the missile body and plumage. [9]
Most radiometers for remote sensing (RS) acquire multispectral images. Dividing the spectrum into many bands, multispectral is the opposite of panchromatic, which records only the total intensity of radiation falling on each pixel. [14] Usually, Earth observation satellites have three or more radiometers. Each acquires one digital image (in remote sensing, called a 'scene') in a small spectral band. The bands are grouped into wavelength regions based on the origin of the light and the interests of the researchers.
Modern weather satellites produce imagery in a variety of spectra. [15]
Multispectral imaging combines two to five spectral imaging bands of relatively large bandwidth into a single optical system. A multispectral system usually provides a combination of visible (0.4 to 0.7 µm), near infrared (NIR; 0.7 to 1 µm), short-wave infrared (SWIR; 1 to 1.7 µm), mid-wave infrared (MWIR; 3.5 to 5 µm) or long-wave infrared (LWIR; 8 to 12 µm) bands into a single system. — Valerie C. Coffey [16]
In the case of Landsat satellites, several different band designations have been used, with as many as 11 bands (Landsat 8) comprising a multispectral image. [17] [18] [19] Spectral imaging with a higher radiometric resolution (involving hundreds or thousands of bands), finer spectral resolution (involving smaller bands), or wider spectral coverage may be called hyperspectral or ultraspectral. [19]
Multispectral imaging can be employed for investigation of paintings and other works of art. [3] The painting is irradiated by ultraviolet, visible and infrared rays and the reflected radiation is recorded in a camera sensitive in this region of the spectrum. The image can also be registered using the transmitted instead of reflected radiation. In special cases the painting can be irradiated by UV, VIS or IR rays and the fluorescence of pigments or varnishes can be registered. [20]
Multispectral analysis has assisted in the interpretation of ancient papyri, such as those found at Herculaneum, by imaging the fragments in the infrared range (1000 nm). Often, the text on the documents appears to the naked eye as black ink on black paper. At 1000 nm, the difference in how paper and ink reflect infrared light makes the text clearly readable. It has also been used to image the Archimedes palimpsest by imaging the parchment leaves in bandwidths from 365–870 nm, and then using advanced digital image processing techniques to reveal the undertext with Archimedes' work. [21] Multispectral imaging has been used in a Mellon Foundation project at Yale University to compare inks in medieval English manuscripts. [4]
Multispectral imaging has also been used to examine discolorations and stains on old books and manuscripts. Comparing the "spectral fingerprint" of a stain to the characteristics of known chemical substances can make it possible to identify the stain. This technique has been used to examine medical and alchemical texts, seeking hints about the activities of early chemists and the possible chemical substances they may have used in their experiments. Like a cook spilling flour or vinegar on a cookbook, an early chemist might have left tangible evidence on the pages of the ingredients used to make medicines. [22]
Infrared is electromagnetic radiation (EMR) with wavelengths longer than that of visible light but shorter than microwaves. The infrared spectral band begins with waves that are just longer than those of red light, so IR is invisible to the human eye. IR is generally understood to include wavelengths from around 750 nm (400 THz) to 1 mm (300 GHz). IR is commonly divided between longer-wavelength thermal IR, emitted from terrestrial sources, and shorter-wavelength IR or near-IR, part of the solar spectrum. Longer IR wavelengths (30–100 μm) are sometimes included as part of the terahertz radiation band. Almost all black-body radiation from objects near room temperature is in the IR band. As a form of electromagnetic radiation, IR carries energy and momentum, exerts radiation pressure, and has properties corresponding to both those of a wave and of a particle, the photon.
Remote sensing is the acquisition of information about an object or phenomenon without making physical contact with the object, in contrast to in situ or on-site observation. The term is applied especially to acquiring information about Earth and other planets. Remote sensing is used in numerous fields, including geophysics, geography, land surveying and most Earth science disciplines. It also has military, intelligence, commercial, economic, planning, and humanitarian applications, among others.
Infrared thermography (IRT), thermal video or thermal imaging, is a process where a thermal camera captures and creates an image of an object by using infrared radiation emitted from the object in a process, which are examples of infrared imaging science. Thermographic cameras usually detect radiation in the long-infrared range of the electromagnetic spectrum and produce images of that radiation, called thermograms. Since infrared radiation is emitted by all objects with a temperature above absolute zero according to the black body radiation law, thermography makes it possible to see one's environment with or without visible illumination. The amount of radiation emitted by an object increases with temperature; therefore, thermography allows one to see variations in temperature. When viewed through a thermal imaging camera, warm objects stand out well against cooler backgrounds; humans and other warm-blooded animals become easily visible against the environment, day or night. As a result, thermography is particularly useful to the military and other users of surveillance cameras.
Aerial archaeology is the study of archaeological sites from the air. It is a method of archaeological investigation that uses aerial photography, remote sensing, and other techniques to identify, record, and interpret archaeological features and sites. Aerial archaeology has been used to discover and map a wide range of archaeological sites, from prehistoric settlements and ancient roads to medieval castles and World War II battlefields.
Satellite images are images of Earth collected by imaging satellites operated by governments and businesses around the world. Satellite imaging companies sell images by licensing them to governments and businesses such as Apple Maps and Google Maps.
Spectral signature is the variation of reflectance or emittance of a material with respect to wavelengths. The spectral signature of stars indicates the composition of the stellar atmosphere. The spectral signature of an object is a function of the incidental EM wavelength and material interaction with that section of the electromagnetic spectrum.
Spectral imaging is imaging that uses multiple bands across the electromagnetic spectrum. While an ordinary camera captures light across three wavelength bands in the visible spectrum, red, green, and blue (RGB), spectral imaging encompasses a wide variety of techniques that go beyond RGB. Spectral imaging may use the infrared, the visible spectrum, the ultraviolet, x-rays, or some combination of the above. It may include the acquisition of image data in visible and non-visible bands simultaneously, illumination from outside the visible range, or the use of optical filters to capture a specific spectral range. It is also possible to capture hundreds of wavelength bands for each pixel in an image.
The normalized difference vegetation index (NDVI) is a widely-used metric for quantifying the health and density of vegetation using sensor data. It is calculated from spectrometric data at two specific bands: red and near-infrared. The spectrometric data is usually sourced from remote sensors, such as satellites.
Hyperspectral imaging collects and processes information from across the electromagnetic spectrum. The goal of hyperspectral imaging is to obtain the spectrum for each pixel in the image of a scene, with the purpose of finding objects, identifying materials, or detecting processes. There are three general types of spectral imagers. There are push broom scanners and the related whisk broom scanners, which read images over time, band sequential scanners, which acquire images of an area at different wavelengths, and snapshot hyperspectral imagers, which uses a staring array to generate an image in an instant.
An imaging spectrometer is an instrument used in hyperspectral imaging and imaging spectroscopy to acquire a spectrally-resolved image of an object or scene, usually to support analysis of the composition the object being imaged. The spectral data produced for a pixel is often referred to as a datacube due to the three-dimensional representation of the data. Two axes of the image correspond to vertical and horizontal distance and the third to wavelength. The principle of operation is the same as that of the simple spectrometer, but special care is taken to avoid optical aberrations for better image quality.
Chemical imaging is the analytical capability to create a visual image of components distribution from simultaneous measurement of spectra and spatial, time information. Hyperspectral imaging measures contiguous spectral bands, as opposed to multispectral imaging which measures spaced spectral bands.
Airborne Real-time Cueing Hyperspectral Enhanced Reconnaissance, also known by the acronym ARCHER, is an aerial imaging system that produces ground images far more detailed than plain sight or ordinary aerial photography can. It is the most sophisticated unclassified hyperspectral imaging system available, according to U.S. Government officials. ARCHER can automatically scan detailed imaging for a given signature of the object being sought, for abnormalities in the surrounding area, or for changes from previous recorded spectral signatures.
Electro-optical MASINT is a subdiscipline of Measurement and Signature Intelligence, (MASINT) and refers to intelligence gathering activities which bring together disparate elements that do not fit within the definitions of Signals Intelligence (SIGINT), Imagery Intelligence (IMINT), or Human Intelligence (HUMINT).
The visible and near-infrared (VNIR) portion of the electromagnetic spectrum has wavelengths between approximately 400 and 1100 nanometers (nm). It combines the full visible spectrum with an adjacent portion of the infrared spectrum up to the water absorption band between 1400 and 1500 nm. Some definitions also include the short-wavelength infrared band from 1400 nm up to the water absorption band at 2500 nm. VNIR multi-spectral image cameras have wide applications in remote sensing and imaging spectroscopy. Hyperspectral Imaging Satellite carried two payloads, among which one was working on the spectral range of VNIR.
Infrared vision is the capability of biological or artificial systems to detect infrared radiation. The terms thermal vision and thermal imaging are also commonly used in this context since infrared emissions from a body are directly related to their temperature: hotter objects emit more energy in the infrared spectrum than colder ones.
Multispectral remote sensing is the collection and analysis of reflected, emitted, or back-scattered energy from an object or an area of interest in multiple bands of regions of the electromagnetic spectrum. Subcategories of multispectral remote sensing include hyperspectral, in which hundreds of bands are collected and analyzed, and ultraspectral remote sensing where many hundreds of bands are used. The main purpose of multispectral imaging is the potential to classify the image using multispectral classification. This is a much faster method of image analysis than is possible by human interpretation.
Gaofen is a series of Chinese high-resolution Earth imaging satellites launched as part of the China High-resolution Earth Observation System (CHEOS) program. CHEOS is a state-sponsored, civilian Earth-observation program used for agricultural, disaster, resource, and environmental monitoring. Proposed in 2006 and approved in 2010, the CHEOS program consists of the Gaofen series of space-based satellites, near-space and airborne systems such as airships and UAVs, ground systems that conduct data receipt, processing, calibration, and taskings, and a system of applications that fuse observation data with other sources to produce usable information and knowledge.
Remote sensing is used in the geological sciences as a data acquisition method complementary to field observation, because it allows mapping of geological characteristics of regions without physical contact with the areas being explored. About one-fourth of the Earth's total surface area is exposed land where information is ready to be extracted from detailed earth observation via remote sensing. Remote sensing is conducted via detection of electromagnetic radiation by sensors. The radiation can be naturally sourced, or produced by machines and reflected off of the Earth surface. The electromagnetic radiation acts as an information carrier for two main variables. First, the intensities of reflectance at different wavelengths are detected, and plotted on a spectral reflectance curve. This spectral fingerprint is governed by the physio-chemical properties of the surface of the target object and therefore helps mineral identification and hence geological mapping, for example by hyperspectral imaging. Second, the two-way travel time of radiation from and back to the sensor can calculate the distance in active remote sensing systems, for example, Interferometric synthetic-aperture radar. This helps geomorphological studies of ground motion, and thus can illuminate deformations associated with landslides, earthquakes, etc.
Land cover maps are tools that provide vital information about the Earth's land use and cover patterns. They aid policy development, urban planning, and forest and agricultural monitoring.
Spectroradiometry is a technique in Earth and planetary remote sensing, which makes use of light behaviour, specifically how light energy is reflected, emitted, and scattered by substances, to explore their properties in the electromagnetic (light) spectrum and identify or differentiate between them. The interaction between light radiation and the surface of a given material determines the manner in which the radiation reflects back to a detector, i.e., a spectroradiometer. Combining the elements of spectroscopy and radiometry, spectroradiometry carries out precise measurements of electromagnetic radiation and associated parameters within different wavelength ranges. This technique forms the basis of multi- and hyperspectral imaging and reflectance spectroscopy, commonly applied across numerous geoscience disciplines, which evaluates the spectral properties exhibited by various materials found on Earth and planetary bodies.