Land around the [[Chesapeake Bay]]{{cite web |url=http://landsat.gsfc.nasa.gov/?page_id=11/compositor/ |title=The Landsat 7 Compositor |publisher=landsat.gsfc.nasa.gov |date=2011-03-21 |access-date=2012-09-01 |archive-date=21 September 2013 |archive-url=https://web.archive.org/web/20130921121348/http://landsat.gsfc.nasa.gov/?page_id=11%2Fcompositor%2F |url-status=dead}}"},"header_align":{"wt":""},"header_background":{"wt":""},"width1":{"wt":"200"},"image1":{"wt":"True-color-image.png"},"caption1":{"wt":"This true-color image shows the area in actual colors,e.g.,the vegetation appears in green. It covers the full [[visible spectrum]] using the red,green and blue / green spectral bands of the satellite mapped to the [[RGB color model|RGB color space]] of the image.\n"},"width2":{"wt":"199"},"image2":{"wt":"False-color-image.png"},"caption2":{"wt":"The same area as a false-color image using the [[infrared|near infrared]],red and green spectral bands mapped to RGB –this image shows vegetation in a red tone,as vegetation reflects most light in the near infrared."}},"i":0}}]}" id="mwNA">.mw-parser-output .tmulti .multiimageinner{display:flex;flex-direction:column}.mw-parser-output .tmulti .trow{display:flex;flex-direction:row;clear:left;flex-wrap:wrap;width:100%;box-sizing:border-box}.mw-parser-output .tmulti .tsingle{margin:1px;float:left}.mw-parser-output .tmulti .theader{clear:both;font-weight:bold;text-align:center;align-self:center;background-color:transparent;width:100%}.mw-parser-output .tmulti .thumbcaption{background-color:transparent}.mw-parser-output .tmulti .text-align-left{text-align:left}.mw-parser-output .tmulti .text-align-right{text-align:right}.mw-parser-output .tmulti .text-align-center{text-align:center}@media all and (max-width:720px){.mw-parser-output .tmulti .thumbinner{width:100%!important;box-sizing:border-box;max-width:none!important;align-items:center}.mw-parser-output .tmulti .trow{justify-content:center}.mw-parser-output .tmulti .tsingle{float:none!important;max-width:100%!important;box-sizing:border-box;text-align:center}.mw-parser-output .tmulti .tsingle .thumbcaption{text-align:left}.mw-parser-output .tmulti .trow>.thumbcaption{text-align:center}}@media screen{html.skin-theme-clientpref-night .mw-parser-output .tmulti .multiimageinner img{background-color:white}}@media screen and (prefers-color-scheme:dark){html.skin-theme-clientpref-os .mw-parser-output .tmulti .multiimageinner img{background-color:white}}
Absolute true-color rendering is impossible. [3] There are three major sources of color error (metameric failure):
The result of a metameric failure would be for example an image of a green tree which shows a different shade of green than the tree itself, a different shade of red for a red apple, a different shade of blue for the blue sky, and so on. Color management (e.g. with ICC profiles) can be used to mitigate this problem within the physical constraints.
Approximate true-color images gathered by spacecraft are an example where images have a certain amount of metameric failure, as the spectral bands of a spacecraft's camera are chosen to gather information on the physical properties of the object under investigation, and are not chosen to capture true-color images. [3]
In contrast to a true-color image, a false-color image sacrifices natural color rendition in order to ease the detection of features that are not readily discernible otherwise – for example the use of near infrared for the detection of vegetation in satellite images. [1] While a false-color image can be created using solely the visual spectrum (e.g. to accentuate color differences), typically some or all data used is from electromagnetic radiation (EM) outside the visual spectrum (e.g. infrared, ultraviolet or X-ray). The choice of spectral bands is governed by the physical properties of the object under investigation.
As the human eye uses three spectral bands (see trichromacy for details), three spectral bands are commonly combined into a false-color image. At least two spectral bands are needed for a false-color encoding, [4] and it is possible to combine more bands into the three visual RGB bands – with the eye's ability to discern three channels being the limiting factor. [5] In contrast, a "color" image made from one spectral band, or an image made from data consisting of non-EM data (e.g. elevation, temperature, tissue type) is a pseudocolor image (see below).
For true color, the RGB channels (red "R", green "G" and blue "B") from the camera are mapped to the corresponding RGB channels of the image, yielding a "RGB→RGB" mapping. For false color this relationship is changed. The simplest false-color encoding is to take an RGB image in the visible spectrum, but map it differently, e.g. "GBR→RGB". For traditional false-color satellite images of Earth a "NRG→RGB" mapping is used, with "N" being the near-infrared spectral band (and the blue spectral band being unused) – this yields the typical "vegetation in red" false-color images. [1] [6]
False color is used (among others) for satellite and space images: Examples are remote sensing satellites (e.g. Landsat, see example above), space telescopes (e.g. the Hubble Space Telescope) or space probes (e.g. Cassini-Huygens ). Some spacecraft, with rovers (e.g. the Mars Science Laboratory Curiosity) being the most prominent examples, have the ability to capture approximate true-color images as well. [3] Weather satellites produce, in contrast to the spacecraft mentioned previously, grayscale images from the visible or infrared spectrum.
False color has a range of scientific applications. Spacecraft often employ false-color methods to help understand the composition of structures in the universe such as nebula and galaxies. [7] The frequency of light emitted by different ions in space are assigned contrasting colors, allowing the chemical composition of complex structures to be better separated and visualised. The image of the Eagle Nebula above is a typical example of this; the Hydrogen and Oxygen ions have been assigned green and blue respectively. The large amounts of green and blue in the image show that there is a large amount of Hydrogen and Oxygen in the nebula.
On 26 October 2004, the NASA/ESA Cassini-Huygens spacecraft captured a false-color image of Titan, Saturn's largest moon. [8] The image was captured in Ultraviolet and Infrared wavelengths, both invisible to the human eye. [9] In order to provide a visual representation, false color techniques were used. The infrared data was mapped to red and green colors, and ultraviolet mapped to blue. [10]
A pseudocolor image (sometimes styled pseudo-color or pseudo color) is derived from a grayscale image by mapping each intensity value to a color according to a table or function. [11] Pseudo color is typically used when a single channel of data is available (e.g. temperature, elevation, soil composition, tissue type, and so on), in contrast to false color which is commonly used to display three channels of data. [4]
Pseudocoloring can make some details more visible, as the perceived difference in color space is bigger than between successive gray levels alone. On the other hand, the color mapping function should be chosen to make sure the lightness of the color is still monotonic, or the uneven change would make it hard to interpret levels, for both normal and colorblind viewers. One offender is the commonly-used "rainbow" palette, with a back-and-forth change in lightness. (See also Choropleth map § Color progression.) [12]
A typical example for the use of pseudo color is thermography (thermal imaging), where infrared cameras feature only one spectral band and show their grayscale images in pseudo color.
Another familiar example of pseudo color is the encoding of elevation using hypsometric tints in physical relief maps, where negative values (below sea level) are usually represented by shades of blue, and positive values by greens and browns.
Depending on the table or function used and the choice of data sources, pseudocoloring may increase the information contents of the original image, for example adding geographic information, combining information obtained from infrared or ultra-violet light, or other sources like MRI scans. [13]
A further application of pseudocoloring is to store the results of image elaboration; that is, changing the colors in order to ease understanding an image. [14]
Density slicing, a variation of pseudo color, divides an image into a few colored bands and is (among others) used in the analysis of remote sensing images. [15] For density slicing the range of grayscale levels is divided into intervals, with each interval assigned to one of a few discrete colors – this is in contrast to pseudo color, which uses a continuous color scale. [16] For example, in a grayscale thermal image the temperature values in the image can be split into bands of 2 °C, and each band represented by one color – as a result the temperature of one spot in the thermograph can be easier acquired by the user, because the discernible differences between the discrete colors are greater than those of images with continuous grayscale or continuous pseudo color.
A choropleth is an image or map in which areas are colored or patterned proportionally to the category or value of one or more variables being represented. The variables are mapped to a few colors; each area contributes one data point and receives one color from these selected colors. Basically, it is density slicing applied to a pseudocolor overlay. A choropleth map of a geographic area is thus an extreme form of false color.
This section needs expansion. You can help by adding to it. (August 2012) |
While artistic rendition lends to subjective expression of color, Andy Warhol (1928–1987) has become a culturally significant figure of the modern art movement by creating false-color paintings with screen printing techniques. Some of Warhol's most recognizable prints include a replication of Marilyn Monroe, her image based on a film frame from the movie Niagara . The subject was a sex symbol and film noir starlet whose death in 1962 influenced the artist. A series of prints were made with endearment but expose her persona as an illusion through his assembly line style of art production which are non-erotic and slightly grotesque. [17] Using various ink color palettes, Warhol immersed himself in a process of repetition that serves to compare personas and everyday objects to the qualities of mass production and consumerism. [18] The colors of ink were selected through experimentation of aesthetics and do not correlate to false-color rendering of the electromagnetic spectrum employed in remote sensing image processing. For years the artist continued screen printing false-color images of Marilyn Monroe, perhaps his most referenced work being Turquoise Marilyn [19] which was bought in May 2007 by a private collector for 80 million US dollars. [20]
Remote sensing is the acquisition of information about an object or phenomenon without making physical contact with the object, in contrast to in situ or on-site observation. The term is applied especially to acquiring information about Earth and other planets. Remote sensing is used in numerous fields, including geophysics, geography, land surveying and most Earth science disciplines. It also has military, intelligence, commercial, economic, planning, and humanitarian applications, among others.
The Landsat program is the longest-running enterprise for acquisition of satellite imagery of Earth. It is a joint NASA / USGS program. On 23 July 1972, the Earth Resources Technology Satellite was launched. This was eventually renamed to Landsat 1 in 1975. The most recent, Landsat 9, was launched on 27 September 2021.
In digital photography, computer-generated imagery, and colorimetry, a greyscale or grayscale image is one in which the value of each pixel is a single sample representing only an amount of light; that is, it carries only intensity information. Grayscale images, a kind of black-and-white or gray monochrome, are composed exclusively of shades of gray. The contrast ranges from black at the weakest intensity to white at the strongest.
Landsat 7 is the seventh satellite of the Landsat program. Launched on 15 April 1999, Landsat 7's primary goal is to refresh the global archive of satellite photos, providing up-to-date and cloud-free images. The Landsat program is managed and operated by the United States Geological Survey, and data from Landsat 7 is collected and distributed by the USGS. The NASA WorldWind project allows 3D images from Landsat 7 and other sources to be freely navigated and viewed from any angle. The satellite's companion, Earth Observing-1, trailed by one minute and followed the same orbital characteristics, but in 2011 its fuel was depleted and EO-1's orbit began to degrade. Landsat 7 was built by Lockheed Martin Space Systems.
A spectral color is a color that is evoked by monochromatic light, i.e. either a spectral line with a single wavelength or frequency of light in the visible spectrum, or a relatively narrow spectral band. Every wave of visible light is perceived as a spectral color; when viewed as a continuous spectrum, these colors are seen as the familiar rainbow. Non-spectral colors are evoked by a combination of spectral colors.
Satellite images are images of Earth collected by imaging satellites operated by governments and businesses around the world. Satellite imaging companies sell images by licensing them to governments and businesses such as Apple Maps and Google Maps.
Landsat 4 is the fourth satellite of the Landsat program. It was launched on July 16, 1982, with the primary goal of providing a global archive of satellite imagery. Although the Landsat Program is managed by NASA, data from Landsat 4 was collected and distributed by the U.S. Geological Survey. Landsat 4 science operations ended on December 14, 1993, when the satellite lost its ability to transmit science data, far beyond its designed life expectancy of five years. The satellite housekeeping telemetry and tracking continued to be maintained by NASA until it was decommissioned on June 15, 2001.
Multispectral imaging captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or detected with the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range. It can allow extraction of additional information the human eye fails to capture with its visible receptors for red, green and blue. It was originally developed for military target identification and reconnaissance. Early space-based imaging platforms incorporated multispectral imaging technology to map details of the Earth related to coastal boundaries, vegetation, and landforms. Multispectral imaging has also found use in document and painting analysis.
Color digital images are made of pixels, and pixels are made of combinations of primary colors represented by a series of code. A channel in this context is the grayscale image of the same size as a color image, made of just one of these primary colors. For instance, an image from a standard digital camera will have a red, green and blue channel. A grayscale image has just one channel.
In computing, indexed color is a technique to manage digital images' colors in a limited fashion, in order to save computer memory and file storage, while speeding up display refresh and file transfers. It is a form of vector quantization compression.
The normalized difference vegetation index (NDVI) is a widely-used metric for quantifying the health and density of vegetation using sensor data. It is calculated from spectrometric data at two specific bands: red and near-infrared. The spectrometric data is usually sourced from remote sensors, such as satellites.
An imaging spectrometer is an instrument used in hyperspectral imaging and imaging spectroscopy to acquire a spectrally-resolved image of an object or scene, usually to support analysis of the composition the object being imaged. The spectral data produced for a pixel is often referred to as a datacube due to the three-dimensional representation of the data. Two axes of the image correspond to vertical and horizontal distance and the third to wavelength. The principle of operation is the same as that of the simple spectrometer, but special care is taken to avoid optical aberrations for better image quality.
Landsat 2 is the second satellite of the Landsat program. The spacecraft originally carried a designation of ERTS-B but was renamed "Landsat 2" prior to its launch on January 22, 1975. The objective of the satellite was to acquire global, seasonal data in medium resolution from a near-polar, Sun-synchronous orbit. The satellite, built by General Electric, acquired data with the Return Beam Vidicon (RBV) and the Multispectral Scanner (MSS). Despite having a design life of one year, Landsat 2 operated for over seven years, finally ceasing operations on February 25, 1982.
Ocean color is the branch of ocean optics that specifically studies the color of the water and information that can be gained from looking at variations in color. The color of the ocean, while mainly blue, actually varies from blue to green or even yellow, brown or red in some cases. This field of study developed alongside water remote sensing, so it is focused mainly on how color is measured by instruments.
Pansharpening is a process of merging high-resolution panchromatic and lower resolution multispectral imagery to create a single high-resolution color image. Google Maps and nearly every map creating company use this technique to increase image quality. Pansharpening produces a high-resolution color image from three, four or more low-resolution multispectral satellite bands plus a corresponding high-resolution panchromatic band:
Low-res color bands + High-res grayscale band = High-res color image
Landsat 8 is an American Earth observation satellite launched on 11 February 2013. It is the eighth satellite in the Landsat program; the seventh to reach orbit successfully. Originally called the Landsat Data Continuity Mission (LDCM), it is a collaboration between NASA and the United States Geological Survey (USGS). NASA Goddard Space Flight Center in Greenbelt, Maryland, provided development, mission systems engineering, and acquisition of the launch vehicle while the USGS provided for development of the ground systems and will conduct on-going mission operations. It comprises the camera of the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS), which can be used to study Earth surface temperature and is used to study global warming.
Sentinel-2 is an Earth observation mission from the Copernicus Programme that acquires optical imagery at high spatial resolution over land and coastal waters. The mission's Sentinel-2A and Sentinel-2B satellites were joined in orbit in 2024 by a third, Sentinel-2C, and in the future by Sentinel-2D, eventually replacing the A and B satellites, respectively.
The Operational Land Imager (OLI) is a remote sensing instrument aboard Landsat 8, built by Ball Aerospace & Technologies. Landsat 8 is the successor to Landsat 7 and was launched on February 11, 2013.
Multispectral remote sensing is the collection and analysis of reflected, emitted, or back-scattered energy from an object or an area of interest in multiple bands of regions of the electromagnetic spectrum. Subcategories of multispectral remote sensing include hyperspectral, in which hundreds of bands are collected and analyzed, and ultraspectral remote sensing where many hundreds of bands are used. The main purpose of multispectral imaging is the potential to classify the image using multispectral classification. This is a much faster method of image analysis than is possible by human interpretation.
GOES-16, formerly known as GOES-R before reaching geostationary orbit, is the first of the GOES-R series of Geostationary Operational Environmental Satellites (GOES) operated by NASA and the National Oceanic and Atmospheric Administration (NOAA). GOES-16 serves as the operational geostationary weather satellite in the GOES East position at 75.2°W, providing a view centered on the Americas. GOES-16 provides high spatial and temporal resolution imagery of the Earth through 16 spectral bands at visible and infrared wavelengths using its Advanced Baseline Imager (ABI). GOES-16's Geostationary Lightning Mapper (GLM) is the first operational lightning mapper flown in geostationary orbit. The spacecraft also includes four other scientific instruments for monitoring space weather and the Sun.
{{cite web}}
: CS1 maint: multiple names: authors list (link)