False color

Last updated
A mosaic constructed from a series of 53 images taken through three spectral filters by Galileo's imaging system as it flew over the northern regions of the Moon in December 1992 Moon Crescent - False Color Mosaic.jpg
A mosaic constructed from a series of 53 images taken through three spectral filters by Galileo’s imaging system as it flew over the northern regions of the Moon in December 1992

False color (or pseudo color) refers to a group of color rendering methods used to display images in color which were recorded in the visible or non-visible parts of the electromagnetic spectrum. A false-color image is an image that depicts an object in colors that differ from those a photograph (a true-color image) would show.

Color Characteristic of human visual perception

Color, or colour, is the characteristic of human visual perception described through color categories, with names such as red, orange, yellow, green, blue, or purple. This perception of color derives from the stimulation of cone cells in the human eye by electromagnetic radiation in the visible spectrum. Color categories and physical specifications of color are associated with objects through the wavelength of the light that is reflected from them. This reflection is governed by the object's physical properties such as light absorption, emission spectra, etc.

Signal processing models and analyzes data representations of physical events

Signal processing is an electrical engineering subfield that focuses on analysing, modifying and synthesizing signals such as sound, images and biological measurements. Signal processing techniques can be used to improve transmission, storage efficiency and subjective quality and to also emphasize or detect components of interest in a measured signal.

Visible spectrum Portion of the electromagnetic spectrum that is visible to the human eye

The visible spectrum is the portion of the electromagnetic spectrum that is visible to the human eye. Electromagnetic radiation in this range of wavelengths is called visible light or simply light. A typical human eye will respond to wavelengths from about 380 to 740 nanometers. In terms of frequency, this corresponds to a band in the vicinity of 430–770 THz.


In addition, variants of false color such as pseudocolor, density slicing, and choropleths are used for information visualization of either data gathered by a single grayscale channel or data not depicting parts of the electromagnetic spectrum (e.g. elevation in relief maps or tissue types in magnetic resonance imaging).

Information visualization study of visual representations of data

Information visualization or information visualisation is the study of (interactive) visual representations of abstract data to reinforce human cognition. The abstract data include both numerical and non-numerical data, such as text and geographic information. However, information visualization differs from scientific visualization: "it's infovis [information visualization] when the spatial representation is chosen, and it's scivis [scientific visualization] when the spatial representation is given".

Magnetic resonance imaging non-destructive technique for imaging internal structures of objects or organisms

Magnetic resonance imaging (MRI) is a medical imaging technique used in radiology to form pictures of the anatomy and the physiological processes of the body. MRI scanners use strong magnetic fields, magnetic field gradients, and radio waves to generate images of the organs in the body. MRI does not involve X-rays or the use of ionizing radiation, which distinguishes it from CT or CAT scans and PET scans. Magnetic resonance imaging is a medical application of nuclear magnetic resonance (NMR). NMR can also be used for imaging in other NMR applications such as NMR spectroscopy.

Types of color renderings

True color

The concept behind true color can help in understanding false color. An image is called a true-color image when it offers a natural color rendition, or when it comes close to it. This means that the colors of an object in an image appear to a human observer the same way as if this observer were to directly view the object: A green tree appears green in the image, a red apple red, a blue sky blue, and so on. [1] When applied to black-and-white images, true-color means that the perceived lightness of a subject is preserved in its depiction.

Color vision ability of an organism or machine to distinguish objects based on wavelengths of light

Color vision is an ability of animals to perceive differences between light composed of different wavelengths independently of light intensity. Color perception is a part of the larger visual system and is mediated by a complex process between neurons that begins with differential stimulation of different types of photoreceptors by light entering the eye. Those photoreceptors then emit outputs that are then propagated through many layers of neurons and then ultimately to the brain. Color vision is found in many animals and is mediated by similar underlying mechanisms with common types of biological molecules and a complex history of evolution in different animal taxa. In primates, color vision may have evolved under selective pressure for a variety of visual tasks including the foraging for nutritious young leaves, ripe fruit, and flowers, as well as detecting predator camouflage and emotional states in other primates.

Two Landsat satellite images showing the same region:
Chesapeake Bay and the city of Baltimore [2]
This true-color image shows the area in actual colors, e.g., the vegetation appears in green. It covers the full visible spectrum using the red, green and blue / green spectral bands of the satellite mapped to the RGB color space of the image.
The same area as a false-color image using the near infrared, red and green spectral bands mapped to RGB – this image shows vegetation in a red tone, as vegetation reflects most light in the near infrared.
Burns Cliff inside of Endurance crater on Mars. The color is approximate true color because, instead of the red spectral band, infrared was used. The result is a metameric failure in the color of the sky, which is slightly green in the image - had a human observer been present, then that person would have perceived the actual sky color to have a bit more orange in it. The Opportunity rover which captured this image does have a red filter, but it is often not used, due to the higher scientific value of images captured using the infrared band and the constraints of data transmission. Burns cliff.jpg
Burns Cliff inside of Endurance crater on Mars. The color is approximate true color because, instead of the red spectral band, infrared was used. The result is a metameric failure in the color of the sky, which is slightly green in the image – had a human observer been present, then that person would have perceived the actual sky color to have a bit more orange in it. The Opportunity rover which captured this image does have a red filter, but it is often not used, due to the higher scientific value of images captured using the infrared band and the constraints of data transmission.

Absolute true-color rendering is impossible. [3] There are three major sources of color error (metameric failure):

In colorimetry, metamerism is a perceived matching of colors with different (nonmatching) spectral power distributions. Colors that match this way are called metamers.

Spectral sensitivity Relative efficiency of detection of a signal as a function of its frequency or wavelength

Spectral sensitivity is the relative efficiency of detection, of light or other signal, as a function of the frequency or wavelength of the signal.

Camera Optical device for recording images

A camera is an optical instrument to capture still images or to record moving images, which are stored in a physical medium such as in a digital system or on photographic film. A camera consists of a lens which focuses light from the scene, and a camera body which holds the image capture mechanism.

Printer (computing) electronic device which produces a representation of an electronic document on physical media

In computing, a printer is a peripheral device which makes a persistent representation of graphics or text on paper. While most output is human-readable, bar code printers are an example of an expanded use for printers.

The result of a metameric failure would be for example an image of a green tree which shows a different shade of green than the tree itself, a different shade of red for a red apple, a different shade of blue for the blue sky, and so on. Color management (e.g. with ICC profiles) can be used to mitigate this problem within the physical constraints.

In digital imaging systems, color management is the controlled conversion between the color representations of various devices, such as image scanners, digital cameras, monitors, TV screens, film printers, computer printers, offset presses, and corresponding media.

ICC profile file format that characterizes a color input or output device

In color management, an ICC profile is a set of data that characterizes a color input or output device, or a color space, according to standards promulgated by the International Color Consortium (ICC). Profiles describe the color attributes of a particular device or viewing requirement by defining a mapping between the device source or target color space and a profile connection space (PCS). This PCS is either CIELAB (L*a*b*) or CIEXYZ. Mappings may be specified using tables, to which interpolation is applied, or through a series of parameters for transformations.

Approximate true-color images gathered by spacecraft are an example where images have a certain amount of metameric failure, as the spectral bands of a spacecraft's camera are chosen to gather information on the physical properties of the object under investigation, and are not chosen to capture true-color images. [3]

This approximate true-color panorama shows the impact crater Endurance on Mars. It was taken by the panoramic camera on the Opportunity rover and is a composite of a total of 258 images taken in the 480, 530 and 750 nanometer spectral bands (blue / green, green and near infrared).

False color

A traditional false-color satellite image of Las Vegas. Grass-covered land (e.g. a golf course) appears in red. Lasvegas.terra.1500pix.jpg
A traditional false-color satellite image of Las Vegas. Grass-covered land (e.g. a golf course) appears in red.

In contrast to a true-color image, a false-color image sacrifices natural color rendition in order to ease the detection of features that are not readily discernible otherwise – for example the use of near infrared for the detection of vegetation in satellite images. [1] While a false-color image can be created using solely the visual spectrum (e.g. to accentuate color differences), typically some or all data used is from electromagnetic radiation (EM) outside the visual spectrum (e.g. infrared, ultraviolet or X-ray). The choice of spectral bands is governed by the physical properties of the object under investigation.

As the human eye uses three spectral bands (see trichromacy for details), three spectral bands are commonly combined into a false-color image. At least two spectral bands are needed for a false-color encoding, [4] and it is possible to combine more bands into the three visual RGB bands – with the eye's ability to discern three channels being the limiting factor. [5] In contrast, a "color" image made from one spectral band, or an image made from data consisting of non-EM data (e.g. elevation, temperature, tissue type) is a pseudocolor image (see below).

For true color, the RGB channels (red "R", green "G" and blue "B") from the camera are mapped to the corresponding RGB channels of the image, yielding a "RGB→RGB" mapping. For false color this relationship is changed. The simplest false-color encoding is to take an RGB image in the visible spectrum, but map it differently, e.g. "GBR→RGB". For traditional false-color satellite images of Earth a "NRG→RGB" mapping is used, with "N" being the near-infrared spectral band (and the blue spectral band being unused) – this yields the typical "vegetation in red" false-color images. [1] [6]

False color is used (among others) for satellite and space images: Examples are remote sensing satellites (e.g. Landsat, see example above), space telescopes (e.g. the Hubble Space Telescope) or space probes (e.g. Cassini-Huygens). Some spacecraft, with rovers (e.g. the Mars Science Laboratory Curiosity) being the most prominent examples, have the ability to capture approximate true-color images as well. [3] Weather satellites produce, in contrast to the spacecrafts mentioned previously, grayscale images from the visible or infrared spectrum.

Examples for the application of false color:
Daedelus comparison, remote sensing in precision farming (rotated).jpg
These three false-color images demonstrate the application of remote sensing in precision agriculture: The left image shows vegetation density and the middle image presence of water (greens / blue for wet soil and red for dry soil). The right image shows where crops are under stress, as is particularly the case in fields 120 and 119 (indicated by red and yellow pixels). These fields were due to be irrigated the following day.
This false-color composite image of the spiral galaxy Messier 66 is combining four infrared spectral bands from 3.6 to 8.0 micrometers. The contribution from starlight (measured at 3.6 micrometers) has been subtracted from the 5.8 and 8 micrometer band to enhance the visibility of the polycyclic aromatic hydrocarbon emissions.
Eagle Nebula - GPN-2000-000987.jpg
This iconic picture of the Eagle Nebula is false color, as can be inferred from the pink stars. Three pictures were taken by the Hubble Space Telescope, the first picking up light in the frequency of sulfur ions (arbitrarily assigned to the color red), the second hydrogen (green), the third oxygen ions (blue). The actual color of the nebula is unknown, but if one viewed it at a distance making the 1 light year long "pillars" similarly visible, is probably a nearly uniform brownish grey to human eyes.


A pseudocolor image (sometimes styled pseudo-color or pseudo color) is derived from a grayscale image by mapping each intensity value to a color according to a table or function. [7] Pseudo color is typically used when a single channel of data is available (e.g. temperature, elevation, soil composition, tissue type, and so on), in contrast to false color which is commonly used to display three channels of data. [4]

Pseudocoloring can make some details more visible, as the perceived difference in color space is bigger than between successive gray levels alone. On the other hand, the color mapping function should be chosen to make sure the lightness of the color is still monotonic, or the uneven change would make it hard to interpret levels, for both normal and colorblind viewers. One offender is the commonly-used "rainbow" palette, with a back-and-forth change in lightness. (See also Choropleth map § Color progression.) [8]

A typical example for the use of pseudo color is thermography (thermal imaging), where infrared cameras feature only one spectral band and show their grayscale images in pseudo color.

Examples of encoding temperature with pseudo color:
Passivhaus thermogram gedaemmt ungedaemmt.png
Thermogram of a passive house in the foreground and a traditional building in the background. Note the color to temperature key on the right.
Thermal image of a steam locomotive using pseudocolor encoding – yellow/white indicates hot and red/violet indicates cool.
This pseudocolor image shows the results of a computer simulation of temperatures during Space Shuttle reentry. Areas reaching 3,000 °F (1,650 °C) can be seen in yellow.

Another familiar example of pseudo color is the encoding of elevation using hypsometric tints in physical relief maps, where negative values (below sea level) are usually represented by shades of blue, and positive values by greens and browns.

Examples of encoding elevation with pseudo color:
Pacific elevation.jpg
An elevation map of the Pacific Ocean, showing ocean floor in shades of blue and land in greens and browns.
Kasei Valles topo.jpg
This color-coded elevation relief map indicates the result of floods on Mars. Please note the color to elevation key on the bottom.
Moon worldwind.jpg
The Moon with hypsometric tints of red for the highest points and purple for the lowest.

Depending on the table or function used and the choice of data sources, pseudocoloring may increase the information contents of the original image, for example adding geographic information, combining information obtained from infrared or ultra-violet light, or other sources like MRI scans. [9]

Examples of overlaying additional information with pseudo color:
Moon Crescent - False Color Mosaic.jpg
This image shows compositional variations of the Moon overlaid as pseudo color.
MR Knee.jpg
A grayscale MRI of a knee – different gray levels indicate different tissue types, requiring a trained eye.
Knee MRI 113035 rgbcb.png
A pseudocolor MRI of a knee created using three different grayscale scans – tissue types are easier to discern through pseudo color.

A further application of pseudocoloring is to store the results of image elaboration; that is, changing the colors in order to ease understanding an image. [10]

Density slicing

An image of Tasmania and surrounding waters using density slicing to show phytoplankton concentration. The ocean color as captured by the satellite image is mapped to seven colors: Yellow, orange and red indicate more phytoplankton, while light green, dark green, light blue and dark blue indicate less phytoplankton; land and clouds are depicted in different colors. Tasmania 27nov81.png
An image of Tasmania and surrounding waters using density slicing to show phytoplankton concentration. The ocean color as captured by the satellite image is mapped to seven colors: Yellow, orange and red indicate more phytoplankton, while light green, dark green, light blue and dark blue indicate less phytoplankton; land and clouds are depicted in different colors.

Density slicing, a variation of pseudo color, divides an image into a few colored bands and is (among others) used in the analysis of remote sensing images. [11] For density slicing the range of grayscale levels is divided into intervals, with each interval assigned to one of a few discrete colors – this is in contrast to pseudo color, which uses a continuous color scale. [12] For example, in a grayscale thermal image the temperature values in the image can be split into bands of 2 °C, and each band represented by one color – as a result the temperature of one spot in the thermograph can be easier acquired by the user, because the discernible differences between the discrete colors are greater than those of images with continuous grayscale or continuous pseudo color.


The US Presidential Election of 2004, visualised using a choropleth map. 2004US election map.svg
The US Presidential Election of 2004, visualised using a choropleth map.

A choropleth is an image or map in which areas are colored or patterned proportionally to the category or value of one or more variables being represented. The variables are mapped to a few colors; each area contributes one data point and receives one color from these selected colors. Basically it is density slicing applied to a pseudocolor overlay. A choropleth map of a geographic area is thus an extreme form of false color.

False color in the arts

While artistic rendition lends to subjective expression of color, Andy Warhol (1928–1987) has become a culturally significant figure of the modern art movement by creating false color paintings with screen printing techniques. Some of Warhol's most recognizable prints include a replication of Marilyn Monroe, her image based on a film frame from the movie Niagara . The subject was a sex symbol and film noir starlet whose death in 1962 influenced the artist. A series of prints were made with endearment but expose her persona as an illusion through his assembly line style of art production which are non-erotic and slightly grotesque. [13] Using various ink color palettes, Warhol immersed himself in a process of repetition that serves to compare personas and everyday objects to the qualities of mass production and consumerism. [14] The colors of ink were selected through experimentation of aesthetics and do not correlate to false color rendering of the electromagnetic spectrum employed in remote sensing image processing. For years the artist continued screen printing false color images of Marilyn Monroe, perhaps his most referenced work being Turquoise Marilyn [15] which was bought in May 2007 by a private collector for 80 million US dollars. [16]

See also

Related Research Articles

Remote sensing Acquisition of information at a significant distance from the subject

Remote sensing is the acquisition of information about an object or phenomenon without making physical contact with the object and thus in contrast to on-site observation, especially the Earth. Remote sensing is used in numerous fields, including geography, land surveying and most Earth science disciplines ; it also has military, intelligence, commercial, economic, planning, and humanitarian applications.

Landsat program

The Landsat program is the longest-running enterprise for acquisition of satellite imagery of Earth. On July 23, 1972 the Earth Resources Technology Satellite was launched. This was eventually renamed to Landsat. The most recent, Landsat 8, was launched on February 11, 2013. The instruments on the Landsat satellites have acquired millions of images. The images, archived in the United States and at Landsat receiving stations around the world, are a unique resource for global change research and applications in agriculture, cartography, geology, forestry, regional planning, surveillance and education, and can be viewed through the U.S. Geological Survey (USGS) 'EarthExplorer' website. Landsat 7 data has eight spectral bands with spatial resolutions ranging from 15 to 60 meters ; the temporal resolution is 16 days. Landsat images are usually divided into scenes for easy downloading. Each Landsat scene is about 115 miles long and 115 miles wide.

In digital photography, computer-generated imagery, and colorimetry, a grayscale or greyscale image is one in which the value of each pixel is a single sample representing only an amount of light, that is, it carries only intensity information. Grayscale images, a kind of black-and-white or gray monochrome, are composed exclusively of shades of gray. The contrast ranges from black at the weakest intensity to white at the strongest.

Landsat 7 earth observation satellite

Landsat 7 is the seventh satellite of the Landsat program. Launched on April 15, 1999, Landsat 7's primary goal is to refresh the global archive of satellite photos, providing up-to-date and cloud-free images. The Landsat Program is managed and operated by the USGS, and data from Landsat 7 is collected and distributed by the USGS. The NASA World Wind project allows 3D images from Landsat 7 and other sources to be freely navigated and viewed from any angle. The satellite's companion, Earth Observing-1, trailed by one minute and followed the same orbital characteristics, but in 2011 its fuel was depleted and EO-1's orbit began to degrade. Landsat 7 was built by Lockheed Martin Space Systems Company.

Spectral color color evoked by a single wavelength of light in the visible spectrum

A spectral color is a color that is evoked in a normal human by a single wavelength of light in the visible spectrum, or by a relatively narrow band of wavelengths, also known as monochromatic light. Every wavelength of visible light is perceived as a spectral color, in a continuous spectrum; the colors of sufficiently close wavelengths are indistinguishable for the human eye.

Satellite imagery imagery of the Earth or another astronomical object taken from an artificial satellite

Satellite imagery are images of Earth or other planets collected by imaging satellites operated by governments and businesses around the world. Satellite imaging companies sell images by licensing them to governments and businesses such as Apple Maps and Google Maps.

Landsat 3 third satellite of the Landsat program

Landsat 3 is the third satellite of the Landsat program. It was launched on March 5, 1978, with the primary goal of providing a global archive of satellite imagery. Unlike later Landsat satellites, Landsat 3 was managed solely by NASA. Landsat 3 decommissioned on September 7, 1983, beyond its design life of one year. The data collected during Landsat 3's lifetime was used by 31 countries. Countries that cannot afford their own satellite are able to use the data for ecological preservation efforts and to determine the location of natural resources.

Multispectral image

A multispectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, i.e. infrared and ultra-violet. Spectral imaging can allow extraction of additional information the human eye fails to capture with its receptors for red, green and blue. It was originally developed for space-based imaging, and has also found use in document and painting analysis.

Spectral imaging

Spectral imaging is imaging that uses multiple bands across the electromagnetic spectrum. While an ordinary camera captures light across three wavelength bands in the visible spectrum, red, green, and blue (RGB), spectral imaging encompasses a wide variety of techniques that go beyond RGB. Spectral imaging may use the infrared, the visible spectrum, the ultraviolet, x-rays, or some combination of the above. It may include the acquisition of image data in visible and non-visible bands simultaneously, illumination from outside the visible range, or the use of optical filters to capture a specific spectral range. It is also possible to capture hundreds of wavelength bands for each pixel in an image.

Imaging spectroscopy

In imaging spectroscopy each pixel of an image acquires many bands of light intensity data from the spectrum, instead of just the three bands of the RGB color model. More precisely, it is the simultaneous acquisition of spatially coregistered images in many spectrally contiguous bands.

Normalized difference vegetation index

The normalized difference vegetation index (NDVI) is a simple graphical indicator that can be used to analyze remote sensing measurements, typically, but not necessarily, from a space platform, and assess whether the target being observed contains live green vegetation or not.

Hyperspectral imaging

Hyperspectral imaging, like other spectral imaging, collects and processes information from across the electromagnetic spectrum. The goal of hyperspectral imaging is to obtain the spectrum for each pixel in the image of a scene, with the purpose of finding objects, identifying materials, or detecting processes. There are three general branches of spectral imagers. There are push broom scanners and the related whisk broom scanners, which read images over time, band sequential scanners, which acquire images of an area at different wavelengths, and snapshot hyperspectral imaging, which uses a staring array to generate an image in an instant.

Landsat 2

Landsat 2 is the second satellite of the Landsat program. The spacecraft originally carried a designation of ERTS-B but was renamed "Landsat 2" prior to its launch on January 22, 1975. The objective of the satellite was to acquire global, seasonal data in medium resolution from a near-polar, sun-synchronous orbit. The satellite, built by General Electric, acquired data with the Return Beam Vidicon (RBV) and the Multi-Spectral Scanner (MSS). Despite having a design life of one year, Landsat 2 operated for over seven years, finally ceasing operations on February 25, 1982.

Pansharpening is a process of merging high-resolution panchromatic and lower resolution multispectral imagery to create a single high-resolution color image. Google Maps and nearly every map creating company use this technique to increase image quality. Pansharpening produces a high-resolution color image from three, four or more low-resolution multispectral satellite bands plus a corresponding high-resolution panchromatic band:

Low-res color bands + High-res grayscale band = Hi-res color image

Landsat 8 American Earth observation satellite

Landsat 8 is an American Earth observation satellite launched on February 11, 2013. It is the eighth satellite in the Landsat program; the seventh to reach orbit successfully. Originally called the Landsat Data Continuity Mission (LDCM), it is a collaboration between NASA and the United States Geological Survey (USGS). NASA Goddard Space Flight Center in Greenbelt, Maryland, provided development, mission systems engineering, and acquisition of the launch vehicle while the USGS provided for development of the ground systems and will conduct on-going mission operations.

Operational Land Imager

The Operational Land Imager (OLI) is a remote sensing instrument aboard Landsat 8, built by Ball Aerospace & Technologies. Landsat 8 is the successor to Landsat 7 and was launched in 2013.

Multispectral remote sensing is the collection and analysis of reflected, emitted, or back-scattered energy from an object or an area of interest in multiple bands of regions of the electromagnetic spectrum. Subcategories of multispectral remote sensing include hyperspectral, in which hundreds of bands are collected and analyzed, and ultraspectral remote sensing where many hundreds of bands are used. The main purpose of multispectral imaging is the potential to classify the image using multispectral classification. This is a much faster method of image analysis than is possible by human interpretation.

Landsat 9 is a planned US, Earth observation satellite, initially scheduled for launch in December 2020. NASA is in charge of building, launching, and testing the system, while the United States Geological Survey (USGS) will process, archive, and distribute its data. Its intended as the eighth satellite in the Landsat series, as Landsat 6 failed to reach orbit. As of October 2017, the United Launch Alliance is planning for a launch date of June 2021 using an Atlas V 401 rocket, and will lift off from the Space Launch Complex 3E at Vandenberg Air Force Base. The critical design review was completed by NASA on March 1, 2018, and Orbital ATK was given the go-ahead to manufacture the satellite.


  1. 1 2 3 "Principles of Remote Sensing - Centre for Remote Imaging, Sensing and Processing, CRISP". www.crisp.nus.edu.sg. Retrieved 2012-09-01.
  2. "The Landsat 7 Compositor". landsat.gsfc.nasa.gov. 2011-03-21. Retrieved 2012-09-01.
  3. 1 2 3 Nancy Atkinson (2007-10-01). "True or False (Color): The Art of Extraterrestrial Photography". www.universetoday.com. Retrieved 2012-09-01.
  4. "NGC 3627 (M66) - NASA Spitzer Space Telescope Collection". www.nasaimages.org. 2005-09-15. Archived from the original on 2011-09-01. Retrieved 2012-09-01.
  5. GDSC, Nationaal Lucht- en Ruimtevaartlaboratorium (National Laboratory of Air and Space Transport), Netherlands. "Band combinations". GDSC, Nationaal Lucht- en Ruimtevaartlaboratorium (National Laboratory of Air and Space Transport), Netherlands. Archived from the original on 2012-08-17.CS1 maint: multiple names: authors list (link)
  6. "Pseudocolor Filter for VirtualDub". Neuron2.net. Archived from the original on 2010-06-11. Retrieved 2012-09-01.
  7. Stauffer, Reto. "Somewhere over the Rainbow". HCL Wizard. Retrieved 14 August 2019.
  8. Leonid I. Dimitrov (1995). "Pseudo-colored visualization of EEG-activities on the human cortex using MRI-based volume rendering and Delaunay interpolation". Medical Imaging 1995: Image Display. 2431: 460. Bibcode:1995SPIE.2431..460D. CiteSeerX . doi:10.1117/12.207641. Archived from the original on 2011-07-06. Retrieved 2009-03-18.
  9. C J Setchell; N W Campbell (July 1999). "Using Color Gabor Texture Features for Scene Understanding". 7th. International Conference on Image Processing and its Applications. University of Bristol. Retrieved 2009-03-18.
  10. John Alan Richards; Xiuping Jia (2006). Remote Sensing Digital Image Analysis: An Introduction (4th ed.). Birkhäuser. pp. 102–104. ISBN   9783540251286 . Retrieved 2015-07-26.
  11. J. B. Campbell, "Introduction to Remote Sensing", 3rd ed., Taylor & Francis, p. 153
  12. Wood, Paul (2004). Varieties of Modernism. London, United Kingdom: Yale University Press. pp. 339–341, 354. ISBN   978-0-300-10296-3.
  13. "Gold Marilyn Monroe". www.MoMa.org. Retrieved 9 June 2014.
  14. Fallon, Michael (2011). How to Analyze the Works of Andy Warhol. North Mankato, Minnesota, United States of America: ABDO Publishing Company. pp. 44–46. ISBN   978-1-61613-534-8.
  15. Vogel, Carol (2007-05-25). "Inside Art". The New York Times. Retrieved 9 June 2014.