False color

Last updated

A mosaic constructed from a series of 53 images taken through three spectral filters by Galileo's imaging system as it flew over the northern regions of the Moon in December 1992. Moon Crescent - False Color Mosaic.jpg
A mosaic constructed from a series of 53 images taken through three spectral filters by Galileo's imaging system as it flew over the northern regions of the Moon in December 1992.
A false-color image from the Meteor M2-2 satellite's imager MSU-MR. The image was received by an amateur radio station and is derived from the HRPT data. MSU-MR-Meteor-M2-2.png
A false-color image from the Meteor M2-2 satellite's imager MSU-MR. The image was received by an amateur radio station and is derived from the HRPT data.

False color (or pseudo color) refers to a group of color rendering methods used to display images in color which were recorded in the visible or non-visible parts of the electromagnetic spectrum. A false-color image is an image that depicts an object in colors that differ from those a photograph (a true-color image) would show. In this image, colors have been assigned to three different wavelengths that human eyes cannot normally see.

Contents

In addition, variants of false color such as pseudocolor, density slicing, and choropleths are used for information visualization of either data gathered by a single grayscale channel or data not depicting parts of the electromagnetic spectrum (e.g. elevation in relief maps or tissue types in magnetic resonance imaging).

Types of color renderings

True color

The concept behind true color can help in understanding false color. An image is called a true-color image when it offers a natural color rendition, or when it comes close to it. This means that the colors of an object in an image appear to a human observer the same way as if this same observer were to directly view the object: A green tree appears green in the image, a red apple red, a blue sky blue, and so on. [1]

Two Landsat satellite images showing the same region:
Chesapeake Bay and the city of Baltimore [2]
True-color-image.png
This true-color image shows the area in actual colors, e.g., the vegetation appears in green. It covers the full visible spectrum using the red, green and blue / green spectral bands of the satellite mapped to the RGB color space of the image.
False-color-image.png
The same area as a false-color image using the near infrared, red and green spectral bands mapped to RGB – this image shows vegetation in a red tone, as vegetation reflects most light in the near infrared.
Burns Cliff inside of Endurance crater on Mars. The color is approximate true color because, instead of the red spectral band, infrared was used. The result is a metameric failure in the color of the sky, which is slightly green in the image - had a human observer been present, then that person would have perceived the actual sky color to have a bit more orange in it. The Opportunity rover which captured this image does have a red filter, but it is often not used, due to the higher scientific value of images captured using the infrared band and the constraints of data transmission. Burns cliff.jpg
Burns Cliff inside of Endurance crater on Mars. The color is approximate true color because, instead of the red spectral band, infrared was used. The result is a metameric failure in the color of the sky, which is slightly green in the image – had a human observer been present, then that person would have perceived the actual sky color to have a bit more orange in it. The Opportunity rover which captured this image does have a red filter, but it is often not used, due to the higher scientific value of images captured using the infrared band and the constraints of data transmission.

Absolute true-color rendering is impossible. [3] There are three major sources of color error (metameric failure):

The result of a metameric failure would be for example an image of a green tree which shows a different shade of green than the tree itself, a different shade of red for a red apple, a different shade of blue for the blue sky, and so on. Color management (e.g. with ICC profiles) can be used to mitigate this problem within the physical constraints.

Approximate true-color images gathered by spacecraft are an example where images have a certain amount of metameric failure, as the spectral bands of a spacecraft's camera are chosen to gather information on the physical properties of the object under investigation, and are not chosen to capture true-color images. [3]

MarsEndurance.jpg
This approximate true-color panorama shows the impact crater Endurance on Mars. It was taken by the panoramic camera on the Opportunity rover and is a composite of a total of 258 images taken in the 480, 530 and 750 nanometer spectral bands (blue / green, green and near infrared).

False color

A traditional false-color satellite image of Las Vegas. Grass-covered land (e.g. a golf course) appears in red. Lasvegas.terra.1500pix.jpg
A traditional false-color satellite image of Las Vegas. Grass-covered land (e.g. a golf course) appears in red.

In contrast to a true-color image, a false-color image sacrifices natural color rendition in order to ease the detection of features that are not readily discernible otherwise – for example the use of near infrared for the detection of vegetation in satellite images. [1] While a false-color image can be created using solely the visual spectrum (e.g. to accentuate color differences), typically some or all data used is from electromagnetic radiation (EM) outside the visual spectrum (e.g. infrared, ultraviolet or X-ray). The choice of spectral bands is governed by the physical properties of the object under investigation.

As the human eye uses three spectral bands (see trichromacy for details), three spectral bands are commonly combined into a false-color image. At least two spectral bands are needed for a false-color encoding, [4] and it is possible to combine more bands into the three visual RGB bands – with the eye's ability to discern three channels being the limiting factor. [5] In contrast, a "color" image made from one spectral band, or an image made from data consisting of non-EM data (e.g. elevation, temperature, tissue type) is a pseudocolor image (see below).

For true color, the RGB channels (red "R", green "G" and blue "B") from the camera are mapped to the corresponding RGB channels of the image, yielding a "RGB→RGB" mapping. For false color this relationship is changed. The simplest false-color encoding is to take an RGB image in the visible spectrum, but map it differently, e.g. "GBR→RGB". For traditional false-color satellite images of Earth a "NRG→RGB" mapping is used, with "N" being the near-infrared spectral band (and the blue spectral band being unused) – this yields the typical "vegetation in red" false-color images. [1] [6]

False color is used (among others) for satellite and space images: Examples are remote sensing satellites (e.g. Landsat, see example above), space telescopes (e.g. the Hubble Space Telescope) or space probes (e.g. Cassini-Huygens ). Some spacecraft, with rovers (e.g. the Mars Science Laboratory Curiosity) being the most prominent examples, have the ability to capture approximate true-color images as well. [3] Weather satellites produce, in contrast to the spacecraft mentioned previously, grayscale images from the visible or infrared spectrum.

Examples for the application of false color:
Daedelus comparison, remote sensing in precision farming (rotated).jpg
These three false-color images demonstrate the application of remote sensing in precision agriculture: The left image shows vegetation density and the middle image presence of water (greens / blue for wet soil and red for dry soil). The right image shows where crops are under stress, as is particularly the case in fields 120 and 119 (indicated by red and yellow pixels). These fields were due to be irrigated the following day.
Sig05-016.jpg
This false-color composite image of the spiral galaxy Messier 66 is combining four infrared spectral bands from 3.6 to 8.0 micrometers. The contribution from starlight (measured at 3.6 micrometers) has been subtracted from the 5.8 and 8 micrometer band to enhance the visibility of the polycyclic aromatic hydrocarbon emissions.
Eagle Nebula - GPN-2000-000987.jpg
This iconic picture of the Eagle Nebula is false-color, as can be inferred from the pink stars. Three pictures were taken by the Hubble Space Telescope, then first picking up light in the frequency of sulfur ions (arbitrarily assigned to the color red), the second hydrogen (green), the third oxygen ions (blue). The actual color of the nebula is unknown, but if one viewed it at a distance making the 1-light-year-long "pillars" similarly visible, is probably a nearly uniform brownish grey to human eyes.

False color has a range of scientific applications. Spacecraft often employ false-color methods to help understand the composition of structures in the universe such as nebula and galaxies. [7] The frequency of light emitted by different ions in space are assigned contrasting colors, allowing the chemical composition of complex structures to be better separated and visualised. The image of the Eagle Nebula above is a typical example of this; the Hydrogen and Oxygen ions have been assigned green and blue respectively. The large amounts of green and blue in the image show that there is a large amount of Hydrogen and Oxygen in the nebula.

On 26 October 2004, the NASA/ESA Cassini-Huygens spacecraft captured a false-color image of Titan, Saturn's largest moon. [8] The image was captured in Ultraviolet and Infrared wavelengths, both invisible to the human eye. [9] In order to provide a visual representation, false color techniques were used. The infrared data was mapped to red and green colors, and ultraviolet mapped to blue. [10]

Pseudocolor

A pseudocolor image (sometimes styled pseudo-color or pseudo color) is derived from a grayscale image by mapping each intensity value to a color according to a table or function. [11] Pseudo color is typically used when a single channel of data is available (e.g. temperature, elevation, soil composition, tissue type, and so on), in contrast to false color which is commonly used to display three channels of data. [4]

Pseudocoloring can make some details more visible, as the perceived difference in color space is bigger than between successive gray levels alone. On the other hand, the color mapping function should be chosen to make sure the lightness of the color is still monotonic, or the uneven change would make it hard to interpret levels, for both normal and colorblind viewers. One offender is the commonly-used "rainbow" palette, with a back-and-forth change in lightness. (See also Choropleth map § Color progression.) [12]

A typical example for the use of pseudo color is thermography (thermal imaging), where infrared cameras feature only one spectral band and show their grayscale images in pseudo color.

Examples of encoding temperature with pseudo color:
Passivhaus thermogram gedaemmt ungedaemmt.png
Thermogram of a passive house in the foreground and a traditional building in the background. There is a color to temperature key on the right.
ParowozIR.jpg
Thermal image of a steam locomotive using pseudocolor encoding – yellow/white indicates hot and red/violet indicates cool.
Stsheat.jpg
This pseudocolor image shows the results of a computer simulation of temperatures during Space Shuttle reentry. Areas reaching 3,000 °F (1,650 °C) can be seen in yellow.

Another familiar example of pseudo color is the encoding of elevation using hypsometric tints in physical relief maps, where negative values (below sea level) are usually represented by shades of blue, and positive values by greens and browns.

Examples of encoding elevation with pseudo color:
Pacific elevation.jpg
An elevation map of the Pacific Ocean, showing ocean floor in shades of blue and land in greens and browns.
Kasei Valles topo.jpg
This color-coded elevation relief map indicates the result of floods on Mars. There is a color to elevation key on the bottom.
Moon worldwind.jpg
The Moon with hypsometric tints of red for the highest points and purple for the lowest.

Depending on the table or function used and the choice of data sources, pseudocoloring may increase the information contents of the original image, for example adding geographic information, combining information obtained from infrared or ultra-violet light, or other sources like MRI scans. [13]

Examples of overlaying additional information with pseudo color:
Moon Crescent - False Color Mosaic.jpg
This image shows compositional variations of the Moon overlaid as pseudo color.
MR Knee.jpg
A grayscale MRI of a knee – different gray levels indicate different tissue types, requiring a trained eye.
Knee MRI 113035 rgbcb.png
A pseudocolor MRI of a knee created using three different grayscale scans – tissue types are easier to discern through pseudo color.

A further application of pseudocoloring is to store the results of image elaboration; that is, changing the colors in order to ease understanding an image. [14]

Density slicing

An image of Tasmania and surrounding waters using density slicing to show phytoplankton concentration. The ocean color as captured by the satellite image is mapped to seven colors: Yellow, orange and red indicate more phytoplankton, while light green, dark green, light blue and dark blue indicate less phytoplankton; land and clouds are depicted in different colors. Tasmania 27nov81.png
An image of Tasmania and surrounding waters using density slicing to show phytoplankton concentration. The ocean color as captured by the satellite image is mapped to seven colors: Yellow, orange and red indicate more phytoplankton, while light green, dark green, light blue and dark blue indicate less phytoplankton; land and clouds are depicted in different colors.

Density slicing, a variation of pseudo color, divides an image into a few colored bands and is (among others) used in the analysis of remote sensing images. [15] For density slicing the range of grayscale levels is divided into intervals, with each interval assigned to one of a few discrete colors – this is in contrast to pseudo color, which uses a continuous color scale. [16] For example, in a grayscale thermal image the temperature values in the image can be split into bands of 2 °C, and each band represented by one color – as a result the temperature of one spot in the thermograph can be easier acquired by the user, because the discernible differences between the discrete colors are greater than those of images with continuous grayscale or continuous pseudo color.

Choropleth

The 2004 United States presidential election, visualized using a choropleth map. Support for the Republican and Democratic candidates are shown in shades of the parties' respective red and blue traditional colors. 2004US election map.svg
The 2004 United States presidential election, visualized using a choropleth map. Support for the Republican and Democratic candidates are shown in shades of the parties' respective red and blue traditional colors.

A choropleth is an image or map in which areas are colored or patterned proportionally to the category or value of one or more variables being represented. The variables are mapped to a few colors; each area contributes one data point and receives one color from these selected colors. Basically it is density slicing applied to a pseudocolor overlay. A choropleth map of a geographic area is thus an extreme form of false color.

False color in the arts

While artistic rendition lends to subjective expression of color, Andy Warhol (1928–1987) has become a culturally significant figure of the modern art movement by creating false-color paintings with screen printing techniques. Some of Warhol's most recognizable prints include a replication of Marilyn Monroe, her image based on a film frame from the movie Niagara . The subject was a sex symbol and film noir starlet whose death in 1962 influenced the artist. A series of prints were made with endearment but expose her persona as an illusion through his assembly line style of art production which are non-erotic and slightly grotesque. [17] Using various ink color palettes, Warhol immersed himself in a process of repetition that serves to compare personas and everyday objects to the qualities of mass production and consumerism. [18] The colors of ink were selected through experimentation of aesthetics and do not correlate to false-color rendering of the electromagnetic spectrum employed in remote sensing image processing. For years the artist continued screen printing false-color images of Marilyn Monroe, perhaps his most referenced work being Turquoise Marilyn [19] which was bought in May 2007 by a private collector for 80 million US dollars. [20]

See also

Related Research Articles

<span class="mw-page-title-main">Remote sensing</span> Acquisition of information at a significant distance from the subject

Remote sensing is the acquisition of information about an object or phenomenon without making physical contact with the object, in contrast to in situ or on-site observation. The term is applied especially to acquiring information about Earth and other planets. Remote sensing is used in numerous fields, including geophysics, geography, land surveying and most Earth science disciplines ; it also has military, intelligence, commercial, economic, planning, and humanitarian applications, among others.

<span class="mw-page-title-main">Landsat program</span> American network of Earth-observing satellites for international research purposes

The Landsat program is the longest-running enterprise for acquisition of satellite imagery of Earth. It is a joint NASA / USGS program. On 23 July 1972, the Earth Resources Technology Satellite was launched. This was eventually renamed to Landsat 1 in 1975. The most recent, Landsat 9, was launched on 27 September 2021.

In digital photography, computer-generated imagery, and colorimetry, a grayscale image is one in which the value of each pixel is a single sample representing only an amount of light; that is, it carries only intensity information. Grayscale images, a kind of black-and-white or gray monochrome, are composed exclusively of shades of gray. The contrast ranges from black at the weakest intensity to white at the strongest.

<span class="mw-page-title-main">Landsat 7</span> American Earth-observing satellite launched in 1999 as part of the Landsat program

Landsat 7 is the seventh satellite of the Landsat program. Launched on 15 April 1999, Landsat 7's primary goal is to refresh the global archive of satellite photos, providing up-to-date and cloud-free images. The Landsat program is managed and operated by the United States Geological Survey, and data from Landsat 7 is collected and distributed by the USGS. The NASA WorldWind project allows 3D images from Landsat 7 and other sources to be freely navigated and viewed from any angle. The satellite's companion, Earth Observing-1, trailed by one minute and followed the same orbital characteristics, but in 2011 its fuel was depleted and EO-1's orbit began to degrade. Landsat 7 was built by Lockheed Martin Space Systems.

<span class="mw-page-title-main">Spectral color</span> Color evoked by a single wavelength of light in the visible spectrum

A spectral color is a color that is evoked by monochromatic light, i.e. either a spectral line with a single wavelength or frequency of light in the visible spectrum, or a relatively narrow spectral band. Every wave of visible light is perceived as a spectral color; when viewed as a continuous spectrum, these colors are seen as the familiar rainbow. Non-spectral colors are evoked by a combination of spectral colors.

<span class="mw-page-title-main">Satellite imagery</span> Images taken from an artificial satellite

Satellite images are images of Earth collected by imaging satellites operated by governments and businesses around the world. Satellite imaging companies sell images by licensing them to governments and businesses such as Apple Maps and Google Maps.

<span class="mw-page-title-main">Landsat 4</span> American Earth-observing satellite launched in 1982 as part of the Landsat program

Landsat 4 is the fourth satellite of the Landsat program. It was launched on July 16, 1982, with the primary goal of providing a global archive of satellite imagery. Although the Landsat Program is managed by NASA, data from Landsat 4 was collected and distributed by the U.S. Geological Survey. Landsat 4 science operations ended on December 14, 1993, when the satellite lost its ability to transmit science data, far beyond its designed life expectancy of five years. The satellite housekeeping telemetry and tracking continued to be maintained by NASA until it was decommissioned on June 15, 2001.

<span class="mw-page-title-main">Multispectral imaging</span> Capturing image data across multiple electromagnetic spectrum ranges

Multispectral imaging captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or detected with the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, i.e. infrared and ultra-violet. It can allow extraction of additional information the human eye fails to capture with its visible receptors for red, green and blue. It was originally developed for military target identification and reconnaissance. Early space-based imaging platforms incorporated multispectral imaging technology to map details of the Earth related to coastal boundaries, vegetation, and landforms. Multispectral imaging has also found use in document and painting analysis.

Color digital images are made of pixels, and pixels are made of combinations of primary colors represented by a series of code. A channel in this context is the grayscale image of the same size as a color image, made of just one of these primary colors. For instance, an image from a standard digital camera will have a red, green and blue channel. A grayscale image has just one channel.

In computing, indexed color is a technique to manage digital images' colors in a limited fashion, in order to save computer memory and file storage, while speeding up display refresh and file transfers. It is a form of vector quantization compression.

<span class="mw-page-title-main">Normalized difference vegetation index</span> Graphical indicator of remotely sensed live green vegetation

The normalized difference vegetation index (NDVI) is a widely-used metric for quantifying the health and density of vegetation using sensor data. It is calculated from spectrometric data at two specific bands: red and near-infrared. The spectrometric data is usually sourced from remote sensors, such as satellites.

<span class="mw-page-title-main">Landsat 2</span> American Earth-observing satellite launched in 1975 as part of the Landsat program

Landsat 2 is the second satellite of the Landsat program. The spacecraft originally carried a designation of ERTS-B but was renamed "Landsat 2" prior to its launch on January 22, 1975. The objective of the satellite was to acquire global, seasonal data in medium resolution from a near-polar, Sun-synchronous orbit. The satellite, built by General Electric, acquired data with the Return Beam Vidicon (RBV) and the Multispectral Scanner (MSS). Despite having a design life of one year, Landsat 2 operated for over seven years, finally ceasing operations on February 25, 1982.

<span class="mw-page-title-main">Ocean color</span> Explanation of the color of oceans and ocean color remote sensing

Ocean color is the branch of ocean optics that specifically studies the color of the water and information that can be gained from looking at variations in color. The color of the ocean, while mainly blue, actually varies from blue to green or even yellow, brown or red in some cases. This field of study developed alongside water remote sensing, so it is focused mainly on how color is measured by instruments.

Pansharpening is a process of merging high-resolution panchromatic and lower resolution multispectral imagery to create a single high-resolution color image. Google Maps and nearly every map creating company use this technique to increase image quality. Pansharpening produces a high-resolution color image from three, four or more low-resolution multispectral satellite bands plus a corresponding high-resolution panchromatic band:

Low-res color bands + High-res grayscale band = Hi-res color image

<span class="mw-page-title-main">Landsat 8</span> American Earth-observing satellite launched in 2013 as part of the Landsat program

Landsat 8 is an American Earth observation satellite launched on 11 February 2013. It is the eighth satellite in the Landsat program; the seventh to reach orbit successfully. Originally called the Landsat Data Continuity Mission (LDCM), it is a collaboration between NASA and the United States Geological Survey (USGS). NASA Goddard Space Flight Center in Greenbelt, Maryland, provided development, mission systems engineering, and acquisition of the launch vehicle while the USGS provided for development of the ground systems and will conduct on-going mission operations. It comprises the camera of the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS), which can be used to study Earth surface temperature and is used to study global warming.

<span class="mw-page-title-main">Operational Land Imager</span> Sensing instrument aboard the Landsat 8 satellite orbiting Earth

The Operational Land Imager (OLI) is a remote sensing instrument aboard Landsat 8, built by Ball Aerospace & Technologies. Landsat 8 is the successor to Landsat 7 and was launched on February 11, 2013.

<span class="mw-page-title-main">Earth Observing-1</span>

Earth Observing-1 (EO-1) is a decommissioned NASA Earth observation satellite created to develop and validate a number of instrument and spacecraft bus breakthrough technologies. It was intended to enable the development of future Earth imaging observatories that will have a significant increase in performance while also having reduced cost and mass. The spacecraft was part of the New Millennium Program. It was the first satellite to map active lava flows from space; the first to measure a facility's methane leak from space; and the first to track re-growth in a partially logged Amazon forest from space. EO-1 captured scenes such as the ash after the World Trade Center attacks, the flooding in New Orleans after Hurricane Katrina, volcanic eruptions and a large methane leak in southern California.

Multispectral remote sensing is the collection and analysis of reflected, emitted, or back-scattered energy from an object or an area of interest in multiple bands of regions of the electromagnetic spectrum. Subcategories of multispectral remote sensing include hyperspectral, in which hundreds of bands are collected and analyzed, and ultraspectral remote sensing where many hundreds of bands are used. The main purpose of multispectral imaging is the potential to classify the image using multispectral classification. This is a much faster method of image analysis than is possible by human interpretation.

<span class="mw-page-title-main">Landsat 9</span> American Earth-observing satellite launched in 2021 as part of the Landsat program

Landsat 9 is an Earth observation satellite launched on 27 September 2021 from Space Launch Complex-3E at Vandenberg Space Force Base on an Atlas V 401 launch vehicle. NASA is in charge of building, launching, and testing the satellite, while the United States Geological Survey (USGS) operates the satellite, and manages and distributes the data archive. It is the ninth satellite in the Landsat program, but Landsat 6 failed to reach orbit. The Critical Design Review (CDR) was completed by NASA in April 2018, and Northrop Grumman Innovation Systems (NGIS) was given the go-ahead to manufacture the satellite.

References

  1. 1 2 3 "Principles of Remote Sensing - Centre for Remote Imaging, Sensing and Processing, CRISP". www.crisp.nus.edu.sg. Retrieved 1 September 2012.
  2. "The Landsat 7 Compositor". landsat.gsfc.nasa.gov. 21 March 2011. Archived from the original on 21 September 2013. Retrieved 1 September 2012.
  3. 1 2 3 Nancy Atkinson (1 October 2007). "True or False (Color): The Art of Extraterrestrial Photography". www.universetoday.com. Retrieved 1 September 2012.
  4. "NGC 3627 (M66) - NASA Spitzer Space Telescope Collection". www.nasaimages.org. 15 September 2005. Archived from the original on 1 September 2011. Retrieved 1 September 2012.
  5. GDSC, Nationaal Lucht- en Ruimtevaartlaboratorium (National Laboratory of Air and Space Transport), Netherlands. "Band combinations". GDSC, Nationaal Lucht- en Ruimtevaartlaboratorium (National Laboratory of Air and Space Transport), Netherlands. Archived from the original on 17 August 2012.{{cite web}}: CS1 maint: multiple names: authors list (link)
  6. "The Truth About Hubble, JWST, and False Color". NASA Blueshift. Retrieved 9 March 2022.
  7. JPL, Carolina Martinez. "NASA - First Close Encounter of Saturn's Hazy Moon Titan". www.nasa.gov. Retrieved 9 March 2022.
  8. Hadhazy, Adam. "What are the limits of human vision?". www.bbc.com. Retrieved 9 March 2022.
  9. "NASA - Titan in False Color". www.nasa.gov. Retrieved 9 March 2022.
  10. "Pseudocolor Filter for VirtualDub". Neuron2.net. Archived from the original on 11 June 2010. Retrieved 1 September 2012.
  11. Stauffer, Reto. "Somewhere over the Rainbow". HCL Wizard. Retrieved 14 August 2019.
  12. Leonid I. Dimitrov (1995). Kim, Yongmin (ed.). "Pseudo-colored visualization of EEG-activities on the human cortex using MRI-based volume rendering and Delaunay interpolation". Medical Imaging 1995: Image Display. 2431: 460–469. Bibcode:1995SPIE.2431..460D. CiteSeerX   10.1.1.57.308 . doi:10.1117/12.207641. S2CID   13315449. Archived from the original on 6 July 2011. Retrieved 18 March 2009.
  13. Setchell, C. J.; Campbell, N. W. (July 1999). "Using colour Gabor texture features for scene understanding" (PDF). 7th International Conference on Image Processing and its Applications. Vol. 1999. pp. 372–376. doi:10.1049/cp:19990346. ISBN   0-85296-717-9. S2CID   15972743.
  14. John Alan Richards; Xiuping Jia (2006). Remote Sensing Digital Image Analysis: An Introduction (4th ed.). Birkhäuser. pp. 102–104. ISBN   9783540251286 . Retrieved 26 July 2015.
  15. J. B. Campbell, "Introduction to Remote Sensing", 3rd ed., Taylor & Francis, p. 153
  16. Wood, Paul (2004). Varieties of Modernism. London, United Kingdom: Yale University Press. pp. 339–341, 354. ISBN   978-0-300-10296-3.
  17. "Gold Marilyn Monroe". Museum of Modern Art. Archived from the original on 13 June 2014. Retrieved 9 June 2014.
  18. Fallon, Michael (2011). How to Analyze the Works of Andy Warhol . North Mankato, Minnesota, United States of America: ABDO Publishing Company. pp.  44–46. ISBN   978-1-61613-534-8.
  19. Vogel, Carol (25 May 2007). "Inside Art". The New York Times. Retrieved 9 June 2014.