This article needs additional citations for verification .(September 2014) |
Pansharpening is a process of merging high-resolution panchromatic and lower resolution multispectral imagery to create a single high-resolution color image. [1] Google Maps and nearly every map creating company use this technique to increase image quality. Pansharpening produces a high-resolution color image from three, four or more low-resolution multispectral satellite bands plus a corresponding high-resolution panchromatic band:
Low-res color bands + High-res grayscale band = Hi-res color image
Such band combinations are commonly bundled in satellite data sets, for example Landsat 7, which includes six 30 m resolution multispectral bands, a 60 m thermal infrared band plus a 15 m resolution panchromatic band. SPOT, GeoEye and Maxar commercial data packages also commonly include both lower-resolution multispectral bands and a single panchromatic band. One of the principal reasons for configuring satellite sensors this way is to keep satellite weight, cost, bandwidth and complexity down. Pan sharpening uses spatial information in the high-resolution grayscale band and color information in the multispectral bands to create a high-resolution color image, essentially increasing the resolution of the color information in the data set to match that of the panchromatic band.
One common class of algorithms for pansharpening is called “component substitution,” [2] which usually involves the following steps:
Common color-space transformation used for pan sharpening are HSI (hue-saturation-intensity), and YCbCr. The same steps can also be performed using wavelet decomposition or PCA and replacing the first component with the pan band.
Pan-sharpening techniques can result in spectral distortions when pan sharpening satellite images as a result of the nature of the panchromatic band. The Landsat panchromatic band for example is not sensitive to blue light. As a result, the spectral characteristics of the raw pansharpened color image may not exactly match those of the corresponding low-resolution RGB image, resulting in altered color tones. This has resulted in the development of many algorithms that attempt to reduce this spectral distortion and to produce visually pleasing images.
Remote sensing is the acquisition of information about an object or phenomenon without making physical contact with the object, in contrast to in situ or on-site observation. The term is applied especially to acquiring information about the Earth and other planets. Remote sensing is used in numerous fields, including geography, land surveying and most Earth science disciplines ; it also has military, intelligence, commercial, economic, planning, and humanitarian applications, among others.
The Landsat program is the longest-running enterprise for acquisition of satellite imagery of Earth. It is a joint NASA / USGS program. On 23 July 1972, the Earth Resources Technology Satellite was launched. This was eventually renamed to Landsat 1 in 1975. The most recent, Landsat 9, was launched on 27 September 2021.
Landsat 7 is the seventh satellite of the Landsat program. Launched on 15 April 1999, Landsat 7's primary goal is to refresh the global archive of satellite photos, providing up-to-date and cloud-free images. The Landsat program is managed and operated by the United States Geological Survey, and data from Landsat 7 is collected and distributed by the USGS. The NASA WorldWind project allows 3D images from Landsat 7 and other sources to be freely navigated and viewed from any angle. The satellite's companion, Earth Observing-1, trailed by one minute and followed the same orbital characteristics, but in 2011 its fuel was depleted and EO-1's orbit began to degrade. Landsat 7 was built by Lockheed Martin Space Systems.
SPOT is a commercial high-resolution optical Earth imaging satellite system operating from space. It is run by Spot Image, based in Toulouse, France. It was initiated by the CNES in the 1970s and was developed in association with the SSTC and the Swedish National Space Board (SNSB). It has been designed to improve the knowledge and management of the Earth by exploring the Earth's resources, detecting and forecasting phenomena involving climatology and oceanography, and monitoring human activities and natural phenomena. The SPOT system includes a series of satellites and ground control resources for satellite control and programming, image production, and distribution. Earlier satellites were launched using the European Space Agency's Ariane 2, 3, and 4 rockets, while SPOT 6 and SPOT 7 were launched by the Indian PSLV.
DigitalGlobe is an American commercial vendor of space imagery and geospatial content, and operator of civilian remote sensing spacecraft. The company went public on the New York Stock Exchange on 14 May 2009, selling 14.7 million shares at US$19.00 each to raise US$279 million in capital. On 5 October 2017, Maxar Technologies completed its acquisition of DigitalGlobe.
False color refers to a group of color rendering methods used to display images in color which were recorded in the visible or non-visible parts of the electromagnetic spectrum. A false-color image is an image that depicts an object in colors that differ from those a photograph would show. In this image, colors have been assigned to three different wavelengths that our eyes cannot normally see.
Satellite images are images of Earth collected by imaging satellites operated by governments and businesses around the world. Satellite imaging companies sell images by licensing them to governments and businesses such as Apple Maps and Google Maps. It should not be confused for astronomy images collected by space telescope.
Landsat 3 is the third satellite of the Landsat program. It was launched on March 5, 1978, with the primary goal of providing a global archive of satellite imagery. Unlike later Landsat satellites, Landsat 3 was managed solely by NASA. Landsat 3 decommissioned on September 7, 1983, beyond its design life of one year. The data collected during Landsat 3's lifetime was used by 31 countries. Countries that cannot afford their own satellite are able to use the data for ecological preservation efforts and to determine the location of natural resources.
Multispectral imaging captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or detected via the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, i.e. infrared and ultra-violet. Spectral imaging can allow extraction of additional information the human eye fails to capture with its visible receptors for red, green and blue. It was originally developed for military target identification and reconnaissance. Early space-based imaging platforms incorporated multispectral imaging technology to map details of the Earth related to coastal boundaries, vegetation, and landforms. Multispectral imaging has also found use in document and painting analysis.
MEdium Resolution Imaging Spectrometer (MERIS) was one of the main instruments on board the European Space Agency (ESA)'s Envisat platform. ESA formally announced the end of Envisat's mission on 9 May 2012.
In imaging spectroscopy each pixel of an image acquires many bands of light intensity data from the spectrum, instead of just the three bands of the RGB color model. More precisely, it is the simultaneous acquisition of spatially coregistered images in many spectrally contiguous bands.
The normalized difference vegetation index (NDVI) is a simple graphical indicator that can be used to analyze remote sensing measurements, often from a space platform, assessing whether or not the target being observed contains live green vegetation.
Hyperspectral imaging, like other spectral imaging, collects and processes information from across the electromagnetic spectrum. The goal of hyperspectral imaging is to obtain the spectrum for each pixel in the image of a scene, with the purpose of finding objects, identifying materials, or detecting processes. There are three general branches of spectral imagers. There are push broom scanners and the related whisk broom scanners, which read images over time, band sequential scanners, which acquire images of an area at different wavelengths, and snapshot hyperspectral imaging, which uses a staring array to generate an image in an instant.
Landsat 2 is the second satellite of the Landsat program. The spacecraft originally carried a designation of ERTS-B but was renamed "Landsat 2" prior to its launch on January 22, 1975. The objective of the satellite was to acquire global, seasonal data in medium resolution from a near-polar, sun-synchronous orbit. The satellite, built by General Electric, acquired data with the Return Beam Vidicon (RBV) and the Multispectral Scanner (MSS). Despite having a design life of one year, Landsat 2 operated for over seven years, finally ceasing operations on February 25, 1982.
The image fusion process is defined as gathering all the important information from multiple images, and their inclusion into fewer images, usually a single one. This single image is more informative and accurate than any single source image, and it consists of all the necessary information. The purpose of image fusion is not only to reduce the amount of data but also to construct images that are more appropriate and understandable for the human and machine perception. In computer vision, multisensor image fusion is the process of combining relevant information from two or more images into a single image. The resulting image will be more informative than any of the input images.
Resurs-DK No.1, also called Resurs-DK1, is a commercial Earth observation satellite capable of transmitting high-resolution imagery to the ground stations as it passes overhead. The spacecraft is operated by NTs OMZ, the Russian Research Center for Earth Operative Monitoring.
Airborne Real-time Cueing Hyperspectral Enhanced Reconnaissance, also known by the acronym ARCHER, is an aerial imaging system that produces ground images far more detailed than plain sight or ordinary aerial photography can. It is the most sophisticated unclassified hyperspectral imaging system available, according to U.S. Government officials. ARCHER can automatically scan detailed imaging for a given signature of the object being sought, for abnormalities in the surrounding area, or for changes from previous recorded spectral signatures.
Landsat 8 is an American Earth observation satellite launched on 11 February 2013. It is the eighth satellite in the Landsat program; the seventh to reach orbit successfully. Originally called the Landsat Data Continuity Mission (LDCM), it is a collaboration between NASA and the United States Geological Survey (USGS). NASA Goddard Space Flight Center in Greenbelt, Maryland, provided development, mission systems engineering, and acquisition of the launch vehicle while the USGS provided for development of the ground systems and will conduct on-going mission operations. It comprises the camera of the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS) which can be used to study Earth surface temperature and is used to study global warming.
The Operational Land Imager (OLI) is a remote sensing instrument aboard Landsat 8, built by Ball Aerospace & Technologies. Landsat 8 is the successor to Landsat 7 and was launched on February 11, 2013.
Multispectral remote sensing is the collection and analysis of reflected, emitted, or back-scattered energy from an object or an area of interest in multiple bands of regions of the electromagnetic spectrum. Subcategories of multispectral remote sensing include hyperspectral, in which hundreds of bands are collected and analyzed, and ultraspectral remote sensing where many hundreds of bands are used. The main purpose of multispectral imaging is the potential to classify the image using multispectral classification. This is a much faster method of image analysis than is possible by human interpretation.
|journal=
(help)