Airborne Real-time Cueing Hyperspectral Enhanced Reconnaissance, also known by the acronym ARCHER, is an aerial imaging system that produces ground images far more detailed than plain sight or ordinary aerial photography can. [1] It is the most sophisticated unclassified hyperspectral imaging system available, according to U.S. Government officials. [2] ARCHER can automatically scan detailed imaging for a given signature of the object being sought (such as a missing aircraft), [3] for abnormalities in the surrounding area, or for changes from previous recorded spectral signatures. [4]
It has direct applications for search and rescue, counterdrug, disaster relief and impact assessment, and homeland security, and has been deployed by the Civil Air Patrol (CAP) in the US on the Australian-built Gippsland GA8 Airvan fixed-wing aircraft. [2] CAP, the civilian auxiliary of the United States Air Force, is a volunteer education and public-service non-profit organization that conducts aircraft search and rescue in the US.
ARCHER is a daytime non-invasive technology, which works by analyzing an object's reflected light. It cannot detect objects at night, underwater, under dense cover, underground, under snow or inside buildings. [5] The system uses a special camera facing down through a quartz glass portal in the belly of the aircraft, which is typically flown at a standard mission altitude of 2,500 feet (760 meters) and 100 knots (50 meters/second) ground speed. [6]
The system software was developed by Space Computer Corporation of Los Angeles and the system hardware is supplied by NovaSol Corp. of Honolulu, Hawaii specifically for CAP. [5] [7] The ARCHER system is based on hyperspectral technology research and testing previously undertaken by the United States Naval Research Laboratory (NRL) and Air Force Research Laboratory (AFRL). [7] CAP developed ARCHER in cooperation with the NRL, AFRL and the United States Coast Guard Research & Development Center in the largest interagency project CAP has undertaken in its 74-year history. [8]
Since 2003, almost US$5 million authorized under the 2002 Defense Appropriations Act has been spent on development and deployment. [5] As of January 2007 [update] , CAP reported completing the initial deployment of 16 aircraft throughout the U.S. and training over 100 operators, but had only used the system on a few search and rescue missions, and had not credited it with being the first to find any wreckage. [9] In searches in Georgia and Maryland during 2007, ARCHER located the aircraft wreckage, but both accidents had no survivors, according to Col. Drew Alexa, director of advanced technology, and the ARCHER program manager at CAP. [1] An ARCHER equipped aircraft from the Utah Wing of the Civil Air Patrol was used in the search for adventurer Steve Fossett in September 2007. [3] [10] ARCHER did not locate Mr. Fossett, but was instrumental in uncovering eight previously uncharted crash sites in the high desert area of Nevada, [11] [12] some decades old. [13] [14]
Col. Alexa described the system to the press in 2007: "The human eye sees basically three bands of light. The ARCHER sensor sees 50. It can see things that are anomalous in the vegetation such as metal or something from an airplane wreckage." [1] Major Cynthia Ryan of the Nevada Civil Air Patrol, while also describing the system to the press in 2007, stated, "ARCHER is essentially something used by the geosciences. It's pretty sophisticated stuff … beyond what the human eye can generally see," [15] She elaborated further, "It might see boulders, it might see trees, it might see mountains, sagebrush, whatever, but it goes 'not that' or 'yes, that'. The amazing part of this is that it can see as little as 10 per cent of the target, and extrapolate from there." [16]
In addition to the primary search and rescue mission, CAP has tested additional uses for ARCHER. [17] For example, an ARCHER equipped CAP GA8 was used in a pilot project in Missouri in August 2005 to assess the suitability of the system for tracking hazardous material releases into the environment, [18] and one was deployed to track oil spills in the aftermath of Hurricane Rita in Texas during September 2005. [19]
Since then, in the case of a flight originating in Missouri, the ARCHER system proved its usefulness in October 2006, when it found the wreckage in Antlers, Okla. [20] The National Transportation and Safety Board was extremely pleased with the data ARCHER provided, which was later used to locate aircraft debris spread over miles of rough, wooded terrain. In July 2007, the ARCHER system identified a flood-borne oil spill originating in a Kansas oil refinery, that extended downstream and had invaded previously unsuspected reservoir areas. [21] The client agencies (EPA, Coast Guard, and other federal and state agencies) found the data essential to quick remediation. In September 2008, a Civil Air Patrol GA-8 from Texas Wing searched for a missing aircraft from Arkansas. It was found in Oklahoma, identified simultaneously by ground searchers and the overflying ARCHER system. Rather than a direct find, this was a validation of the system's accuracy and efficacy. In the subsequent recovery, it was found that the ARCHER plotted the debris area with great accuracy.[ citation needed ]
The major ARCHER subsystem components include: [6]
The passive hyperspectral imaging spectroscopy remote sensor observes a target in multi-spectral bands. The HSI camera separates the image spectra into 52 "bins" from 500 nanometers (nm) wavelength at the blue end of the visible spectrum to 1100 nm in the infrared, giving the camera a spectral resolution of 11.5 nm. [22] Although ARCHER records data in all 52 bands, the computational algorithms only use the first 40 bands, from 500 nm to 960 nm because the bands above 960 nm are too noisy to be useful. [23] For comparison, the normal human eye will respond to wavelengths from approximately 400 to 700 nm, [24] and is trichromatic, meaning the eye's cone cells only sense light in three spectral bands.
As the ARCHER aircraft flies over a search area, reflected sunlight is collected by the HSI camera lens. The collected light passes through a set of lenses that focus the light to form an image of the ground. The imaging system uses a pushbroom approach to image acquisition. With the pushbroom approach, the focusing slit reduces the image height to the equivalent of one vertical pixel, creating a horizontal line image.
The horizontal line image is then projected onto a diffraction grating, which is a very finely etched reflecting surface that disperses light into its spectra. The diffraction grating is specially constructed and positioned to create a two-dimensional (2D) spectrum image from the horizontal line image. The spectra are projected vertically, i.e., perpendicular to the line image, by the design and arrangement of the diffraction grating.[ citation needed ]
The 2D spectrum image projects onto a charge-coupled device (CCD) two-dimensional image sensor, which is aligned so that the horizontal pixels are parallel to the image's horizontal. As a result, the vertical pixels are coincident to the spectra produced from the diffraction grating. Each column of pixels receives the spectrum of one horizontal pixel from the original image. The arrangement of vertical pixel sensors in the CCD divides the spectrum into distinct and non-overlapping intervals. The CCD output consists of electrical signals for 52 spectral bands for each of 504 horizontal image pixels.[ citation needed ]
The on-board computer records the CCD output signal at a frame rate of sixty times each second. At an aircraft altitude of 2,500 ft AGL and a speed of 100 knots, a 60 Hz frame rate equates to a ground image resolution of approximately one square meter per pixel. Thus, every frame captured from the CCD contains the spectral data for a ground swath that is approximately one meter long and 500 meters wide. [23]
A high-resolution imaging (HRI) black-and-white, or panchromatic, camera is mounted adjacent to the HSI camera to enable both cameras to capture the same reflected light. The HRI camera uses a pushbroom approach just like the HSI camera with a similar lens and slit arrangement to limit the incoming light to a thin, wide beam. However, the HRI camera does not have a diffraction grating to disperse the incoming reflected light. Instead, the light is directed to a wider CCD to capture more image data. Because it captures a single line of the ground image per frame, it is called a line scan camera. The HRI CCD is 6,144 pixels wide and one pixel high. It operates at a frame rate of 720 Hz. At ARCHER search speed and altitude (100 knots over the ground at 2,500 ft AGL) each pixel in the black-and-white image represents a 3 inch by 3 inch area of the ground. This high resolution adds the capability to identify some objects. [23]
A monitor in the cockpit displays detailed images in real time, and the system also logs the image and Global Positioning System data at a rate of 30 gigabytes (GB) per hour for later analysis. [1] The on-board data processing system performs numerous real-time processing functions including data acquisition and recording, raw data correction, target detection, cueing and chipping, precision image geo-registration, and display and dissemination of image products and target cue information. [25]
ARCHER has three methods for locating targets:
In change detection, scene changes are identified, and new, moved or departed targets are highlighted for evaluation. [2] In spectral signature matching, the system can be programmed with the parameters of a missing aircraft, such as paint colors, to alert the operators of possible wreckage. [3] It can also be used to look for specific materials, such as petroleum products or other chemicals released into the environment, [18] or even ordinary items like commonly available blue polyethylene tarpaulins. In an impact assessment role, information on the location of blue tarps used to temporarily repair buildings damaged in a storm can help direct disaster relief efforts; in a counterdrug role, a blue tarp located in a remote area could be associated with illegal activity. [27]
A charge-coupled device (CCD) is an integrated circuit containing an array of linked, or coupled, capacitors. Under the control of an external circuit, each capacitor can transfer its electric charge to a neighboring capacitor. CCD sensors are a major technology used in digital imaging.
A photodiode is a semiconductor diode sensitive to photon radiation, such as visible light, infrared or ultraviolet radiation, X-rays and gamma rays. It produces an electrical current when it absorbs photons. This can be used for detection and measurement applications, or for the generation of electrical power in solar cells. Photodiodes are used in a wide range of applications throughout the electromagnetic spectrum from visible light photocells to gamma ray spectrometers.
A digital camera, also called a digicam, is a camera that captures photographs in digital memory. Most cameras produced today are digital, largely replacing those that capture images on photographic film or film stock. Digital cameras are now widely incorporated into mobile devices like smartphones with the same or more capabilities and features of dedicated cameras. High-end, high-definition dedicated cameras are still commonly used by professionals and those who desire to take higher-quality photographs.
Forward-looking infrared (FLIR) cameras, typically used on military and civilian aircraft, use a thermographic camera that senses infrared radiation.
Measurement and signature intelligence (MASINT) is a technical branch of intelligence gathering, which serves to detect, track, identify or describe the distinctive characteristics (signatures) of fixed or dynamic target sources. This often includes radar intelligence, acoustic intelligence, nuclear intelligence, and chemical and biological intelligence. MASINT is defined as scientific and technical intelligence derived from the analysis of data obtained from sensing instruments for the purpose of identifying any distinctive features associated with the source, emitter or sender, to facilitate the latter's measurement and identification.
A Bayer filter mosaic is a color filter array (CFA) for arranging RGB color filters on a square grid of photosensors. Its particular arrangement of color filters is used in most single-chip digital image sensors used in digital cameras, and camcorders to create a color image. The filter pattern is half green, one quarter red and one quarter blue, hence is also called BGGR, RGBG, GRBG, or RGGB.
Satellite images are images of Earth collected by imaging satellites operated by governments and businesses around the world. Satellite imaging companies sell images by licensing them to governments and businesses such as Apple Maps and Google Maps.
Multispectral imaging captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or detected with the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range. It can allow extraction of additional information the human eye fails to capture with its visible receptors for red, green and blue. It was originally developed for military target identification and reconnaissance. Early space-based imaging platforms incorporated multispectral imaging technology to map details of the Earth related to coastal boundaries, vegetation, and landforms. Multispectral imaging has also found use in document and painting analysis.
Spectral imaging is imaging that uses multiple bands across the electromagnetic spectrum. While an ordinary camera captures light across three wavelength bands in the visible spectrum, red, green, and blue (RGB), spectral imaging encompasses a wide variety of techniques that go beyond RGB. Spectral imaging may use the infrared, the visible spectrum, the ultraviolet, x-rays, or some combination of the above. It may include the acquisition of image data in visible and non-visible bands simultaneously, illumination from outside the visible range, or the use of optical filters to capture a specific spectral range. It is also possible to capture hundreds of wavelength bands for each pixel in an image.
An image sensor or imager is a sensor that detects and conveys information used to form an image. It does so by converting the variable attenuation of light waves into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, camera phones, optical mouse devices, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others. As technology changes, electronic and digital imaging tends to replace chemical and analog imaging.
Hyperspectral imaging collects and processes information from across the electromagnetic spectrum. The goal of hyperspectral imaging is to obtain the spectrum for each pixel in the image of a scene, with the purpose of finding objects, identifying materials, or detecting processes. There are three general types of spectral imagers. There are push broom scanners and the related whisk broom scanners, which read images over time, band sequential scanners, which acquire images of an area at different wavelengths, and snapshot hyperspectral imagers, which uses a staring array to generate an image in an instant.
In digital imaging, a color filter array (CFA), or color filter mosaic (CFM), is a mosaic of tiny color filters placed over the pixel sensors of an image sensor to capture color information.
An imaging spectrometer is an instrument used in hyperspectral imaging and imaging spectroscopy to acquire a spectrally-resolved image of an object or scene, usually to support analysis of the composition the object being imaged. The spectral data produced for a pixel is often referred to as a datacube due to the three-dimensional representation of the data. Two axes of the image correspond to vertical and horizontal distance and the third to wavelength. The principle of operation is the same as that of the simple spectrometer, but special care is taken to avoid optical aberrations for better image quality.
The image fusion process is defined as gathering all the important information from multiple images, and their inclusion into fewer images, usually a single one. This single image is more informative and accurate than any single source image, and it consists of all the necessary information. The purpose of image fusion is not only to reduce the amount of data but also to construct images that are more appropriate and understandable for the human and machine perception. In computer vision, multisensor image fusion is the process of combining relevant information from two or more images into a single image. The resulting image will be more informative than any of the input images.
Electro-optical MASINT is a subdiscipline of Measurement and Signature Intelligence, (MASINT) and refers to intelligence gathering activities which bring together disparate elements that do not fit within the definitions of Signals Intelligence (SIGINT), Imagery Intelligence (IMINT), or Human Intelligence (HUMINT).
Integral field spectrographs (IFS) combine spectrographic and imaging capabilities in the optical or infrared wavelength domains (0.32 μm – 24 μm) to get from a single exposure spatially resolved spectra in a bi-dimensional region. The name originates from the fact that the measurements result from integrating the light on multiple sub-regions of the field. Developed at first for the study of astronomical objects, this technique is now also used in many other fields, such bio-medical science and Earth remote sensing. Integral field spectrography is part of the broader category of snapshot hyperspectral imaging techniques, itself a part of hyperspectral imaging.
PRISMA is an Italian Space Agency pre-operational and technology demonstrator mission focused on the development and delivery of hyperspectral products and the qualification of the hyperspectral payload in space.
Gaofen is a series of Chinese high-resolution Earth imaging satellites launched as part of the China High-resolution Earth Observation System (CHEOS) program. CHEOS is a state-sponsored, civilian Earth-observation program used for agricultural, disaster, resource, and environmental monitoring. Proposed in 2006 and approved in 2010, the CHEOS program consists of the Gaofen series of space-based satellites, near-space and airborne systems such as airships and UAVs, ground systems that conduct data receipt, processing, calibration, and taskings, and a system of applications that fuse observation data with other sources to produce usable information and knowledge.
Resurs-P is a series of Russian commercial Earth observation satellites capable of acquiring high-resolution hyperspectral (HSI), wide-field multispectral (MSI), and panchromatic imagery. These spacecraft cost over 5 billion rubles and are operated by Roscosmos replacing the Resurs-DK No.1 satellite.
Ralph is a science instrument aboard the robotic New Horizons spacecraft, which was launched in 2006. Ralph is a visible and infrared imager and spectrometer to provide maps of relevant astronomical targets based on data from that hardware. Ralph has two major subinstruments, LEISA and MVIC. MVIC stands for Multispectral Visible Imaging Camera and is a color imaging device, while LEISA originally stood for Linear Etalon Imaging Spectral Array and is an infrared imaging spectrometer for spaceflight. LEISA observes 250 discrete wavelengths of infrared light from 1.25 to 2.5 micrometers. MVIC is a pushbroom scanner type of design with seven channels, including red, blue, near-infrared (NIR), and methane.
ARCHER … is capable of panchromatic aerial imaging far more detailed than plain sight or ordinary photography can gather.
The most sophisticated unclassified HyperSpectral imaging system available.
According to CAP, a set of parameters describing the intended target, including its color and shape, is programmed into the ARCHER system.
HSI—Also referred to as 'ARCHER'. HSI is a passive sensor system that observes a target in multi-spectral bands.
{{cite journal}}
: Cite journal requires |journal=
(help)HSI is a daytime non-invasive technology, which works by analyzing an object's reflected light.
ARCHER contains an advanced hyperspectral imaging (HSI) system and a panchromatic high-resolution imaging (HRI) camera.
Each ARCHER system consists of a NovaSol-designed, pushbroom, visible/near-infrared (VNIR) hyperspectral imaging (HSI) sensor, a co-boresighted visible panchromatic high-resolution imaging (HRI) sensor, and a CMIGITS-III GPS/INS unit in an integrated sensor assembly mounted inside the GA-8 cabin. ARCHER incorporates an on-board data processing system developed by Space Computer Corporation (SCC) to perform numerous real-time processing functions including data acquisition and recording, raw data correction, target detection, cueing and chipping, precision image geo-registration, and display and dissemination of image products and target cue information.
{{cite book}}
: |journal=
ignored (help)Dr. Paul Schuda (CAP) reported that the Civil Air Patrol has completed the initial deployment of sixteen Airborne Real-time Cueing Hyperspectral Enhanced Recon…
…a plane from the Utah Wing of the Civil Air Patrol arrived Wednesday with equipment capable of collecting hyperspectral and panchromatic images…
While the Archer has not located Mr. Fossett, it has been instrumental in the discovery of at least eight previously unnoticed plane wrecks in the rugged Sierra Nevada region.
...search parties have spotted wreckage of eight other airplanes that had been lost for years in and around the rugged mountains of western Nevada.
…another downed plane Friday that was spotted on a hillside about 45 miles southeast of Reno … turned out to be an old crash, a plane last registered in Oregon in 1975
…before the advent of high-tech search gadgets like the ARCHER imaging system, which can identify targets as small as a motorcycle from 2,500 feet away. One of the crash sites that the Fossett team discovered might date to 1964…
'ARCHER is essentially something used by the geosciences,' Ryan said. 'It's pretty sophisticated stuff … beyond what the human eye can generally see.'
'It might see boulders, it might see trees, it might see mountains, sagebrush, whatever, but it goes "not that" or "yes, that",' she said.[ dead link ]
CAP has found many additional uses for ARCHER, including missions for homeland security, disaster assessment, and drug interdiction.
Missouri Department of Natural Resources ... is pursuing a Pilot Project to evaluate environmental applications of ARCHER ...
A Gippsland Aeronautics GA-8 'Airvan' … came to West Houston Airport to support the assessment of Hurricane Rita.
ARCHER incorporates an on-board data processing system developed by Space Computer Corporation (SCC) to perform numerous real-time processing functions…
{{cite book}}
: |journal=
ignored (help)…signature matching (matching reflected light to spectral signatures), anomaly detection (calculates a statistical model of all the pixels in the image to see if there is a probability that a pixel does not fit).
Tarp in a remote wooded area—useful for counter-drug applications.