CLidar

Last updated

The CLidar is a scientific instrument used for measuring particulates (aerosols) in the lower atmosphere. CLidar stands for camera lidar, which in turn is a portmanteau of "light" and "radar". It is a form of remote sensing and used for atmospheric physics.

Contents

Description

In this technique a very wide-angle lens images light scattered from a laser beam onto a CCD (Charge-coupled device) camera. The camera is positioned hundreds of meters away from the (usually vertically-pointed) laser beam. The geometry of the CLidar is shown in the figure. It is important in the analysis that the optics, the wide-angle lens in this case, accurately maps equal angles onto an equal number of pixels throughout the 100 degree field-of-view.

CLidarDiagram2008.jpg

Example

In the second figure, an image from the CCD camera is shown which is analyzed by adding up the individual pixels at each altitude. The camera was 122 meters from the vertically pointed, circularly-polarized laser beam. The beam is brighter near the ground due to near-ground aerosols. A bright spot due to a cloud can be seen near the top of the beam. The beam was positioned diagonally on the CCD array to use the space more effectively. A lighthouse and power pole can also be seen in the image.

CLidarExampleImage2008.jpg

Differences between Lidar and CLidar

The CLidar technique has the advantage over the lidar technique of being able to measure all the way to the ground. This difference derives from their distinct geometric configurations, where in the lidar technique the receiving telescope and sourcing optics are monostatic (axially aligned) while the receiving and outputting optics for the CLidar technique are bistatic (non-zero perpendicular distance between receiving and outputting optics). The signal strength is also much more constant than a lidar signal which can change by many orders of magnitude. It has very high altitude resolution in the lower atmosphere. The instrument components are typically simpler than those in the lidar also.

Disadvantages include poor altitude resolution in the upper atmosphere, difficulty designing optics that gathers substantial amounts of light, and a loss in noise rejection (signal-to-noise ratio).

Related Research Articles

Charge-coupled device Device for the movement of electrical charge

A charge-coupled device (CCD) is an integrated circuit containing an array of linked, or coupled, capacitors. Under the control of an external circuit, each capacitor can transfer its electric charge to a neighboring capacitor. CCD sensors are a major technology used in digital imaging.

Lidar Method of spatial measurement using laser scanning

Lidar is a method for determining ranges by targeting an object with a laser and measuring the time for the reflected light to return to the receiver. Lidar can also be used to make digital 3-D representations of areas on the earth's surface and ocean bottom, due to differences in laser return times, and by varying laser wavelengths. It has terrestrial, airborne, and mobile applications.

<i>Clementine</i> (spacecraft) American space project

Clementine was a joint space project between the Ballistic Missile Defense Organization and NASA, launched on January 25, 1994. Its objective was to test sensors and spacecraft components in long-term exposure to space and to make scientific observations of both the Moon and the near-Earth asteroid 1620 Geographos.

Adaptive optics Technique used to improve performance of optical systems

Adaptive optics (AO) is a technology used to improve the performance of optical systems by reducing the effect of incoming wavefront distortions by deforming a mirror in order to compensate for the distortion. It is used in astronomical telescopes and laser communication systems to remove the effects of atmospheric distortion, in microscopy, optical fabrication and in retinal imaging systems to reduce optical aberrations. Adaptive optics works by measuring the distortions in a wavefront and compensating for them with a device that corrects those errors such as a deformable mirror or a liquid crystal array.

Ceilometer Ground-based lidar for cloud height measurement

A ceilometer is a device that uses a laser or other light source to determine the height of a cloud ceiling or cloud base. Ceilometers can also be used to measure the aerosol concentration within the atmosphere. A ceilometer that uses laser light is a type of atmospheric lidar instrument.

Diffraction-limited system Optical system with resolution performance at the instruments theoretical limit

The resolution of an optical imaging system – a microscope, telescope, or camera – can be limited by factors such as imperfections in the lenses or misalignment. However, there is a principal limit to the resolution of any optical system, due to the physics of diffraction. An optical system with resolution performance at the instrument's theoretical limit is said to be diffraction-limited.

Particle image velocimetry (PIV) is an optical method of flow visualization used in education and research. It is used to obtain instantaneous velocity measurements and related properties in fluids. The fluid is seeded with tracer particles which, for sufficiently small particles, are assumed to faithfully follow the flow dynamics. The fluid with entrained particles is illuminated so that particles are visible. The motion of the seeding particles is used to calculate speed and direction of the flow being studied.

Imaging radar

Imaging radar is an application of radar which is used to create two-dimensional images, typically of landscapes. Imaging radar provides its light to illuminate an area on the ground and take a picture at radio wavelengths. It uses an antenna and digital computer storage to record its images. In a radar image, one can see only the energy that was reflected back towards the radar antenna. The radar moves along a flight path and the area illuminated by the radar, or footprint, is moved along the surface in a swath, building the image as it does so.

A total internal reflection fluorescence microscope (TIRFM) is a type of microscope with which a thin region of a specimen, usually less than 200 nanometers can be observed.

Vignetting Reduction of an images brightness or saturation toward the periphery compared to the image center

In photography and optics, vignetting is a reduction of an image's brightness or saturation toward the periphery compared to the image center. The word vignette, from the same root as vine, originally referred to a decorative border in a book. Later, the word came to be used for a photographic portrait that is clear at the center and fades off toward the edges. A similar effect is visible in photographs of projected images or videos off a projection screen, resulting in a so-called "hotspot" effect.

Flat-field correction

Flat-field correction (FFC) is a technique used to improve quality in digital imaging. It cancels the effects of image artifacts caused by variations in the pixel-to-pixel sensitivity of the detector and by distortions in the optical path. It is a standard calibration procedure in everything from personal digital cameras to large telescopes.

Three-CCD camera

A three-CCD (3CCD) camera is a camera whose imaging system uses three separate charge-coupled devices (CCDs), each one receiving filtered red, green, or blue color ranges. Light coming in from the lens is split by a complex prism into three beams, which are then filtered to produce colored light in three color ranges or "bands". The system is employed by high quality still cameras, telecine systems, professional video cameras and some prosumer video cameras.

Image sensor Device that converts an optical image into an electronic signal

An image sensor or imager is a sensor that detects and conveys information used to make an image. It does so by converting the variable attenuation of light waves into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, camera phones, optical mouse devices, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others. As technology changes, electronic and digital imaging tends to replace chemical and analog imaging.

The following are common definitions related to the machine vision field.

Planar laser-induced fluorescence

Planar laser-induced fluorescence (PLIF) is an optical diagnostic technique widely used for flow visualization and quantitative measurements. PLIF has been shown to be used for velocity, concentration, temperature and pressure measurements.

Image sensor format Shape and size of a digital cameras image sensor

In digital photography, the image sensor format is the shape and size of the image sensor.

Laser beam profiler

A laser beam profiler captures, displays, and records the spatial intensity profile of a laser beam at a particular plane transverse to the beam propagation path. Since there are many types of lasers — ultraviolet, visible, infrared, continuous wave, pulsed, high-power, low-power — there is an assortment of instrumentation for measuring laser beam profiles. No single laser beam profiler can handle every power level, pulse duration, repetition rate, wavelength, and beam size.

Optical heterodyne detection is a method of extracting information encoded as modulation of the phase, frequency or both of electromagnetic radiation in the wavelength band of visible or infrared light. The light signal is compared with standard or reference light from a "local oscillator" (LO) that would have a fixed offset in frequency and phase from the signal if the latter carried null information. "Heterodyne" signifies more than one frequency, in contrast to the single frequency employed in homodyne detection.

Time-of-flight camera Range imaging camera system

A time-of-flight camera is a range imaging camera system employing time-of-flight techniques to resolve distance between the camera and the subject for each point of the image, by measuring the round trip time of an artificial light signal provided by a laser or an LED. Laser-based time-of-flight cameras are part of a broader class of scannerless LIDAR, in which the entire scene is captured with each laser pulse, as opposed to point-by-point with a laser beam such as in scanning LIDAR systems. Time-of-flight camera products for civil applications began to emerge around 2000, as the semiconductor processes allowed the production of components fast enough for such devices. The systems cover ranges of a few centimeters up to several kilometers.

Atmospheric lidar is a class of instruments that uses laser light to study atmospheric properties from the ground up to the top of the atmosphere. Such instruments have been used to study, among other, atmospheric gases, aerosols, clouds, and temperature.

References

    Sources