Time delay and integration

Last updated

A time delay and integration or time delay integration (TDI) is a forward motion compensation (FMC) technique for capturing images of moving objects at low light levels. It's a type of line scanning where multiple linear arrays are placed side by side. After the first array is exposed, the charge is transferred to the neighboring line. When the object moves the distance of the separation between lines, a second exposure is taken on top of the first with the next array, and so on. Thus, each line of the object is imaged repeatedly, and the exposures are added to each other. This works by synchronized mechanical and electronic scanning, so that the effects of dim imaging targets on the sensor can be integrated over longer periods of time.

Contents

TDI is more of an operating mode of an image sensor than a separate type of imaging device altogether, even if technical optimizations for the mode are also available. The most used way to perform TDI is called dTDI from digital Time Delay Integration, which is software-based and independent of the type of underlying imaging sensor. The principle behind TDI constructive interference between separate observationsis often applicable to other sensor technologies, so that it is comparable to any long-term integrating mode of imaging, such as speckle imaging, adaptive optics, and especially long exposure astronomical observation.

Detailed operation

It is perhaps the easiest to understand TDI devices by contrast with more well-known types of CCD sensors. The best known is the staring array one. In it, there are hundreds or thousands of adjacent rows of specially engineered semiconductor that react to light by accumulating charge, and slightly separated in depth from it by insulation, a tightly spaced array of gate electrodes, whose electric field can be used to drive the accumulated charge around in a predictable and almost lossless fashion. In a staring array configuration, the image is exposed on the two-dimensional semiconductor surface, and then the resulting charge distribution over each line of the image is moved to the side, to be rapidly and sequentially read out by an electronic read amplifier. When done fast enough, this produces a snapshot of the applied photonic flux over the sensor; the readout can proceed in parallel over the several lines, and yields a two-dimensional image of the light applied. Along with CMOS detectors which sense the photocharge accumulation pixel by pixel instead of moving the charge out line by line, such sensors are commonly known as parts of digital cameras, from the small to the large.

A scanning array on the other hand involves just one such CCD line, or at most a couple of them. Its principle of operation is to rely on mechanical scanning, so that a single linear CCD element gets exposed to different parts of the object to be imaged, sequentially. Then the whole image is assembled from equally spaced lines through the field of view. Typical examples of this scanning mode are fax machines and other document scanners, where the imaging target is fed through at a constant linear velocity, and satellite sensing, where the constant orbital velocity of a satellite naturally exposes line after another of the underlying terrain to the transversely positioned sensor.

The advantage of using a CCD sensor this way is reduced complexity, and so price, or vice versa the possibility of utilizing much more refined and so more expensive CCD technology for the single line sensor array, for higher fidelity. CCDs can also be manufactured in configurations that are tolerant to the wide fluctuations in radiation and temperature, characteristic of space environments, and scanning ones can be made extra robust by the inclusion of multiple lines. Since the out-clocking mechanism of a well-phased CCD line is a continuous process, not divided into pixels, the eventual line-wise resolution of the image can also exceed the resolution of the gating infrastructure, leading to higher resolution than a pixel-based sensor. CCDs are also easier to make for cryogenic temperatures, such as are needed e.g. for far-infrared astronomy.

Motion

At the same time, the continuous operation and slow, line-discrete readout also leads to a problem: if anything moves within the scene to be imaged, there will be blurring and tearing between lines. Wherever some accumulated packet of charge within a CCD line is moving on the sensor chip, any extra light shone upon it will lead to more charge, even if it comes from a wrong direction, or a newer moment of acquisition than intended. It will register just the same, so that it integrates over time to whatever will eventually be read out. This leads to what is in cinematography called motion blur, and since the readout of the multiple lines of the typical CCD array occurs at different successive times, it also causes screen tearing.

In TDI mode, motion blur and the pseudo-analogue nature of CCDs is turned from a fault into a special-purpose asset. The line or 2D array is turned 90 degrees so that the lines in the CCD sensor follow the expected trajectory of the object of interest in the field of view. Then, the readout speed from the sensor is adjusted so that the charge packets in the imaging plane track the object, accumulating charge over time. This is effectively the same as spinning the spacecraft or other platform to match the viewing angle towards an object; it yields time integration in the digital domain, instead of the physical one. Physical tracking and superimposition of images can be applied in addition, as more traditional forms of TDI.

With the high sensitivity of CCD sensors, into the photon counting regime, this can lead to extremely high detection and measurement sensitivity. [1] Additionally, it is difficult to achieve the kinds of coherent measurement gains with digital technologies besides CCDs, because they suffer from more prominent aliasing.

Technology specific to TDI CCD

While the basic theory of TDI only mentions single-row CCDs, specifically designed parts and algorithms utilize everything from a few lines to entire staring arrays, with integration taking place over multiple lines, in software, as well. A designated TDI CCD improves upon the single-line-scan system by adding up multiple measured photocharges over its more complicated sensor, and by more comprehensive analysis of the interaction between continuous lines and discrete column structure. This e.g. aids in integration over physical tracking errors, imperfect lensing, background rejection, and multi-object tracking.

CCD technology and as such TDI is also used in x-ray astronomy. There, a different set of challenges prevails: TDI is used because high energy photons tend to exhibit high imaging loss, and then when they are fortuitously recovered, one-by-one tend the wreak havoc with the imaging element. Here, CCDs are often used because they can be manufactured in radiation hardened configurations, and are rather tolerant of radiation even as-is. This is especially important in solutions using coherent addition, because they focus and track intense radiation sources for a span of time, so that the total irradiative dose from the source reaches high levels over time, per a given area.

Applications

TDI CCD is especially used in scanning of moving objects, for example letter and film scanning, or from a moving platform, for example aerial reconnaissance. [2]

See also

Related Research Articles

A charge-coupled device (CCD) is an integrated circuit containing an array of linked, or coupled, capacitors. Under the control of an external circuit, each capacitor can transfer its electric charge to a neighboring capacitor. CCD sensors are a major technology used in digital imaging.

<span class="mw-page-title-main">Raster graphics</span> Image display as a 2D grid of pixels

In computer graphics and digital photography, a raster graphic represents a two-dimensional picture as a rectangular matrix or grid of pixels, viewable via a computer display, paper, or other display medium. A raster image is technically characterized by the width and height of the image in pixels and by the number of bits per pixel. Raster images are stored in image files with varying dissemination, production, generation, and acquisition formats.

<span class="mw-page-title-main">Photodiode</span> Converts light into current

A photodiode is a semiconductor diode sensitive to photon radiation, such as visible light, infrared or ultraviolet radiation, X-rays and gamma rays. It produces an electrical current when it absorbs photons. This can be used for detection and measurement applications, or for the generation of electrical power in solar cells. Photodiodes are used in a wide range of applications throughout the electromagnetic spectrum from visible light photocells to gamma ray spectrometers.

<span class="mw-page-title-main">Digital camera</span> Camera that captures photographs or video in digital format

A digital camera, also called a digicam, is a camera that captures photographs in digital memory. Most cameras produced today are digital, largely replacing those that capture images on photographic film or film stock. Digital cameras are now widely incorporated into mobile devices like smartphones with the same or more capabilities and features of dedicated cameras. High-end, high-definition dedicated cameras are still commonly used by professionals and those who desire to take higher-quality photographs.

<span class="mw-page-title-main">Astrophotography</span> Imaging of astronomical objects

Astrophotography, also known as astronomical imaging, is the photography or imaging of astronomical objects, celestial events, or areas of the night sky. The first photograph of an astronomical object was taken in 1840, but it was not until the late 19th century that advances in technology allowed for detailed stellar photography. Besides being able to record the details of extended objects such as the Moon, Sun, and planets, modern astrophotography has the ability to image objects outside of the visible spectrum of the human eye such as dim stars, nebulae, and galaxies. This is accomplished through long time exposure as both film and digital cameras can accumulate and sum photons over long periods of time or using specialized optical filters which limit the photons to a certain wavelength.

<span class="mw-page-title-main">Thermography</span> Infrared imaging used to reveal temperature

Infrared thermography (IRT), thermal video or thermal imaging, is a process where a thermal camera captures and creates an image of an object by using infrared radiation emitted from the object in a process, which are examples of infrared imaging science. Thermographic cameras usually detect radiation in the long-infrared range of the electromagnetic spectrum and produce images of that radiation, called thermograms. Since infrared radiation is emitted by all objects with a temperature above absolute zero according to the black body radiation law, thermography makes it possible to see one's environment with or without visible illumination. The amount of radiation emitted by an object increases with temperature; therefore, thermography allows one to see variations in temperature. When viewed through a thermal imaging camera, warm objects stand out well against cooler backgrounds; humans and other warm-blooded animals become easily visible against the environment, day or night. As a result, thermography is particularly useful to the military and other users of surveillance cameras.

<span class="mw-page-title-main">Video camera</span> Camera used for electronic motion picture acquisition

A video camera is an optical instrument that captures videos, as opposed to a movie camera, which records images on film. Video cameras were initially developed for the television industry but have since become widely used for a variety of other purposes.

A digital image is an image composed of picture elements, also known as pixels, each with finite, discrete quantities of numeric representation for its intensity or gray level that is an output from its two-dimensional functions fed as input by its spatial coordinates denoted with x, y on the x-axis and y-axis, respectively. Depending on whether the image resolution is fixed, it may be of vector or raster type. By itself, the term "digital image" usually refers to raster images or bitmapped images.

<span class="mw-page-title-main">Digital camera back</span> Digital image sensor that attaches to the back of a film camera

A digital camera back is a device that attaches to the back of a camera in place of the traditional negative film holder and contains an electronic image sensor. This allows cameras that were designed to use film take digital photographs. These camera backs are generally expensive by consumer standards and are primarily built to be attached on medium- and large-format cameras used by professional photographers.

<span class="mw-page-title-main">Yohkoh</span> Japanese spacecraft

Yohkoh, known before launch as Solar-A, was a Solar observatory spacecraft of the Institute of Space and Astronautical Science (Japan), in collaboration with space agencies in the United States and the United Kingdom. It was launched into Earth orbit on August 30, 1991 by the M-3SII rocket from Kagoshima Space Center. It took its first soft X-ray image on 13 September 1991, 21:53:40, and movie representations of the X-ray corona over 1991-2001 are available at the Yohkoh Legacy site.

<span class="mw-page-title-main">Photodetector</span> Sensors of light or other electromagnetic energy

Photodetectors, also called photosensors, are sensors of light or other electromagnetic radiation. There are a wide variety of photodetectors which may be classified by mechanism of detection, such as photoelectric or photochemical effects, or by various performance metrics, such as spectral response. Semiconductor-based photodetectors typically use a p–n junction that converts photons into charge. The absorbed photons make electron–hole pairs in the depletion region. Photodiodes and photo transistors are a few examples of photo detectors. Solar cells convert some of the light energy absorbed into electrical energy.

A staring array, also known as staring-plane array or focal-plane array (FPA), is an image sensor consisting of an array of light-sensing pixels at the focal plane of a lens. FPAs are used most commonly for imaging purposes, but can also be used for non-imaging purposes such as spectrometry, LIDAR, and wave-front sensing.

<span class="mw-page-title-main">High-speed photography</span> Photography genre

High-speed photography is the science of taking pictures of very fast phenomena. In 1948, the Society of Motion Picture and Television Engineers (SMPTE) defined high-speed photography as any set of photographs captured by a camera capable of 69 frames per second or greater, and of at least three consecutive frames. High-speed photography can be considered to be the opposite of time-lapse photography.

An image sensor or imager is a sensor that detects and conveys information used to form an image. It does so by converting the variable attenuation of light waves into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, camera phones, optical mouse devices, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others. As technology changes, electronic and digital imaging tends to replace chemical and analog imaging.

An active-pixel sensor (APS) is an image sensor, which was invented by Peter J.W. Noble in 1968, where each pixel sensor unit cell has a photodetector and one or more active transistors. In a metal–oxide–semiconductor (MOS) active-pixel sensor, MOS field-effect transistors (MOSFETs) are used as amplifiers. There are different types of APS, including the early NMOS APS and the now much more common complementary MOS (CMOS) APS, also known as the CMOS sensor. CMOS sensors are used in digital camera technologies such as cell phone cameras, web cameras, most modern digital pocket cameras, most digital single-lens reflex cameras (DSLRs), mirrorless interchangeable-lens cameras (MILCs), and lensless imaging for, e.g., blood cells.

<span class="mw-page-title-main">Resurs-DK No.1</span>

Resurs-DK No.1, also called Resurs-DK1, was a commercial Earth observation satellite capable of transmitting high-resolution imagery to the ground stations as it passed overhead. The spacecraft was operated by NTs OMZ, the Russian Research Center for Earth Operative Monitoring.

<span class="mw-page-title-main">Medipix</span> Family of pixel detectors

Medipix is a family of photon counting and particle tracking pixel detectors developed by an international collaboration, hosted by CERN.

<span class="mw-page-title-main">Rolling shutter</span> Image capture method

Rolling shutter describes the process of image capture in which a still picture or each frame of a video is captured not by taking a snapshot of the entire scene at a single instant in time but rather by scanning across the scene rapidly, vertically, horizontally or rotationally. Thus, not all parts of the image of the scene are recorded at the same instant – however, during playback, the entire image of the scene is displayed at once, as if it represents a single instant in time. This produces predictable distortions of fast-moving objects or rapid flashes of light, referred to as rolling shutter effect. This process in contrast with global shutter in which the entire frame is captured at the same instant.

<span class="mw-page-title-main">Time-of-flight camera</span> Range imaging camera system

A time-of-flight camera, also known as time-of-flight sensor, is a range imaging camera system for measuring distances between the camera and the subject for each point of the image based on time-of-flight, the round trip time of an artificial light signal, as provided by a laser or an LED. Laser-based time-of-flight cameras are part of a broader class of scannerless LIDAR, in which the entire scene is captured with each laser pulse, as opposed to point-by-point with a laser beam such as in scanning LIDAR systems. Time-of-flight camera products for civil applications began to emerge around 2000, as the semiconductor processes allowed the production of components fast enough for such devices. The systems cover ranges of a few centimeters up to several kilometers.

<span class="mw-page-title-main">Detectors for transmission electron microscopy</span>

There are a variety of technologies available for detecting and recording the images, diffraction patterns, and electron energy loss spectra produced using transmission electron microscopy (TEM).

References

  1. Ostman, Brad (15 Jan 2010). "TDI CCDs are still the sensors of choice for demanding applications". Laser Focus World World Magazine. PennWell Corporation. Retrieved 22 May 2013.
  2. "TDI CCDs are still the sensors of choice for demanding applications". www.laserfocusworld.com. 15 January 2010. Retrieved 2016-05-19.
  3. Rabinowitz, David. "Drift Scanning (Time-Delay Integration)" (PDF). Yale University Center for Astronomy and Astrophysics. Caltech. Retrieved 17 May 2016.
  4. Holdsworth, D. W.; Gerson, R. K.; Fenster, A. (7 June 1990). "A time-delay integration charge-coupled device camera for slot-scanned digital radiography". Medical Physics. 17 (5). AAPM and of the COMP/CCPM/IOMP: 876–886. Bibcode:1990MedPh..17..876H. doi:10.1118/1.596578. PMID   2233575 . Retrieved 22 May 2013.
  5. "Tdi CCD Array | Products & Suppliers | Engineering360".