Wavefront coding

Last updated

In optics and signal processing, wavefront coding refers to the use of a phase modulating element in conjunction with deconvolution to extend the depth of field of a digital imaging system such as a video camera.

Optics The branch of physics that studies light

Optics is the branch of physics that studies the behaviour and properties of light, including its interactions with matter and the construction of instruments that use or detect it. Optics usually describes the behaviour of visible, ultraviolet, and infrared light. Because light is an electromagnetic wave, other forms of electromagnetic radiation such as X-rays, microwaves, and radio waves exhibit similar properties.

Signal processing models and analyzes data representations of physical events

Signal processing is a subfield of mathematics, information and electrical engineering that concerns the analysis, synthesis, and modification of signals, which are broadly defined as functions conveying "information about the behavior or attributes of some phenomenon", such as sound, images, and biological measurements. For example, signal processing techniques are used to improve signal transmission fidelity, storage efficiency, and subjective quality, and to emphasize or detect components of interest in a measured signal.

In mathematics, deconvolution is an algorithm-based process used to reverse the effects of convolution on recorded data. The concept of deconvolution is widely used in the techniques of signal processing and image processing. Because these techniques are in turn widely used in many scientific and engineering disciplines, deconvolution finds many applications.

Contents

Wavefront coding falls under the broad category of computational photography as a technique to enhance the depth of field.

Computational photography Computational Photography

Computational photography refers to digital image capture and processing techniques that use digital computation instead of optical processes. Computational photography can improve the capabilities of a camera, or introduce features that were not possible at all with film based photography, or reduce the cost or size of camera elements. Examples of computational photography include in-camera computation of digital panoramas, high-dynamic-range images, and light field cameras. Light field cameras use novel optical elements to capture three dimensional scene information which can then be used to produce 3D images, enhanced depth-of-field, and selective de-focusing. Enhanced depth-of-field reduces the need for mechanical focusing systems. All of these features use computational imaging techniques.

Encoding

The wavefront of a light wave passing through the camera system is modulated using optical elements that introduce a spatially varying optical path length. The modulating elements must be placed at or near the plane of the aperture stop or pupil so that the same modulation is introduced for all field angles across the field-of-view. This modulation corresponds to a change in complex argument of the pupil function of such an imaging device, and it can be engineered with different goals in mind: e.g. extending the depth of focus.

The pupil function or aperture function describes how a light wave is affected upon transmission through an optical imaging system such as a camera, microscope, or the human eye. More specifically, it is a complex function of the position in the pupil or aperture that indicates the relative change in amplitude and phase of the light wave. Sometimes this function is referred to as the generalized pupil function, in which case pupil function only indicates whether light is transmitted or not. Imperfections in the optics typically have a direct effect on the pupil function, it is therefore an important tool to study optical imaging systems and their performance.

Linear phase mask

Wavefront coding with linear phase masks works by creating an optical transfer function that encodes distance information. [1]

Cubic phase mask

Wavefront Coding with cubic phase masks works to blur the image uniformly using a cubic shaped waveplate so that the intermediate image, the optical transfer function, is out of focus by a constant amount. Digital image processing then removes the blur and introduces noise depending upon the physical characteristics of the processor. Dynamic range is sacrificed to extend the depth of field depending upon the type of filter used. It can also correct optical aberration. [2]

Waveplate Optical device

A waveplate or retarder is an optical device that alters the polarization state of a light wave travelling through it. Two common types of waveplates are the half-wave plate, which shifts the polarization direction of linearly polarized light, and the quarter-wave plate, which converts linearly polarized light into circularly polarized light and vice versa. A quarter-wave plate can be used to produce elliptical polarization as well.

Focus (optics) point where light rays originating from a point converge on an object

In geometrical optics, a focus, also called an image point, is the point where light rays originating from a point on the object converge. Although the focus is conceptually a point, physically the focus has a spatial extent, called the blur circle. This non-ideal focusing may be caused by aberrations of the imaging optics. In the absence of significant aberrations, the smallest possible blur circle is the Airy disc, which is caused by diffraction from the optical system's aperture. Aberrations tend to get worse as the aperture diameter increases, while the Airy circle is smallest for large apertures.

In computer science, digital image processing is the use of computer algorithms to perform image processing on digital images. As a subcategory or field of digital signal processing, digital image processing has many advantages over analog image processing. It allows a much wider range of algorithms to be applied to the input data and can avoid problems such as the build-up of noise and signal distortion during processing. Since images are defined over two dimensions digital image processing may be modeled in the form of multidimensional systems.

The mask was developed by using the ambiguity function and the stationary phase method

In pulsed radar and sonar signal processing, an ambiguity function is a two-dimensional function of time delay and Doppler frequency showing the distortion of a returned pulse due to the receiver matched filter due to the Doppler shift of the return from a moving target. The ambiguity function is determined by the properties of the pulse and the matched filter, and not any particular target scenario. Many definitions of the ambiguity function exist; Some are restricted to narrowband signals and others are suitable to describe the propagation delay and Doppler relationship of wideband signals. Often the definition of the ambiguity function is given as the magnitude squared of other definitions (Weiss). For a given complex baseband pulse , the narrowband ambiguity function is given by

History

The technique was pioneered by radar engineer Edward Dowski and his thesis adviser Thomas Cathey at the University of Colorado in the United States in the 1990s. After the university showed little interest in the research [3] they have since founded a company to commercialize the method called CDM-Optics. The company was acquired in 2005 by OmniVision Technologies, which has released wavefront-coding-based mobile camera chips as TrueFocus sensors.

TrueFocus sensors are able to simulate older autofocus technologies that use rangefinders and narrow depth of fields. [4] In fact, the technology theoretically allows for any number of combinations of focal points per pixel for effect. It is the only technology not limited to EDoF (Extended-Depth-of-Field).

Related Research Articles

Depth of field Distance between the nearest and the furthest objects that are in focus in an image

Depth of field is the distance between the nearest and the furthest objects that are in acceptably sharp focus in an image. The depth of field is determined by focal length, distance to subject, the acceptable circle of confusion size, and aperture. A particular depth of field may be chosen for technical or artistic purposes. Some post-processing methods, such as focus stacking allow extended depth of field that would be impossible with traditional techniques.

Photonics branch of physics

Photonics is the physical science of light (photon) generation, detection, and manipulation through emission, transmission, modulation, signal processing, switching, amplification, and sensing. Though covering all light's technical applications over the whole spectrum, most photonic applications are in the range of visible and near-infrared light. The term photonics developed as an outgrowth of the first practical semiconductor light emitters invented in the early 1960s and optical fibers developed in the 1970s.

Adaptive optics

Adaptive optics (AO) is a technology used to improve the performance of optical systems by reducing the effect of incoming wavefront distortions by deforming a mirror in order to compensate for the distortion. It is used in astronomical telescopes and laser communication systems to remove the effects of atmospheric distortion, in microscopy, optical fabrication and in retinal imaging systems to reduce optical aberrations. Adaptive optics works by measuring the distortions in a wavefront and compensating for them with a device that corrects those errors such as a deformable mirror or a liquid crystal array.

Schlieren photography

Schlieren photography is a visual process that is used to photograph the flow of fluids of varying density. Invented by the German physicist August Toepler in 1864 to study supersonic motion, it is widely used in aeronautical engineering to photograph the flow of air around objects.

Point spread function

The point spread function (PSF) describes the response of an imaging system to a point source or point object. A more general term for the PSF is a system's impulse response, the PSF being the impulse response of a focused optical system. The PSF in many contexts can be thought of as the extended blob in an image that represents an unresolved object. In functional terms it is the spatial domain version of the optical transfer function of the imaging system. It is a useful concept in Fourier optics, astronomical imaging, medical imaging, electron microscopy and other imaging techniques such as 3D microscopy and fluorescence microscopy. The degree of spreading (blurring) of the point object is a measure for the quality of an imaging system. In non-coherent imaging systems such as fluorescent microscopes, telescopes or optical microscopes, the image formation process is linear in power and described by linear system theory. This means that when two objects A and B are imaged simultaneously, the result is equal to the sum of the independently imaged objects. In other words: the imaging of A is unaffected by the imaging of B and vice versa, owing to the non-interacting property of photons. The image of a complex object can then be seen as a convolution of the true object and the PSF. However, when the detected light is coherent, image formation is linear in the complex field. Recording the intensity image then can lead to cancellations or other non-linear effects.

Autofocus optical system

An autofocus optical system uses a sensor, a control system and a motor to focus on an automatically or manually selected point or area. An electronic rangefinder has a display instead of the motor; the adjustment of the optical system has to be done manually until indication. Autofocus methods are distinguished by their type as being either active, passive or hybrid variants.

Vignetting

In photography and optics, vignetting (, UK also ; French: vignette) is a reduction of an image's brightness or saturation toward the periphery compared to the image center. The word vignette, from the same root as vine, originally referred to a decorative border in a book. Later, the word came to be used for a photographic portrait that is clear at the center and fades off toward the edges. A similar effect is visible in photographs of projected images or videos off a projection screen, resulting in a so-called "hotspot" effect.

Apodization

Apodization is an optical filtering technique. Its literal translation is "removing the foot". It is the technical term for changing the shape of a mathematical function, an electrical signal, an optical transmission or a mechanical structure. In optics, it is primarily used to remove Airy disks caused by diffraction around an intensity peak, improving the focus.

Light-field camera

A light field camera, also known as plenoptic camera, captures information about the light field emanating from a scene; that is, the intensity of light in a scene, and also the direction that the light rays are traveling in space. This contrasts with a conventional camera, which records only light intensity.

Optical transfer function function that specifies how different spatial frequencies are handled by the system; describes how the optics project light from the object or scene onto a photographic film, detector array, retina, screen, etc.

The optical transfer function (OTF) of an optical system such as a camera, microscope, human eye, or projector specifies how different spatial frequencies are handled by the system. It is used by optical engineers to describe how the optics project light from the object or scene onto a photographic film, detector array, retina, screen, or simply the next item in the optical transmission chain. A variant, the modulation transfer function (MTF), neglects phase effects, but is equivalent to the OTF in many situations.

Defocus aberration

In optics, defocus is the aberration in which an image is simply out of focus. This aberration is familiar to anyone who has used a camera, videocamera, microscope, telescope, or binoculars. Optically, defocus refers to a translation of the focus along the optical axis away from the detection surface. In general, defocus reduces the sharpness and contrast of the image. What should be sharp, high-contrast edges in a scene become gradual transitions. Fine detail in the scene is blurred or even becomes invisible. Nearly all image-forming optical devices incorporate some form of focus adjustment to minimize defocus and maximize image quality.

The following outline is provided as an overview of and topical guide to photography:

Digital holography

Digital holography refers to the acquisition and processing of holograms with a digital sensor array , typically a CCD camera or a similar device. Image rendering, or reconstruction of object data is performed numerically from digitized interferograms. Digital holography offers a means of measuring optical phase data and typically delivers three-dimensional surface or optical thickness images. Several recording and processing schemes have been developed to assess optical wave characteristics such as amplitude, phase, and polarization state, which make digital holography a very powerful method for metrology applications .

Coded aperture

Coded apertures or coded-aperture masks are grids, gratings, or other patterns of materials opaque to various wavelengths of electromagnetic radiation. The wavelengths are usually high-energy radiation such as X-rays and gamma rays. By blocking radiation in a known pattern, a coded "shadow" is cast upon a plane. The properties of the original radiation sources can then be mathematically reconstructed from this shadow. Coded apertures are used in X- and gamma ray imaging systems, because these high-energy rays cannot be focused with lenses or mirrors that work for visible light.

Computer-generated holography (CGH) is the method of digitally generating holographic interference patterns. A holographic image can be generated e.g. by digitally computing a holographic interference pattern and printing it onto a mask or film for subsequent illumination by suitable coherent light source.

Focus stacking

Focus stacking is a digital image processing technique which combines multiple images taken at different focus distances to give a resulting image with a greater depth of field (DOF) than any of the individual source images. Focus stacking can be used in any situation where individual images have a very shallow depth of field; macro photography and optical microscopy are two typical examples. Focus stacking can also be useful in landscape photography.

Range imaging is the name for a collection of techniques that are used to produce a 2D image showing the distance to points in a scene from a specific point, normally associated with some type of sensor device.

Boston Micromachines Corporation is a US company operating out of Cambridge, Massachusetts. Boston Micromachines manufactures and develops instruments based on MEMS technology to perform open and closed-loop adaptive optics. The technology is applied in astronomy, beam shaping, vision science, retinal imaging, microscopy, laser communications, and national defense. The instruments developed at Boston Micromachines include deformable mirrors, optical modulators, and retinal imaging systems, all of which utilize adaptive optics technology to enable wavefront manipulation capabilities which enhance the quality of the final image.

References

  1. US Patent 7218448 - Extended depth of field optical systems
  2. Extended depth of field through wave-front coding
  3. Wavefront coding keeps a focus on applications
  4. Multi-matrix depth of field image sensor