Distortion (optics)

Last updated

Wine glasses creating a non-uniform distortion of their background Uniformity.jpg
Wine glasses creating a non-uniform distortion of their background
Optical aberration
Out-of-focus image of a spoke target..svg Defocus

HartmannShack 1lenslet.svg Tilt
Spherical aberration 3.svg Spherical aberration
Astigmatism.svg Astigmatism
Lens coma.svg Coma
Barrel distortion.svg Distortion
Field curvature.svg Petzval field curvature
Chromatic aberration lens diagram.svg Chromatic aberration

Contents

In geometric optics, distortion is a deviation from rectilinear projection; a projection in which straight lines in a scene remain straight in an image. It is a form of optical aberration.

Radial distortion

Although distortion can be irregular or follow many patterns, the most commonly encountered distortions are radially symmetric, or approximately so, arising from the symmetry of a photographic lens. These radial distortions can usually be classified as either barrel distortions or pincushion distortions. See van Walree. [1]

Barrel distortion.svg Barrel distortion

In barrel distortion, image magnification decreases with distance from the optical axis. The apparent effect is that of an image which has been mapped around a sphere (or barrel). Fisheye lenses, which take hemispherical views, utilize this type of distortion as a way to map an infinitely wide object plane into a finite image area. In a zoom lens, barrel distortion appears in the middle of the lens's focal length range and is worst at the wide-angle end of the range. [2]

Pincushion distortion.svg Pincushion distortion

In pincushion distortion, image magnification increases with the distance from the optical axis. The visible effect is that lines that do not go through the centre of the image are bowed inwards, towards the centre of the image, like a pincushion.

Mustache distortion.svg Mustache distortion

A mixture of both types, sometimes referred to as mustache distortion (moustache distortion) or complex distortion, is less common but not rare. It starts out as barrel distortion close to the image center and gradually turns into pincushion distortion towards the image periphery, making horizontal lines in the top half of the frame look like a handlebar mustache.

Mathematically, barrel and pincushion distortion are quadratic, meaning they increase as the square of distance from the center. In mustache distortion the quartic (degree 4) term is significant: in the center, the degree 2 barrel distortion is dominant, while at the edge the degree 4 distortion in the pincushion direction dominates. Other distortions are in principle possible – pincushion in center and barrel at the edge, or higher order distortions (degree 6, degree 8) – but do not generally occur in practical lenses, and higher order distortions are small relative to the main barrel and pincushion effects.

Occurrence

Simulated animation of globe effect (right) compared with a simple pan (left) Globe effect.gif
Simulated animation of globe effect (right) compared with a simple pan (left)

In photography, distortion is particularly associated with zoom lenses, particularly large-range zooms, but may also be found in prime lenses, and depends on focal distance – for example, the Canon EF 50mm f/1.4 exhibits barrel distortion at extremely short focal distances. Barrel distortion may be found in wide-angle lenses, and is often seen at the wide-angle end of zoom lenses, while pincushion distortion is often seen in older or low-end telephoto lenses. Mustache distortion is observed particularly on the wide end of zooms, with certain retrofocus lenses, and more recently on large-range zooms such as the Nikon 18–200 mm.

A certain amount of pincushion distortion is often found with visual optical instruments, e.g., binoculars, where it serves to eliminate the globe effect.

Radial distortions can be understood by their effect on concentric circles, as in an archery target. WA 80 cm archery target.svg
Radial distortions can be understood by their effect on concentric circles, as in an archery target.

In order to understand these distortions, it should be remembered that these are radial defects; the optical systems in question have rotational symmetry (omitting non-radial defects), so the didactically correct test image would be a set of concentric circles having even separation—like a shooter's target. It will then be observed that these common distortions actually imply a nonlinear radius mapping from the object to the image: What is seemingly pincushion distortion, is actually simply an exaggerated radius mapping for large radii in comparison with small radii. A graph showing radius transformations (from object to image) will be steeper in the upper (rightmost) end. Conversely, barrel distortion is actually a diminished radius mapping for large radii in comparison with small radii. A graph showing radius transformations (from object to image) will be less steep in the upper (rightmost) end.

Chromatic aberration

Radial distortion that depends on wavelength is called "lateral chromatic aberration" – "lateral" because radial, "chromatic" because dependent on color (wavelength). This can cause colored fringes in high-contrast areas in the outer parts of the image. This should not be confused with axial (longitudinal) chromatic aberration, which causes aberrations throughout the field, particularly purple fringing.

Origin of terms

The names for these distortions come from familiar objects which are visually similar.

Software correction

Radial distortion, whilst primarily dominated by low order radial components, [3] can be corrected using Brown's distortion model, [4] also known as the Brown–Conrady model based on earlier work by Conrady. [5] The Brown–Conrady model corrects both for radial distortion and for tangential distortion caused by physical elements in a lens not being perfectly aligned. The latter is also known as decentering distortion. See Zhang [6] for additional discussion of radial distortion.

[3]
[3]

where:

= distorted image point as projected on image plane using specified lens,
= undistorted image point as projected by an ideal pinhole camera,
= distortion center,
= radial distortion coefficient,
= tangential distortion coefficient,
= , and
= an infinite series.

Barrel distortion typically will have a negative term for whereas pincushion distortion will have a positive value. Moustache distortion will have a non-monotonic radial geometric series where for some the sequence will change sign.

To model radial distortion, the division model [7] typically provides a more accurate approximation than Brown-Conrady's even-order polynomial model: [8]

[7]

using the same parameters previously defined. For radial distortion, this division model is often preferred over the Brown–Conrady model, as it requires fewer terms to describe more accurately severe distortion. [8] Using this model, a single term is usually sufficient to model most cameras. [9]

Software can correct those distortions by warping the image with a reverse distortion. This involves determining which distorted pixel corresponds to each undistorted pixel, which is non-trivial due to the non-linearity of the distortion equation. [3] Lateral chromatic aberration (purple/green fringing) can be significantly reduced by applying such warping for red, green and blue separately.

Distorting or undistorting requires either both sets of coefficients or inverting the non-linear problem which, in general, lacks an analytical solution. Standard approaches such as approximating, locally linearizing and iterative solvers all apply. Which solver is preferable depends on the accuracy required and the computational resources available.

Calibrated

Calibrated systems work from a table of lens/camera transfer functions:

Manual

Manual systems allow manual adjustment of distortion parameters:

convert distorted_image.jpg -distort barrel "0.06335 -0.18432 -0.13009" corrected_image.jpg

Besides these systems that address images, there are some that also adjust distortion parameters for videos:

Radial distortion is a failure of a lens to be rectilinear: a failure to image lines into lines. If a photograph is not taken straight-on then, even with a perfect rectilinear lens, rectangles will appear as trapezoids: lines are imaged as lines, but the angles between them are not preserved (tilt is not a conformal map). This effect can be controlled by using a perspective control lens, or corrected in post-processing.

Due to perspective, cameras image a cube as a square frustum (a truncated pyramid, with trapezoidal sides)—the far end is smaller than the near end. This creates perspective, and the rate at which this scaling happens (how quickly more distant objects shrink) creates a sense of a scene being deep or shallow. This cannot be changed or corrected by a simple transform of the resulting image, because it requires 3D information, namely the depth of objects in the scene. This effect is known as perspective distortion; the image itself is not distorted, but is perceived as distorted when viewed from a normal viewing distance.

Note that if the center of the image is closer than the edges (for example, a straight-on shot of a face), then barrel distortion and wide-angle distortion (taking the shot from close) both increase the size of the center, while pincushion distortion and telephoto distortion (taking the shot from far) both decrease the size of the center. However, radial distortion bends straight lines (out or in), while perspective distortion does not bend lines, and these are distinct phenomena. Fisheye lenses are wide-angle lenses with heavy barrel distortion and thus exhibit both these phenomena, so objects in the center of the image (if shot from a short distance) are particularly enlarged: even if the barrel distortion is corrected, the resulting image is still from a wide-angle lens, and will still have a wide-angle perspective.

See also

Related Research Articles

In optics, aberration is a property of optical systems such as lenses that causes light to be spread out over some region of space rather than focused to a point. Aberrations cause the image formed by a lens to be blurred or distorted, with the nature of the distortion depending on the type of aberration. Aberration can be defined as a departure of the performance of an optical system from the predictions of paraxial optics. In an imaging system, it occurs when light from one point of an object does not converge into a single point after transmission through the system. Aberrations occur because the simple paraxial theory is not a completely accurate model of the effect of an optical system on light, rather than due to flaws in the optical elements.

Spherical coordinate system 3-dimensional coordinate system

In mathematics, a spherical coordinate system is a coordinate system for three-dimensional space where the position of a point is specified by three numbers: the radial distance of that point from a fixed origin, its polar angle measured from a fixed zenith direction, and the azimuthal angle of its orthogonal projection on a reference plane that passes through the origin and is orthogonal to the zenith, measured from a fixed reference direction on that plane. It can be seen as the three-dimensional version of the polar coordinate system.

Chromatic aberration Failure of a lens to focus all colors on the same point

In optics, chromatic aberration (CA), also called chromatic distortion and spherochromatism, is a failure of a lens to focus all colors to the same point. It is caused by dispersion: the refractive index of the lens elements varies with the wavelength of light. The refractive index of most transparent materials decreases with increasing wavelength. Since the focal length of a lens depends on the refractive index, this variation in refractive index affects focusing. Chromatic aberration manifests itself as "fringes" of color along boundaries that separate dark and bright parts of the image.

Angle of view Angular extent of given scene imaged by camera

In photography, angle of view (AOV) describes the angular extent of a given scene that is imaged by a camera. It is used interchangeably with the more general term field of view.

HSL and HSV are alternative representations of the RGB color model, designed in the 1970s by computer graphics researchers to more closely align with the way human vision perceives color-making attributes. In these models, colors of each hue are arranged in a radial slice, around a central axis of neutral colors which ranges from black at the bottom to white at the top.

Quantization (signal processing) Process of mapping a continuous set to a countable set

Quantization, in mathematics and digital signal processing, is the process of mapping input values from a large set to output values in a (countable) smaller set, often with a finite number of elements. Rounding and truncation are typical examples of quantization processes. Quantization is involved to some degree in nearly all digital signal processing, as the process of representing a signal in digital form ordinarily involves rounding. Quantization also forms the core of essentially all lossy compression algorithms.

Zoom lens Lens with a variable focal length

A zoom lens is a mechanical assembly of lens elements for which the focal length can be varied, as opposed to a fixed focal length (FFL) lens.

CIELAB color space Standard color space with color-opponent values

The CIELAB color space also referred to as L*a*b* is a color space defined by the International Commission on Illumination in 1976. It expresses color as three values: L* for perceptual lightness, and a* and b* for the four unique colors of human vision: red, green, blue, and yellow. CIELAB was intended as a perceptually uniform space, where a given numerical change corresponds to similar perceived change in color. While the LAB space is not truly perceptually uniform, it nevertheless is useful in industry for detecting small differences in color.

Fisheye lens

A fisheye lens is an ultra wide-angle lens that produces strong visual distortion intended to create a wide panoramic or hemispherical image. Fisheye lenses achieve extremely wide angles of view. Instead of producing images with straight lines of perspective, fisheye lenses use a special mapping, which gives images a characteristic convex non-rectilinear appearance.

Eyepiece Type of lens attached to a variety of optical devices such as telescopes and microscopes

An eyepiece, or ocular lens, is a type of lens that is attached to a variety of optical devices such as telescopes and microscopes. It is so named because it is usually the lens that is closest to the eye when someone looks through the device. The objective lens or mirror collects light and brings it to focus creating an image. The eyepiece is placed near the focal point of the objective to magnify this image. The amount of magnification depends on the focal length of the eyepiece.

Color balance

In photography and image processing, color balance is the global adjustment of the intensities of the colors. An important goal of this adjustment is to render specific colors – particularly neutral colors – correctly. Hence, the general method is sometimes called gray balance, neutral balance, or white balance. Color balance changes the overall mixture of colors in an image and is used for color correction. Generalized versions of color balance are used to correct colors other than neutrals or to deliberately change them for effect.

Specular highlight

A specular highlight is the bright spot of light that appears on shiny objects when illuminated. Specular highlights are important in 3D computer graphics, as they provide a strong visual cue for the shape of an object and its location with respect to light sources in the scene.

Optical resolution describes the ability of an imaging system to resolve detail in the object that is being imaged.

Tissots indicatrix

In cartography, a Tissot's indicatrix is a mathematical contrivance presented by French mathematician Nicolas Auguste Tissot in 1859 and 1871 in order to characterize local distortions due to map projection. It is the geometry that results from projecting a circle of infinitesimal radius from a curved geometric model, such as a globe, onto a map. Tissot proved that the resulting diagram is an ellipse whose axes indicate the two principal directions along which scale is maximal and minimal at that point on the map.

Image stitching or photo stitching is the process of combining multiple photographic images with overlapping fields of view to produce a segmented panorama or high-resolution image. Commonly performed through the use of computer software, most approaches to image stitching require nearly exact overlaps between images and identical exposures to produce seamless results, although some stitching algorithms actually benefit from differently exposed images by doing high-dynamic-range imaging in regions of overlap. Some digital cameras can stitch their photos internally.

Optical transfer function Function that specifies how different spatial frequencies are handled by an optical system

The optical transfer function (OTF) of an optical system such as a camera, microscope, human eye, or projector specifies how different spatial frequencies are handled by the system. It is used by optical engineers to describe how the optics project light from the object or scene onto a photographic film, detector array, retina, screen, or simply the next item in the optical transmission chain. A variant, the modulation transfer function (MTF), neglects phase effects, but is equivalent to the OTF in many situations.

Perspective control

Perspective control is a procedure for composing or editing photographs to better conform with the commonly accepted distortions in constructed perspective. The control would:

Camera resectioning is the process of estimating the parameters of a pinhole camera model approximating the camera that produced a given photograph or video. Usually, the pinhole camera parameters are represented in a 3 × 4 matrix called the camera matrix.

In computer vision triangulation refers to the process of determining a point in 3D space given its projections onto two, or more, images. In order to solve this problem it is necessary to know the parameters of the camera projection function from 3D to 2D for the cameras involved, in the simplest case represented by the camera matrices. Triangulation is sometimes also referred to as reconstruction or intersection.

The pupil function or aperture function describes how a light wave is affected upon transmission through an optical imaging system such as a camera, microscope, or the human eye. More specifically, it is a complex function of the position in the pupil or aperture that indicates the relative change in amplitude and phase of the light wave. Sometimes this function is referred to as the generalized pupil function, in which case pupil function only indicates whether light is transmitted or not. Imperfections in the optics typically have a direct effect on the pupil function, it is therefore an important tool to study optical imaging systems and their performance.

References

  1. Paul van Walree. "Distortion". Photographic optics. Archived from the original on 29 January 2009. Retrieved 2 February 2009.
  2. "Tamron 18-270mm f/3.5-6.3 Di II VC PZD" . Retrieved 20 March 2013.
  3. 1 2 3 4 de Villiers, J. P.; Leuschner, F.W.; Geldenhuys, R. (17–19 November 2008). "Centi-pixel accurate real-time inverse distortion correction" (PDF). 2008 International Symposium on Optomechatronic Technologies. SPIE. doi:10.1117/12.804771.
  4. Brown, Duane C. (May 1966). "Decentering distortion of lenses" (PDF). Photogrammetric Engineering. 32 (3): 444–462. Archived from the original (PDF) on 12 March 2018.
  5. Conrady, A. E. (1919). "Decentred Lens-Systems". Monthly Notices of the Royal Astronomical Society. 79 (5): 384. Bibcode:1919MNRAS..79..384C. doi: 10.1093/mnras/79.5.384 .
  6. Zhang, Zhengyou (1998). A Flexible New Technique for Camera Calibration (PDF) (Technical report). Microsoft Research. MSR-TR-98-71.
  7. 1 2 Fitzgibbon, A. W. (2001). "Simultaneous linear estimation of multiple view geometry and lens distortion". Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR). IEEE. doi:10.1109/CVPR.2001.990465.
  8. 1 2 Bukhari, F.; Dailey, M. N. (2013). "Automatic Radial Distortion Estimation from a Single Image" (PDF). Journal of mathematical imaging and vision. Springer. doi:10.1007/s10851-012-0342-2.
  9. Wang, J.; Shi, F.; Zhang, J.; Liu, Y. (2008). "A new calibration model of camera lens distortion". Pattern Recognition. Elsevier. doi:10.1016/j.patcog.2007.06.012.
  10. "PTlens" . Retrieved 2 January 2012.
  11. "lensfun - Rev 246 - /trunk/README". Archived from the original on 13 October 2013. Retrieved 13 October 2013.
  12. "OpenCV". opencv.org/. Retrieved 22 January 2018.
  13. Wiley, Carlisle. "Articles: Digital Photography Review". Dpreview.com. Archived from the original on 7 July 2012. Retrieved 3 July 2013.
  14. "ImageMagick v6 Examples -- Lens Corrections".
  15. "Hugin tutorial – Simulating an architectural projection" . Retrieved 9 September 2009.
  16. "FFmpeg Filters Documentation".