In geometric optics, distortion is a deviation from rectilinear projection; a projection in which straight lines in a scene remain straight in an image. It is a form of optical aberration that may be distinguished from other aberrations such as spherical aberration, coma, chromatic aberration, field curvature, and astigmatism in a sense that these impact the image sharpness without changing an object shape or structure in the image (e.g., a straight line in an object is still a straight line in the image although the image sharpness may be degraded by the mentioned aberrations) while distortion can change the object structure in the image (so named as distortion).
Although distortion can be irregular or follow many patterns, the most commonly encountered distortions are radially symmetric, or approximately so, arising from the symmetry of a photographic lens. These radial distortions can usually be classified as either barrel distortions or pincushion distortions. [1]
Mathematically, barrel and pincushion distortion are quadratic, meaning they increase as the square of distance from the center. In mustache distortion the quartic (degree 4) term is significant: in the center, the degree 2 barrel distortion is dominant, while at the edge the degree 4 distortion in the pincushion direction dominates. Other distortions are in principle possible – pincushion in center and barrel at the edge, or higher order distortions (degree 6, degree 8) – but do not generally occur in practical lenses, and higher order distortions are small relative to the main barrel and pincushion effects.
The names for these distortions come from familiar objects which are visually similar.
In photography, distortion is particularly associated with zoom lenses, particularly large-range zooms, but may also be found in prime lenses, and depends on focal distance – for example, the Canon EF 50mm f/1.4 exhibits barrel distortion at extremely short focal distances. Barrel distortion may be found in wide-angle lenses, and is often seen at the wide-angle end of zoom lenses, while pincushion distortion is often seen in older or low-end telephoto lenses. Mustache distortion is observed particularly on the wide end of zooms, with certain retrofocus lenses, and more recently on large-range zooms such as the Nikon 18–200 mm.
A certain amount of pincushion distortion is often found with visual optical instruments, e.g., binoculars, where it serves to counteract the globe effect.
In order to understand these distortions, it should be remembered that these are radial defects; the optical systems in question have rotational symmetry (omitting non-radial defects), so the didactically correct test image would be a set of concentric circles having even separation – like a shooter's target. It will then be observed that these common distortions actually imply a nonlinear radius mapping from the object to the image: What is seemingly pincushion distortion, is actually simply an exaggerated radius mapping for large radii in comparison with small radii. A graph showing radius transformations (from object to image) will be steeper in the upper (rightmost) end. Conversely, barrel distortion is actually a diminished radius mapping for large radii in comparison with small radii. A graph showing radius transformations (from object to image) will be less steep in the upper (rightmost) end.
Radial distortion that depends on wavelength is called "lateral chromatic aberration" – "lateral" because radial, "chromatic" because dependent on color (wavelength). This can cause colored fringes in high-contrast areas in the outer parts of the image. This should not be confused with axial (longitudinal) chromatic aberration, which causes aberrations throughout the field, particularly purple fringing.
Radial distortion, whilst primarily dominated by low-order radial components, [3] can be corrected using Brown's distortion model, [4] also known as the Brown–Conrady model based on earlier work by Conrady. [5] The Brown–Conrady model corrects both for radial distortion and for tangential distortion caused by physical elements in a lens not being perfectly aligned. The latter is also known as decentering distortion. See Zhang [6] for additional discussion of radial distortion. The Brown-Conrady distortion model is
where
Barrel distortion typically will have a negative term for whereas pincushion distortion will have a positive value. Moustache distortion will have a non-monotonic radial geometric series where for some the sequence will change sign.
To model radial distortion, the division model [7] typically provides a more accurate approximation than Brown-Conrady's even-order polynomial model, [8]
using the same parameters previously defined. For radial distortion, this division model is often preferred over the Brown–Conrady model, as it requires fewer terms to more accurately describe severe distortion. [8] Using this model, a single term is usually sufficient to model most cameras. [9]
Software can correct those distortions by warping the image with a reverse distortion. This involves determining which distorted pixel corresponds to each undistorted pixel, which is non-trivial due to the non-linearity of the distortion equation. [3] Lateral chromatic aberration (purple/green fringing) can be significantly reduced by applying such warping for red, green and blue separately.
Distorting or undistorting requires either both sets of coefficients or inverting the non-linear problem which, in general, lacks an analytical solution. Standard approaches such as approximating, locally linearizing and iterative solvers all apply. Which solver is preferable depends on the accuracy required and the computational resources available.
In addition to usually being sufficient to model most cameras, as mentioned, the single-term division model has an analytical solution to the reverse-distortion problem. [8] In this case, the distorted pixels are given by
where
Calibrated software works from a table of lens/camera transfer functions:
Manual software allows manual adjustment of distortion parameters:
convert distorted_image.jpg -distort barrel "0.06335 -0.18432 -0.13009" corrected_image.jpg
Besides these systems that address images, there are some that also adjust distortion parameters for videos:
Radial distortion is a failure of a lens to be rectilinear: a failure to image lines into lines. If a photograph is not taken straight-on then, even with a perfect rectilinear lens, rectangles will appear as trapezoids: lines are imaged as lines, but the angles between them are not preserved (tilt is not a conformal map). This effect can be controlled by using a perspective control lens, or corrected in post-processing.
Due to perspective, cameras image a cube as a square frustum (a truncated pyramid, with trapezoidal sides) – the far end is smaller than the near end. This creates perspective, and the rate at which this scaling happens (how quickly more distant objects shrink) creates a sense of a scene being deep or shallow. This cannot be changed or corrected by a simple transform of the resulting image, because it requires 3D information, namely the depth of objects in the scene. This effect is known as perspective distortion; the image itself is not distorted but is perceived as distorted when viewed from a normal viewing distance.
Note that if the center of the image is closer than the edges (for example, a straight-on shot of a face), then barrel distortion and wide-angle distortion (taking the shot from close) both increase the size of the center, while pincushion distortion and telephoto distortion (taking the shot from far) both decrease the size of the center. However, radial distortion bends straight lines (out or in), while perspective distortion does not bend lines, and these are distinct phenomena. Fisheye lenses are wide-angle lenses with heavy barrel distortion and thus exhibit both these phenomena, so objects in the center of the image (if shot from a short distance) are particularly enlarged: even if the barrel distortion is corrected, the resulting image is still from a wide-angle lens, and will still have a wide-angle perspective.
In optics, aberration is a property of optical systems, such as lenses, that causes light to be spread out over some region of space rather than focused to a point. Aberrations cause the image formed by a lens to be blurred or distorted, with the nature of the distortion depending on the type of aberration. Aberration can be defined as a departure of the performance of an optical system from the predictions of paraxial optics. In an imaging system, it occurs when light from one point of an object does not converge into a single point after transmission through the system. Aberrations occur because the simple paraxial theory is not a completely accurate model of the effect of an optical system on light, rather than due to flaws in the optical elements.
A lens is a transmissive optical device that focuses or disperses a light beam by means of refraction. A simple lens consists of a single piece of transparent material, while a compound lens consists of several simple lenses (elements), usually arranged along a common axis. Lenses are made from materials such as glass or plastic and are ground, polished, or molded to the required shape. A lens can focus light to form an image, unlike a prism, which refracts light without focusing. Devices that similarly focus or disperse waves and radiation other than visible light are also called "lenses", such as microwave lenses, electron lenses, acoustic lenses, or explosive lenses.
In optics, chromatic aberration (CA), also called chromatic distortion, color aberration, color fringing, or purple fringing, is a failure of a lens to focus all colors to the same point. It is caused by dispersion: the refractive index of the lens elements varies with the wavelength of light. The refractive index of most transparent materials decreases with increasing wavelength. Since the focal length of a lens depends on the refractive index, this variation in refractive index affects focusing. Since the focal length of the lens varies with the color of the light different colors of light are brought to focus at different distances from the lens or with different levels of magnification. Chromatic aberration manifests itself as "fringes" of color along boundaries that separate dark and bright parts of the image.
In photography, angle of view (AOV) describes the angular extent of a given scene that is imaged by a camera. It is used interchangeably with the more general term field of view.
Optics is the branch of physics which involves the behavior and properties of light, including its interactions with matter and the construction of instruments that use or detect it. Optics usually describes the behavior of visible, ultraviolet, and infrared light. Because light is an electromagnetic wave, other forms of electromagnetic radiation such as X-rays, microwaves, and radio waves exhibit similar properties.
In photography and cinematography, perspective distortion is a warping or transformation of an object and its surrounding area that differs significantly from what the object would look like with a normal focal length, due to the relative scale of nearby and distant features. Perspective distortion is determined by the relative distances at which the image is captured and viewed, and is due to the angle of view of the image being either wider or narrower than the angle of view at which the image is viewed, hence the apparent relative distances differing from what is expected. Related to this concept is axial magnification – the perceived depth of objects at a given magnification.
In optics, the Airy disk and Airy pattern are descriptions of the best-focused spot of light that a perfect lens with a circular aperture can make, limited by the diffraction of light. The Airy disk is of importance in physics, optics, and astronomy.
Curvilinear perspective, also five-point perspective, is a graphical projection used to draw 3D objects on 2D surfaces, for which (straight) lines on the 3D object are projected to curves on the 2D surface that are typically not straight. It was formally codified in 1968 by the artists and art historians André Barre and Albert Flocon in the book La Perspective curviligne, which was translated into English in 1987 as Curvilinear Perspective: From Visual Space to the Constructed Image and published by the University of California Press.
Gradient-index (GRIN) optics is the branch of optics covering optical effects produced by a gradient of the refractive index of a material. Such gradual variation can be used to produce lenses with flat surfaces, or lenses that do not have the aberrations typical of traditional spherical lenses. Gradient-index lenses may have a refraction gradient that is spherical, axial, or radial.
A fisheye lens is an ultra wide-angle lens that produces strong visual distortion intended to create a wide panoramic or hemispherical image. Fisheye lenses achieve extremely wide angles of view, well beyond any rectilinear lens. Instead of producing images with straight lines of perspective, fisheye lenses use a special mapping, which gives images a characteristic convex non-rectilinear appearance.
The point spread function (PSF) describes the response of a focused optical imaging system to a point source or point object. A more general term for the PSF is the system's impulse response; the PSF is the impulse response or impulse response function (IRF) of a focused optical imaging system. The PSF in many contexts can be thought of as the extended blob in an image that represents a single point object, that is considered as a spatial impulse. In functional terms, it is the spatial domain version of the optical transfer function (OTF) of an imaging system. It is a useful concept in Fourier optics, astronomical imaging, medical imaging, electron microscopy and other imaging techniques such as 3D microscopy and fluorescence microscopy.
Image stitching or photo stitching is the process of combining multiple photographic images with overlapping fields of view to produce a segmented panorama or high-resolution image. Commonly performed through the use of computer software, most approaches to image stitching require nearly exact overlaps between images and identical exposures to produce seamless results, although some stitching algorithms actually benefit from differently exposed images by doing high-dynamic-range imaging in regions of overlap. Some digital cameras can stitch their photos internally.
The optical transfer function (OTF) of an optical system such as a camera, microscope, human eye, or projector specifies how different spatial frequencies are captured or transmitted. It is used by optical engineers to describe how the optics project light from the object or scene onto a photographic film, detector array, retina, screen, or simply the next item in the optical transmission chain. A variant, the modulation transfer function (MTF), neglects phase effects, but is equivalent to the OTF in many situations.
Camera resectioning is the process of estimating the parameters of a pinhole camera model approximating the camera that produced a given photograph or video; it determines which incoming light ray is associated with each pixel on the resulting image. Basically, the process determines the pose of the pinhole camera.
In photography, a rectilinear lens is a photographic lens that yields images where straight features, such as the edges of walls of buildings, appear with straight lines, as opposed to being curved. In other words, it is a lens with little or no barrel or pincushion distortion. At particularly wide angles, however, the rectilinear perspective will cause objects to appear increasingly stretched and enlarged as they near the edge of the frame. These types of lenses are often used to create forced perspective effects.
The globe effect, also known as rolling ball effect, is an optical illusion which can occur with optical instruments used visually, in particular binoculars or telescopes. If such an instrument is rectilinear, or free of rectilinear distortion, some observers get the impression of an image rolling on a convex surface when the instrument is panned.
The Sigma 8–16mm lens is an enthusiast-level, ultra wide-angle rectilinear zoom lens made by Sigma Corporation specifically for use with APS-C small format digital SLRs. It is the first ultrawide rectilinear zoom lens with a minimum focal length of 8 mm, designed specifically for APS-C size image sensors. The lens was introduced at the February 2010 Photo Marketing Association International Convention and Trade Show. At its release it was the widest viewing angle focal length available commercially for APS-C cameras. It is part of Sigma's DC line of lenses, meaning it was designed to have an image circle tailored to work with APS-C format cameras. The lens has a constant length regardless of optical zoom and focus with inner lens tube elements responding to these parameters. The lens has hypersonic zoom autofocus.
The smc Pentax-DA 10-17mm f/3.5-4.5 ED (IF) Fish-Eye lens is a fisheye zoom lens for the Pentax K-mount. It offers an up to 180 degree view, and allows quick shift focus.
The pupil function or aperture function describes how a light wave is affected upon transmission through an optical imaging system such as a camera, microscope, or the human eye. More specifically, it is a complex function of the position in the pupil or aperture that indicates the relative change in amplitude and phase of the light wave. Sometimes this function is referred to as the generalized pupil function, in which case pupil function only indicates whether light is transmitted or not. Imperfections in the optics typically have a direct effect on the pupil function, it is therefore an important tool to study optical imaging systems and their performance.
DxO ViewPoint is image geometry and lens defect correction software developed by DxO. It is designed to automatically straighten distorted perspectives caused by the lens used and the position of the photographer. The software claims to be able to make precise corrections to lens flaws through its use of DxO's database of calibrations which have been created through laboratory tests.