Adaptive histogram equalization

Last updated

Adaptive histogram equalization (AHE) is a computer image processing technique used to improve contrast in images. It differs from ordinary histogram equalization in the respect that the adaptive method computes several histograms, each corresponding to a distinct section of the image, and uses them to redistribute the lightness values of the image. It is therefore suitable for improving the local contrast and enhancing the definitions of edges in each region of an image.

Contents

However, AHE has a tendency to overamplify noise in relatively homogeneous regions of an image. A variant of adaptive histogram equalization called contrast limited adaptive histogram equalization (CLAHE) prevents this by limiting the amplification.

Motivation and explanation of the method

Ordinary histogram equalization uses the same transformation derived from the image histogram to transform all pixels. This works well when the distribution of pixel values is similar throughout the image. However, when the image contains regions that are significantly lighter or darker than most of the image, the contrast in those regions will not be sufficiently enhanced.

Adaptive histogram equalization (AHE) improves on this by transforming each pixel with a transformation function derived from a neighbourhood region. It was first developed for use in aircraft cockpit displays. [1] cited in [2] In its simplest form, each pixel is transformed based on the histogram of a square surrounding the pixel, as in the figure below. The derivation of the transformation functions from the histograms is exactly the same as for ordinary histogram equalization: The transformation function is proportional to the cumulative distribution function (CDF) of pixel values in the neighbourhood.

AHE-neighbourhoods.svg

Pixels near the image boundary have to be treated specially, because their neighbourhood would not lie completely within the image. This for example applies to the pixels to the left or above the blue pixel in the figure. This can be solved by extending the image by mirroring pixel lines and columns with respect to the image boundary. Simply copying the pixel lines on the border is not appropriate, as it would lead to a highly peaked neighbourhood histogram.

Properties of AHE

Contrast Limited AHE

Ordinary AHE tends to overamplify the contrast in near-constant regions of the image, since the histogram in such regions is highly concentrated. As a result, AHE may cause noise to be amplified in near-constant regions. Contrast Limited AHE (CLAHE) is a variant of adaptive histogram equalization in which the contrast amplification is limited, so as to reduce this problem of noise amplification. [3]

In AHE, the contrast amplification in the vicinity of a given pixel value is given by the slope of the transformation function. This is proportional to the slope of the neighbourhood cumulative distribution function (CDF) and therefore to the value of the histogram at that pixel value. CLAHE limits the amplification by clipping the histogram at a predefined value before computing the CDF. This limits the slope of the CDF and therefore of the transformation function. The value at which the histogram is clipped, the so-called clip limit, depends on the normalization of the histogram and thereby on the size of the neighbourhood region. Common values limit the resulting amplification to between 3 and 4.

It is advantageous not to discard the part of the histogram that exceeds the clip limit but to redistribute it equally among all histogram bins. [3]

Clahe-redist.svg

The redistribution will push some bins over the clip limit again (region shaded green in the figure), resulting in an effective clip limit that is larger than the prescribed limit and the exact value of which depends on the image. If this is undesirable, the redistribution procedure can be repeated recursively until the excess is negligible.

Efficient computation by interpolation

Adaptive histogram equalization in its straightforward form presented above, both with and without contrast limiting, requires the computation of a different neighbourhood histogram and transformation function for each pixel in the image. This makes the method very expensive computationally.

Interpolation allows a significant improvement in efficiency without compromising the quality of the result. [3] The image is partitioned into equally sized rectangular tiles as shown in the right part of the figure below. (64 tiles in 8 columns and 8 rows is a common choice. [4] ). A histogram, CDF and transformation function is then computed for each of the tiles. The transformation functions are appropriate for the tile center pixels (black squares in the left part of the figure). All other pixels are transformed with up to four transformation functions of the tiles with center pixels closest to them, and are assigned interpolated values. Pixels in the bulk of the image (shaded blue) are bilinearly interpolated, pixels close to the boundary (shaded green) are linearly interpolated, and pixels near corners (shaded red) are transformed with the transformation function of the corner tile. The interpolation coefficients reflect the location of pixels between the closest tile center pixels, so that the result is continuous as the pixel approaches a tile center.

Clahe-tileinterpol.svg

This procedure reduces the number of transformation functions to be computed dramatically and only imposes the small additional cost of linear interpolation.

Efficient computation by incremental update of histogram

An alternative to tiling the image is to "slide" the rectangle one pixel at a time, and only incrementally update the histogram for each pixel, [5] by adding the new pixel row and subtracting the row left behind. The algorithm is denoted SWAHE (Sliding Window Adaptive Histogram Equalization) by the original authors. The computational complexity of histogram calculation is then reduced from O() to O(N) (with N = pixel width of the surrounding rectangle); and since there is no tiling a final interpolation step is not required.

See also

Related Research Articles

<span class="mw-page-title-main">Gouraud shading</span> Interpolation method in computer graphics

Gouraud shading, named after Henri Gouraud, is an interpolation method used in computer graphics to produce continuous shading of surfaces represented by polygon meshes. In practice, Gouraud shading is most often used to achieve continuous lighting on triangle meshes by computing the lighting at the corners of each triangle and linearly interpolating the resulting colours for each pixel covered by the triangle. Gouraud first published the technique in 1971. However, enhanced hardware support for superior shading models has yielded Gouraud shading largely obsolete in modern rendering.

In digital signal processing, spatial anti-aliasing is a technique for minimizing the distortion artifacts (aliasing) when representing a high-resolution image at a lower resolution. Anti-aliasing is used in digital photography, computer graphics, digital audio, and many other applications.

<span class="mw-page-title-main">Phong shading</span> Interpolation technique for surface shading

In 3D computer graphics, Phong shading, Phong interpolation, or normal-vector interpolation shading is an interpolation technique for surface shading invented by computer graphics pioneer Bui Tuong Phong. Phong shading interpolates surface normals across rasterized polygons and computes pixel colors based on the interpolated normals and a reflection model. Phong shading may also refer to the specific combination of Phong interpolation and the Phong reflection model.

<span class="mw-page-title-main">Shading</span> Depicting depth through varying levels of darkness

Shading refers to the depiction of depth perception in 3D models or illustrations by varying the level of darkness. Shading tries to approximate local behavior of light on the object's surface and is not to be confused with techniques of adding shadows, such as shadow mapping or shadow volumes, which fall under global behavior of light.

In image processing and photography, a color histogram is a representation of the distribution of colors in an image. For digital images, a color histogram represents the number of pixels that have colors in each of a fixed list of color ranges, that span the image's color space, the set of all possible colors.

<span class="mw-page-title-main">Canny edge detector</span> Image edge detection algorithm

The Canny edge detector is an edge detection operator that uses a multi-stage algorithm to detect a wide range of edges in images. It was developed by John F. Canny in 1986. Canny also produced a computational theory of edge detection explaining why the technique works.

<span class="mw-page-title-main">Fractal flame</span>

Fractal flames are a member of the iterated function system class of fractals created by Scott Draves in 1992. Draves' open-source code was later ported into Adobe After Effects graphics software and translated into the Apophysis fractal flame editor.

<span class="mw-page-title-main">Bayer filter</span> Color filter array

A Bayer filter mosaic is a color filter array (CFA) for arranging RGB color filters on a square grid of photosensors. Its particular arrangement of color filters is used in most single-chip digital image sensors used in digital cameras, and camcorders to create a color image. The filter pattern is half green, one quarter red and one quarter blue, hence is also called BGGR, RGBG, GRBG, or RGGB.

<span class="mw-page-title-main">Shader</span> Type of program in a graphical processing unit (GPU)

In computer graphics, a shader is a computer program that calculates the appropriate levels of light, darkness, and color during the rendering of a 3D scene—a process known as shading. Shaders have evolved to perform a variety of specialized functions in computer graphics special effects and video post-processing, as well as general-purpose computing on graphics processing units.

The scale-invariant feature transform (SIFT) is a computer vision algorithm to detect, describe, and match local features in images, invented by David Lowe in 1999. Applications include object recognition, robotic mapping and navigation, image stitching, 3D modeling, gesture recognition, video tracking, individual identification of wildlife and match moving.

<span class="mw-page-title-main">Thresholding (image processing)</span> Image segmentation algorithm

In digital image processing, thresholding is the simplest method of segmenting images. From a grayscale image, thresholding can be used to create binary images.

<span class="mw-page-title-main">Pixel-art scaling algorithms</span> Upscaling filters for pixel art graphics

Pixel art scaling algorithms are graphical filters that attempt to enhance the appearance of hand-drawn 2D pixel art graphics. These algorithms are a form of automatic image enhancement. Pixel art scaling algorithms employ methods significantly different than the common methods of image rescaling, which have the goal of preserving the appearance of images.

<span class="mw-page-title-main">Image histogram</span> Digital image analysis tool

An image histogram is a type of histogram that acts as a graphical representation of the tonal distribution in a digital image. It plots the number of pixels for each tonal value. By looking at the histogram for a specific image a viewer will be able to judge the entire tonal distribution at a glance.

Demosaicing, also known as color reconstruction, is a digital image processing algorithm used to reconstruct a full color image from the incomplete color samples output from an image sensor overlaid with a color filter array (CFA) such as a Bayer filter. It is also known as CFA interpolation or debayering.

In image processing, normalization is a process that changes the range of pixel intensity values. Applications include photographs with poor contrast due to glare, for example. Normalization is sometimes called contrast stretching or histogram stretching. In more general fields of data processing, such as digital signal processing, it is referred to as dynamic range expansion.

<span class="mw-page-title-main">Histogram equalization</span> Method in image processing of contrast adjustment using the images histogram

Histogram equalization is a method in image processing of contrast adjustment using the image's histogram.

<span class="mw-page-title-main">Histogram matching</span>

In image processing, histogram matching or histogram specification is the transformation of an image so that its histogram matches a specified histogram. The well-known histogram equalization method is a special case in which the specified histogram is uniformly distributed.

<span class="mw-page-title-main">Image editing</span> Processes of altering images

Image editing encompasses the processes of altering images, whether they are digital photographs, traditional photo-chemical photographs, or illustrations. Traditional analog image editing is known as photo retouching, using tools such as an airbrush to modify photographs or editing illustrations with any traditional art medium. Graphic software programs, which can be broadly grouped into vector graphics editors, raster graphics editors, and 3D modelers, are the primary tools with which a user may manipulate, enhance, and transform images. Many image editing programs are also used to render or create computer art from scratch. The term "image editing" usually refers only to the editing of 2D images, not 3D ones.

Color normalization is a topic in computer vision concerned with artificial color vision and object recognition. In general, the distribution of color values in an image depends on the illumination, which may vary depending on lighting conditions, cameras, and other factors. Color normalization allows for object recognition techniques based on color to compensate for these variations.

This is a glossary of terms relating to computer graphics.

References

  1. D. J. Ketcham, R. W. Lowe & J. W. Weber: Image enhancement techniques for cockpit displays. Tech. rep., Hughes Aircraft. 1974.
  2. R. A. Hummel: Image Enhancement by Histogram Transformation. Computer Graphics and Image Processing 6 (1977) 184195.
  3. 1 2 3 4 S. M. Pizer, E. P. Amburn, J. D. Austin, et al.: Adaptive Histogram Equalization and Its Variations. Computer Vision, Graphics, and Image Processing 39 (1987) 355-368.
  4. 1 2 K. Zuiderveld: Contrast Limited Adaptive Histogram Equalization. In: P. Heckbert: Graphics Gems IV, Academic Press 1994, ISBN   0-12-336155-9
  5. T. Sund & A. Møystad: Sliding window adaptive histogram equalization of intra-oral radiographs: effect on diagnostic quality. Dentomaxillofac Radiol. 2006 May;35(3):133-8.

6. G. R. Vidhya and H. Ramesh, "Effectiveness of contrast limited adaptive histogram equalization technique on multispectral satellite imagery", Proc. Int. Conf. Video Image Process., pp. 234-239, Dec. 2017.