Photo response non-uniformity

Last updated

Photo response non-uniformity, pixel response non-uniformity, or PRNU, is a form of fixed-pattern noise related to digital image sensors, as used in cameras and optical instruments. Both CCD and CMOS sensors are two-dimensional arrays of photosensitive cells, each broadly corresponding to an image pixel. Due to the non-uniformity of image sensors, each cell responds with a different voltage level when illuminated with a uniform light source, and this leads to luminance inaccuracy at the pixel level.

High-end and metrology camera vendors tend to characterise this non-uniformity during instrument manufacture. The sensor is illuminated with a standardized light source and a two-dimensional table of correction factors is generated. This table is either carried in camera non-volatile memory and dynamically applied to the image on each capture, or ships with the camera to be applied by an external image processing and correcting pipeline.

See also

Related Research Articles

Charge-coupled device Device for the movement of electrical charge

A charge-coupled device (CCD) is an integrated circuit containing an array of linked, or coupled, capacitors. Under the control of an external circuit, each capacitor can transfer its electric charge to a neighboring capacitor. CCD sensors are a major technology used in digital imaging.

Pixel Physical point in a raster image

In digital imaging, a pixel, pel, or picture element is the smallest addressable element in a raster image, or the smallest addressable element in an all points addressable display device; so it is the smallest controllable element of a picture represented on the screen.

RGB color model Additive color model based on combining red, green, and blue

The RGB color model is an additive color model in which the red, green, and blue primary colors of light are added together in various ways to reproduce a broad array of colors. The name of the model comes from the initials of the three additive primary colors, red, green, and blue.

Photodiode Converts light into current

A photodiode is a semiconductor p-n junction device that converts light into an electrical current. The current is generated when photons are absorbed in the photodiode. Photodiodes may contain optical filters, built-in lenses, and may have large or small surface areas. Photodiodes usually have a slower response time as their surface area increases. The common, traditional solar cell used to generate electric solar power is a large area photodiode.

Gamma correction or gamma is a nonlinear operation used to encode and decode luminance or tristimulus values in video or still image systems. Gamma correction is, in the simplest cases, defined by the following power-law expression:

Diffraction-limited system Optical system with resolution performance at the instruments theoretical limit

The resolution of an optical imaging system – a microscope, telescope, or camera – can be limited by factors such as imperfections in the lenses or misalignment. However, there is a principal limit to the resolution of any optical system, due to the physics of diffraction. An optical system with resolution performance at the instrument's theoretical limit is said to be diffraction-limited.

Bayer filter Color filter array

A Bayer filter mosaic is a color filter array (CFA) for arranging RGB color filters on a square grid of photosensors. Its particular arrangement of color filters is used in most single-chip digital image sensors used in digital cameras, camcorders, and scanners to create a color image. The filter pattern is half green, one quarter red and one quarter blue, hence is also called BGGR,RGBG, GRBG, or RGGB.

Photodetector sensors of light or other electromagnetic energy

Photodetectors, also called photosensors, are sensors of light or other electromagnetic radiation. There is a wide variety of photodetectors which may be classified by mechanism of detection, such as photoelectric or photochemical effects, or by various performance metrics, such as spectral response. Semiconductor-based photodetectors typically have a p–n junction that converts light photons into current. The absorbed photons make electron–hole pairs in the depletion region. Photodiodes and phototransistors are a few examples of photodetectors. Solar cells convert some of the light energy absorbed into electrical energy.

Flat-field correction

Flat-field correction (FFC) is a technique used to improve quality in digital imaging. It cancels the effects of image artifacts caused by variations in the pixel-to-pixel sensitivity of the detector and by distortions in the optical path. It is a standard calibration procedure in everything from personal digital cameras to large telescopes.

Fixed-pattern noise (FPN) is the term given to a particular noise pattern on digital imaging sensors often noticeable during longer exposure shots where particular pixels are susceptible to giving brighter intensities above the general background noise.

Image noise

Image noise is random variation of brightness or color information in images, and is usually an aspect of electronic noise. It can be produced by the image sensor and circuitry of a scanner or digital camera. Image noise can also originate in film grain and in the unavoidable shot noise of an ideal photon detector. Image noise is an undesirable by-product of image capture that obscures the desired information.

A staring array, also known as staring-plane array or focal-plane array (FPA), is an image sensor consisting of an array of light-sensing pixels at the focal plane of a lens. FPAs are used most commonly for imaging purposes, but can also be used for non-imaging purposes such as spectrometry, LIDAR, and wave-front sensing.

Image sensor Device that converts an optical image into an electronic signal

An image sensor or imager is a sensor that detects and conveys information used to make an image. It does so by converting the variable attenuation of light waves into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, camera phones, optical mouse devices, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others. As technology changes, electronic and digital imaging tends to replace chemical and analog imaging.

Optical transfer function Function that specifies how different spatial frequencies are handled by an optical system

The optical transfer function (OTF) of an optical system such as a camera, microscope, human eye, or projector specifies how different spatial frequencies are handled by the system. It is used by optical engineers to describe how the optics project light from the object or scene onto a photographic film, detector array, retina, screen, or simply the next item in the optical transmission chain. A variant, the modulation transfer function (MTF), neglects phase effects, but is equivalent to the OTF in many situations.

The following are common definitions related to the machine vision field.

Active-pixel sensor Image sensor consisting of an integrated circuit

An active-pixel sensor (APS) is an image sensor where each pixel sensor unit cell has a photodetector and one or more active transistors. In a metal–oxide–semiconductor (MOS) active-pixel sensor, MOS field-effect transistors (MOSFETs) are used as amplifiers. There are different types of APS, including the early NMOS APS and the much more common complementary MOS (CMOS) APS, also known as the CMOS sensor, which is widely used in digital camera technologies such as cell phone cameras, web cameras, most modern digital pocket cameras, most digital single-lens reflex cameras (DSLRs), and mirrorless interchangeable-lens cameras (MILCs). CMOS sensors emerged as an alternative to charge-coupled device (CCD) image sensors and eventually outsold them by the mid-2000s.

Color filter array

In digital imaging, a color filter array (CFA), or color filter mosaic (CFM), is a mosaic of tiny color filters placed over the pixel sensors of an image sensor to capture color information.

The merits of digital versus film photography were considered by photographers and filmmakers in the early 21st century after consumer digital cameras became widely available. Digital photography and digital cinematography have both advantages and disadvantages relative to still film and motion picture film photography. In the 21st century, photography came to be predominantly digital, but traditional photochemical methods continue to serve many users and applications.

Large-screen television technology Technology rapidly developed in the late 1990s and 2000s

Large-screen television technology developed rapidly in the late 1990s and 2000s. Prior to the development of thin-screen technologies, rear-projection television was used for many larger displays, and jumbotron, a non-projection video display technology, was used at stadiums and concerts. Various thin-screen technologies are being developed, but only liquid crystal display (LCD), plasma display (PDP) and Digital Light Processing (DLP) have been released on the public market. However, recently released technologies like organic light-emitting diode (OLED), and not-yet-released technologies like surface-conduction electron-emitter display (SED) or field emission display (FED), are on their way to replacing the first flat-screen technologies in picture quality.

Back-illuminated sensor

A back-illuminated sensor, also known as backside illumination sensor, is a type of digital image sensor that uses a novel arrangement of the imaging elements to increase the amount of light captured and thereby improve low-light performance.