Images and videos use specific transfer functions to describe the relationship between electrical signal, scene light and displayed light.
The opto-electronic transfer function (OETF) is the transfer function having the scene light as input and converting into the picture or video signal as output. This is typically done within a camera. [1]
The electro-optical transfer function (EOTF) is the transfer function having the picture or video signal as input and converting it into the linear light output of the display. [1] This is done within a display device.
The opto-optical transfer function (OOTF) is the transfer function having the scene light as input and the displayed light as output. The OOTF is the composition of the OETF and the EOTF and is usually non-linear. [1]
These transfer functions have been developed to allow HDR display:
Chroma subsampling is the practice of encoding images by implementing less resolution for chroma information than for luma information, taking advantage of the human visual system's lower acuity for color differences than for luminance.
SMPTE color bars are a television test pattern used where the NTSC video standard is utilized, including countries in North America. The Society of Motion Picture and Television Engineers (SMPTE) refers to the pattern as Engineering Guideline (EG) 1-1990. Its components are a known standard, and created by test pattern generators. Comparing it as received to the known standard gives video engineers an indication of how an NTSC video signal has been altered by recording or transmission and what adjustments must be made to bring it back to specification. It is also used for setting a television monitor or receiver to reproduce NTSC chrominance and luminance information correctly.
YCbCr, Y′CbCr, or Y Pb/Cb Pr/Cr, also written as YCBCR or Y′CBCR, is a family of color spaces used as a part of the color image pipeline in video and digital photography systems. Y′ is the luma component and CB and CR are the blue-difference and red-difference chroma components. Y′ is distinguished from Y, which is luminance, meaning that light intensity is nonlinearly encoded based on gamma corrected RGB primaries.
sRGB is a standard RGB color space that HP and Microsoft created cooperatively in 1996 to use on monitors, printers, and the World Wide Web. It was subsequently standardized by the International Electrotechnical Commission (IEC) as IEC 61966-2-1:1999. sRGB is the current defined standard colorspace for the web, and it is usually the assumed colorspace for images that are neither tagged for a colorspace nor have an embedded color profile.
The Adobe RGB (1998) color space or opRGB is a color space developed by Adobe Inc. in 1998. It was designed to encompass most of the colors achievable on CMYK color printers, but by using RGB primary colors on a device such as a computer display. The Adobe RGB (1998) color space encompasses roughly 50% of the visible colors specified by the CIELAB color space – improving upon the gamut of the sRGB color space, primarily in cyan-green hues. It was subsequently standardized by the IEC as IEC 61966-2-5:1999 with a name opRGB and is used in HDMI.
A color solid is the three-dimensional representation of a color space or model and can be thought as an analog of, for example, the one-dimensional color wheel, which depicts the variable of hue ; or the two-dimensional chromaticity diagram, which depicts the variables of hue and colorfulness. The added spatial dimension allows a color solid to depict the three dimensions of color: lightness, hue, and colorfulness, allowing the solid to depict all conceivable colors in an organized three-dimensional structure.
xvYCC or extended-gamut YCbCr is a color space that can be used in the video electronics of television sets to support a gamut 1.8 times as large as that of the sRGB color space. xvYCC was proposed by Sony, specified by the IEC in October 2005 and published in January 2006 as IEC 61966-2-4. xvYCC extends the ITU-R BT.709 tone curve by defining over-ranged values. xvYCC-encoded video retains the same color primaries and white point as BT.709, and uses either a BT.601 or BT.709 RGB-to-YCC conversion matrix and encoding. This allows it to travel through existing digital limited range YCC data paths, and any colors within the normal gamut will be compatible. It works by allowing negative RGB inputs and expanding the output chroma. These are used to encode more saturated colors by using a greater part of the RGB values that can be encoded in the YCbCr signal compared with those used in Broadcast Safe Level. The extra-gamut colors can then be displayed by a device whose underlying technology is not limited by the standard primaries.
Rec. 709, also known as Rec.709, BT.709, and ITU 709, is a standard developed by ITU-R for image encoding and signal characteristics of high-definition television.
ITU-R Recommendation BT.2020, more commonly known by the abbreviations Rec. 2020 or BT.2020, defines various aspects of ultra-high-definition television (UHDTV) with standard dynamic range (SDR) and wide color gamut (WCG), including picture resolutions, frame rates with progressive scan, bit depths, color primaries, RGB and luma-chroma color representations, chroma subsamplings, and an opto-electronic transfer function. The first version of Rec. 2020 was posted on the International Telecommunication Union (ITU) website on August 23, 2012, and two further editions have been published since then.
The hybrid log–gamma (HLG) transfer function is a transfer function jointly developed by the BBC and NHK for high dynamic range (HDR) display. It's backward compatible with the transfer function of SDR. It was approved as ARIB STD-B67 by the Association of Radio Industries and Businesses (ARIB). It is also defined in ATSC 3.0, Digital Video Broadcasting (DVB) UHD-1 Phase 2, and International Telecommunication Union (ITU) Rec. 2100.
HDR10 Media Profile, more commonly known as HDR10, is an open high-dynamic-range video (HDR) standard announced on August 27, 2015, by the Consumer Technology Association. It is the most widespread HDR format.
Standard-dynamic-range video is a video technology which represents light intensity based on the brightness, contrast and color characteristics and limitations of a cathode ray tube (CRT) display. SDR video is able to represent a video or picture's colors with a maximum luminance around 100 cd/m2, a black level around 0.1 cd/m2 and Rec.709 / sRGB color gamut. It uses the gamma curve as its electro-optical transfer function.
Ultra HD Forum is an organization whose goal is to help solve the real world hurdles in deploying Ultra HD video and thus to help promote UHD deployment. The Ultra HD Forum will help navigate amongst the standards related to high dynamic range (HDR), high frame rate (HFR), next generation audio (NGA), and wide color gamut (WCG). The Ultra HD Forum is an industry organisation that is complementary to the UHD Alliance, covering different aspects of the UHD ecosystem.
ICTCP, ICtCp, or ITP is a color representation format specified in the Rec. ITU-R BT.2100 standard that is used as a part of the color image pipeline in video and digital photography systems for high dynamic range (HDR) and wide color gamut (WCG) imagery. It was developed by Dolby Laboratories from the IPT color space by Ebner and Fairchild. The format is derived from an associated RGB color space by a coordinate transformation that includes two matrix transformations and an intermediate nonlinear transfer function that is informally known as gamma pre-correction. The transformation produces three signals called I, CT, and CP. The ICTCP transformation can be used with RGB signals derived from either the perceptual quantizer (PQ) or hybrid log–gamma (HLG) nonlinearity functions, but is most commonly associated with the PQ function.
ITU-R Recommendation BT.2100, more commonly known by the abbreviations Rec. 2100 or BT.2100, introduced high-dynamic-range television (HDR-TV) by recommending the use of the perceptual quantizer (PQ) or hybrid log–gamma (HLG) transfer functions instead of the traditional "gamma" previously used for SDR-TV.
The perceptual quantizer (PQ), published by SMPTE as SMPTE ST 2084, is a transfer function that allows for HDR display by replacing the gamma curve used in SDR. It is capable of representing luminance level up to 10000 cd/m2 (nits) and down to 0.0001 nits. It has been developed by Dolby and standardized in 2014 by SMPTE and also in 2016 by ITU in Rec. 2100. ITU specifies the use of PQ or HLG as transfer functions for HDR-TV. PQ is the basis of HDR video formats and is also used for HDR still picture formats. PQ is not backward compatible with the BT.1886 EOTF, while HLG is compatible.
High-dynamic-range television (HDR-TV) is a technology that uses high dynamic range (HDR) to improve the quality of display signals. It is contrasted with the retroactively-named standard dynamic range (SDR). HDR changes the way the luminance and colors of videos and images are represented in the signal, and allows brighter and more detailed highlight representation, darker and more detailed shadows, and more intense colors.
ITU-R BT.1886 is the reference EOTF of SDR-TV. It is a gamma 2.4 transfer function considered as a satisfactory approximation of the response characteristic of CRT to electrical signal. It has been standardized by ITU in March 2011. It is used for Rec. 709 (HD-TV) and Rec. 2020 (UHD-TV).
The EBU colour bars is a television test card used to check if a video signal has been altered by recording or transmission, and what adjustments must be made to bring it back to specification. It is also used for setting a television monitor or receiver to reproduce chrominance and luminance information correctly. The EBU bars are most commonly shown arranged side-by-side in a vertical manner, though some broadcasters – such as TVP in Poland, and Gabon Télévision in Gabon – were known to have aired a horizontal version of the EBU bars.
{{cite web}}
: |first=
has generic name (help)