This article contains content that is written like an advertisement .(January 2019) |
The Academy Color Encoding System (ACES) is a color image encoding system created under the auspices of the Academy of Motion Picture Arts and Sciences. ACES is characterised by a color accurate workflow, with "seamless interchange of high quality motion picture images regardless of source". [1]
The system defines its own color primaries based on spectral locus as defined by the CIE xyY specification. The white point is approximate to the chromaticity of CIE Daylight with a Correlated Color Temperature (CCT) of 6000K. [2] Most ACES compliant image files are encoded in 16-bit half-floats, thus allowing ACES OpenEXR files to encode 30 stops of scene information. [1] The ACESproxy format uses integers with a log encoding. ACES supports both high dynamic range (HDR) and wide color gamut (WCG). [1]
The version 1.0 release occurred in December 2014. ACES received a Primetime Engineering Emmy Award in 2012. [3] The system is standardized in part by the Society of Motion Picture and Television Engineers (SMPTE) standards body.
The ACES project began its development in 2004 in collaboration with 50 industry technologists. [4] The project began due to the recent incursion of digital technologies into the motion picture industry. The traditional motion picture workflow had been based on film negatives, and with the digital transition, scanning of negatives and digital camera acquisition. The industry lacked a color management scheme for diverse sources coming from a variety of digital motion picture cameras and film. The ACES system is designed to control the complexity inherent in managing a multitude of file formats, image encoding, metadata transfer, color reproduction, and image interchanges that are present in the current motion picture workflow.
The following versions are available for the reference implementation: [5]
The system comprises several components which are designed to work together to create a uniform workflow:
ACES 1.0 is a color encoding system, defining one core archival color space, and then four additional working color spaces, and additional file protocols. The ACES system is designed to cover the needs film and television production, relating to the capture, generation, transport, exchange, grading, processing, and short & long term storage of motion picture and still image data. These color spaces all have a few common characteristics:
The five color spaces use one of two defined sets of RGB color primaries called AP0 and AP1 (“ACES Primaries” #0 and #1); The chromaticity coordinates are listed in the table below:
CIE 1931 | AP0: ACES 2065-1 | White Point | AP1: cg, cc, cct, proxy | ||||
---|---|---|---|---|---|---|---|
red | green | blue | red | green | blue | ||
x | 0.7347 | 0.0000 | 0.0001 | 0.32168 | 0.713 | 0.165 | 0.128 |
y | 0.2653 | 1.0000 | -0.0770 | 0.33767 | 0.293 | 0.830 | 0.044 |
AP0 is defined as the smallest set of primaries that encloses the entire CIE 1931 standard-observer spectral locus; thus theoretically including, and exceeding, all the color stimuli that can be seen by the average human eye. The concept of using non-realizable or imaginary primaries is not new, and is often employed with color systems that wish to render a larger portion of the visible spectral locus. The ProPhoto RGB (developed by Kodak) and the ARRI Wide Gamut (developed by Arri) are two such color spaces. Values outside the spectral locus are maintained with the assumption that they will later be manipulated through color timing or in other cases of image interchange to eventually lie within the locus. This results in color values not being “clipped” or “crushed” as a result of post-production manipulation.
AP1 gamut is smaller than the AP0 primaries, but is still considered “wide gamut”. The AP1 primaries are much closer to realizable primaries, but unlike AP0, none are negative. This is important for use as a working space, for a number of practical reasons:
This is the core ACES color space, and the only one using the AP0 RGB primaries. It uses photo-metrically linear transfer characteristics (i.e. gamma of 1.0), and is the only ACES space intended for interchange among facilities, and most importantly, archiving image/video files.
ACES2065-1 code values are linear values scaled in an Input Transform so that:
ACES2065-1 code values often exceed for ordinary scenes, and a very high range of speculars and highlights can be maintained in the encoding. The internal processing and storage of ACES2065-1 code values must be in floating-point arithmetics with at least 16 bits per channel. Pre-release versions of ACES, i.e. those prior to 1.0, defined ACES2065-1 as the only color space. Legacy applications might therefore refer to ACES2065-1 when referring to “the ACES color space”. Furthermore, because of its importance and linear characteristics, and being the one based on AP0 primaries, it is also improperly referred to as either “Linear ACES”, “ACES.lin”, “SMPTE2065-1” or even “the AP0 color space”.
Standards are defined for storing images in the ACES2065-1 color space, particularly on the metadata side of things, so that applications honoring ACES framework can acknowledge the color space encoding from the metadata rather than inferring it from other things. For example:
ACEScg is a scene-linear encoding, like ACES2065-1, but ACEScg is using the AP1 primaries, which are closer to realizable primaries. ACEScg was developed for use in visual effects work, when it became clear that ACES2065 was not a useful working space due to the negative blue primary, and the extreme distance of the other imaginary primaries.
The AP1 primaries are much closer to the chromaticity diagram of real colors, and importantly, none of them are negative. This is important for rendering and compositing image data as needed for visual effects.
Like ACEScg, ACEScc and ACEScct are using the AP1 primaries. What sets them apart is that instead of a scene-linear transfer encoding, ACEScc and ACEScct use logarithmic curves, which makes them better suited to color-grading. The grading workflow has traditionally used log encoded image data, in large part as the physical film used in cinematography has a logarithmic response to light.
ACEScc is a pure log function, but ACEScct has a "toe" near black, to simulate the minimum density of photographic negative film, and the legacy DPX or Cineon log curve.
ACES is defined by several Standards by SMPTE (ST2065 family) and documentations by AMPAS, which include: [8]
A SMPTE standard is also under development to allow ACES code streams to be mapped to the Material Exchange Format (MXF) container. [9]
The RGB color model is an additive color model in which the red, green and blue primary colors of light are added together in various ways to reproduce a broad array of colors. The name of the model comes from the initials of the three additive primary colors, red, green, and blue.
Gamma correction or gamma is a nonlinear operation used to encode and decode luminance or tristimulus values in video or still image systems. Gamma correction is, in the simplest cases, defined by the following power-law expression:
Y′UV, also written YUV, is the color model found in the PAL analogue color TV standard. A color is described as a Y′ component (luma) and two chroma components U and V. The prime symbol (') denotes that the luma is calculated from gamma-corrected RGB input and that it is different from true luminance. Today, the term YUV is commonly used in the computer industry to describe colorspaces that are encoded using YCbCr.
RGB color spaces is a category of additive colorimetric color spaces specifying part of its absolute color space definition using the RGB color model.
The CIELAB color space, also referred to as L*a*b*, is a color space defined by the International Commission on Illumination in 1976. It expresses color as three values: L* for perceptual lightness and a* and b* for the four unique colors of human vision: red, green, blue and yellow. CIELAB was intended as a perceptually uniform space, where a given numerical change corresponds to a similar perceived change in color. While the LAB space is not truly perceptually uniform, it nevertheless is useful in industry for detecting small differences in color.
In color reproduction and colorimetry, a gamut, or color gamut, is a convex set containing the colors that can be accurately represented, i.e. reproduced by an output device or measured by an input device. Devices with a larger gamut can represent more colors. Similarly, gamut may also refer to the colors within a defined color space, which is not linked to a specific device. A trichromatic gamut is often visualized as a color triangle. A less common usage defines gamut as the subset of colors contained within an image, scene or video.
YCbCr, Y′CbCr, or Y Pb/Cb Pr/Cr, also written as YCBCR or Y′CBCR, is a family of color spaces used as a part of the color image pipeline in video and digital photography systems. Y′ is the luma component and CB and CR are the blue-difference and red-difference chroma components. Y′ is distinguished from Y, which is luminance, meaning that light intensity is nonlinearly encoded based on gamma corrected RGB primaries.
sRGB is a standard RGB color space that HP and Microsoft created cooperatively in 1996 to use on monitors, printers, and the World Wide Web. It was subsequently standardized by the International Electrotechnical Commission (IEC) as IEC 61966-2-1:1999. sRGB is the current defined standard colorspace for the web, and it is usually the assumed colorspace for images that are neither tagged for a colorspace nor have an embedded color profile.
The Adobe RGB (1998) color space or opRGB is a color space developed by Adobe Inc. in 1998. It was designed to encompass most of the colors achievable on CMYK color printers, but by using RGB primary colors on a device such as a computer display. The Adobe RGB (1998) color space encompasses roughly 30% of the visible colors specified by the CIELAB color space – improving upon the gamut of the sRGB color space, primarily in cyan-green hues. It was subsequently standardized by the IEC as IEC 61966-2-5:1999 with a name opRGB and is used in HDMI.
The ProPhoto RGB color space, also known as ROMM RGB, is an output referred RGB color space developed by Kodak. It offers an especially large gamut designed for use with photographic output in mind. The ProPhoto RGB color space encompasses over 90% of possible surface colors in the CIE L*a*b* color space, and 100% of likely occurring real-world surface colors documented by Michael Pointer in 1980, making ProPhoto even larger than the Wide-gamut RGB color space. The ProPhoto RGB primaries were also chosen in order to minimize hue rotations associated with non-linear tone scale operations. One of the downsides to this color space is that approximately 13% of the representable colors are imaginary colors that do not exist and are not visible colors.
In 1931 the International Commission on Illumination (CIE) published the CIE 1931 color spaces which define the relationship between the visible spectrum and the visual sensation of specific colors by human color vision. The CIE color spaces are mathematical models that create a "standard observer", which attempts to predict the perception of unique hues of color. These color spaces are essential tools that provide the foundation for measuring color for industry, including inks, dyes, and paints, illumination, color imaging, etc. The CIE color spaces contributed to the development of color television, the creation of instruments for maintaining consistent color in manufacturing processes, and other methods of color management.
LMS, is a color space which represents the response of the three types of cones of the human eye, named for their responsivity (sensitivity) peaks at long, medium, and short wavelengths.
scRGB is a wide color gamut RGB color space created by Microsoft and HP that uses the same color primaries and white/black points as the sRGB color space but allows coordinates below zero and greater than one. The full range is −0.5 through just less than +7.5.
PGF is a wavelet-based bitmapped image format that employs lossless and lossy data compression. PGF was created to improve upon and replace the JPEG format. It was developed at the same time as JPEG 2000 but with a focus on speed over compression ratio.
Rec. 709, also known as Rec.709, BT.709, and ITU 709, is a standard developed by ITU-R for image encoding and signal characteristics of high-definition television.
A color space is a specific organization of colors. In combination with color profiling supported by various physical devices, it supports reproducible representations of color – whether such representation entails an analog or a digital representation. A color space may be arbitrary, i.e. with physically realized colors assigned to a set of physical color swatches with corresponding assigned color names, or structured with mathematical rigor. A "color space" is a useful conceptual tool for understanding the color capabilities of a particular device or digital file. When trying to reproduce color on another device, color spaces can show whether shadow/highlight detail and color saturation can be retained, and by how much either will be compromised.
DCI-P3 is a color space defined in 2005 as part of the Digital Cinema Initiative, for use in theatrical digital motion picture distribution (DCDM). Display P3 is a variant developed by Apple Inc. for wide-gamut displays.
ICTCP, ICtCp, or ITP is a color representation format specified in the Rec. ITU-R BT.2100 standard that is used as a part of the color image pipeline in video and digital photography systems for high dynamic range (HDR) and wide color gamut (WCG) imagery. It was developed by Dolby Laboratories from the IPT color space by Ebner and Fairchild. The format is derived from an associated RGB color space by a coordinate transformation that includes two matrix transformations and an intermediate nonlinear transfer function that is informally known as gamma pre-correction. The transformation produces three signals called I, CT, and CP. The ICTCP transformation can be used with RGB signals derived from either the perceptual quantizer (PQ) or hybrid log–gamma (HLG) nonlinearity functions, but is most commonly associated with the PQ function.
ITU-R Recommendation BT.2100, more commonly known by the abbreviations Rec. 2100 or BT.2100, introduced high-dynamic-range television (HDR-TV) by recommending the use of the perceptual quantizer or hybrid log–gamma (HLG) transfer functions instead of the traditional "gamma" previously used for SDR-TV.