Academy Color Encoding System

Last updated

The Academy Color Encoding System (ACES) is a color image encoding system created under the auspices of the Academy of Motion Picture Arts and Sciences. ACES is characterised by a color accurate workflow, with "seamless interchange of high quality motion picture images regardless of source". [1]

Contents

The system defines its own color primaries based on spectral locus as defined by the CIE xyY specification. The white point is approximate to the chromaticity of CIE Daylight with a Correlated Color Temperature (CCT) of 6000K. [2] Most ACES compliant image files are encoded in 16-bit half-floats, thus allowing ACES OpenEXR files to encode 30 stops of scene information. [1] The ACESproxy format uses integers with a log encoding. ACES supports both high dynamic range (HDR) and wide color gamut (WCG). [1]

The version 1.0 release occurred in December 2014. ACES received a Primetime Engineering Emmy Award in 2012. [3] The system is standardized in part by the Society of Motion Picture and Television Engineers (SMPTE) standards body.

History

Background

The ACES project began its development in 2004 in collaboration with 50 industry technologists. [4] The project began due to the recent incursion of digital technologies into the motion picture industry. The traditional motion picture workflow had been based on film negatives, and with the digital transition, scanning of negatives and digital camera acquisition. The industry lacked a color management scheme for diverse sources coming from a variety of digital motion picture cameras and film. The ACES system is designed to control the complexity inherent in managing a multitude of file formats, image encoding, metadata transfer, color reproduction, and image interchanges that are present in the current motion picture workflow.

Versions

The following versions are available for the reference implementation: [5]

System overview

The system comprises several components which are designed to work together to create a uniform workflow:

ACES Color Spaces

CIE 1931 chromaticity diagram comparing ACES and sRGB gamuts CIE 1931 chromaticity ACES sRGB gamut comparison CreativeCommons v06.svg
CIE 1931 chromaticity diagram comparing ACES and sRGB gamuts

ACES 1.0 is a color encoding system, defining one core archival color space, and then four additional working color spaces, and additional file protocols. The ACES system is designed to cover the needs film and television production, relating to the capture, generation, transport, exchange, grading, processing, and short & long term storage of motion picture and still image data. These color spaces all have a few common characteristics:

  1. They are based on the RGB color model.
  2. The image data is scene-referred, i.e. the numerical values are related to the original scene lighting, as reflected or emitted from the real objects & lights on the set at the time of filming. The space refers to a "standard reference camera", an imaginary camera that can capture all of human visual perception. Scene-referred code values captured by a real camera are directly related to luminous exposure.
  3. They are capable of holding 30 stops of exposure.
  4. The reference white point is sometimes, and incorrectly, referred to as "D60" though there is no such thing as a CIE D60 standard illuminant. Further, the white point is not on the CIE Daylight Locus nor the Planckian Locus, and does not define the neutral axis. Filmmakers are allowed to choose whatever effective whitepoint they need for technical or artistic reasons.
  5. The white point serves only as a mathematical reference for transforms, and should not be confused with a scene or display reference. It was chosen through an experiment, projecting film containing a LAD test patch onto a theater screen, using a projector with a xenon bulb. That measured white point was then adjusted to be close to, but not on, the CIE daylight locus. The CCT is close to 6000k, with CIE 1931 xy chromaticities of . [7]

The five color spaces use one of two defined sets of RGB color primaries called AP0 and AP1 (“ACES Primaries#0 and #1); The chromaticity coordinates are listed in the table below:

ACES Primary and White Point Coordinates
CIE

1931

AP0: ACES 2065-1White

Point

AP1: cg, cc, cct, proxy
redgreenblueredgreenblue
x0.73470.00000.00010.321680.7130.1650.128
y0.26531.0000-0.07700.337670.2930.8300.044

AP0 is defined as the smallest set of primaries that encloses the entire CIE 1931 standard-observer spectral locus; thus theoretically including, and exceeding, all the color stimuli that can be seen by the average human eye. The concept of using non-realizable or imaginary primaries is not new, and is often employed with color systems that wish to render a larger portion of the visible spectral locus. The ProPhoto RGB (developed by Kodak) and the ARRI Wide Gamut (developed by Arri) are two such color spaces. Values outside the spectral locus are maintained with the assumption that they will later be manipulated through color timing or in other cases of image interchange to eventually lie within the locus. This results in color values not being “clipped” or “crushed” as a result of post-production manipulation.

AP1 gamut is smaller than the AP0 primaries, but is still considered “wide gamut”. The AP1 primaries are much closer to realizable primaries, but unlike AP0, none are negative. This is important for use as a working space, for a number of practical reasons:

ACES2065-1

This is the core ACES color space, and the only one using the AP0 RGB primaries. It uses photo-metrically linear transfer characteristics (i.e. gamma of 1.0), and is the only ACES space intended for interchange among facilities, and most importantly, archiving image/video files.

ACES2065-1 code values are linear values scaled in an Input Transform so that:

ACES2065-1 code values often exceed for ordinary scenes, and a very high range of speculars and highlights can be maintained in the encoding. The internal processing and storage of ACES2065-1 code values must be in floating-point arithmetics with at least 16 bits per channel. Pre-release versions of ACES, i.e. those prior to 1.0, defined ACES2065-1 as the only color space. Legacy applications might therefore refer to ACES2065-1 when referring to “the ACES color space”. Furthermore, because of its importance and linear characteristics, and being the one based on AP0 primaries, it is also improperly referred to as either “Linear ACES”, “ACES.lin”, “SMPTE2065-1” or even “the AP0 color space”.

Standards are defined for storing images in the ACES2065-1 color space, particularly on the metadata side of things, so that applications honoring ACES framework can acknowledge the color space encoding from the metadata rather than inferring it from other things. For example:

ACEScg

ACEScg is a scene-linear encoding, like ACES2065-1, but ACEScg is using the AP1 primaries, which are closer to realizable primaries. ACEScg was developed for use in visual effects work, when it became clear that ACES2065 was not a useful working space due to the negative blue primary, and the extreme distance of the other imaginary primaries.

The AP1 primaries are much closer to the chromaticity diagram of real colors, and importantly, none of them are negative. This is important for rendering and compositing image data as needed for visual effects.

ACEScc & ACEScct

Like ACEScg, ACEScc and ACEScct are using the AP1 primaries. What sets them apart is that instead of a scene-linear transfer encoding, ACEScc and ACEScct use logarithmic curves, which makes them better suited to color-grading. The grading workflow has traditionally used log encoded image data, in large part as the physical film used in cinematography has a logarithmic response to light.

ACEScc is a pure log function, but ACEScct has a "toe" near black, to simulate the minimum density of photographic negative film, and the legacy DPX or Cineon log curve.

Converting ACES2065-1 RGB values to CIE XYZ values


Converting CIE XYZ values to ACES2065-1 values


Standards

ACES is defined by several Standards by SMPTE (ST2065 family) and documentations by AMPAS, which include: [8]

A SMPTE standard is also under development to allow ACES code streams to be mapped to the Material Exchange Format (MXF) container. [9]

See also

Related Research Articles

<span class="mw-page-title-main">RGB color model</span> Color model based on red, green, and blue

The RGB color model is an additive color model in which the red, green and blue primary colors of light are added together in various ways to reproduce a broad array of colors. The name of the model comes from the initials of the three additive primary colors, red, green, and blue.

Gamma correction or gamma is a nonlinear operation used to encode and decode luminance or tristimulus values in video or still image systems. Gamma correction is, in the simplest cases, defined by the following power-law expression:

<span class="mw-page-title-main">Y′UV</span> Mathematical color model

Y′UV, also written YUV, is the color model found in the PAL analogue color TV standard. A color is described as a Y′ component (luma) and two chroma components U and V. The prime symbol (') denotes that the luma is calculated from gamma-corrected RGB input and that it is different from true luminance. Today, the term YUV is commonly used in the computer industry to describe colorspaces that are encoded using YCbCr.

<span class="mw-page-title-main">RGB color spaces</span> Any additive color space based on the RGB color model

RGB color spaces are additive colorimetric color spaces specifying part of its absolute color space definition using the RGB color model.

Color management is the process of ensuring consistent and accurate colors across various devices, such as monitors, printers, and cameras. It involves the use of color profiles, which are standardized descriptions of how colors should be displayed or reproduced.

Digital Picture Exchange (DPX) is a common file format for digital intermediate and visual effects work and is a SMPTE standard. The file format is most commonly used to represent the density of each colour channel of a scanned negative film in an uncompressed "logarithmic" image where the gamma of the original camera negative is preserved as taken by a film scanner. For this reason, DPX is the worldwide-chosen format for still frames storage in most digital intermediate post-production facilities and film labs. Other common video formats are supported as well, from video to purely digital ones, making DPX a file format suitable for almost any raster digital imaging applications. DPX provides, in fact, a great deal of flexibility in storing colour information, colour spaces and colour planes for exchange between production facilities. Multiple forms of packing and alignment are possible. The DPX specification allows for a wide variety of metadata to further clarify information stored within each file.

<span class="mw-page-title-main">Gamut</span> Color reproduction capability

In color reproduction and colorimetry, a gamut, or color gamut, is a convex set containing the colors that can be accurately represented, i.e. reproduced by an output device or measured by an input device. Devices with a larger gamut can represent more colors. Similarly, gamut may also refer to the colors within a defined color space, which is not linked to a specific device. A trichromatic gamut is often visualized as a color triangle. A less common usage defines gamut as the subset of colors contained within an image, scene or video.

<span class="mw-page-title-main">YCbCr</span> Family of digital colour spaces

YCbCr, Y′CbCr, or Y Pb/Cb Pr/Cr, also written as YCBCR or Y′CBCR, is a family of color spaces used as a part of the color image pipeline in video and digital photography systems. Y′ is the luma component and CB and CR are the blue-difference and red-difference chroma components. Y′ is distinguished from Y, which is luminance, meaning that light intensity is nonlinearly encoded based on gamma corrected RGB primaries.

sRGB Standard RGB color space

sRGB is a standard RGB color space that HP and Microsoft created cooperatively in 1996 to use on monitors, printers, and the World Wide Web. It was subsequently standardized by the International Electrotechnical Commission (IEC) as IEC 61966-2-1:1999. sRGB is the current defined standard colorspace for the web, and it is usually the assumed colorspace for images that are neither tagged for a colorspace nor have an embedded color profile.

<span class="mw-page-title-main">Adobe RGB color space</span> Color space developed by Adobe

The Adobe RGB (1998) color space or opRGB is a color space developed by Adobe Inc. in 1998. It was designed to encompass most of the colors achievable on CMYK color printers, but by using RGB primary colors on a device such as a computer display. The Adobe RGB (1998) color space encompasses roughly 30% of the visible colors specified by the CIELAB color space – improving upon the gamut of the sRGB color space, primarily in cyan-green hues. It was subsequently standardized by the IEC as IEC 61966-2-5:1999 with a name opRGB and is used in HDMI.

<span class="mw-page-title-main">ProPhoto RGB color space</span> Photographic color space developed by Kodak

The ProPhoto RGB color space, also known as ROMM RGB, is an output referred RGB color space developed by Kodak. It offers an especially large gamut designed for use with photographic output in mind. The ProPhoto RGB color space encompasses over 90% of possible surface colors in the CIE L*a*b* color space, and 100% of likely occurring real-world surface colors documented by Michael Pointer in 1980, making ProPhoto even larger than the Wide-gamut RGB color space. The ProPhoto RGB primaries were also chosen in order to minimize hue rotations associated with non-linear tone scale operations. One of the downsides to this color space is that approximately 13% of the representable colors are imaginary colors that do not exist and are not visible colors.

<span class="mw-page-title-main">CIE 1931 color space</span> Color space defined by the CIE in 1931

The CIE 1931 color spaces are the first defined quantitative links between distributions of wavelengths in the electromagnetic visible spectrum, and physiologically perceived colors in human color vision. The mathematical relationships that define these color spaces are essential tools for color management, important when dealing with color inks, illuminated displays, and recording devices such as digital cameras. The system was designed in 1931 by the "Commission Internationale de l'éclairage", known in English as the International Commission on Illumination.

xvYCC or extended-gamut YCbCr is a color space that can be used in the video electronics of television sets to support a gamut 1.8 times as large as that of the sRGB color space. xvYCC was proposed by Sony, specified by the IEC in October 2005 and published in January 2006 as IEC 61966-2-4. xvYCC extends the ITU-R BT.709 tone curve by defining over-ranged values. xvYCC-encoded video retains the same color primaries and white point as BT.709, and uses either a BT.601 or BT.709 RGB-to-YCC conversion matrix and encoding. This allows it to travel through existing digital limited range YCC data paths, and any colors within the normal gamut will be compatible. It works by allowing negative RGB inputs and expanding the output chroma. These are used to encode more saturated colors by using a greater part of the RGB values that can be encoded in the YCbCr signal compared with those used in Broadcast Safe Level. The extra-gamut colors can then be displayed by a device whose underlying technology is not limited by the standard primaries.

scRGB Wide color gamut RGB color space

scRGB is a wide color gamut RGB color space created by Microsoft and HP that uses the same color primaries and white/black points as the sRGB color space but allows coordinates below zero and greater than one. The full range is −0.5 through just less than +7.5.

<span class="mw-page-title-main">Progressive Graphics File</span> File format

PGF is a wavelet-based bitmapped image format that employs lossless and lossy data compression. PGF was created to improve upon and replace the JPEG format. It was developed at the same time as JPEG 2000 but with a focus on speed over compression ratio.

<span class="mw-page-title-main">Rec. 709</span> Standard for HDTV image encoding and signal characteristics

Rec. 709, also known as Rec.709, BT.709, and ITU 709, is a standard developed by ITU-R for image encoding and signal characteristics of high-definition television.

<span class="mw-page-title-main">Color space</span> Standard that defines a specific range of colors

A color space is a specific organization of colors. In combination with color profiling supported by various physical devices, it supports reproducible representations of color – whether such representation entails an analog or a digital representation. A color space may be arbitrary, i.e. with physically realized colors assigned to a set of physical color swatches with corresponding assigned color names, or structured with mathematical rigor. A "color space" is a useful conceptual tool for understanding the color capabilities of a particular device or digital file. When trying to reproduce color on another device, color spaces can show whether shadow/highlight detail and color saturation can be retained, and by how much either will be compromised.

<span class="mw-page-title-main">DCI-P3</span> RGB color space for digital movie projection from the American film industry

DCI-P3 is a color space first defined in 2005 as part of the Digital Cinema Initiative, to be used for digital theatrical motion picture distribution (DCDM). Display P3 is a variant developed by Apple Inc. for wide-gamut displays.

<i>ICtCp</i>

ICTCP, ICtCp, or ITP is a color representation format specified in the Rec. ITU-R BT.2100 standard that is used as a part of the color image pipeline in video and digital photography systems for high dynamic range (HDR) and wide color gamut (WCG) imagery. It was developed by Dolby Laboratories from the IPT color space by Ebner and Fairchild. The format is derived from an associated RGB color space by a coordinate transformation that includes two matrix transformations and an intermediate nonlinear transfer function that is informally known as gamma pre-correction. The transformation produces three signals called I, CT, and CP. The ICTCP transformation can be used with RGB signals derived from either the perceptual quantizer (PQ) or hybrid log–gamma (HLG) nonlinearity functions, but is most commonly associated with the PQ function.

References

  1. 1 2 3 "What are the Advantages of using ACES for Color Correction?". Oscars.org. 19 November 2015. Retrieved 2016-12-02.
  2. "Derivation of the ACES White Point CIE Chromaticity Coordinates". docs.acescentral.com. Retrieved 2022-07-01.
  3. "Winners of the 64th Primetime Emmy Engineering Awards Announced - InteractiveTV Today". Itvt.com. Archived from the original on 2013-05-09. Retrieved 2013-03-08.
  4. "Academy Color Encoding System | Science & Technology Council | Academy of Motion Picture Arts & Sciences". Oscars.org. 2012-08-24. Retrieved 2013-12-20.
  5. "aces-dev/CHANGELOG.md at dev · ampas/aces-dev". GitHub.
  6. Tobenkin, Steve (3 May 2021). "ACES 1.3 is Available!". ACESCentral.
  7. "TB-2018-001 Derivation of the ACES White Point CIE Chromaticity Coordinates" . Retrieved 26 June 2018.
  8. "ACES Documentation". Oscars.org. 29 April 2015. Retrieved 2016-09-24.
  9. "31FS ACES Codestreams in MXF". Oscars.org. Archived from the original on 2016-09-27. Retrieved 2016-09-24.