HD-MAC

Last updated

HD-MAC (High DefinitionMultiplexed Analogue Components) was a broadcast television standard proposed by the European Commission in 1986, as part of Eureka 95 project. [1] It belongs to the MAC - Multiplexed Analogue Components standard family. It is an early attempt by the EEC to provide High-definition television (HDTV) in Europe. It is a complex mix of analogue signal (based on the Multiplexed Analogue Components standard), multiplexed with digital sound, and assistance data for decoding (DATV). The video signal (1250 lines/50 fields per second in 16:9 aspect ratio, with 1152 visible lines) was encoded with a modified D2-MAC encoder.

Contents

HD-MAC could be decoded by normal D2-MAC standard definition receivers, but no extra resolution was obtained and certain artifacts were visible. To decode the signal in full resolution a specific HD-MAC tuner was required .

Naming convention

The European Broadcasting Union video format description is as follows: width x height [scan type: i or p] / number of full frames per second [2]

European standard definition digital broadcasts use 720×576i/25, meaning 25 720 pixels wide and 576 pixels high interlaced frames: odd lines (1, 3, 5 ...) are grouped to build the odd field, which is transmitted first, then it is followed by the even field containing lines 2, 4, 6... Thus, there are two fields in a frame, resulting in a field frequency of 25 × 2 = 50 Hz.

The visible part of the video signal provided by an HD-MAC receiver was 1152i/25, which exactly doubles the vertical resolution of standard definition. The amount of information is multiplied by 4, considering the encoder started its operations from a 1440x1152i/25 sampling grid.

Standard history

Work on HD-MAC specification started officially in May 1986. The purpose was to react against a Japanese proposal, supported by the US, which aimed to establish the NHK-designed Hi-Vision (also known as MUSE) system as a world standard. Besides preservation of the European electronic industry, there was also a need to produce a standard that would be compliant with the 50 Hz field frequency systems (used by a large majority of countries in the world). Truth be said, the precisely 60 Hz of the Japanese proposal was also worrying the US, as their NTSC M-based standard definition infrastructure used a practical frequency of 59.94 Hz, potentially leading to incompatibility problems.

In September, 1988, the Japanese performed the first High Definition broadcasts of the Olympic games, using their Hi-Vision system (NHK produced material using this format since 1982). In that same month of September, Europe showed for the first time a credible alternative, namely a complete HD-MAC broadcasting chain, at IBC 88 in Brighton. [3] This show included the first progressive scan HD video camera prototypes (Thomson/LER). [4]

Golden SCART socket (right) on a Nordmende Space System 92 HS TV with HD MAC standard. Nordmende Space System 92 HS Scart.jpg
Golden SCART socket (right) on a Nordmende Space System 92 HS TV with HD MAC standard.

Golden SCART was developed as a transmission interface for consumer devices, [5] a special and backward-compatible implementation of the normal SCART connection. Some television sets from Philips and Telefunken are said to have been equipped with it.

For the Albertville 1992 Winter Olympics and Barcelona 1992 Summer Olympics, a public demonstration of HD-MAC broadcasting took place. [1] 60 HD-MAC receivers for the Albertville games and 700 for the Barcelona games were set up in "Eurosites" to show the capabilities of the standard. [6] 1250 lines (1152 visible) CRT projectors were used to create an image a few meters wide in public spaces in Barcelona for the Olympics. [7] There were some Thomson "Space system" [8] 16:9 CRT TV sets as well. The project sometimes used rear-projection televisions. In addition, some 80,000 viewers of D2-MAC receivers were also able to watch the channel (though not in HD). It is estimated that 350,000 people across Europe were able to see this demonstration of European HDTV. This project was financed by the EEC. The PAL-converted signal was used by mainstream broadcasters such as SWR, BR and 3sat. The HD-MAC standard was also demonstrated at Seville Expo '92, exclusively using equipment designed to work with the standard such as Plumbicon and CCD cameras, direct view and rear projection CRT TVs, BCH 1000 Type B VTRs, single mode fiber optic cables, and Laserdisc players with their respective discs. Production equipment was visible to the public through windows. [9]

Because UHF spare bandwidth was very scarce, HD-MAC was usable "de facto" only to cable and satellite providers, [1] where their bandwidth was less constricted, similarly to Hi-Vision that was only broadcast by the NHK through a dedicated satellite channel called BShi. However, the standard never became popular among broadcasters. For all this, analogue HDTV could not replace conventional SDTV (terrestrial) PAL/SECAM, making HD-MAC sets unattractive to potential consumers.

It was required that all high-powered satellite broadcasters use MAC from 1986. However, the launch of middle-powered satellites by SES and the use of PAL allowed broadcasters to bypass HD-MAC, reducing their transmission costs. HD-MAC was left for transcontinental satellite links, however.

The HD-MAC standard was abandoned in 1993, and since then all EU and EBU efforts have focused on the DVB system (Digital Video Broadcasting), which allows both SDTV and HDTV.

This article about IFA 1993 provides a view of the project's status close to its end. It mentions "a special BBC compilation encoded in HD-MAC and replayed from a D1 Video Tape Recorder". [10]

HD-MAC development was stopped alongside the EUREKA project in 1996, because picture quality was not deemed to be good enough, receiving TVs didn't have enough resolution, the 16:9 aspect ratio that would later become standard was seen as exotic, and receiving TVs weren't large enough to exhibit the image quality of the standard, and those that were, were CRT TVs which made them extremely heavy. [11]

Technical details

Simulated MAC signal. From left to right: digital data, chrominance and luminance. Multiplexed Analogue Components transmission (simulation).jpg
Simulated MAC signal. From left to right: digital data, chrominance and luminance.

Transmission

PAL/SECAM analogue SDTV broadcasts use 6-, 7- (VHF), or 8 MHz (UHF). The 819-line (System E) used 14 MHz wide VHF channels. For HD-MAC, the transmission medium must guarantee a baseband bandwidth of at least 11.14 MHz. [12] This translates to a 12 MHz channel spacing in cable networks. The specification allows for 8 MHz channels, but in this case assistance data can no longer be correctly decoded, and it is only possible to extract a standard definition signal, using a D2-MAC receiver. For satellite broadcasting, due to FM modulation spectrum expansion, an entire satellite transponder would be used, resulting in 27 to 36 MHz of bandwidth. [13] The situation is pretty much the same in analogue standard definition : a given transponder can only support one analogue channel. So from this point of view, going to HD does not represent an inconvenience.

Bandwidth reduction

BRE (Bandwidth Reduction Encoding) operation started with analogue HD video (even when the source was a digital recorder, it was reconverted to analogue to feed the encoder [14] ). It was specified to have a 50 Hz field frequency. It could be interlaced, with 25 frames a second (called 1250/50/2 in the recommendation), or it could be progressively scanned with 50 full frames a second (called 1250/50/1). The interlaced version was the one used in practice. In any case, the number of visible lines was 1152, twice the standard 576 lines vertical definition. The full number of lines in a frame period, included those that cannot be displayed, was 1250. This made for a 32 µs line period. According to ITU recommendation for HDTV standards parameters [15] the active part of the line was 26.67 µs long (see also the LDK 9000 camera document [16] ).

Had the modern trend for square pixels applied, this would have yielded a 2048x1152 sampling grid. There was no such requirement in the standard, though, since CRT monitors don't need any extra scaling to be able to show non-square pixels. According to the specification, the sampling rate for the interlaced input to use was 72 MHz, resulting in 72 x 26.67 = 1920 horizontal samples. It was then reconverted to 1440 from within the sampled domain. The input signal often originated from sources previously sampled at only 54 MHz, for economical reasons, and therefore already containing no more than the analogue equivalent of 1440 samples per line. Ultimately, the starting point for BRE was a 1440x1152 sampling grid (twice the horizontal and vertical resolutions of digital SD), interlaced, at 25 fps. [17]

To improve horizontal resolution of the D2-MAC norm, only its bandwidth had to be increased. This was easily done as, unlike PAL, the sound is not sent on a sub-carrier, but multiplexed with the picture. However, to increase vertical bandwidth was more complex, as the line frequency had to stay at 15.625 kHz to be compatible with D2-MAC. This offered three choices:

As none of the three modes would have been sufficient, the choice during encoding was not made for the whole picture, but for little blocks of 16×16 pixels. The signal then contained hints (the DATV digital stream) that controlled which de-interlacing method the decoder should use.

The 20 ms mode offered an improved temporal resolution, but the 80 ms was the only one that provided High spatial definition in the usual sense. The 40 ms mode threw away one the HD fields and reconstructed it in the receiver with the assistance of motion compensation data. Some indications were also provided in case of a whole frame movement (camera panning,..) to improve the quality of the reconstruction.

The encoder could work in "Camera" operating mode, using the three coding modes, but also in "film" mode where the 20 ms coding mode was not used.

The 80 ms mode took advantage of its reduced 12.5 fps frame rate to spread the contents of an HD frame over two SD frames, meaning four 20 ms fields = 80 ms, hence the name.

But that was not enough, as a single HD frame contains the equivalent of 4 SD frames. This could have been "solved" by doubling the bandwidth of the D2-MAC signal, thus increasing the allowed horizontal resolution by the same factor. Instead, the standard D2-MAC channel bandwidth was preserved, and one pixel out of two was dropped from each line. This sub-sampling was done in a quincux pattern. Assuming pixels on a line independently numbered from 1 to 1440, only pixels 1, 3, 5... were retained from the first line, pixels 2, 4, 6... from the second, 1, 3, 5...again from the third, and so on. That way, information from all the columns of the HD frame were conveyed to the receiver. Each missing pixel was surrounded by 4 transmitted ones (except on the sides) and could be interpolated from them. The resulting 720 horizontal resolution was further truncated to the 697 samples per line limit of the D2-HDMAC video multiplex. [18]

As a consequence of those operations, a 4:1 reduction factor was achieved, allowing the high definition video signal to be transported in a standard D2-MAC channel. The samples retained by the BRE were assembled into a valid standard definition D2-MAC vision signal and finally converted to analogue for transmission. The modulation parameters were such that the independence of the samples was preserved. [19]

To fully decode the picture, the receiver had to sample the signal again and then read from the memory several times. The BRD (Bandwidth Restoration Decoder) in the receiver would then reconstruct a 1394x1152 sampling grid from it, under the control of the DATV stream, to be fed into its DAC.

The final output was a 1250 (1152 visible) lines, 25 fps, interlaced, analogue HD video signal, with a 50 Hz field frequency.

Progressive scanning

European systems are generally referred to as 50 Hz standards (field frequency). The two fields are 20 ms apart in time. The Eu95 project [20] [21] stated it would evolve towards 1152p/50, and it is taken into account as a possible source in the D2-HDMAC specification. In that format, a full frame is captured every 20 ms, thus preserving the quality of motion of television and topping it with solid artifact-free frames representing only one instant in time, as is done for cinema. The 24 fps frame frequency of cinema is a bit low, though, and a generous amount of motion smear is required to allow the eye to perceive a smooth motion. 50 Hz is more than twice that rate, and the motion smear can be reduced in proportion, allowing for sharper pictures.

In practice, 50P was not used very much. Some tests were even done by having film shot at 50 fps and subsequently telecined. [22]

Thomson / LER presented a progressive camera. However, it used a form of quincunx sampling and had therefore some bandwidth constraints. [23]

This requirement meant pushing the technology boundaries of the time, and would have added to the notorious lack of sensitivity of some Eu 95 cameras (particularly CRT ones). This thirst for light was one of the problems that plagued the operators shooting the French film "L'affaire Seznec (The Seznec case)" in 1250i. Some CCD cameras were developed in the context of the project, see for example LDK9000  : 50 DB signal to noise ratio at 30 MHz, 1000 lux at F/4.

The Eu95 system would have provided better compatibility with cinema technology than its competitor, first because of progressive scanning, and second because of the convenience and quality of transfer between 50 Hz standards and film (no motion artifacts, one just needs to invert the usual "PAL speed-up" process by slowing down the frame rate in a 25/24 ratio). Taking one frame out of two from a 50P stream would have provided a suitable 25P video as a starting point for this operation. If the sequence is shot at 50 P with a fully opened shutter, it will produce the same amount of motion smear as a 25P shot with a half opened shutter, a common setting when shooting with a standard movie camera.

In practice, Hi-Vision seems to have been more successful in that regard, having been used for films such as Giulia e Giulia(1987) and Prospero's books(1991).

Recording

Reel to reel BCH 1000 HD-MAC VTR BNC HDTV VTR-type B deck.jpg
Reel to reel BCH 1000 HD-MAC VTR

Consumer

A consumer tape recorder prototype was presented in 1988. It had an 80-minute recording time and used a 1.25 cm "metal" tape. Bandwidth was 10.125 MHz and signal to noise ratio 42 dB. [24]

HD-MAC videodisc prototype HD MAC video disc (edited).jpg
HD-MAC videodisc prototype

An HD-MAC videodisc prototype had been designed as well. [25] The version that was presented in 1988 could record 20 min per side of a 30 cm disc. Bandwidth was 12 MHz and S/N 32 dB. [26] This media was used for several hours at Expo 92. [27]

Professional equipment

On the studio and production side, it was entirely different. HD-MAC bandwidth reduction techniques bring the HD pixel rate down to the level of SD. So in theory, it would have been possible to use an SD digital video recorder, assuming it provides enough room for the DATV assistance stream, which requires less than 1.1 Mbit/s. SD video using 4:2:0 format (12 bits per pixel) needs 720x576x25x12 bits per second, which is slightly less than 125 Mbit/s, to be compared with the 270 Mbit/s available from a D-1 machine.

But there is no real reason for the studio equipment to be constrained by HD-MAC, as the latter is only a transmission standard, used to convey the HD material from the transmitter to the viewers. Furthermore, technical and financial resources are available to store the HD video with better quality, for editing and archiving.

So in practice, other methods were used. At the start of the Eureka95 project the only means of recording the HD signal from a camera was on a massive 1-inch reel-to-reel tape machine, the BTS BCH 1000, which was based on the Type B videotape format but with 8 video heads instead of the two normally used, together with a higher linear tape speed of 66 cm/s, thus accommodating the higher bandwidth requirements of HD Video.

The plan within the Eureka95 project was to develop an uncompressed 72 MHz sampling digital recorder, dubbed the "Gigabit" recorder. It was expected to take a year to develop, so in the interim, two alternative digital recording systems were assembled, both using the standard definition "D1" uncompressed digital component recorder as starting points.

The Quincunx-subsampled, or double/dual D1 system developed by Thomson used two D-1 digital recorders which were synchronized in a master/slave relationship. Odd fields could then be recorded on one of the D-1 and even fields on the other. Horizontally the system recorded just half the horizontal bandwidth, with samples taken in a quincunx sampling grid. This gave the system a full bandwidth performance in the diagonal direction, but halved horizontally or vertically depending on the exact image temporal-spatial characteristics.

The Quadriga [28] system was developed by the BBC in 1988 using 4 synchronised D1 recorders, 54 MHz sampling, and distributed the signal in such a way that blocks of 4 pixels were sent to each recorder in turn. Thus if a single tape was viewed, the image would appear as a fair but distorted representation of the whole image, enabling edit decisions to be taken on a single recording, and a three-machine edit was possible on a single quadriga by processing each of the four channels in turn, with identical edits made on the other three channels subsequently under the control of a programmed edit controller.

The original D1 recorders were restricted to a parallel video interface with very bulky short cables, but this was not a problem, since the digital signals were contained with the 5 half-height racks (4 D1s and the interface/control/interleaving rack) which made up the Quadriga, and initially all external signals were analogue components. The introduction of SDI (the 270 Mbit/s Serial Digital Interface) simplified cabling by the time the BBC constructed a second Quadriga.

Philips also constructed a Quadriga but used a slightly different format, with the HD image divided into four quadrants, each quadrant going to one of the four recorders. Excepting a slightly longer processing delay, it otherwise worked similarly to the BBC approach, and both versions of the Quadriga equipment were made to be interoperable, switchable between interleaved and quadrant modes.

In about 1993 Philips, in a joint venture with Bosch (BTS), produced a "BRR" (or Bit Rate Reduction) recording system to enable the full HD signal to be recorded onto a single D1 (or D5 HD) recorder. A low-resolution version of the image could be viewed in the centre of the screen if the tape was replayed on a conventional D1 recorder, and was surrounded by what appeared to be noise, but was in fact simply coded/compressed data, in a similar way to later MPEG digital compression techniques, with a compression rate of 5:1, starting with 72 MHz sampling. Some BRR equipment also contained Quadriga interfaces, for ease of conversion between recording formats, also being switchable between BBC and Philips versions of the Quadriga format. By this time, Quadriga signals were being carried on four SDI cables.

Finally, with help from Toshiba, in around 2000, the Gigabit recorder, by now known as the D6 HDTV VTR "Voodoo", was produced, some years after work on the 1250-line system had ceased in favour of the Common Image Format, the HDTV system as it is known today.

Hence the quality of Eureka 95 archives is higher than what viewers could see at the output of an HD-MAC decoder.

Transfer to film

For the making of the HD-based movie L'affaire Seznec, the Thomson company certified it would be able to transfer HD to 35 mm film. But none of the attempts were successful (shooting was done on dual-D1). However, another French movie shot in 1994, Du fond du coeur: Germaine et Benjamin, allegedly achieved such a transfer. It is said to have been shot in digital high definition in 1250 lines. [29] [30] If so, it would arguably be the first digital high definition movie, using a film-friendly 50 Hz field rate, 7 years before Vidocq and 8 years before Star Wars: Episode II – Attack of the Clones .[ citation needed ]. For a historical perspective on HD-originated movies, one can mention early attempts such as 'Harlow', shot in 1965 using a near-HD analogue 819 lines process that later evolved to higher resolutions (see Electronovision).

Project's afterlife

Experience was gained on important building blocks like HD digital recording, digital processing including motion compensation, HD CCD cameras, and also in factors driving acceptance or rejection of a new format by the professionals, and all of that was put to good use in the subsequent Digital Video Broadcasting project which, in contrast to HD-MAC, is a great worldwide success. Despite early claims by competitors that it could not do HD, it was soon deployed in Australia for just that purpose.

The cameras and tape recorders were reused for early experiments in digital high definition cinema.

The US brought home some of the Eu95 cameras to be studied in the context of their own HDTV standard development effort.

In France, a company called VTHR (Video Transmission Haute Resolution) used the Eu95 hardware for some time to retransmit cultural events to small villages (later, they switched to upscaled 15 Mbit/s MPEG2 SD).

In 1993, Texas Instruments built a 2048x1152 DMD prototype. [31] No rationale is exposed in the papers for choosing this specific resolution over the Japanese 1035 active lines system, or alternatively doubling the 480 lines of the standard US TV to 960, but that way it could cover all resolutions expected to be present on the market, and that included the European one, which happened to be the highest. Some legacy of this development may be seen in "2K" and "4K" digital movie projectors using TI DLP chips, which run a slightly wider than usual 2048x1080 or 4096x2160 resolution, giving 1.896:1 aspect ratio without anamorphic stretching (vs the 1.778:1 of regular 16:9, with 1920 or 3840 horizontal pixels), give a little (6.7%) more horizontal resolution with anamorphic lenses when showing 2.21:1 (or wider) movies specifically prepared for them, and further enhancement (~13.78%) through reduced letterboxing if used without such lenses.

As of 2010, some computer monitors with 2048x1152 resolution were available (e.g. Samsung 2343BWX 23, Dell SP2309W). This unlikely to be in reference to Eu95, especially as the refresh rate will generally default to "60 Hz" (or 59.94 Hz), but simply a convenient "HD+" resolution made for bragging rights over ubiquitous 1920x1080 HD panels, with the slimmest possible actual resolution improvement whilst keeping the same 16:9 resolution for video playback without cropping or letterboxing (the next nearest "convenient" 16:9 resolution being the comparatively much larger, so much more expensive 2560x1600 "2.5K" as used in e.g. Apple Cinema and Retina displays); it is also a "neat" power-of-2 width, twice the width of one-time standard XGA (so, e.g. websites designed for that width can be smoothly zoomed to 200%), and happens to be 4x the size of the 1024x576 panels commonly used for cheaper netbooks and mobile tablets (much as the 2.5K standard is 4x the 1280x800 WXGA used in ultraportable laptops and midrange tablets). In this way, it can be considered a form of convergent specification evolution - although there's little chance the two standards are directly related, their particulars will have been landed on by broadly similar methods.

Although the fact is now mainly of historical interest, most larger-tube CRT PC monitors had a maximum horizontal scan rate of 70 kHz or higher, which means they could have handled 2048x1152 at 60 Hz progressive if set to use a custom resolution (with slimmer vertical blanking margins than HD-MAC/Eu95 itself for those rated for less than 75 kHz). Monitors able to support the lower refresh rate, including smaller models incapable of 70 kHz but good for at least 58 kHz (preferably 62.5 kHz) and able to support the lower vertical refresh rate could instead be set to run 50 Hz progressive, or even 100 Hz interlace to avert the flicker that would otherwise cause.

See also

TV transmission systems

Related standards:

Related Research Articles

<span class="mw-page-title-main">Interlaced video</span> Technique for doubling the perceived frame rate of a video display

Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth. The interlaced signal contains two fields of a video frame captured consecutively. This enhances motion perception to the viewer, and reduces flicker by taking advantage of the phi phenomenon.

<span class="mw-page-title-main">Chroma subsampling</span> Practice of encoding images

Chroma subsampling is the practice of encoding images by implementing less resolution for chroma information than for luma information, taking advantage of the human visual system's lower acuity for color differences than for luminance.

<span class="mw-page-title-main">ATSC standards</span> Standards for digital television in the US

Advanced Television Systems Committee (ATSC) standards are an American set of standards for digital television transmission over terrestrial, cable and satellite networks. It is largely a replacement for the analog NTSC standard and, like that standard, is used mostly in the United States, Mexico, Canada, South Korea and Trinidad & Tobago. Several former NTSC users, such as Japan, have not used ATSC during their digital television transition, because they adopted other systems such as ISDB developed by Japan, and DVB developed in Europe, for example.

<span class="mw-page-title-main">D-1 (Sony)</span> Magnetic tape-based videocassette format

D-1 or 4:2:2 Component Digital is an SMPTE digital recording video standard, introduced in 1986 through efforts by SMPTE engineering committees. It started as a Sony and Bosch - BTS product and was the first major professional digital video format. SMPTE standardized the format within ITU-R 601, also known as Rec. 601, which was derived from SMPTE 125M and EBU 3246-E standards.

<span class="mw-page-title-main">Serial digital interface</span> Family of digital video interfaces

Serial digital interface (SDI) is a family of digital video interfaces first standardized by SMPTE in 1989. For example, ITU-R BT.656 and SMPTE 259M define digital video interfaces used for broadcast-grade video. A related standard, known as high-definition serial digital interface (HD-SDI), is standardized in SMPTE 292M; this provides a nominal data rate of 1.485 Gbit/s.

Enhanced-definition television, or extended-definition television (EDTV) is a Consumer Electronics Association (CEA) marketing shorthand term for certain digital television (DTV) formats and devices. Specifically, this term defines an extension of the standard-definition television (SDTV) format that enables a clearer picture during high-motion scenes compared to previous iterations of SDTV, but not producing images as detailed as high-definition television (HDTV).

PALplus is an analogue television broadcasting system aimed to improve and enhance the PAL format by allowing 16:9 aspect ratio broadcasts, while remaining compatible with existing television receivers, defined by International Telecommunication Union (ITU) recommendation BT.1197-1. Introduced in 1993, it followed experiences with the HD-MAC and D2-MAC, hybrid analogue-digital widescreen formats that were incompatible with PAL receivers. It was developed at the University of Dortmund in Germany, in cooperation with German terrestrial broadcasters and European and Japanese manufacturers. The system had some adoption across Europe during the late 1990s and helped introduce widescreen TVs in the market, but never became mainstream.

<span class="mw-page-title-main">720p</span> Video resolution

720p is a progressive HD signal format with 720 horizontal lines/1280 columns and an aspect ratio (AR) of 16:9, normally known as widescreen HD (1.78:1). All major HD broadcasting standards include a 720p format, which has a resolution of 1280×720p.

1080i is a combination of frame resolution and scan type. 1080i is used in high-definition television (HDTV) and high-definition video. The number "1080" refers to the number of horizontal lines on the screen. The "i" is an abbreviation for "interlaced"; this indicates that only the even lines of each frame, then only the odd lines, are drawn alternately, so that only half the number of lines are ever updated at once. A related display resolution is 1080p, which also has 1080 lines of resolution; the "p" refers to progressive scan, which indicates that each full frame appears on the screen in sequence.

High-definition video is video of higher resolution and quality than standard-definition. While there is no standardized meaning for high-definition, generally any video image with considerably more than 480 vertical scan lines or 576 vertical lines (Europe) is considered high-definition. 480 scan lines is generally the minimum even though the majority of systems greatly exceed that. Images of standard resolution captured at rates faster than normal, by a high-speed camera may be considered high-definition in some contexts. Some television series shot on high-definition video are made to look as if they have been shot on film, a technique which is often known as filmizing.

<span class="mw-page-title-main">1080p</span> Video mode

1080p is a set of HDTV high-definition video modes characterized by 1,920 pixels displayed across the screen horizontally and 1,080 pixels down the screen vertically; the p stands for progressive scan, i.e. non-interlaced. The term usually assumes a widescreen aspect ratio of 16:9, implying a resolution of 2.1 megapixels. It is often marketed as Full HD or FHD, to contrast 1080p with 720p resolution screens. Although 1080p is sometimes informally referred to as 2K, these terms reflect two distinct technical standards, with differences including resolution and aspect ratio.

<span class="mw-page-title-main">Kell factor</span>

The Kell factor, named after RCA engineer Raymond D. Kell, is a parameter used to limit the bandwidth of a sampled image signal to avoid the appearance of beat frequency patterns when displaying the image in a discrete display device, usually taken to be 0.7. The number was first measured in 1934 by Raymond D. Kell and his associates as 0.64 but has suffered several revisions given that it is based on image perception, hence subjective, and is not independent of the type of display. It was later revised to 0.85 but can go higher than 0.9, when fixed pixel scanning and fixed pixel displays are used, or as low as 0.7 for electron gun scanning.

<span class="mw-page-title-main">Multiplexed Analogue Components</span> 1980s analog television standard

Multiplexed Analogue Components (MAC) was an analog television standard where luminance and chrominance components were transmitted separately. This was an evolution from older color TV systems where there was interference between chrominance and luminance.

Analog high-definition television has referred to a variety of analog video broadcast television systems with various display resolutions throughout history.

MUSE, commercially known as Hi-Vision was a Japanese analog high-definition television system, with design efforts going back to 1979.

<span class="mw-page-title-main">D-MAC</span>

Among the family of MAC or Multiplexed Analogue Components systems for television broadcasting, D-MAC is a reduced bandwidth variant designed for transmission down cable.

<span class="mw-page-title-main">D2-MAC</span>

D2-MAC is a satellite television transmission standard, a member of Multiplexed Analogue Components family. It was created to solve D-MAC's bandwidth usage by further reducing it, allowing usage of the system on cable and satellite broadcast. It could carry four high quality sound channels or eight lower quality audio channels. It was adopted by Scandinavian, German and French satellite broadcasts. The system was used until July 2006 in Scandinavia and until the mid-1990s for German and French sound channels.

Television standards conversion is the process of changing a television transmission or recording from one video system to another. Converting video between different numbers of lines, frame rates, and color models in video pictures is a complex technical problem. However, the international exchange of television programming makes standards conversion necessary so that video may be viewed in another nation with a differing standard. Typically video is fed into video standards converter which produces a copy according to a different video standard. One of the most common conversions is between the NTSC and PAL standards.

Broadcast-safe video is a term used in the broadcast industry to define video and audio compliant with the technical or regulatory broadcast requirements of the target area or region the feed might be broadcasting to. In the United States, the Federal Communications Commission (FCC) is the regulatory authority; in most of Europe, standards are set by the European Broadcasting Union (EBU).

High-definition television (HDTV) describes a television or video system which provides a substantially higher image resolution than the previous generation of technologies. The term has been used since at least 1933; in more recent times, it refers to the generation following standard-definition television (SDTV). It is the current de facto standard video format used in most broadcasts: terrestrial broadcast television, cable television, satellite television.

References

  1. 1 2 3 Pauchon (1992). "Analogue HDTV in Europe". EBU Technical Review (PDF) (Autumn ed.). EBU. pp. 6–19.
  2. Ive, John (July 2004). EBU TECHNICAL REVIEW - Image formats for HDTV (PDF). EBU.
  3. Keys, Geoff (Autumn 1988). "THE HDTV VEHICLE" (PDF). Eng Inf: 14. Archived from the original (PDF) on 2014-02-22.
  4. High Definition Television; the Creation, Development and Implementation of HDTV Technology, Philip J. Cianci
  5. Jim Slater: Modern Television Systems to HDTV and beyond. Pitman Publishing, 1991, ISBN 0-203-16851-8, S. 180/181
  6. Scott, B. (Autumn 1992). "HDTV programme production" (PDF). EBU Technical Review: 52.
  7. "The Cathode ray Tube site. Television CRT's". www.crtsite.com. Retrieved 2023-01-15.
  8. "'SPACE SYSTEM' GIVES TV A MOVIE-SCREEN LOOK". Chicago Tribune . 1991.
  9. Tejerina1, Visintin2 (1992). "The HDTV demonstrations at Expo 92". EBU Technical Review (PDF) (Winter ed.). EBU. pp. 25–32.
  10. "Fernsehmuseum1- Sie sind im Bereich : EUREKA Report 1993 (2)". www.fernsehmuseum.info.
  11. "A Brief Review on HDTV in Europe in the early 90's | LIVE-PRODUCTION.TV". www.live-production.tv. Retrieved 2023-01-15.
  12. ETSI specification of the D2-HDMAC/Packet system (ETS 300 352), section 4.1
  13. ETSI specification of the D2-HDMAC/Packet system (ETS 300 352), section 10.3
  14. Burtfield (1989). A digital HDTV recorder using a cluster of four D1 DVTRs (PDF). BBC Research Department.
  15. Recommendation ITU-R BT.709-5 - Parameter values for the HDTV standards for production and international programme exchange (PDF). ITU-R. 2009. Archived from the original (PDF) on 2012-10-22.
  16. "A high-performance, full-bandwidth HDTV camera applying the first 2.2 million pixel frame transfer CCD sensor" (PDF). SMPTE Journal: 319. May 1994.
  17. ETSI specification of the D2-HDMAC/Packet system (ETS 300 352), section 5.2.1
  18. ETSI specification of the D2-HDMAC/Packet system (ETS 300 352), 5.3.6 Baseband format
  19. ETSI specification of the D2-HDMAC/Packet system (ETS 300 352), 10.2.2 Nyquist filtering
  20. Grimaldi, J.; Thoumy, F.; Duhamel, H. (September 10, 1990). "Up conversion from interlace to progressive using motion detection and quincunx filtering". pp. 111–115 via IEEE Xplore.
  21. Hayward, Jack (May 29, 2008). Leaderless Europe. OUP Oxford. ISBN   978-0-19-156014-9 via Google Books.
  22. http://www.bbceng.info/Eng_Inf/EngInf_34.pdf Archived 2014-02-22 at the Wayback Machine BBC engineering at IBC 88
  23. Eouzan, J.Y.; Boyer, R. (September 10, 1988). "A progressive scanning 1250/501 HDTV colour camera and processing based on quincunx sampling". pp. 174–176 via IEEE Xplore.
  24. http://www.itu.int/dms_pub/itu-r/opb/rep/R-REP-BT.2003-1994-PDF-F.pdf Harmonisation des normes de TVHD...,Section 3.3.5.1 (in French)
  25. Horstman, R.A. (September 10, 1988). "Videodisc and player for HDMAC". pp. 224–227 via IEEE Xplore.
  26. http://www.itu.int/dms_pub/itu-r/opb/rep/R-REP-BT.2003-1994-PDF-F.pdf Harmonisation des normes de TVHD...,Section 3.3.5.2 (in French)
  27. https://tech.ebu.ch/docs/techreview/trev_254-tejerina.pdf Expo 92, section 6.1.1
  28. http://downloads.bbc.co.uk/rd/pubs/reports/1989-13.pdf BBC R&D document about the Quadriga
  29. "Du fond du coeur (1994) - Technical Specifications". IMDb.
  30. http://www.lesechos.fr/01/03/1994/LesEchos/16593-96-ECH_vision-1250-parie-sur-la-video-haute-definition-dans-le-cinema.htm Germaine et Benjamin produced in vision 1250 format(in French)
  31. "Digital Micromirror Device™ (DMD) Based Projection Display Systems". 2002-01-06. Archived from the original on 2002-01-06. Retrieved 2023-01-15.