Word clock

Last updated

In digital audio electronics, a word clock or wordclock (sometimes sample clock, which can have a broader meaning)[ further explanation needed ] is a clock signal used to synchronise other devices, such as digital audio tape machines and compact disc players, which interconnect via digital audio signals. Word clock is so named because it clocks each audio sample. Samples are represented in data words.

Contents

S/PDIF, AES/EBU, MADI, ADAT, and TDIF are some of the formats that use a word clock. Various audio over Ethernet systems use communication protocols to distribute word clock. The device which generates the word clock is the clock source for all the other audio devices.

The signal is used for synchronizing digital audio signals between devices, such as CD players, audio I/O cards, etc. [1] It allows all the components in the signal path to process the data and remain synchronized with each other. [2]

Comparison to timecode

Word clock should not be confused with timecode; word clock is used entirely to keep a perfectly timed and constant bitrate to avoid timing errors that can cause data transmission errors. Timecode is metadata about the media data being transmitted. Time code can be used as an initial phase reference for jam sync using the word clock as the frequency reference.

Over coax cable

Professional digital audio equipment may have a word clock input or output to synchronize timing between multiple devices. Although the electrical characteristics of the word clock signal have not been completely standardized, some characteristics should always apply. Items that should remain consistent are TTL level, a 75ohm output impedance, 75ohm cables and a 75ohm terminating resistor at the end of a chain or cable.

Proper termination of the word clock signal with a 75ohm resistor is important. It prevents the clock signal from reflecting back into the cable and causing false detection of extra 1's and 0's. Some digital equipment includes a switchable terminator, some include a hardwired terminator and others have no terminator at all. An unfortunate aspect is that some equipment manuals do not indicate whether a hardwired terminator is included. [3]

A chain connection from the source through the receivers may increase jitter. Using clock distributing devices for parallel transmission is a better method. The length and quality of coaxial cables are important.

Over AES3

The AES11 standard defines a means for carrying a word clock over an AES3 connection. In this context, the word clock is known as a Digital Audio Reference Signal (DARS).

In annex B, the AES11 standard also describes common practice in transmitting and receiving a plain word clock signal. This is not an attempt to standardize it, the annex is informative only.

See also

Related Research Articles

Linear Timecode (LTC) is an encoding of SMPTE timecode data in an audio signal, as defined in SMPTE 12M specification. The audio signal is commonly recorded on a VTR track or other storage media. The bits are encoded using the biphase mark code : a 0 bit has a single transition at the start of the bit period. A 1 bit has two transitions, at the beginning and middle of the period. This encoding is self-clocking. Each frame is terminated by a 'sync word' which has a special predefined sync relationship with any video or film content.

<span class="mw-page-title-main">RS-232</span> Standard for serial communication

In telecommunications, RS-232 or Recommended Standard 232 is a standard originally introduced in 1960 for serial communication transmission of data. It formally defines signals connecting between a DTE such as a computer terminal, and a DCE, such as a modem. The standard defines the electrical characteristics and timing of signals, the meaning of signals, and the physical size and pinout of connectors. The current version of the standard is TIA-232-F Interface Between Data Terminal Equipment and Data Circuit-Terminating Equipment Employing Serial Binary Data Interchange, issued in 1997. The RS-232 standard had been commonly used in computer serial ports and is still widely used in industrial communication devices.

<span class="mw-page-title-main">S/PDIF</span> Standardized digital audio interface

S/PDIF is a type of digital audio interface used in consumer audio equipment to output audio over relatively short distances. The signal is transmitted over either a coaxial cable or a fiber optic cable with TOSLINK connectors. S/PDIF interconnects components in home theaters and other digital high-fidelity systems.

<span class="mw-page-title-main">Low-voltage differential signaling</span> Technical standard

Low-voltage differential signaling (LVDS), also known as TIA/EIA-644, is a technical standard that specifies electrical characteristics of a differential, serial signaling standard. LVDS operates at low power and can run at very high speeds using inexpensive twisted-pair copper cables. LVDS is a physical layer specification only; many data communication standards and applications use it and add a data link layer as defined in the OSI model on top of it.

AES3 is a standard for the exchange of digital audio signals between professional audio devices. An AES3 signal can carry two channels of pulse-code-modulated digital audio over several transmission media including balanced lines, unbalanced lines, and optical fiber.

A Controller Area Network is a vehicle bus standard designed to allow microcontrollers and devices to communicate with each other's applications without a host computer. It is a message-based protocol, designed originally for multiplex electrical wiring within automobiles to save on copper, but it can also be used in many other contexts. For each device, the data in a frame is transmitted serially but in such a way that if more than one device transmits at the same time, the highest priority device can continue while the others back off. Frames are received by all devices, including by the transmitting device.

<span class="mw-page-title-main">DMX512</span> Digital communication network standard for controlling stage lighting and effects

DMX512 is a standard for digital communication networks that are commonly used to control lighting and effects. It was originally intended as a standardized method for controlling stage lighting dimmers, which, prior to DMX512, had employed various incompatible proprietary protocols. It quickly became the primary method for linking controllers to dimmers and special effects devices such as fog machines and intelligent lights.

<span class="mw-page-title-main">Electrical termination</span> Ending a transmission line

In electronics, electrical termination is the practice of ending a transmission line with a device that matches the characteristic impedance of the line. Termination prevents signals from reflecting off the end of the transmission line. Reflections at the ends of unterminated transmission lines cause distortion, which can produce ambiguous digital signal levels and misoperation of digital systems. Reflections in analog signal systems cause such effects as video ghosting, or power loss in radio transmitter transmission lines.

<span class="mw-page-title-main">Serial digital interface</span> Family of digital video interfaces

Serial digital interface (SDI) is a family of digital video interfaces first standardized by SMPTE in 1989. For example, ITU-R BT.656 and SMPTE 259M define digital video interfaces used for broadcast-grade video. A related standard, known as high-definition serial digital interface (HD-SDI), is standardized in SMPTE 292M; this provides a nominal data rate of 1.485 Gbit/s.

I²S, is an electrical serial bus interface standard used for connecting digital audio devices together. It is used to communicate PCM audio data between integrated circuits in an electronic device. The I²S bus separates clock and serial data signals, resulting in simpler receivers than those required for asynchronous communications systems that need to recover the clock from the data stream. Alternatively I²S is spelled I2S or IIS. Despite the similar name, I²S is unrelated to the bidirectional I²C (IIC) bus.

AES47 is a standard which describes a method for transporting AES3 professional digital audio streams over Asynchronous Transfer Mode (ATM) networks.

The Precision Time Protocol (PTP) is a protocol used to synchronize clocks throughout a computer network. On a local area network, it achieves clock accuracy in the sub-microsecond range, making it suitable for measurement and control systems. PTP is employed to synchronize financial transactions, mobile phone tower transmissions, sub-sea acoustic arrays, and networks that require precise timing but lack access to satellite navigation signals.

The AES11 standard published by the Audio Engineering Society provides a systematic approach to the synchronization of digital audio signals. AES11 recommends using an AES3 signal to distribute audio clocks within a facility. In this application, the connection is referred to as a Digital Audio Reference Signal (DARS).

SMPTE 292 is a digital video transmission line standard published by the Society of Motion Picture and Television Engineers (SMPTE). This technical standard is usually referred to as HD-SDI; it is part of a family of standards that define a Serial Digital Interface based on a coaxial cable, intended to be used for transport of uncompressed digital video and audio in a television studio environment.

In audio and broadcast engineering, Audio over Ethernet is the use of an Ethernet-based network to distribute real-time digital audio. AoE replaces bulky snake cables or audio-specific installed low-voltage wiring with standard network structured cabling in a facility. AoE provides a reliable backbone for any audio application, such as for large-scale sound reinforcement in stadiums, airports and convention centers, multiple studios or stages.

<span class="mw-page-title-main">MADI</span> Multichannel digital audio interface

Multichannel Audio Digital Interface (MADI) standardized as AES10 by the Audio Engineering Society (AES) defines the data format and electrical characteristics of an interface that carries multiple channels of digital audio. The AES first documented the MADI standard in AES10-1991 and updated it in AES10-2003 and AES10-2008. The MADI standard includes a bit-level description and has features in common with the two-channel AES3 interface.

The ADAT Lightpipe, officially the ADAT Optical Interface, is a standard for the transfer of digital audio between equipment. It was originally developed by Alesis but has since become widely accepted, with many third party hardware manufacturers including Lightpipe interfaces on their equipment. The protocol has become so popular that the term ADAT is now often used to refer to the transfer standard rather than to the Alesis Digital Audio Tape itself.

<span class="mw-page-title-main">Incremental encoder</span> Electromechanical device

An incremental encoder is a linear or rotary electromechanical device that has two output signals, A and B, which issue pulses when the device is moved. Together, the A and B signals indicate both the occurrence of and direction of movement. Many incremental encoders have an additional output signal, typically designated index or Z, which indicates the encoder is located at a particular reference position. Also, some encoders provide a status output that indicates internal fault conditions such as a bearing failure or sensor malfunction.

<span class="mw-page-title-main">Audio Video Bridging</span> Specifications for synchronized, low-latency streaming through IEEE 802 networks

Audio Video Bridging (AVB) is a common name for the set of technical standards which provide improved synchronization, low-latency, and reliability for switched Ethernet networks. AVB embodies the following technologies and standards:

AES50 is an Audio over Ethernet protocol for multichannel digital audio. It is defined by the AES50-2011 standard for High-resolution multi-channel audio interconnection (HRMAI).

References

  1. "What is digital audio synchronization? How does it differ from time code synchronization?". Sweetwater . 27 April 2007. Retrieved 31 March 2021.{{cite web}}: CS1 maint: url-status (link)
  2. Shelton, Tim (1 May 1989). "Synchronization of Digital Audio". Audio Engineering Society . Retrieved 31 March 2021.{{cite journal}}: CS1 maint: url-status (link)
  3. Section 9.1.3 of Ardour manual