Quantization (music)

Last updated

In digital music processing technology, quantization is the studio-software process of transforming performed musical notes, which may have some imprecision due to expressive performance, to an underlying musical representation that eliminates the imprecision. The process results in notes being set on beats and on exact fractions of beats. [1]

Contents

The purpose of quantization in music processing is to provide a more beat-accurate timing of sounds. [2] Quantization is frequently applied to a record of MIDI notes created by the use of a musical keyboard or drum machine. Additionally, the phrase "pitch quantization" can refer to pitch correction used in audio production, such as using Auto-Tune.

Description

A frequent application of quantization in this context lies within MIDI application software or hardware. MIDI sequencers typically include quantization in their manifest of edit commands. In this case, the dimensions of this timing grid are set beforehand. When one instructs the music application to quantize a certain group of MIDI notes in a song, the program moves each note to the closest point on the timing grid. Quantization in MIDI is usually applied to Note On messages and sometimes Note Off messages; some digital audio workstations shift the entire note by moving both messages together. Sometimes quantization is applied in terms of a percentage, to partially align the notes to a certain beat. Using a percentage of quantization allows for the subtle preservation of some natural human timing nuances.

The most difficult problem in quantization is determining which rhythmic fluctuations are imprecise or expressive (and should be removed by the quantization process) and which should be represented in the output score. For instance, a simple children's song should probably have very coarse quantization, resulting in few different notes in output. On the other hand, quantizing a performance of a piano piece by Arnold Schoenberg, for instance, should result in many smaller notes, tuplets, etc.

In recent years audio quantization has come into play, with the plug-in Beat Detective on all versions of Pro Tools being used regularly on modern-day records to tighten the playing of drums, guitar, bass, etc. [3]

See also

Related Research Articles

MIDI Means of connecting electronic musical instruments

MIDI is a technical standard that describes a communications protocol, digital interface, and electrical connectors that connect a wide variety of electronic musical instruments, computers, and related audio devices for playing, editing and recording music. The specification originates in a paper titled Universal Synthesizer Interface, published by Dave Smith and Chet Wood, then of Sequential Circuits, at the October 1981 Audio Engineering Society conference in New York City.

Zeta Instrument Processor Interface (ZIPI) was a research project initiated by Zeta Instruments and UC Berkeley's CNMAT. Introduced in 1994 in a series of publications in Computer Music Journal from MIT Press, ZIPI was intended as the next-generation transport protocol for digital musical instruments, designed with compliance to the OSI model.

Metronome

A metronome, from ancient Greek μέτρον and νέμω, is a device that produces an audible click or other sound at a regular interval that can be set by the user, typically in beats per minute (BPM). Musicians use the device to practise playing to a regular pulse. Metronomes typically include synchronized visual motion.

A music sequencer is a device or application software that can record, edit, or play back music, by handling note and performance information in several forms, typically CV/Gate, MIDI, or Open Sound Control (OSC), and possibly audio and automation data for DAWs and plug-ins.

A click track is a series of audio cues used to synchronize sound recordings, sometimes for synchronization to a moving image. The click track originated in early sound movies, where optical marks were made on the film to indicate precise timings for musical accompaniment. It can also serve a purpose similar to a metronome, as in the music industry, where it is often used during recording sessions and live performances.

Csound is a domain-specific computer programming language for audio programming. It is called Csound because it is written in C, as opposed to some of its predecessors.

CV/gate

CV/gate is an analog method of controlling synthesizers, drum machines and other similar equipment with external sequencers. The control voltage typically controls pitch and the gate signal controls note on-off.

Steinberg Cubase Digital audio workstation

Cubase is a digital audio workstation (DAW) developed by Steinberg for music and MIDI recording, arranging and editing. The first version, which was originally only a MIDI sequencer and ran on the Atari ST computer, was released in 1989. Cut-down versions of Cubase are included with almost all Yamaha audio and MIDI hardware, as well as hardware from other manufacturers. These versions can be upgraded to a more advanced version at a discount.

Virtual Studio Technology software plug-in interface used in computer-based audio production

Virtual Studio Technology (VST) is an audio plug-in software interface that integrates software synthesizers and effects units into digital audio workstations. VST and similar technologies use digital signal processing to simulate traditional recording studio hardware in software. Thousands of plugins exist, both commercial and freeware, and many audio applications support VST under license from its creator, Steinberg.

Pro Tools Digital audio workstation

Pro Tools is a digital audio workstation (DAW) developed and released by Avid Technology for Microsoft Windows and macOS used for music creation and production, sound for picture and, more generally, sound recording, editing, and mastering processes.

GarageBand Digital audio software for macOS and iOS

GarageBand is a line of digital audio workstations for macOS, iPadOS, and iOS devices that allows users to create music or podcasts. GarageBand is developed and sold by Apple for macOS, and is part of the iLife software suite, along with iMovie and iDVD. Its music and podcast creation system enables users to create multiple tracks with pre-made MIDI keyboards, pre-made loops, an array of various instrumental effects, and voice recordings.

ChucK Audio programming language for real-time synthesis, composition, and performance

ChucK is a concurrent, strongly timed audio programming language for real-time synthesis, composition, and performance, which runs on Linux, Mac OS X, Microsoft Windows, and iOS. It is designed to favor readability and flexibility for the programmer over other considerations such as raw performance. It natively supports deterministic concurrency and multiple, simultaneous, dynamic control rates. Another key feature is the ability to live code; adding, removing, and modifying code on the fly, while the program is running, without stopping or restarting. It has a highly precise timing/concurrency model, allowing for arbitrarily fine granularity. It offers composers and researchers a powerful and flexible programming tool for building and experimenting with complex audio synthesis programs, and real-time interactive control.

DirectMusic is a deprecated component of the Microsoft DirectX API that allows music and sound effects to be composed and played and provides flexible interactive control over the way they are played. Architecturally, DirectMusic is a high-level set of objects, built on top of DirectSound, that allow the programmer to play sound and music without needing to get quite as low-level as DirectSound. DirectSound allows for the capture and playback of digital sound samples, whereas DirectMusic works with message-based musical data. Music can be synthesized either in hardware, in the Microsoft GS Wavetable SW Synth, or in a custom synthesizer.

Novation Digital Music Systems British musical equipment manufacturer

Novation Digital Music Systems Ltd. is a British musical equipment manufacturer, founded in 1992 by Ian Jannaway and Mark Thompson as Novation Electronic Music Systems. Today the company specializes in MIDI controllers with and without keyboards, both analog and virtual analog performance synthesizers, grid-based performance controllers, and audio interfaces. At present, Novation products are primarily manufactured in China.

Ableton Live Digital audio workstation

Ableton Live is a digital audio workstation developed by Ableton for macOS and Windows. In contrast to many other software sequencers, Ableton Live is designed to be an instrument for live performances as well as a tool for composing, recording, arranging, mixing, and mastering. It is also used by DJs, as it offers a suite of controls for beatmatching, crossfading, and other different effects used by turntablists, and was one of the first music applications to automatically beatmatch songs. Live is available in three editions: Intro, Standard, and Suite.

MIDI controller

A MIDI controller is any hardware or software that generates and transmits Musical Instrument Digital Interface (MIDI) data to MIDI-enabled devices, typically to trigger sounds and control parameters of an electronic music performance.

Transcription (music)

In music, transcription is the practice of notating a piece or a sound which was previously unnotated and/or unpopular as a written music, for example, a jazz improvisation or a video game soundtrack. When a musician is tasked with creating sheet music from a recording and they write down the notes that make up the piece in music notation, it is said that they created a musical transcription of that recording. Transcription may also mean rewriting a piece of music, either solo or ensemble, for another instrument or other instruments than which it was originally intended. The Beethoven Symphonies transcribed for solo piano by Franz Liszt are an example. Transcription in this sense is sometimes called arrangement, although strictly speaking transcriptions are faithful adaptations, whereas arrangements change significant aspects of the original piece.

Computer audition (CA) or machine listening is general field of study of algorithms and systems for audio understanding by machine. Since the notion of what it means for a machine to "hear" is very broad and somewhat vague, computer audition attempts to bring together several disciplines that originally dealt with specific problems or had a concrete application in mind. The engineer Paris Smaragdis, interviewed in Technology Review, talks about these systems --"software that uses sound to locate people moving through rooms, monitor machinery for impending breakdowns, or activate traffic cameras to record accidents."

The stutter edit is the rhythmic repetition of small fragments of audio, occurring as the common 16th note repetition, but also as 64th notes and beyond, with layers of digital signal processing operations in a rhythmic fashion based on the overall length of the host tempo. The Stutter Edit audio software VST plug-in implements forms of granular synthesis, sample retrigger, and various effects to create a certain audible manipulation of the sound run through it, in which fragments of audio are repeated in rhythmic intervals. The plug-in allows musicians to manipulate audio in real time, slicing audio into small fragments and sequences the pieces into rhythmic effects, recreating techniques that formerly took hours to do in the studio. Electronic musician Brian Transeau is widely recognized for pioneering the stutter edit as a musical technique; he developed, coined the term, and holds multiple patents for the Stutter Edit software plug-in.

Director Musices is computer software produced by the Department of Speech, Music and Hearing at KTH Royal Institute of Technology. It aims to give an expressive, human-like performance to a musical score by varying the volume and timing of the notes. Director Musices is written in CMU Common Lisp and distributed as free software. It processes MIDI files.

References

  1. Childs, G.W., IV (March 7, 2018). "A Music Producer's Guide to Quantizing MIDI". Ask.Audio. Retrieved April 26, 2019.CS1 maint: multiple names: authors list (link)
  2. "Quantization". Mediacollege.com. Quantization can also refer to the process of correcting the timing of a musical performance. The music track is analysed and stretched in time so that beats are evenly distributed, eliminating timing errors. Some manufacturers refer to quantizing features as autocorrect.
  3. Simon Price (August 2003). "Pro Tools: Using Beat Detective".Cite magazine requires |magazine= (help)