Developer(s) | Alex McLean, others |
---|---|
Initial release | 2009 |
Stable release | 1.9.5 / 7 April 2024 |
Repository | https://github.com/tidalcycles/ |
Written in | Haskell |
Operating system | Linux, macOS, Windows |
Type | Live coding environment, algorave |
License | GPLv3 |
Website | tidalcycles |
TidalCycles (also known as Tidal) is a live coding environment which is designed for improvising and composing music. Technically, it is a domain-specific language embedded in the functional programming language Haskell, and is focused on the generating and manipulating audiovisual patterns. [1] [2] [3] It was originally designed for heavily percussive and polyrhythmic grid-based music, but it now uses a flexible and functional reactive representation for patterns, by using rational time. [4] Therefore, Tidal may be applied to a wide range of musical styles, although its cyclic approach to time means that it affords use in repetitive styles such as algorave. [5]
TidalCycles was created by Alex McLean who also coined the term algorave, [6] and is a domain-specific language embedded in Haskell, which focuses on generating and manipulating audiovisual patterns. [2] Tidal's representation of rhythm is based on metrical cycles, [7] which is inspired by Indian classical music, [8] supporting polyrhythmic and polymetric structures using a flexible, functional reactive representation for patterns, and rational time. This programme doesn't produce sound itself, but via the SuperCollider sound environment through the SuperDirt framework, via MIDI, or Open Sound Control.
Tidal is also used widely in academic research, including representation in music AI, [9] [10] as a language in network music, [11] and in electronic literature. [12]
Tidal is widely used at algorave algorithmic dance music events, [13] [14] and on high profile music releases. [15] [16] [17] It has been featured on BBC Radio 3's New Music Show. [18]
Since January 2022, an official port of Tidal's pattern engine has developed into the web-based live coding environment Strudel, [19] created by Felix Roos and Alex McLean. [20]
Audio signal processing is a subfield of signal processing that is concerned with the electronic manipulation of audio signals. Audio signals are electronic representations of sound waves—longitudinal waves which travel through air, consisting of compressions and rarefactions. The energy contained in audio signals or sound power level is typically measured in decibels. As audio signals may be represented in either digital or analog format, processing may occur in either domain. Analog processors operate directly on the electrical signal, while digital processors operate mathematically on its digital representation.
SuperCollider is an environment and programming language originally released in 1996 by James McCartney for real-time audio synthesis and algorithmic composition.
The Roland TR-909 Rhythm Composer, commonly known as the 909, is a drum machine introduced by Roland Corporation in 1983, succeeding the TR-808. It was the first Roland drum machine to use samples for some sounds, and the first with MIDI functionality, allowing it to synchronize with other devices. Though a commercial failure, it influenced the development of electronic dance music genres such as techno, house and acid house.
Ableton Live is a digital audio workstation for macOS and Windows developed by the German company Ableton.
Live coding, sometimes referred to as on-the-fly programming, just in time programming and conversational programming, makes programming an integral part of the running program.
Scott Wilson is a Canadian composer. He studied music and composition in Canada, the U.S., and Germany, and his teachers include Barry Truax, Wolfgang Rihm, Christos Hatzis, Gary Kulesha, Ron Kuivila, Alvin Lucier, Owen Underhill, Neely Bruce and David Gordon Duke. Since 2004 he has lived in Birmingham, UK, where he is Reader in Electronic Music and Director of Birmingham ElectroAcoustic Sound Theatre and the Electroacoustic Studios at the University of Birmingham.
Live electronic music is a form of music that can include traditional electronic sound-generating devices, modified electric musical instruments, hacked sound generating technologies, and computers. Initially the practice developed in reaction to sound-based composition for fixed media such as musique concrète, electronic music and early computer music. Musical improvisation often plays a large role in the performance of this music. The timbres of various sounds may be transformed extensively using devices such as amplifiers, filters, ring modulators and other forms of circuitry. Real-time generation and manipulation of audio using live coding is now commonplace.
Nick Collins is a British academic and computer music composer. From 2006–2013 he lived in Brighton, UK, and ran the music informatics degrees at the University of Sussex. In 2013 he became Reader at the University of Durham.
Alex McLean is a British musician and researcher. He is notable for his key role in developing live coding as a musical practice, including for creating TidalCycles, a live-coding environment that allows programmer musicians to code simply and quickly, and for coining the term Algorave with Nick Collins.
Slub is an algorave group formed in 2000 by Adrian Ward and Alex McLean, joined by Dave Griffiths in 2005 and Alexandra Cardenas in 2017. They are known for making their music exclusively from their own generative software, projecting their screens so their audience can see their handmade interfaces. Their music is improvised, and advertised as falling within the ambient gabba genre.
Barton McLean is an American composer, performer, music reviewer, and writer.
Benoît and the Mandelbrots, named after French American mathematician Benoît Mandelbrot, is a Computer Music band formed in 2009 in Karlsruhe, Germany. They are known for their live coded and Algorave performances, the Digital Arts practice of improvising with programming languages that gradually dissolves the distinction between composer and performer.
An algorave is an event where people dance to music generated from algorithms, often using live coding techniques. Alex McLean of Slub and Nick Collins coined the word "algorave" in 2011, and the first event under such a name was organised in London, England. It has since become a movement, with algoraves taking place around the world.
Sonic Pi is a live coding environment based on Ruby, originally designed to support both computing and music lessons in schools, developed by Sam Aaron in the University of Cambridge Computer Laboratory in collaboration with Raspberry Pi Foundation.
Alexandra Cárdenas is a Colombian musician, composer and improviser now based in Berlin, who has followed a path from Western classical composition to improvisation and live electronics. Her recent work has included live coding performance, including performances at the forefront of the Algorave scene, she also co-organised a live coding community in Mexico City. At the 2014 Kurukshetra Festival Cárdenas was a keynote speaker and hosted a music live coding workshop, the first of its kind in India. Cárdenas has been invited to talk about and perform live coding at events such as the Berlin based Transmediale festival and the Ableton sponsored Loop symposium, and held residencies including at Tokyo Wonder Site in Japan and Centre for the Arts in Mexico City.
Ixi lang is a programming language for live coding musical expression. It is taught at diverse levels of musical education and used in Algorave performances. Like many other live coding languages, such TidalCycles, ixi lang is a domain-specific language that embraces simplicity and constraints in design.
Joanne Armitage is a composer, improviser and researcher based in Leeds, England, notable for her practice in live coded music, and research into haptics in music performance. She performs internationally using the SuperCollider language, including as half of live coding duo ALGOBABEZ with Shelly Knotts associated with the Algorave movement. Her music is often performed in a club setting, while embracing error and uncertainty. She is also known as advocate for diversity in music and technology, including through invited workshops. Armitage is a lecturer in Digital Media at the School of Media and Communication, University of Leeds, UK.
Live Transcribe is a smartphone application to get realtime captions developed by Google for the Android operating system. Development on the application began in partnership with Gallaudet University. It was publicly released as a free beta for Android 5.0+ on the Google Play Store on February 4, 2019. As of early 2023 it had been downloaded over 500 million times. The app can be installed from an .apk file by sideloading and it will launch, but the actual transcription functionality is disabled, requiring creation of an account with Google.