This article needs additional citations for verification .(June 2022) |
New Interfaces for Musical Expression | |
---|---|
Abbreviation | NIME |
Discipline | Electronic music |
Publication details | |
History | 2001–present |
Website | www |
New Interfaces for Musical Expression, also known as NIME, is an international conference dedicated to scientific research on the development of new technologies and their role in musical expression and artistic performance.
The conference began as a workshop (NIME 01) at the ACM Conference on Human Factors in Computing Systems (CHI) in 2001 in Seattle, Washington, with the concert and demonstration sessions being held at the Experience Music Project museum. Since then, international conferences have been held annually around the world:
The following is a partial list of topics covered by the NIME conference:
Other similarly themed conferences include
Open Sound Control (OSC) is a protocol for networking sound synthesizers, computers, and other multimedia devices for purposes such as musical performance or show control. OSC's advantages include interoperability, accuracy, flexibility and enhanced organization and documentation. Its disadvantages include inefficient coding of information, increased load on embedded processors, and lack of standardized messages/interoperability. The first specification was released in March 2002.
Live coding, sometimes referred to as on-the-fly programming, just in time programming and conversational programming, makes programming an integral part of the running program.
A MIDI controller is any hardware or software that generates and transmits Musical Instrument Digital Interface (MIDI) data to MIDI-enabled devices, typically to trigger sounds and control parameters of an electronic music performance. They most often use a musical keyboard to send data about the pitch of notes to play, although a MIDI controller may trigger lighting and other effects. A wind controller has a sensor that converts breath pressure to volume information and lip pressure to control pitch. Controllers for percussion and stringed instruments exist, as well as specialized and experimental devices. Some MIDI controllers are used in association with specific digital audio workstation software. The original MIDI specification has been extended to include a greater range of control features.
Donald Buchla was an American pioneer in the field of sound synthesis. Buchla popularized the "West Coast" style of synthesis. He was co-inventor of the voltage controlled modular synthesizer along with Robert Moog, the two working independently in the early 1960s.
The Hyperbow is an electronic violin bow interface that was developed as a result of an in-depth research project by students at MIT. The instrument is intended for use only by accomplished players and was designed to amplify their gestures, which lead to supplementary sound or musical control possibilities. It offers the violin player a range of expressive possibilities in its form as an augmented bow controller that lends itself to the control of bowed string physical models.
A hydraulophone is a tonal acoustic musical instrument played by direct physical contact with water where sound is generated or affected hydraulically. The hydraulophone was described and named by Steve Mann in 2005, and patented in 2011. Typically, sound is produced by the same hydraulic fluid in contact with the player's fingers. It has been used as a sensory exploration device for low-vision individuals.
Harris Wulfson was an American composer, instrumentalist and software engineer in Brooklyn, New York. His work employed algorithmic processes and gestural controllers to explore the boundary where humans encounter their machines.
Live electronic music is a form of music that can include traditional electronic sound-generating devices, modified electric musical instruments, hacked sound generating technologies, and computers. Initially the practice developed in reaction to sound-based composition for fixed media such as musique concrète, electronic music and early computer music. Musical improvisation often plays a large role in the performance of this music. The timbres of various sounds may be transformed extensively using devices such as amplifiers, filters, ring modulators and other forms of circuitry. Real-time generation and manipulation of audio using live coding is now commonplace.
Artificial intelligence and music (AIM) is a common subject in the International Computer Music Conference, the Computing Society Conference and the International Joint Conference on Artificial Intelligence. The first International Computer Music Conference (ICMC) was held in 1974 at Michigan State University. Current research includes the application of AI in music composition, performance, theory and digital sound processing.
In computing, a natural user interface (NUI) or natural interface is a user interface that is effectively invisible, and remains invisible as the user continuously learns increasingly complex interactions. The word "natural" is used because most computer interfaces use artificial control devices whose operation has to be learned. Examples include voice assistants, such as Alexa and Siri, touch and multitouch interactions on today's mobile phones and tablets, but also touch interfaces invisibly integrated into the textiles furnitures.
Alexander Refsum Jensenius is a Norwegian researcher and musician. He is Professor of music technology and was Head of the Department of Musicology, University of Oslo during the period 2013-2016. He is currently Deputy Director of RITMO - Centre for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo, and serves as the Chair of the Steering Committee for NIME, the International Conference in New Interfaces for Musical Expression. He is the grandson of politician Marie Borge Refsum.
Sonic interaction design is the study and exploitation of sound as one of the principal channels conveying information, meaning, and aesthetic/emotional qualities in interactive contexts. Sonic interaction design is at the intersection of interaction design and sound and music computing. If interaction design is about designing objects people interact with, and such interactions are facilitated by computational means, in sonic interaction design, sound is mediating interaction either as a display of processes or as an input medium.
The Miburi is a wearable musical instrument which was released commercially by the Yamaha Corporation’s Tokyo-based experimental division in 1995.
Sergi Jordà is a Catalan innovator, installation artist, digital musician and Associate Professor at the Music Technology Group, Universitat Pompeu Fabra in Barcelona. He is best known for directing the team that invented the Reactable. He is also a trained Physicist.
Sound and music computing (SMC) is a research field that studies the whole sound and music communication chain from a multidisciplinary point of view. By combining scientific, technological and artistic methodologies it aims at understanding, modeling and generating sound and music through computational approaches.
The International Society for Music Information Retrieval (ISMIR) is an international forum for research on the organization of music-related data. It started as an informal group steered by an ad hoc committee in 2000 which established a yearly symposium - whence "ISMIR", which meant International Symposium on Music Information Retrieval. It was turned into a conference in 2002 while retaining the acronym. ISMIR was incorporated in Canada on July 4, 2008.
An immersive virtual musical instrument, or immersive virtual environment for music and sound, represents sound processes and their parameters as 3D entities of a virtual reality so that they can be perceived not only through auditory feedback but also visually in 3D and possibly through tactile as well as haptic feedback, using 3D interface metaphors consisting of interaction techniques such as navigation, selection and manipulation (NSM). It builds on the trend in electronic musical instruments to develop new ways to control sound and perform music such as explored in conferences like NIME.
In computing, scratch input is an acoustic-based method of Human-Computer Interaction (HCI) that takes advantage of the characteristic sound produced when a finger nail or other object is dragged over a surface, such as a table or wall. The technique is not limited to fingers; a stick or writing implements can also be used. The sound is often inaudible to the naked ear. However, specialized microphones can digitize the sounds for interactive purposes. Scratch input was invented by Mann et al. in 2007, though the term was first used by Chris Harrison et al.
Eric Singer is a multi-disciplinary artist, musician and software, electrical, computer, robotics, and medical device engineer. He is known for his interactive art and technology works, robotic and electronic musical instruments, fire art, and guerilla art.
Bruno Zamborlin is an AI researcher, entrepreneur and artist based in London, working in the field of human-computer interaction. His work focuses on converting physical objects into touch-sensitive, interactive surfaces using vibration sensors and artificial intelligence. In 2013 he founded Mogees Limited a start-up to transform everyday objects into musical instruments and games using a vibration sensor and a mobile phone. With HyperSurfaces, he converts physical surfaces of any material, shape and form into data-enabled-interactive surfaces using a vibration sensor and a coin-sized chipset. As an artist, he has created art installations around the world, with his most recent work comprising a unique series of "sound furnitures" that was showcased at the Italian Pavilion of the Venice Biennale 2023. He regularly performed with UK-based electronic music duo Plaid. He is also honorary visiting research fellow at Goldsmiths, University of London.
{{cite web}}
: CS1 maint: archived copy as title (link)