New Interfaces for Musical Expression

Last updated
New Interfaces for Musical Expression
AbbreviationNIME
Discipline Electronic music
Publication details
History2001–present
Website www.nime.org
Three musicians playing hydraulophone, an instrument that is similar to a woodwind instrument but makes sound from incompressible fluid (water) rather than compressible fluid (air). Photo from concert programme of the NIME-07 conference in New York City. Nime2007jun08concert performance32.jpg
Three musicians playing hydraulophone, an instrument that is similar to a woodwind instrument but makes sound from incompressible fluid (water) rather than compressible fluid (air). Photo from concert programme of the NIME-07 conference in New York City.

New Interfaces for Musical Expression, also known as NIME, is an international conference dedicated to scientific research on the development of new technologies and their role in musical expression and artistic performance.

Contents

History

The conference began as a workshop (NIME 01) at the ACM Conference on Human Factors in Computing Systems (CHI) in 2001 in Seattle, Washington, with the concert and demonstration sessions being held at the Experience Music Project museum. Since then, international conferences have been held annually around the world:

NIME Location by Year
YearHost InstitutionCityCountry
2001 ACM CHI'01 and Experience Music Project Seattle USA
2002 Media Lab Europe Dublin Ireland
2003 McGill University Montreal Canada
2004 Shizuoka University of Art and Culture Hamamatsu Japan
2005 University of British Columbia Vancouver Canada
2006 IRCAM Paris France
2007Harvestworks Digital Media Arts Center, New York University's Music Technology Program and the Interactive Telecommunications Program in the Tisch School of the Arts New York City USA
2008 [1] Infomus Lab at the University of Genova GenoaItaly
2009 Carnegie Mellon School of Music Pittsburgh USA
2010 [2] University of Technology, Sydney Sydney Australia
2011 University of Oslo Oslo Norway
2012 [3] University of Michigan Ann Arbor USA
2013Graduate School of Culture Technology at KAIST (Korea Advanced Institute of Science and Technology) Daejeon;Seoul South Korea
2014 Goldsmiths University London UK
2015 [4] Louisiana State University Baton Rouge USA
2016 [5] Griffith University Brisbane Australia
2017 [6] Aalborg University Copenhagen Denmark
2018 [7] Virginia Tech and the University of Virginia Blacksburg USA
2019 [8] Federal University of Rio Grande do Sul Porto Alegre Brazil
2020 [9] Royal Birmingham Conservatoire virtual conference, due to COVID-19
2021 [10] NYU Shanghai Shanghai; virtualChina
2022 [11] University of Auckland Auckland; virtualNew Zealand
2023 [12] Monterrey Institute of Technology and Higher Education and Universidad Autónoma Metropolitana Mexico City; virtualMexico

Areas of application

The following is a partial list of topics covered by the NIME conference:

Other similarly themed conferences include

See also

Related Research Articles

Open Sound Control (OSC) is a protocol for networking sound synthesizers, computers, and other multimedia devices for purposes such as musical performance or show control. OSC's advantages include interoperability, accuracy, flexibility and enhanced organization and documentation. Its disadvantages include inefficient coding of information, increased load on embedded processors, and lack of standardized messages/interoperability. The first specification was released in March 2002.

<span class="mw-page-title-main">Live coding</span> Integration of programming as part of running program

Live coding, sometimes referred to as on-the-fly programming, just in time programming and conversational programming, makes programming an integral part of the running program.

<span class="mw-page-title-main">MIDI controller</span> Device that produces MIDI data

A MIDI controller is any hardware or software that generates and transmits Musical Instrument Digital Interface (MIDI) data to MIDI-enabled devices, typically to trigger sounds and control parameters of an electronic music performance. They most often use a musical keyboard to send data about the pitch of notes to play, although a MIDI controller may trigger lighting and other effects. A wind controller has a sensor that converts breath pressure to volume information and lip pressure to control pitch. Controllers for percussion and stringed instruments exist, as well as specialized and experimental devices. Some MIDI controllers are used in association with specific digital audio workstation software. The original MIDI specification has been extended to include a greater range of control features.

<span class="mw-page-title-main">Don Buchla</span> Musical artist

Donald Buchla was an American pioneer in the field of sound synthesis. Buchla popularized the "West Coast" style of synthesis. He was co-inventor of the voltage controlled modular synthesizer along with Robert Moog, the two working independently in the early 1960s.

The Hyperbow is an electronic violin bow interface that was developed as a result of an in-depth research project by students at MIT. The instrument is intended for use only by accomplished players and was designed to amplify their gestures, which lead to supplementary sound or musical control possibilities. It offers the violin player a range of expressive possibilities in its form as an augmented bow controller that lends itself to the control of bowed string physical models.

<span class="mw-page-title-main">Hydraulophone</span> Hydraulic musical instrument

A hydraulophone is a tonal acoustic musical instrument played by direct physical contact with water where sound is generated or affected hydraulically. The hydraulophone was described and named by Steve Mann in 2005, and patented in 2011. Typically, sound is produced by the same hydraulic fluid in contact with the player's fingers. It has been used as a sensory exploration device for low-vision individuals.

<span class="mw-page-title-main">Harris Wulfson</span>

Harris Wulfson was an American composer, instrumentalist and software engineer in Brooklyn, New York. His work employed algorithmic processes and gestural controllers to explore the boundary where humans encounter their machines.

Live electronic music is a form of music that can include traditional electronic sound-generating devices, modified electric musical instruments, hacked sound generating technologies, and computers. Initially the practice developed in reaction to sound-based composition for fixed media such as musique concrète, electronic music and early computer music. Musical improvisation often plays a large role in the performance of this music. The timbres of various sounds may be transformed extensively using devices such as amplifiers, filters, ring modulators and other forms of circuitry. Real-time generation and manipulation of audio using live coding is now commonplace.

Artificial intelligence and music (AIM) is a common subject in the International Computer Music Conference, the Computing Society Conference and the International Joint Conference on Artificial Intelligence. The first International Computer Music Conference (ICMC) was held in 1974 at Michigan State University. Current research includes the application of AI in music composition, performance, theory and digital sound processing.

In computing, a natural user interface (NUI) or natural interface is a user interface that is effectively invisible, and remains invisible as the user continuously learns increasingly complex interactions. The word "natural" is used because most computer interfaces use artificial control devices whose operation has to be learned. Examples include voice assistants, such as Alexa and Siri, touch and multitouch interactions on today's mobile phones and tablets, but also touch interfaces invisibly integrated into the textiles furnitures.

Alexander Refsum Jensenius is a Norwegian researcher and musician. He is Professor of music technology and was Head of the Department of Musicology, University of Oslo during the period 2013-2016. He is currently Deputy Director of RITMO - Centre for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo, and serves as the Chair of the Steering Committee for NIME, the International Conference in New Interfaces for Musical Expression. He is the grandson of politician Marie Borge Refsum.

Sonic interaction design is the study and exploitation of sound as one of the principal channels conveying information, meaning, and aesthetic/emotional qualities in interactive contexts. Sonic interaction design is at the intersection of interaction design and sound and music computing. If interaction design is about designing objects people interact with, and such interactions are facilitated by computational means, in sonic interaction design, sound is mediating interaction either as a display of processes or as an input medium.

The Miburi is a wearable musical instrument which was released commercially by the Yamaha Corporation’s Tokyo-based experimental division in 1995.

Sergi Jordà is a Catalan innovator, installation artist, digital musician and Associate Professor at the Music Technology Group, Universitat Pompeu Fabra in Barcelona. He is best known for directing the team that invented the Reactable. He is also a trained Physicist.

Sound and music computing (SMC) is a research field that studies the whole sound and music communication chain from a multidisciplinary point of view. By combining scientific, technological and artistic methodologies it aims at understanding, modeling and generating sound and music through computational approaches.

<span class="mw-page-title-main">International Society for Music Information Retrieval</span>

The International Society for Music Information Retrieval (ISMIR) is an international forum for research on the organization of music-related data. It started as an informal group steered by an ad hoc committee in 2000 which established a yearly symposium - whence "ISMIR", which meant International Symposium on Music Information Retrieval. It was turned into a conference in 2002 while retaining the acronym. ISMIR was incorporated in Canada on July 4, 2008.

An immersive virtual musical instrument, or immersive virtual environment for music and sound, represents sound processes and their parameters as 3D entities of a virtual reality so that they can be perceived not only through auditory feedback but also visually in 3D and possibly through tactile as well as haptic feedback, using 3D interface metaphors consisting of interaction techniques such as navigation, selection and manipulation (NSM). It builds on the trend in electronic musical instruments to develop new ways to control sound and perform music such as explored in conferences like NIME.

<span class="mw-page-title-main">Scratch input</span> Acoustic-based human-computer interface

In computing, scratch input is an acoustic-based method of Human-Computer Interaction (HCI) that takes advantage of the characteristic sound produced when a finger nail or other object is dragged over a surface, such as a table or wall. The technique is not limited to fingers; a stick or writing implements can also be used. The sound is often inaudible to the naked ear. However, specialized microphones can digitize the sounds for interactive purposes. Scratch input was invented by Mann et al. in 2007, though the term was first used by Chris Harrison et al.

<span class="mw-page-title-main">Eric Singer (artist)</span> American artist

Eric Singer is a multi-disciplinary artist, musician and software, electrical, computer, robotics, and medical device engineer. He is known for his interactive art and technology works, robotic and electronic musical instruments, fire art, and guerilla art.

<span class="mw-page-title-main">Bruno Zamborlin</span> Italian researcher, entrepreneur and artist

Bruno Zamborlin is an AI researcher, entrepreneur and artist based in London, working in the field of human-computer interaction. His work focuses on converting physical objects into touch-sensitive, interactive surfaces using vibration sensors and artificial intelligence. In 2013 he founded Mogees Limited a start-up to transform everyday objects into musical instruments and games using a vibration sensor and a mobile phone. With HyperSurfaces, he converts physical surfaces of any material, shape and form into data-enabled-interactive surfaces using a vibration sensor and a coin-sized chipset. As an artist, he has created art installations around the world, with his most recent work comprising a unique series of "sound furnitures" that was showcased at the Italian Pavilion of the Venice Biennale 2023. He regularly performed with UK-based electronic music duo Plaid. He is also honorary visiting research fellow at Goldsmiths, University of London.

References

  1. "Nime 2008, 8th International Conference on New Interfaces for Musical Expression". Archived from the original on 2008-11-07. Retrieved 2008-12-03.
  2. "NIME++ 2010 International Conference". Educ.dab.uts.edu.au. Retrieved 28 June 2022.
  3. "New Interfaces for Musical Expression (NIME) 2012, University of Michigan". Archived from the original on 2014-07-18. Retrieved 2011-09-16.
  4. "EMDM » NIME 2015". Emdm.cct.lsu.edu. Retrieved 28 June 2022.
  5. "Archived copy". Archived from the original on 2016-01-11. Retrieved 2016-01-04.{{cite web}}: CS1 maint: archived copy as title (link)
  6. "NIME 2017 | New Interfaces for Musical Expression". Nime2017.org. Retrieved 28 June 2022.
  7. "NIME Conference 2018". Nime2018.icat.vt.edu. Retrieved 28 June 2022.
  8. "New Interfaces for Musical Expression | NIME 2019". Ufrgs.br. Retrieved 28 June 2022.
  9. "NIME2020". Nime2020.bcu.ac.uk. Retrieved 2021-02-10.
  10. "NIME 2021". Nime2021.org. Retrieved 2021-02-10.
  11. "NIME 2022". Nime2022.org. Retrieved 2024-02-24.
  12. "NIME 2023". Nime2023.org. Retrieved 2024-02-24.

Further reading