Immersive virtual musical instrument

Last updated

An immersive virtual musical instrument, or immersive virtual environment for music and sound, represents sound processes and their parameters as 3D entities of a virtual reality so that they can be perceived not only through auditory feedback but also visually in 3D and possibly through tactile as well as haptic feedback, using 3D interface metaphors consisting of interaction techniques such as navigation, selection and manipulation (NSM). [1] It builds on the trend in electronic musical instruments to develop new ways to control sound and perform music such as explored in conferences like NIME.

Contents

State of the art

Florent Berthaut created a variety of 3D reactive widgets involving novel representations of musical events and sound, that required a special 3D input device to interact with them using adapted 3D interaction techniques. [2]

Jared Bott created an environment that used 3D spatial control techniques as used in known musical instruments, with symbolic 2D visual representation of musical events. [3]

Richard Polfreman made a 3D virtual environment for musical composition with visual representations of musical and sound data similar to 2D composition environments but placed in a 3D space. [4]

Leonel Valbom created a 3D immersive virtual environment with visual 3D representations of musical events and audio spatialization with which could be interacted using NSM interaction techniques. [5]

Teemu Mäki-Patola explored interaction metaphors based on existing musical instruments as seen in his Virtual Xylophone, Virtual Membrane, and Virtual Air Guitar implementations. [6]

Sutoolz from su-Studio Barcelona used real time 3D video games technology to allow a live performer to construct and play a fully audio visual immersive environment. [7]

Axel Mulder [8] explored the sculpting interaction metaphor by creating a 3D virtual environment that allowed interaction with abstract deformable shapes, such as a sheet and a sphere, which parameters were mapped to sound effects in innovative ways. The work focused on proving the technical feasibility of 3D virtual musical instruments. Gestural control was based on 3D object manipulation such as a subset of prehension.

Early work was done by Jaron Lanier with his Chromatophoria band and separately by Niko Bolas who developed the Soundsculpt Toolkit, a software interface that allows the world of music to communicate with the graphical elements of virtual reality.

Related Research Articles

<span class="mw-page-title-main">User interface</span> Means by which a user interacts with and controls a machine

In the industrial design field of human–computer interaction, a user interface (UI) is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, while the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls and process controls. The design considerations applicable when creating user interfaces are related to, or involve such disciplines as, ergonomics and psychology.

<span class="mw-page-title-main">Augmented reality</span> View of the real world with computer-generated supplementary features

Augmented reality (AR) is an interactive experience that combines the real world and computer-generated content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment. In this way, augmented reality alters one's ongoing perception of a real-world environment, whereas virtual reality completely replaces the user's real-world environment with a simulated one.

<span class="mw-page-title-main">WIMP (computing)</span> Style of human-computer interaction

In human–computer interaction, WIMP stands for "windows, icons, menus, pointer", denoting a style of interaction using these elements of the user interface. Other expansions are sometimes used, such as substituting "mouse" and "mice" for menus, or "pull-down menu" and "pointing" for pointer.

<span class="mw-page-title-main">Computer-mediated reality</span> Ability to manipulate ones perception of reality through the use of a computer

Computer-mediated reality refers to the ability to add to, subtract information from, or otherwise manipulate one's perception of reality through the use of a wearable computer or hand-held device such as a smartphone.

<span class="mw-page-title-main">Visualization (graphics)</span> Set of techniques for creating images, diagrams, or animations to communicate a message

Visualization or visualisation is any technique for creating images, diagrams, or animations to communicate a message. Visualization through visual imagery has been an effective way to communicate both abstract and concrete ideas since the dawn of humanity. from history include cave paintings, Egyptian hieroglyphs, Greek geometry, and Leonardo da Vinci's revolutionary methods of technical drawing for engineering and scientific purposes.

<span class="mw-page-title-main">Mixed reality</span> Merging of real and virtual worlds to produce new environments

Mixed reality (MR) is a term used to describe the merging of a real-world environment and a computer-generated one. Physical and virtual objects may co-exist in mixed reality environments and interact in real time.

<span class="mw-page-title-main">Gesture recognition</span> Topic in computer science and language technology

Gesture recognition is an area of research and development in computer science and language technology concerned with the recognition and interpretation of human gestures. A subdiscipline of computer vision, it employs mathematical algorithms to interpret gestures.

The term color organ refers to a tradition of mechanical devices built to represent sound and accompany music in a visual medium. The earliest created color organs were manual instruments based on the harpsichord design. By the 1900s they were electromechanical. In the early 20th century, a silent color organ tradition (Lumia) developed. In the 1960s and 1970s, the term "color organ" became popularly associated with electronic devices that responded to their music inputs with light shows. The term "light organ" is increasingly being used for these devices; allowing "color organ" to reassume its original meaning.

In computing, post-WIMP comprises work on user interfaces, mostly graphical user interfaces, which attempt to go beyond the paradigm of windows, icons, menus and a pointing device, i.e. WIMP interfaces.

<span class="mw-page-title-main">Immersion (virtual reality)</span> Perception of being physically present in a non-physical world

Immersion into virtual reality (VR) is a perception of being physically present in a non-physical world. The perception is created by surrounding the user of the VR system in images, sound or other stimuli that provide an engrossing total environment.

A projection augmented model is an element sometimes employed in virtual reality systems. It consists of a physical three-dimensional model onto which a computer image is projected to create a realistic looking object. Importantly, the physical model is the same geometric shape as the object that the PA model depicts.

<span class="mw-page-title-main">Interaction technique</span>

An interaction technique, user interface technique or input technique is a combination of hardware and software elements that provides a way for computer users to accomplish a single task. For example, one can go back to the previously visited page on a Web browser by either clicking a button, pressing a key, performing a mouse gesture or uttering a speech command. It is a widely used term in human-computer interaction. In particular, the term "new interaction technique" is frequently used to introduce a novel user interface design idea.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

In computing, a natural user interface (NUI) or natural interface is a user interface that is effectively invisible, and remains invisible as the user continuously learns increasingly complex interactions. The word "natural" is used because most computer interfaces use artificial control devices whose operation has to be learned. Examples include voice assistants, such as Alexa and Siri, touch and multitouch interactions on today's mobile phones and tablets, but also touch interfaces invisibly integrated into the textiles furnitures.

A networked music performance or network musical performance is a real-time interaction over a computer network that enables musicians in different locations to perform as if they were in the same room. These interactions can include performances, rehearsals, improvisation or jamming sessions, and situations for learning such as master classes. Participants may be connected by "high fidelity multichannel audio and video links" as well as MIDI data connections and specialized collaborative software tools. While not intended to be a replacement for traditional live stage performance, networked music performance supports musical interaction when co-presence is not possible and allows for novel forms of music expression. Remote audience members and possibly a conductor may also participate.

Sonic interaction design is the study and exploitation of sound as one of the principal channels conveying information, meaning, and aesthetic/emotional qualities in interactive contexts. Sonic interaction design is at the intersection of interaction design and sound and music computing. If interaction design is about designing objects people interact with, and such interactions are facilitated by computational means, in sonic interaction design, sound is mediating interaction either as a display of processes or as an input medium.

<span class="mw-page-title-main">Scratch input</span> Acoustic-based human-computer interface

In computing, scratch input is an acoustic-based method of Human-Computer Interaction (HCI) that takes advantage of the characteristic sound produced when a finger nail or other object is dragged over a surface, such as a table or wall. The technique is not limited to fingers; a stick or writing implements can also be used. The sound is often inaudible to the naked ear. However, specialized microphones can digitize the sounds for interactive purposes. Scratch input was invented by Mann et al. in 2007, though the term was first used by Chris Harrison et al.

Visual computing is a generic term for all computer science disciplines dealing with images and 3D models, such as computer graphics, image processing, visualization, computer vision, virtual and augmented reality and video processing. Visual computing also includes aspects of pattern recognition, human computer interaction, machine learning and digital libraries. The core challenges are the acquisition, processing, analysis and rendering of visual information. Application areas include industrial quality control, medical image processing and visualization, surveying, robotics, multimedia systems, virtual heritage, special effects in movies and television, and computer games.

Virtual reality (VR) is a computer application which allows users to experience immersive, three dimensional visual and audio simulations. According to Pinho (2004), virtual reality is characterized by immersion in the 3D world, interaction with virtual objects, and involvement in exploring the virtual environment. The feasibility of the virtual reality in education has been debated due to several obstacles such as affordability of VR software and hardware. The psychological effects of virtual reality are also a negative consideration. However, recent technological progress has made VR more viable and promise new learning models and styles for students. These facets of virtual reality have found applications within the primary education sphere in enhancing student learning, increasing engagement, and creating new opportunities for addressing learning preferences.

Joseph J. LaViola Jr. is an American computer scientist, author, consultant, and academic. He holds the Charles N. Millican Professorship in Computer Science and leads the Interactive Computing Experiences Research Cluster at the University of Central Florida (UCF). He also serves as a visiting scholar in the Computer Science Department at Brown University, Consultant at JJL Interface Consultants as well as co-founder of Fluidity Software.

References

  1. Bowman, Doug; Kruijff, Ernst; LaViola, Joseph; Poupyrev, Ivan; Stuerzlinger, Wolfgang (2010). "3D User Interfaces: Design, Implementation, Usability". In Mynatt, Elizabeth D; Hudson, Scott E; Fitzpatrick, Geraldine; Hudson, S; Edwards, K; Rodden, T (eds.). Proceedings of the 28th international conference on Human factors in computing systems (CHI’10). New York: Association for Computing Machinery. ISBN   978-1-60558-929-9.
  2. Berthaut, Florent; Desainte-Catherine, Myriam; Hachet, Martin (2010). "DRILE: an immersive environment for hierarchical live-looping". Proceedings of the 2010 International Conference on New Interfaces for Musical Expression (NIME10), Sydney, NSW, Australia. pp. 192–197.
  3. Bott, Jared N.; Crowley, James G.; LaViola, Joseph J. "One Man Band: A 3D Gestural Interface for Collaborative Music Creation". Proceedings of the 2009 IEEE Virtual Reality Conference (VR '09). IEEE. pp. 273–274. ISBN   978-1-4244-3943-0.
  4. Polfreman, Richard (2009). "Frameworks 3D: Composition in the third dimension". Proceedings of the International Conference on New Interfaces for Musical Expression (NIME) 2009. pp. 226–229.
  5. Valbom, Leonel; Marcos, Adérito (2007). "An Immersive Musical Instrument Prototype". IEEE Computer Graphics and Applications. 27 (4): 14–19. doi:10.1109/MCG.2007.76. ISSN   0272-1716.
  6. Mäki-Patola, Temu; Laitinen, Juha; Kanerva, Aki; Takala, Tapio (2005). "Experiments with Virtual Reality Instruments". Proceedings of the 2005 International Conference on New Interfaces for Musical Expression. pp. 11–16.
  7. Wynnychuk, J; Porcher, R; Brajovic, L; Brajovic, M; Platas, N (2002). "sutoolz 1.0 alpha: 3D software music interface". Proceedings of the 2002 International Conference on New Interfaces for Musical Expression. pp. 1–2.
  8. [A.G.E. Mulder. Design of virtual three-dimensional instruments for sound control. PhD thesis, Simon Fraser University, Canada, 1998.]