This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these template messages)
|
In music, gesture is any movement, either physical (bodily) or mental (imaginary). As such "gesture" includes both categories of movements required to produce sound and categories of perceptual moves associated with those gestures. The concept of musical gestures has received much attention in various musicological disciplines (e.g. music analysis, music therapy, music psychology, NIME) in recent years.
For example, the "musical" movement from a close-position tonic C major chord to a close-position dominant G major chord requires on the piano the physical movement from each white key of the first chord to the right (in space, upwards in pitch) four white keys or steps. Thus gesture includes both characteristic physical movements by performers and characteristic melodies, phrases, chord progressions, and arpeggiations produced by (or producing) those movements.
The concept of musical gestures encompasses a large territory stretching from details of sound-production to more global emotive and aesthetic images of music, and also include considerations of cultural-stylistic vs. more universal modes of expression. In all cases, it is believed that musical gestures manifest the primordial role of human movement in music. For this reason, scholars speak of embodied music cognition in the sense that listeners relate musical sound to mental images of gestures, i.e. that listening (or even merely imagining music) also is a process of incessant mental re-enactment of musical gestures.
Acknowledging the multimodal nature of music perception, embodied music cognition could represent a change of paradigm in music theory and other music related research, research which has often tended to exclude considerations of bodily movement from its conceptual apparatus in favour of focus on more abstract, notation-based elements of music. Focusing on musical gestures provides a coherent and unifying perspective for a renewal of music theory and other music research.
A subset of musical gestures is what could be called music-related body movement, which can be seen from either the performer's or the perceiver's point of view:
The first mathematical definition of gesture is given in the paper "Formulas, Diagrams, and Gestures in Music" (Journal of Mathematics and Music, Vol 1, Nr. 1 2007) by Guerino Mazzola (University of Minnesota) and Moreno Andreatta (IRCAM in Paris). A gesture is a configuration of curves in space and time. More formally, a gesture is a digraph morphism from a "skeleton" of addressed points to a "body", a spatial digraph of a topological category (in the musical case: time, position and pitch). Since the set of gestures of given skeleton and topological category defines a topological category, one may define gestures of gestures, so-called hypergestures.
Indian vocalists move their hands while improvising melody. Although every vocalist has an idiosyncratic gestural style, the motion of the hand and voice are connected through various logics, and many students gesturally resemble their teachers. Nikki Moran, at the University of London, has done research on this topic, and it is one of the subjects of Martin Clayton and Laura Leante's Musical Experience Project at the Open University.
Clayton has published a paper on gestural interaction in Indian music performance: "Time, Gesture and Attention in a Khyal Performance." Asian Music, 38 (2), 71–96.
Matt Rahaim, a vocalist and ethnomusicologist, has published a book on the relationship between vocalization and gesture in Indian vocal music: Musicking Bodies: Gesture and Voice in Hindustani Music. Rahaim's work approaches gesture and vocalization as parallel expressions of melody, investigates isomorphisms between gesture space and raga space, and studies the transmission and inheritance of "paramparic bodies"--vocal/postural/gestural dispositions handed down through teaching lineages.
Robert Hatten (2004) [1] has been using the concept of musical gestures to denote inner-musical qualities:
"Musical gesture is biologically and culturally grounded in communicative human movement. Gesture draws upon the close interaction (and intermodality) of a range of human perceptual and motor systems to synthesize the energetic shaping of motion through time into significant events with unique expressive force. The biological and cultural motivations of musical gesture are further negotiated within the conventions of a musical style, whose elements include both the discrete (pitch, rhythm, meter) and the analog (dynamics, articulation, temporal pacing). Musical gestures are emergent gestalts that convey affective motion, emotion, and agency by fusing otherwise separate elements into continuities of shape and force." [2]
Music is the arrangement of sound to create some combination of form, harmony, melody, rhythm, or otherwise expressive content. Music is generally agreed to be a cultural universal that is present in all human societies. Definitions of music vary widely in substance and approach. While scholars agree that music is defined by a small number of specific elements, there is no consensus as to what these necessary elements are. Music is often characterized as a highly versatile medium for expressing human creativity. Diverse activities are involved in the creation of music, and are often divided into categories of composition, improvisation, and performance. Music may be performed using a wide variety of musical instruments, including the human voice.
Music theory is the study of the practices and possibilities of music. The Oxford Companion to Music describes three interrelated uses of the term "music theory": The first is the "rudiments", that are needed to understand music notation ; the second is learning scholars' views on music from antiquity to the present; the third is a sub-topic of musicology that "seeks to define processes and general principles in music". The musicological approach to theory differs from music analysis "in that it takes as its starting-point not the individual work or performance but the fundamental materials from which it is built."
In music, timbre, also known as tone color or tone quality, is the perceived sound quality of a musical note, sound or tone. Timbre distinguishes different types of sound production, such as choir voices and musical instruments. It also enables listeners to distinguish different instruments in the same category.
Auditory imagery is a form of mental imagery that is used to organize and analyze sounds when there is no external auditory stimulus present. This form of imagery is broken up into a couple of auditory modalities such as verbal imagery or musical imagery. This modality of mental imagery differs from other sensory images such as motor imagery or visual imagery. The vividness and detail of auditory imagery can vary from person to person depending on their background and condition of their brain. Through all of the research developed to understand auditory imagery behavioral neuroscientists have found that the auditory images developed in subjects' minds are generated in real time and consist of fairly precise information about quantifiable auditory properties as well as melodic and harmonic relationships. These studies have been able to recently gain confirmation and recognition due to the arrival of Positron emission tomography and fMRI scans that can confirm a physiological and psychological correlation.
Eurythmy is an expressive movement art originated by Rudolf Steiner in conjunction with his wife, Marie, in the early 20th century. Primarily a performance art, it is also used in education, especially in Waldorf schools, and – as part of anthroposophic medicine – for claimed therapeutic purposes.
Evolutionary musicology is a subfield of biomusicology that grounds the cognitive mechanisms of music appreciation and music creation in evolutionary theory. It covers vocal communication in other animals, theories of the evolution of human music, and holocultural universals in musical ability and processing.
Music psychology, or the psychology of music, may be regarded as a branch of both psychology and musicology. It aims to explain and understand musical behaviour and experience, including the processes through which music is perceived, created, responded to, and incorporated into everyday life. Modern music psychology is primarily empirical; its knowledge tends to advance on the basis of interpretations of data collected by systematic observation of and interaction with human participants. Music psychology is a field of research with practical relevance for many areas, including music performance, composition, education, criticism, and therapy, as well as investigations of human attitude, skill, performance, intelligence, creativity, and social behavior.
Speech perception is the process by which the sounds of language are heard, interpreted, and understood. The study of speech perception is closely linked to the fields of phonology and phonetics in linguistics and cognitive psychology and perception in psychology. Research in speech perception seeks to understand how human listeners recognize speech sounds and use this information to understand spoken language. Speech perception research has applications in building computer systems that can recognize speech, in improving speech recognition for hearing- and language-impaired listeners, and in foreign-language teaching.
Embodied music cognition is a direction within systematic musicology interested in studying the role of the human body in relation to all musical activities.
Computer audition (CA) or machine listening is the general field of study of algorithms and systems for audio interpretation by machines. Since the notion of what it means for a machine to "hear" is very broad and somewhat vague, computer audition attempts to bring together several disciplines that originally dealt with specific problems or had a concrete application in mind. The engineer Paris Smaragdis, interviewed in Technology Review, talks about these systems — "software that uses sound to locate people moving through rooms, monitor machinery for impending breakdowns, or activate traffic cameras to record accidents."
The neuroscience of music is the scientific study of brain-based mechanisms involved in the cognitive processes underlying music. These behaviours include music listening, performing, composing, reading, writing, and ancillary activities. It also is increasingly concerned with the brain basis for musical aesthetics and musical emotion. Scientists working in this field may have training in cognitive neuroscience, neurology, neuroanatomy, psychology, music theory, computer science, and other relevant fields.
Cognitive musicology is a branch of cognitive science concerned with computationally modeling musical knowledge with the goal of understanding both music and cognition.
In music, an instrumental idiom refers to writing, parts, and performance, those being idiomatic or nonidiomatic depending on how well each is suited to the specific instrument intended, in terms of both ease of playing and quality of music and the inherent tendencies and limitations of specific instruments. The analogy is with linguistic idiomaticness, that is, form or structure peculiar to one language but not another.
Embodied cognition is the concept suggesting that many features of cognition are shaped by the state and capacities of the organism. The cognitive features include a wide spectrum of cognitive functions, such as perception biases, memory recall, comprehension and high-level mental constructs and performance on various cognitive tasks. The bodily aspects involve the motor system, the perceptual system, the bodily interactions with the environment (situatedness), and the assumptions about the world built the functional structure of organism's brain and body.
Research into music and emotion seeks to understand the psychological relationship between human affect and music. The field, a branch of music psychology, covers numerous areas of study, including the nature of emotional reactions to music, how characteristics of the listener may determine which emotions are felt, and which components of a musical composition or performance may elicit certain reactions.
Richard Parncutt is an Australian-born academic. He has been professor of systematic musicology at Karl Franzens University Graz in Austria since 1998.
In music cognition, melodic fission, is a phenomenon in which one line of pitches is heard as two or more separate melodic lines. This occurs when a phrase contains groups of pitches at two or more distinct registers or with two or more distinct timbres.
Interaction theory (IT) is an approach to questions about social cognition, or how one understands other people, that focuses on bodily behaviors and environmental contexts rather than on mental processes. IT argues against two other contemporary approaches to social cognition, namely theory theory (TT) and simulation theory (ST). For TT and ST, the primary way of understanding others is by means of ‘mindreading’ or ‘mentalizing’ – processes that depend on either theoretical inference from folk psychology, or simulation. In contrast, for IT, the minds of others are understood primarily through our embodied interactive relations. IT draws on interdisciplinary studies and appeals to evidence developed in developmental psychology, phenomenology, and neuroscience.
The bi-directional hypothesis of language and action proposes that the sensorimotor and language comprehension areas of the brain exert reciprocal influence over one another. This hypothesis argues that areas of the brain involved in movement and sensation, as well as movement itself, influence cognitive processes such as language comprehension. In addition, the reverse effect is argued, where it is proposed that language comprehension influences movement and sensation. Proponents of the bi-directional hypothesis of language and action conduct and interpret linguistic, cognitive, and movement studies within the framework of embodied cognition and embodied language processing. Embodied language developed from embodied cognition, and proposes that sensorimotor systems are not only involved in the comprehension of language, but that they are necessary for understanding the semantic meaning of words.
Joshua Banks Mailman is an American music theorist, as well an analyst, composer, improvisor, philosopher, critic, and technologist of music.