Musical literacy is the reading, writing, and playing of music, as well an understanding of cultural practice and historical and social contexts.
Music literacy and music education are frequently talked about relationally and causatively, however, they are not interchangeable terms, as complete musical literacy also concerns an understanding of the diverse practices involved in teaching music pedagogy and its impact on literacy. Even then, there are those who argue [1] against the relational and causal link between music education and literacy, instead advocating for the solely interactional relationship between social characteristics and music styles. "Musical communications, like verbal ones, must be put in the right contexts by receivers, if their meanings are to come through unobscured," [2] which is why the pedagogical influence of teaching an individual to become musically literate might be confused with overarching ‘literacy’ itself.
‘Musical literacy’ is likewise not to be confused with ‘music theory’ or ‘musicology.’ These two components are aspects of music education that ultimately act as a means to an end of achieving such literacy. Even then, many scholars [3] debate the relevancy of these educational elements to musical literacy at all. The term, ‘musicality,’ is, again, a distinct term that is separate from the concept of ‘musical literacy,’ as the way in which a musician expresses emotions through performance is not indicative of their music-reading ability. [4]
Given that musical literacy involves mechanical and descriptive processes (such as reading, writing, and playing), as well as a broader cultural understanding of both historical and contemporary practice (i.e. listening, playing, and musical interpretation while listening and/or playing), education in these visual, reading/writing, auditory, and kinesthetic areas can work, in tandem, to achieve literacy as a whole.
Understanding of what the term, ‘musical literacy,’ encompasses has developed, over time, as scholars invest time into research and debate. A brief timeline — as collated by Csikos & Dohany (2016) [5] — is as follows:
Scholars such as Waller (2010) [13] also delve further into distinguishing the relational benefit of different mechanical processes, stating that "reading and writing are necessary concurrent processes". [14] The experience of learning how to "read to write and write to read" [15] allows students to become both a consumer and producer where "the music was given back to them to form their own musical ideas, as full participants in their musical development". [16]
The mechanical and factual elements of musical literacy can be taught in an educational environment with ‘music theory’ and ‘musicology,’ in order to use these "certain bits of articulate information... [to] trigger or activate the right perceptual sets and interpretive frameworks". [17] The descriptive nature of both teaching how to read and write standard Western notation (i.e. music theory), [18] and reading about the social, political, and historical contexts in which the music was written, as well as the ways in which it was practiced/performed (i.e. musicology), [19] constitute the visual and reading/writing approaches to learning. While the "factual knowledge and ability components are developed culturally, within a given social context, [20] signs and symbols on printed sheet music are also used for ‘symbolic interaction,’ [21] "which enable [the musician] to understand [broader musical] discourse". [22] Asmus Jr. (2004) [23] proposes that "most educators would agree that the ability to perform from musical notation is paramount; [24] that the only way to become a "better music reader is to read music". [25]
Auditory learning is equally — if not more (as claimed by Herbst, de Wet & Rijsdijk, 2005 [26] ) — important, however, as "neither the ‘extramusical’ nor the ‘purely musical’ content of [any piece of] music can come across for a listener who brings nothing to it from [their] previous experience of related music and of the world". [27] Listening is "through and through contextual: for the music to be heard or experienced is for it to be related to — brought in some fashion into juxtaposition with — patterns, norms, phenomena, facts, lying outside the specific music itself". [28] Auditory-oriented education teaches comprehensive listening and aural perception against the "backdrop of a host of norms associated with the style, genre, and period categories, and the individual compositional corpus". [29] This frames "appropriate reactions and registerings on the order of tension and release, or expectation and fulfillment, or implication and realization during the course of the music[al piece]". [30] It is in this department that conventional classroom education often fails the individual in their acquisition of complete musical literacy as not only have "researchers pointed out that children coming to school do not have the foundational aural experiences with music to the extent that they have had with language", [31] but the "exclusive concentration on reading [and thus lack of listening] has held back the progress of countless learners, while putting many others off completely". [32] It is in this regard that musical literacy operates independently of music education as — while affecting the outcome of an individual's literacy — it is not defined by the quality of the education.
Furthermore, the kinesthetic aspect of music education plays a role in the achievement of musical literacy, as "human interaction is mediated by the use of symbols, by interpretation, [and] by ascertaining the meaning of one another’s actions". [33] "The different ways human emotions embody themselves, in gesture and stance, sets of cultural associations carried by particular rhythms, motifs, timbres, and instruments [and] aspects of a composer’s life, work, and setting" [34] form both the musician's understanding of a work's historical context, as well as any new meaning attached to it by its recontextualization in their contemporary musical settings and practices.
These aspects of musical literacy development coalesce into various educational practices that approach these types of visual, auditory, reading/writing, and kinesthetic learning in different ways. Unfortunately, "fluent music literacy is a rarely acquired ability in Western culture" [35] as "many children are failed by the ways in which they are taught to read music". [36] As such, many scholars debate over the best way to approach musical pedagogy.
For many scholars, the acquisition of aural skills prior to learning the conventions of print music — a ‘sound before symbol’ [37] approach — serves as the "basis for making musical meaning". [38] Much like pedagogical approaches in language development, Mills & McPherson (2015) [39] observe that "children should become competent with spoken verbal language [ie. aural skills] before they grapple with written verbal language [ie. visual/written notation skills]". [40] For others, they find a ‘language- and speech-based’ approach more effective, but only "after the basic structure and vocabulary of the language has first been established". [41] Gundmundsdottir [42] recommends that the "age of students should be considered when choosing a method for teaching" [43] given the changing receptiveness of a developing brain.
In-field research collated by Gudmundsdottir [44] on this topic notes that:
Moreover, Mills & McPherson [47] conclude that:
Burton [49] found "play-based orientation... appeal[ed] to the natural way children learn[ed]", [50] and that the process of learning how to read, write, and play/verbalise music paralleled the process of learning language. [51] Creating an outlet for the energy of children while using the conceptual framework of other school classes to develop their understanding of print music appears to enrich all areas of brain development. [52] As such, Koopman (1996) [53] is of the opinion that "[the] rich musical experience alone justifies the teaching of music at schools". [54]
Stewart, Walsh & Frith (2004) [55] state that "music reading is an automatic process in trained musicians" [56] whereby the speed of information and psychomotor processing occurs at a high level (Kopiez, Weihs, Ligges & Lee, 2006). [57] The coding of visual information, motor responses, and visual-motor integration [58] make up several processes that occur both dependently and independently of one another; while "the ability to play by ear may have a moderate positive correlation to music reading abilities", [59] studies also demonstrate that concepts of pitch and timing are perceived separately. [60]
The development of pitch recognition also varies within itself depending on the context of the music and what mechanical skills an instrument or setting may require. Gudmundsdottir [61] references Fine, Berry & Rosner [62] when she notes that "successful music reading on an instrument does not necessarily require internal representations of pitch as sight-singing does" [63] and proficiency in one area does not guarantee skill in the other. The ability to link the sound of a note with its printed notation counterpart is a cornerstone in highly developed musical readers [64] and allows them to ‘read ahead’ when ‘sight-reading’ a piece due to such aural recollections. [65] Less-developed readers — or, "button pushers" [66] — contrastingly overly-rely on the visual-mechanical processes of musical literacy (i.e., "going directly from the visual image to the fingering required [on the instrument]"), [67] rather than an inclusive auditory/cultural understanding (i.e. how to also listen to and interpret music in addition to the mechanical processes). While musically literate and -illiterate individuals may be equally-able to identify singular notes, "the experts outperform the novices in their ability to identify a group of pitches as a particular chord or scale... and instantly translate that knowledge into a motor output". [68]
Contrastingly, "rhythm production is [universally] difficult without auditory coding" [69] as all musicians "rely on internal mental representations of musical metre [and temporal events] as they perform". [70] In the context of reading and writing music in the school classroom, Burton [71] saw that "[students] were making their own sense of rhythm in print" [72] and would self-correct when they realised that their aural perception of a rhythmic pattern did not match what they had transcribed on the manuscript. [73] Shehan (1987) [74] notes that successful strategies for teaching rhythm — much like pitch — benefit from the teachings of language literacy, as "written patterns... associated with aural labels in the form of speech cues... [tend] to be a successful strategy for teaching rhythm reading". [75]
Scholars, Mills & McPherson, [76] identified stages of development in reading music notation and recommend correlating a pedagogical approach to a stage that is best-received by the neurological development/age of a student. For instance, encouraging young beginners to invent their own visual representations of pieces they know aurally provides them with the "metamusical awareness that will enhance their progress toward understanding why staff notation looks and works the way it does". [77] Similarly, for children younger than six years old, translating prior aural knowledge of melodies into fingerings on an instrument (i.e. kinesthetic learning) sets the foundation for introducing visual notation later and maintains the ‘fun’ element of developing musical literacy. [78]
These stages of development in reading music notation are outlined by Mills & McPherson [79] as follows:
There are various schools of thought/pedagogy that translate these principles into practical teaching methods. The aim of many pedagogical approaches that also attempt to simultaneously address the "deficiency in research that considers the ability to read and write music with musical comprehension [ie. cultural-historical knowledge in the context of visual and auditory learning] as a developmental domain". [81] One of these most well-known teaching frameworks is the ‘Kodaly Method’.
Zoltán Kodály claims that there are four fundamental aspects to a musician that must develop both simultaneously and at the same rate in order to achieve fluent musical literacy; "(1) a well-trained ear, (2) a well-trained intellect, (3) a well-trained heart (aesthetic/emotional understanding), and (4) well-trained hands (technique)". [82] He was one of the first educators to claim that music literacy involved "the ability to read and write musical notation and to read notation at sight without the aid of an instrument... [as well as] a person’s knowledge of an appreciation for a wide range of musical examples and styles". [83]
Kodály's education techniques utilise elements from language and the educational structure of language development to complement pedagogical efforts in the field of musical literacy development. In rhythm, the Kodály method assigns ‘names’ — originally adapted from the Galin-Paris-Chevé system (French time-name system pioneered by Galin, Paris and Chevé) — to beat values; correlating the number of beats in a note to the number of syllables in its respective name.
Analogous to rhythm, the Kodaly method uses syllables to represent the sounds of notes in a scale as a mnemonic device to train singers. This technique was adapted from the teachings of Guido d’Arezzo, an 11th-century monk, who used the tones, ‘Ut, Re, Mi, Fa, So, La’ from the ‘Hymn to St. John’ (International Kodaly Society, 2014) as syllabic representations of pitch. The idea in contemporary Kodály teaching is that each consequent pitch in a musical scale is assigned a syllable:
1 2 3 4 5 6 7 8/1
Doh Ray Me Fah Soh Lah Te Doh
This can be applied in ‘Absolute’ (or, ‘Fixed-Doh’) Form — also known as ‘Solfege’ — or in a ‘Relative’ (or, ‘Movable-Doh’) Form — also known as ‘Solfa’ — where ‘Doh’ starts on the first pitch of the scale (i.e. for A Major, ‘Doh’ is ‘A’; for G Major, ‘Doh’ is ‘G’; E Major, ‘Doh’ is ‘E’; and so on).
The work of Sarah Glover and (continued by) John Curwen throughout England in the 19th Century meant that ‘Moveable-Doh’ solfa became the "favoured pedagogical tool to teach singers to read music". [84] In addition to the auditory-linguistic aid of syllable-to-pitch, John Curwen also introduced a kinesthetic element where different hand signs were applied to each tone of the scale.
Csikos & Dohany [85] affirm the popularity of the Kodály method over history and cite Barkoczi & Pleh [86] and Hallam [87] on "the powerfulness of the Kodály method in Hungary... [and] abroad" [88] in the context of achieving musical literacy within the school curriculum.
Methods such as Kodály's, however — which rely on sound to inform the visual element of conventional staff notation — fall short for learners who are unable to see. "No matter how brilliant the ear and how good the memory, literacy is essential for the blind student too", [89] and unfortunately conventional staff notation fails to cater to visually-impaired needs.
To rectify this, enlarged print music or Braille music scores can be supplied to low vision, legally blind, and totally blind individuals so that they can replace the visual aspect of learning with a tactile one (i.e. enhanced kinesthetic learning). According to Conn, [90] in order for blind students to "fully develop their aural skills... fully participate in music... become an independent and lifelong learner... have a chance to completely analyse the music... make full use of [their] own interpretive ability... share [their] composition... [and] gain employment/career path", [91] they must learn how to read and write Braille music.
Not unlike sighted education, teaching students how to read language in braille parallels the teaching of braille musical literacy. Toussaint & Tiger [92] cite Mangold [93] and Crawford & Elliott [94] on the "novel relation between the tactile stimulus (i.e., a braille symbol) and an auditory or vocal stimulus (i.e., the spoken letter name)". [95] This mirrors Kodály's visual (i.e., conventional staff notation)-to-auditory (i.e., similarly, the spoken letter name) approach in music education.
Despite the pedagogical similarities, however, Braille music literacy is far lower than musical literacy in sighted individuals. In this sense, too, the comparative percentage of sighted versus blind individuals who are literate in language versus music, are on an equal trajectory — for instance, language literacy for sighted year five school students in Australia is at 93.9% [96] compared to a 6.55% rate of HSC students studying music, [97] while language literacy for blind individuals is at approximately 12%. [98] Ianuzzi [99] comments on these double standards when she asks, "How much music would students learn to play if their music teachers couldn't read the notes? Unfortunately, not very many teachers of blind children are fluent in reading and writing Braille themselves." [100]
Although the core of musical literacy is arguably in relation to "extensive and repeated listening", [101] "there is still a need for explicit theories of music reading that would organise knowledge and research about music reading into a system of assumptions, principles, and procedures" [102] that would benefit poor-to-zero vision individuals. It is via the fundamental elements of reading literacy and "the ability to understand the majority of... utterances in a given tradition" [103] that musical literacy can be achieved. [104]
Nonetheless, sighted or not, the various teaching methods and learning approaches required to achieve musical literacy evidence the spectrum of psychological, neurological, multi-sensory, and motor skills functioning within an individual when they come into contact with music. Many fMRI studies have correspondingly demonstrated the impact of music and advanced music literacy on brain development.
Both the processing of music and performance with musical instruments require the involvement of both hemispheres of the brain. [105] Structural differences (i.e. increased grey matter) are found in the brain regions of musical individuals which are both directly linked to musical skills learned during instrumental training (e.g. independent fine motor skills in both hands, auditory discrimination of pitch), and also indirectly linked with improvements in language and mathematical skills. [106]
Many studies demonstrate that "music can have constructive outcomes on our mindsets that may make learning simpler". [107] For instance, young children exhibited a 46% increase in spatial IQ — essential for higher mind capacities involving complex arithmetic and science — after developing aspects of their musical literacy. [108] Such mathematical skills are enhanced in the brain due to the spatial training involved in learning music notation because "understanding rhythmic notation actually requires math-specific skills, such as pattern recognition and an understanding of proportion, ratio, fractions, and subdivision [of note values]". [109]
Superior "dialect capacity, including vocabulary, expressiveness, and simplicity of correspondence" [110] can also be seen in musically literate individuals. This is due to "both music and language processing requir[ing] the ability to segment streams of sound into small perceptual units". [111] Research confirms the relationship between musical literacy and reading and reasoning, [112] as well as non-cognitive skills such as leisure and emotional development, [113] coordination and innovativeness, attention and focus, memory, creativity, self-confidence, and empathetic interpersonal relationships. [114] Due to these various factors and impacts, Williams (1987) [115] finds herself of the opinion that "[musical] literacy gives dignity as well as competence, and is of the utmost importace to self-image and success... [it gives the] great joy of learning... the thrill of participation, and the satisfaction of informed listening". [116]
Braille is a tactile writing system used by people who are visually impaired. It can be read either on embossed paper or by using refreshable braille displays that connect to computers and smartphone devices. Braille can be written using a slate and stylus, a braille writer, an electronic braille notetaker or with the use of a computer connected to a braille embosser.
Literacy is the ability to read and write. Some researchers suggest that the study of "literacy" as a concept can be divided into two periods: the period before 1950, when literacy was understood solely as alphabetical literacy ; and the period after 1950, when literacy slowly began to be considered as a wider concept and process, including the social and cultural aspects of reading and writing and functional literacy.
Whole language is a philosophy of reading and a discredited educational method originally developed for teaching literacy in English to young children. The method became a major model for education in the United States, Canada, New Zealand, and the UK in the 1980s and 1990s, despite there being no scientific support for the method's effectiveness. It is based on the premise that learning to read English comes naturally to humans, especially young children, in the same way that learning to speak develops naturally.
Phonics is a method for teaching reading and writing to beginners. To use phonics is to teach the relationship between the sounds of the spoken language (phonemes), and the letters (graphemes) or groups of letters or syllables of the written language. Phonics is also known as the alphabetic principle or the alphabetic code. It can be used with any writing system that is alphabetic, such as that of English, Russian, and most other languages. Phonics is also sometimes used as part of the process of teaching Chinese people to read and write Chinese characters, which are not alphabetic, using pinyin, which is alphabetic.
Media literacy is an expanded conceptualization of literacy that includes the ability to access and analyze media messages as well as create, reflect and take action, using the power of information and communication to make a difference in the world. Media literacy applies to different types of media and is seen as important skills for work, life, and citizenship.
The Association of College and Research Libraries defines information literacy as a "set of integrated abilities encompassing the reflective discovery of information, the understanding of how information is produced and valued and the use of information in creating new knowledge and participating ethically in communities of learning". In the United Kingdom, the Chartered Institute of Library and Information Professionals' definition also makes reference to knowing both "when" and "why" information is needed.
Visual literacy is the ability to interpret, negotiate, and make meaning from information presented in the form of an image, extending the meaning of literacy, which commonly signifies interpretation of a written or printed text. Visual literacy is based on the idea that pictures can be "read" and that meaning can be discovered through a process of reading.
Synthetic phonics, also known as blended phonics or inductive phonics, is a method of teaching English reading which first teaches the letter sounds and then builds up to blending these sounds together to achieve full pronunciation of whole words.
Music education is a field of practice in which educators are trained for careers as elementary or secondary music teachers, school or music conservatory ensemble directors. Music education is also a research area in which scholars do original research on ways of teaching and learning music. Music education scholars publish their findings in peer-reviewed journals, and teach undergraduate and graduate education students at university education or music schools, who are training to become music teachers.
The Kodály method, also referred to as the Kodály concept, is an approach to music education developed in Hungary during the mid-twentieth century by Zoltán Kodály. His philosophy of education served as inspiration for the method, which was then developed over a number of years by his associates. In 2016, the method was inscribed as a UNESCO Intangible Cultural Heritage.
Adolescent literacy refers to the ability of adolescents to read and write. Adolescence is a period of rapid psychological and neurological development, during which children develop morally, cognitively, and socially. All of these three types of development have influence—to varying degrees—on the development of literacy skills.
High-frequency sight words are commonly used words that young children are encouraged to memorize as a whole by sight so that they can automatically recognize these words in print without having to use any strategies to decode. Sight words were introduced after whole language fell out of favor with the education establishment.
Colored music notation is a technique used to facilitate enhanced learning in young music students by adding visual color to written musical notation. It is based upon the concept that color can affect the observer in various ways, and combines this with standard learning of basic notation.
Justine Bayard Ward was a musical educator who developed a system for teaching music to children known as the Ward Method.
Simply Music is a music education organization licensing teachers at over 700 locations in twelve countries and serving an online self-study student community in 128 countries. Australian music educator Neil Moore founded it on the core belief that all humans are naturally musical. Simply Music offers programs for students from birth through old age, with the stated goal that "students acquire and retain music as a lifelong companion." Simply Music patterns its approach after primary language acquisition, where speaking comes first. In this it shares some philosophical ground with other developmental approaches like Kodály, Orff Schulwerk, and the Suzuki Method.
Reading is the process of taking in the sense or meaning of letters, symbols, etc., especially by sight or touch.
Multiliteracy is an approach to literacy theory and pedagogy coined in the mid-1990s by the New London Group. The approach is characterized by two key aspects of literacy – linguistic diversity and multimodal forms of linguistic expressions and representation. It was coined in response to two major changes in the globalized environment. One such change was the growing linguistic and cultural diversity due to increased transnational migration. The second major change was the proliferation of new mediums of communication due to advancement in communication technologies e.g the internet, multimedia, and digital media. As a scholarly approach, multiliteracy focuses on the new "literacy" that is developing in response to the changes in the way people communicate globally due to technological shifts and the interplay between different cultures and languages.
A sighted child who is reading at a basic level should be able to understand common words and answer simple questions about the information presented. They should also have enough fluency to get through the material in a timely manner. Over the course of a child's education, these foundations are built on to teach higher levels of math, science, and comprehension skills. Children who are blind not only have the education disadvantage of not being able to see: they also miss out on the very fundamental parts of early and advanced education if not provided with the necessary tools.
The field of music education contains a number of learning theories that specify how students learn music based on behavioral and cognitive psychology.
Gary Edward McPherson is an Australian music educator, academic and musician, who has researched various topics within the areas of musical development, music performance science and music psychology. He has served as the Ormond Chair of Music at the Melbourne Conservatorium of Music (MCM) since 2009 and between July 2009 and July 2019, served as director of the MCM at the University of Melbourne. McPherson's research primarily focuses on exploring the factors that influence the development of musical proficiency during childhood, later performance excellence, and the motivators of music participation in individuals of all ages and musical skill levels. Much of his research has been informed by his interest in the formation of musical abilities and identity in developing musicians. McPherson served as the foundation Professor of Creative Arts at the Hong Kong Institute of Education from 2002 to 2005. Prior to taking up his position at the MCM, he was a Professor of Music Education and the Marilyn Pflederer Zimmerman Endowed Chair in Music Education at the School of Music, University of Illinois at Urbana-Champaign from 2005 to 2009.
{{cite journal}}
: Cite journal requires |journal=
(help){{citation}}
: Missing or empty |title=
(help)CS1 maint: location missing publisher (link){{citation}}
: CS1 maint: location missing publisher (link){{cite journal}}
: Cite journal requires |journal=
(help){{citation}}
: Missing or empty |title=
(help)CS1 maint: location missing publisher (link)