Musical literacy

Last updated

Musical literacy is the reading, writing, and playing of music, as well an understanding of cultural practice and historical and social contexts.

Contents

Music literacy and music education are frequently talked about relationally and causatively, however, they are not interchangeable terms, as complete musical literacy also concerns an understanding of the diverse practices involved in teaching music pedagogy and its impact on literacy. Even then, there are those who argue [1] against the relational and causal link between music education and literacy, instead advocating for the solely interactional relationship between social characteristics and music styles. "Musical communications, like verbal ones, must be put in the right contexts by receivers, if their meanings are to come through unobscured," [2] which is why the pedagogical influence of teaching an individual to become musically literate might be confused with overarching ‘literacy’ itself.

‘Musical literacy’ is likewise not to be confused with ‘music theory’ or ‘musicology.’ These two components are aspects of music education that ultimately act as a means to an end of achieving such literacy. Even then, many scholars [3] debate the relevancy of these educational elements to musical literacy at all. The term, ‘musicality,’ is, again, a distinct term that is separate from the concept of ‘musical literacy,’ as the way in which a musician expresses emotions through performance is not indicative of their music-reading ability. [4]

Given that musical literacy involves mechanical and descriptive processes (such as reading, writing, and playing), as well as a broader cultural understanding of both historical and contemporary practice (i.e. listening, playing, and musical interpretation while listening and/or playing), education in these visual, reading/writing, auditory, and kinesthetic areas can work, in tandem, to achieve literacy as a whole.

‘Musical literacy’: A history of definitions

Understanding of what the term, ‘musical literacy,’ encompasses has developed, over time, as scholars invest time into research and debate. A brief timeline — as collated by Csikos & Dohany (2016) [5] — is as follows:

Scholars such as Waller (2010) [13] also delve further into distinguishing the relational benefit of different mechanical processes, stating that "reading and writing are necessary concurrent processes". [14] The experience of learning how to "read to write and write to read" [15] allows students to become both a consumer and producer where "the music was given back to them to form their own musical ideas, as full participants in their musical development". [16]

Approaches to learning

The mechanical and factual elements of musical literacy can be taught in an educational environment with ‘music theory’ and ‘musicology,’ in order to use these "certain bits of articulate information... [to] trigger or activate the right perceptual sets and interpretive frameworks". [17] The descriptive nature of both teaching how to read and write standard Western notation (i.e. music theory), [18] and reading about the social, political, and historical contexts in which the music was written, as well as the ways in which it was practiced/performed (i.e. musicology), [19] constitute the visual and reading/writing approaches to learning. While the "factual knowledge and ability components are developed culturally, within a given social context, [20] signs and symbols on printed sheet music are also used for ‘symbolic interaction,’ [21] "which enable [the musician] to understand [broader musical] discourse". [22] Asmus Jr. (2004) [23] proposes that "most educators would agree that the ability to perform from musical notation is paramount; [24] that the only way to become a "better music reader is to read music". [25]

Auditory learning is equally — if not more (as claimed by Herbst, de Wet & Rijsdijk, 2005 [26] ) — important, however, as "neither the ‘extramusical’ nor the ‘purely musical’ content of [any piece of] music can come across for a listener who brings nothing to it from [their] previous experience of related music and of the world". [27] Listening is "through and through contextual: for the music to be heard or experienced is for it to be related to — brought in some fashion into juxtaposition with — patterns, norms, phenomena, facts, lying outside the specific music itself". [28] Auditory-oriented education teaches comprehensive listening and aural perception against the "backdrop of a host of norms associated with the style, genre, and period categories, and the individual compositional corpus". [29] This frames "appropriate reactions and registerings on the order of tension and release, or expectation and fulfillment, or implication and realization during the course of the music[al piece]". [30] It is in this department that conventional classroom education often fails the individual in their acquisition of complete musical literacy as not only have "researchers pointed out that children coming to school do not have the foundational aural experiences with music to the extent that they have had with language", [31] but the "exclusive concentration on reading [and thus lack of listening] has held back the progress of countless learners, while putting many others off completely". [32] It is in this regard that musical literacy operates independently of music education as — while affecting the outcome of an individual's literacy — it is not defined by the quality of the education.

Furthermore, the kinesthetic aspect of music education plays a role in the achievement of musical literacy, as "human interaction is mediated by the use of symbols, by interpretation, [and] by ascertaining the meaning of one another’s actions". [33]  "The different ways human emotions embody themselves, in gesture and stance, sets of cultural associations carried by particular rhythms, motifs, timbres, and instruments [and] aspects of a composer’s life, work, and setting" [34] form both the musician's understanding of a work's historical context, as well as any new meaning attached to it by its recontextualization in their contemporary musical settings and practices.

These aspects of musical literacy development coalesce into various educational practices that approach these types of visual, auditory, reading/writing, and kinesthetic learning in different ways. Unfortunately, "fluent music literacy is a rarely acquired ability in Western culture" [35] as "many children are failed by the ways in which they are taught to read music". [36] As such, many scholars debate over the best way to approach musical pedagogy.

Pedagogy

For many scholars, the acquisition of aural skills prior to learning the conventions of print music — a ‘sound before symbol’ [37] approach — serves as the "basis for making musical meaning". [38] Much like pedagogical approaches in language development, Mills & McPherson (2015) [39] observe that "children should become competent with spoken verbal language [ie. aural skills] before they grapple with written verbal language [ie. visual/written notation skills]". [40] For others, they find a ‘language- and speech-based’ approach more effective, but only "after the basic structure and vocabulary of the language has first been established". [41] Gundmundsdottir [42] recommends that the "age of students should be considered when choosing a method for teaching" [43] given the changing receptiveness of a developing brain.

In-field research collated by Gudmundsdottir [44] on this topic notes that:

Moreover, Mills & McPherson [47] conclude that:

Burton [49] found "play-based orientation... appeal[ed] to the natural way children learn[ed]", [50] and that the process of learning how to read, write, and play/verbalise music paralleled the process of learning language. [51] Creating an outlet for the energy of children while using the conceptual framework of other school classes to develop their understanding of print music appears to enrich all areas of brain development. [52] As such, Koopman (1996) [53] is of the opinion that "[the] rich musical experience alone justifies the teaching of music at schools". [54]

Stewart, Walsh & Frith (2004) [55] state that "music reading is an automatic process in trained musicians" [56] whereby the speed of information and psychomotor processing occurs at a high level (Kopiez, Weihs, Ligges & Lee, 2006). [57] The coding of visual information, motor responses, and visual-motor integration [58] make up several processes that occur both dependently and independently of one another; while "the ability to play by ear may have a moderate positive correlation to music reading abilities", [59] studies also demonstrate that concepts of pitch and timing are perceived separately. [60]

The development of pitch recognition also varies within itself depending on the context of the music and what mechanical skills an instrument or setting may require. Gudmundsdottir [61] references Fine, Berry & Rosner [62] when she notes that "successful music reading on an instrument does not necessarily require internal representations of pitch as sight-singing does" [63] and proficiency in one area does not guarantee skill in the other. The ability to link the sound of a note with its printed notation counterpart is a cornerstone in highly developed musical readers [64] and allows them to ‘read ahead’ when ‘sight-reading’ a piece due to such aural recollections. [65] Less-developed readers — or, "button pushers" [66] — contrastingly overly-rely on the visual-mechanical processes of musical literacy (i.e., "going directly from the visual image to the fingering required [on the instrument]"), [67] rather than an inclusive auditory/cultural understanding (i.e. how to also listen to and interpret music in addition to the mechanical processes). While musically literate and -illiterate individuals may be equally-able to identify singular notes, "the experts outperform the novices in their ability to identify a group of pitches as a particular chord or scale... and instantly translate that knowledge into a motor output". [68]

Contrastingly, "rhythm production is [universally] difficult without auditory coding" [69] as all musicians "rely on internal mental representations of musical metre [and temporal events] as they perform". [70] In the context of reading and writing music in the school classroom, Burton [71] saw that "[students] were making their own sense of rhythm in print" [72] and would self-correct when they realised that their aural perception of a rhythmic pattern did not match what they had transcribed on the manuscript. [73] Shehan (1987) [74] notes that successful strategies for teaching rhythm — much like pitch — benefit from the teachings of language literacy, as "written patterns... associated with aural labels in the form of speech cues... [tend] to be a successful strategy for teaching rhythm reading". [75]

Scholars, Mills & McPherson, [76] identified stages of development in reading music notation and recommend correlating a pedagogical approach to a stage that is best-received by the neurological development/age of a student. For instance, encouraging young beginners to invent their own visual representations of pieces they know aurally provides them with the "metamusical awareness that will enhance their progress toward understanding why staff notation looks and works the way it does". [77] Similarly, for children younger than six years old, translating prior aural knowledge of melodies into fingerings on an instrument (i.e. kinesthetic learning) sets the foundation for introducing visual notation later and maintains the ‘fun’ element of developing musical literacy. [78]

These stages of development in reading music notation are outlined by Mills & McPherson [79] as follows:

  1. "Features: the markings on the page that form the basis of notation. These involve awareness of the features of the lines and curves of the musical symbols and notes, and knowledge that they are both systematic and meaningful.
  2. Letters/musical notes and signs: Consistent interpretation of features allows the child to attend to and recognize basic symbol units such as individual notes, clef signs, time signatures, dynamic markings, sharps, flats, and so forth.
  3. Syllables/intervals: Structural analysis of melodic patterns involves recognizing the systematic relationships between adjoining notes (e.g., intervals).
  4. Words/groups: The transition from individual notes to groups of notes occurs via structural analysis of the component intervals, or by visual scanning of the whole musical idea (e.g., chord, scale run). This represents the first level of musical meaning; however, at this level, the meanings attached to individual clusters are decontextualized and isolated.
  5. Word groups/motifs or note grouplets: Combinations of clusters form a motif or motif grouplet, a level of musical meaning equivalent to understanding individual phrases and clauses in text. These may vary in length according to their musical function.
  6. Idea/musical phrase or figure: In music, an individual idea is expressed by combining motifs into a musical phrase.
  7. Main idea/musical idea: The combination of musical phrases yields a musical idea, equivalent in text-processing terms to the construction of a main idea from a paragraph.
  8. Themes/musical subject: Understanding of the musical subject involves imposing a sense of musicality onto the score such that the component musical phrase and subject are taken beyond technical proficiency to include variations of sound, mood, dynamics, and so forth in ways that allow for individualized interpretation of the score (Cantwell & Millard, 1994, pp. 47–9)." [80]

There are various schools of thought/pedagogy that translate these principles into practical teaching methods. The aim of many pedagogical approaches that also attempt to simultaneously address the "deficiency in research that considers the ability to read and write music with musical comprehension [ie. cultural-historical knowledge in the context of visual and auditory learning] as a developmental domain". [81] One of these most well-known teaching frameworks is the ‘Kodaly Method’.

Teaching rhythm in music education.jpg
Solfa.jpg
Curwen Hand Signs MT.jpg

The Kodaly Method

Zoltán Kodály claims that there are four fundamental aspects to a musician that must develop both simultaneously and at the same rate in order to achieve fluent musical literacy; "(1) a well-trained ear, (2) a well-trained intellect, (3) a well-trained heart (aesthetic/emotional understanding), and (4) well-trained hands (technique)". [82] He was one of the first educators to claim that music literacy involved "the ability to read and write musical notation and to read notation at sight without the aid of an instrument... [as well as] a person’s knowledge of an appreciation for a wide range of musical examples and styles". [83]

Kodály's education techniques utilise elements from language and the educational structure of language development to complement pedagogical efforts in the field of musical literacy development. In rhythm, the Kodály method assigns ‘names’ — originally adapted from the Galin-Paris-Chevé system (French time-name system pioneered by Galin, Paris and Chevé) — to beat values; correlating the number of beats in a note to the number of syllables in its respective name.

Analogous to rhythm, the Kodaly method uses syllables to represent the sounds of notes in a scale as a mnemonic device to train singers. This technique was adapted from the teachings of Guido d’Arezzo, an 11th-century monk, who used the tones, ‘Ut, Re, Mi, Fa, So, La’ from the ‘Hymn to St. John’ (International Kodaly Society, 2014) as syllabic representations of pitch. The idea in contemporary Kodály teaching is that each consequent pitch in a musical scale is assigned a syllable:

1       2       3       4       5       6       7       8/1

Doh    Ray     Me     Fah     Soh     Lah     Te      Doh

This can be applied in ‘Absolute’ (or, ‘Fixed-Doh’) Form — also known as ‘Solfege’ — or in a ‘Relative’ (or, ‘Movable-Doh’) Form — also known as ‘Solfa’ — where ‘Doh’ starts on the first pitch of the scale (i.e. for A Major, ‘Doh’ is ‘A’; for G Major, ‘Doh’ is ‘G’; E Major, ‘Doh’ is ‘E’; and so on).

The work of Sarah Glover and (continued by) John Curwen throughout England in the 19th Century meant that ‘Moveable-Doh’ solfa became the "favoured pedagogical tool to teach singers to read music". [84] In addition to the auditory-linguistic aid of syllable-to-pitch, John Curwen also introduced a kinesthetic element where different hand signs were applied to each tone of the scale.

Csikos & Dohany [85] affirm the popularity of the Kodály method over history and cite Barkoczi & Pleh [86] and Hallam [87] on "the powerfulness of the Kodály method in Hungary... [and] abroad" [88] in the context of achieving musical literacy within the school curriculum.

Braille music literacy

Methods such as Kodály's, however — which rely on sound to inform the visual element of conventional staff notation — fall short for learners who are unable to see. "No matter how brilliant the ear and how good the memory, literacy is essential for the blind student too", [89] and unfortunately conventional staff notation fails to cater to visually-impaired needs.

To rectify this, enlarged print music or Braille music scores can be supplied to low vision, legally blind, and totally blind individuals so that they can replace the visual aspect of learning with a tactile one (i.e. enhanced kinesthetic learning). According to Conn, [90] in order for blind students to "fully develop their aural skills... fully participate in music... become an independent and lifelong learner... have a chance to completely analyse the music... make full use of [their] own interpretive ability... share [their] composition... [and] gain employment/career path", [91] they must learn how to read and write Braille music.

Not unlike sighted education, teaching students how to read language in braille parallels the teaching of braille musical literacy. Toussaint & Tiger [92] cite Mangold [93] and Crawford & Elliott [94] on the "novel relation between the tactile stimulus (i.e., a braille symbol) and an auditory or vocal stimulus (i.e., the spoken letter name)". [95] This mirrors Kodály's visual (i.e., conventional staff notation)-to-auditory (i.e., similarly, the spoken letter name) approach in music education.

Despite the pedagogical similarities, however, Braille music literacy is far lower than musical literacy in sighted individuals. In this sense, too, the comparative percentage of sighted versus blind individuals who are literate in language versus music, are on an equal trajectory — for instance, language literacy for sighted year five school students in Australia is at 93.9% [96] compared to a 6.55% rate of HSC students studying music, [97] while language literacy for blind individuals is at approximately 12%. [98] Ianuzzi [99] comments on these double standards when she asks, "How much music would students learn to play if their music teachers couldn't read the notes? Unfortunately, not very many teachers of blind children are fluent in reading and writing Braille themselves." [100]

Braille music.jpg

Although the core of musical literacy is arguably in relation to "extensive and repeated listening", [101] "there is still a need for explicit theories of music reading that would organise knowledge and research about music reading into a system of assumptions, principles, and procedures" [102] that would benefit poor-to-zero vision individuals. It is via the fundamental elements of reading literacy and "the ability to understand the majority of... utterances in a given tradition" [103] that musical literacy can be achieved. [104]

Nonetheless, sighted or not, the various teaching methods and learning approaches required to achieve musical literacy evidence the spectrum of psychological, neurological, multi-sensory, and motor skills functioning within an individual when they come into contact with music. Many fMRI studies have correspondingly demonstrated the impact of music and advanced music literacy on brain development.

Brain development

Both the processing of music and performance with musical instruments require the involvement of both hemispheres of the brain. [105] Structural differences (i.e. increased grey matter) are found in the brain regions of musical individuals which are both directly linked to musical skills learned during instrumental training (e.g. independent fine motor skills in both hands, auditory discrimination of pitch), and also indirectly linked with improvements in language and mathematical skills. [106]

Many studies demonstrate that "music can have constructive outcomes on our mindsets that may make learning simpler". [107] For instance, young children exhibited a 46% increase in spatial IQ — essential for higher mind capacities involving complex arithmetic and science — after developing aspects of their musical literacy. [108] Such mathematical skills are enhanced in the brain due to the spatial training involved in learning music notation because "understanding rhythmic notation actually requires math-specific skills, such as pattern recognition and an understanding of proportion, ratio, fractions, and subdivision [of note values]". [109]

Superior "dialect capacity, including vocabulary, expressiveness, and simplicity of correspondence" [110] can also be seen in musically literate individuals. This is due to "both music and language processing requir[ing] the ability to segment streams of sound into small perceptual units". [111] Research confirms the relationship between musical literacy and reading and reasoning, [112] as well as non-cognitive skills such as leisure and emotional development, [113] coordination and innovativeness, attention and focus, memory, creativity, self-confidence, and empathetic interpersonal relationships. [114] Due to these various factors and impacts, Williams (1987) [115] finds herself of the opinion that "[musical] literacy gives dignity as well as competence, and is of the utmost importace to self-image and success... [it gives the] great joy of learning... the thrill of participation, and the satisfaction of informed listening". [116]

Related Research Articles

<span class="mw-page-title-main">Braille</span> Tactile writing system

Braille is a tactile writing system used by people who are visually impaired. It can be read either on embossed paper or by using refreshable braille displays that connect to computers and smartphone devices. Braille can be written using a slate and stylus, a braille writer, an electronic braille notetaker or with the use of a computer connected to a braille embosser.

<span class="mw-page-title-main">Literacy</span> Ability to read and write

Literacy is the ability to read and write. Some researchers suggest that the study of "literacy" as a concept can be divided into two periods: the period before 1950, when literacy was understood solely as alphabetical literacy ; and the period after 1950, when literacy slowly began to be considered as a wider concept and process, including the social and cultural aspects of reading and writing and functional literacy.

Whole language is a philosophy of reading and a discredited educational method originally developed for teaching literacy in English to young children. The method became a major model for education in the United States, Canada, New Zealand, and the UK in the 1980s and 1990s, despite there being no scientific support for the method's effectiveness. It is based on the premise that learning to read English comes naturally to humans, especially young children, in the same way that learning to speak develops naturally.

<span class="mw-page-title-main">Phonics</span> Method of teaching reading and writing

Phonics is a method for teaching reading and writing to beginners. To use phonics is to teach the relationship between the sounds of the spoken language (phonemes), and the letters (graphemes) or groups of letters or syllables of the written language. Phonics is also known as the alphabetic principle or the alphabetic code. It can be used with any writing system that is alphabetic, such as that of English, Russian, and most other languages. Phonics is also sometimes used as part of the process of teaching Chinese people to read and write Chinese characters, which are not alphabetic, using pinyin, which is alphabetic.

Media literacy is an expanded conceptualization of literacy that includes the ability to access and analyze media messages as well as create, reflect and take action, using the power of information and communication to make a difference in the world. Media literacy applies to different types of media and is seen as important skills for work, life, and citizenship.

The Association of College and Research Libraries defines information literacy as a "set of integrated abilities encompassing the reflective discovery of information, the understanding of how information is produced and valued and the use of information in creating new knowledge and participating ethically in communities of learning". In the United Kingdom, the Chartered Institute of Library and Information Professionals' definition also makes reference to knowing both "when" and "why" information is needed.

<span class="mw-page-title-main">Visual literacy</span>

Visual literacy is the ability to interpret, negotiate, and make meaning from information presented in the form of an image, extending the meaning of literacy, which commonly signifies interpretation of a written or printed text. Visual literacy is based on the idea that pictures can be "read" and that meaning can be discovered through a process of reading.

<span class="mw-page-title-main">Synthetic phonics</span> Teaching reading by blending and segmenting the sounds of the letters

Synthetic phonics, also known as blended phonics or inductive phonics, is a method of teaching English reading which first teaches the letter sounds and then builds up to blending these sounds together to achieve full pronunciation of whole words.

<span class="mw-page-title-main">Music education</span> Field of study associated with the teaching and learning of music

Music education is a field of practice in which educators are trained for careers as elementary or secondary music teachers, school or music conservatory ensemble directors. Music education is also a research area in which scholars do original research on ways of teaching and learning music. Music education scholars publish their findings in peer-reviewed journals, and teach undergraduate and graduate education students at university education or music schools, who are training to become music teachers.

The Kodály method, also referred to as the Kodály concept, is an approach to music education developed in Hungary during the mid-twentieth century by Zoltán Kodály. His philosophy of education served as inspiration for the method, which was then developed over a number of years by his associates. In 2016, the method was inscribed as a UNESCO Intangible Cultural Heritage.

Adolescent literacy refers to the ability of adolescents to read and write. Adolescence is a period of rapid psychological and neurological development, during which children develop morally, cognitively, and socially. All of these three types of development have influence—to varying degrees—on the development of literacy skills.

High-frequency sight words are commonly used words that young children are encouraged to memorize as a whole by sight so that they can automatically recognize these words in print without having to use any strategies to decode. Sight words were introduced after whole language fell out of favor with the education establishment.

Colored music notation is a technique used to facilitate enhanced learning in young music students by adding visual color to written musical notation. It is based upon the concept that color can affect the observer in various ways, and combines this with standard learning of basic notation.

Justine Bayard Ward was a musical educator who developed a system for teaching music to children known as the Ward Method.

<span class="mw-page-title-main">Simply Music</span>

Simply Music is a music education organization licensing teachers at over 700 locations in twelve countries and serving an online self-study student community in 128 countries. Australian music educator Neil Moore founded it on the core belief that all humans are naturally musical. Simply Music offers programs for students from birth through old age, with the stated goal that "students acquire and retain music as a lifelong companion." Simply Music patterns its approach after primary language acquisition, where speaking comes first. In this it shares some philosophical ground with other developmental approaches like Kodály, Orff Schulwerk, and the Suzuki Method.

<span class="mw-page-title-main">Reading</span> Taking in the meaning of letters or symbols

Reading is the process of taking in the sense or meaning of letters, symbols, etc., especially by sight or touch.

Multiliteracy is an approach to literacy theory and pedagogy coined in the mid-1990s by the New London Group. The approach is characterized by two key aspects of literacy – linguistic diversity and multimodal forms of linguistic expressions and representation. It was coined in response to two major changes in the globalized environment. One such change was the growing linguistic and cultural diversity due to increased transnational migration. The second major change was the proliferation of new mediums of communication due to advancement in communication technologies e.g the internet, multimedia, and digital media. As a scholarly approach, multiliteracy focuses on the new "literacy" that is developing in response to the changes in the way people communicate globally due to technological shifts and the interplay between different cultures and languages.

A sighted child who is reading at a basic level should be able to understand common words and answer simple questions about the information presented. They should also have enough fluency to get through the material in a timely manner. Over the course of a child's education, these foundations are built on to teach higher levels of math, science, and comprehension skills. Children who are blind not only have the education disadvantage of not being able to see: they also miss out on the very fundamental parts of early and advanced education if not provided with the necessary tools.

The field of music education contains a number of learning theories that specify how students learn music based on behavioral and cognitive psychology.

<span class="mw-page-title-main">Gary E. McPherson</span> Australian musician and educator

Gary Edward McPherson is an Australian music educator, academic and musician, who has researched various topics within the areas of musical development, music performance science and music psychology. He has served as the Ormond Chair of Music at the Melbourne Conservatorium of Music (MCM) since 2009 and between July 2009 and July 2019, served as director of the MCM at the University of Melbourne. McPherson's research primarily focuses on exploring the factors that influence the development of musical proficiency during childhood, later performance excellence, and the motivators of music participation in individuals of all ages and musical skill levels. Much of his research has been informed by his interest in the formation of musical abilities and identity in developing musicians. McPherson served as the foundation Professor of Creative Arts at the Hong Kong Institute of Education from 2002 to 2005. Prior to taking up his position at the MCM, he was a Professor of Music Education and the Marilyn Pflederer Zimmerman Endowed Chair in Music Education at the School of Music, University of Illinois at Urbana-Champaign from 2005 to 2009.

References

  1. Swanwick, 1999, as cited in Csikos & Dohany, 2016, p.6
  2. Levinson, 1990, p.24
  3. Levinson, 1990; Csapo, 2004; Waller, 2010; Hodges & Nolker, 2011; Burton, 2015; Mills & McPherson, 2015; Csikos & Dohany, 2016
  4. Wolf, 1976, as cited in Gudmundsdottir, 2010, p.4
  5. Csíkos, G.; Dohány (2016), "Connections between music literacy and music-related background variables: An empirical investigation" (PDF), Visions of Research in Music Education, 28, ISSN   1938-2065
  6. Csikos & Dohany, 2016, p.5
  7. Csikos & Dohany, 2016, p.2
  8. Csikos & Dohany, 2016, p.3
  9. Csikos & Dohany, 2016, p.2
  10. Csikos & Dohany, 2016, p.5
  11. Csikos & Dohany, 2016, p.2
  12. Csikos & Dohany, 2016, p.3
  13. Waller, D (2010). "Language Literacy and Music Literacy". Philosophy of Music Education Review. 18 (1): 26–44. doi:10.2979/pme.2010.18.1.26. S2CID   144062517.
  14. Burton, 2015, p.9
  15. Waller, 2010, as cited in Burton, 2015, p.9
  16. Burton, 2015, p.9
  17. Levinson, 1990, p.26
  18. Levinson, 1990; Burton, 2015
  19. Csapo, 2004; Levinson, 1990; Csikos & Dohany, 2016
  20. Csikos & Dohany, 2016, p.3
  21. Blumer, H (1986). "Symbolic Interactionism: Perspective and Method". Englewood Cliffs, NJ: Prentice-Hall.{{cite journal}}: Cite journal requires |journal= (help)
  22. Levinson, 1990, p.18
  23. Asmus Jr., Edward P. (Spring 2004), "Music Teaching and Music Literacy", Journal of Music Teacher Education, 13 (2), SAGE Publications: 6–8, doi:10.1177/10570837040130020102, S2CID   143478361
  24. Asmus Jr., Edward P. (Spring 2004), "Music Teaching and Music Literacy", Journal of Music Teacher Education, 13 (2), SAGE Publications: 7, doi:10.1177/10570837040130020102, S2CID   143478361
  25. Asmus Jr., Edward P. (Spring 2004), "Music Teaching and Music Literacy", Journal of Music Teacher Education, 13 (2), SAGE Publications: 8, doi:10.1177/10570837040130020102, S2CID   143478361
  26. Herbst, A; de Wet, J; Rijsdijk, S (2005). "A survey of music education in the primary schools of South Africa's Cape Peninsula". Journal of Research in Music Education. 53 (3): 260–283. doi:10.1177/002242940505300307. S2CID   144468948.
  27. Levinson, 1990, p.23
  28. Gombrich, 1963; Wollheim, 1968, 1980; Goodman, 1968; Walton, 1970; Sagoff, 1978; Pettit in Schaper, 1983
  29. Meyer, 1956; Meyer, 1967
  30. Levinson, 1990, p.20
  31. Burton, 2015, p.4
  32. McPherson, 1993, 2005; Mills, 1991b,c, 2005; Priest, 1989; Schenck, 1989; as cited in Mills & McPherson, 2015, p.189
  33. Blumer, 1986, p.79, as cited in Burton, 2015
  34. Levinson, 1990, p.23
  35. Green, 2002, as cited in Gudmundsdottir, 2010, p.1
  36. Mills & McPherson, 2006, as cited in Gudmundsdottir, 2010, p.1
  37. McPherson & Gabrielsson, 2002; Mills & McPherson, 2006; Gordon, 2012; Mills & McPherson, 2015
  38. Burton, 2015, pp.1-2
  39. Mills, J.; McPherson, G. (2015), "Chapter 9. Musical Literacy: Reading traditional clef notation" (PDF), The child as musician: A handbook of musical development, Oxford Scholarship Online, pp. 1–16, ISBN   9780198530329
  40. Adams (1994) & Kirby (1988) inMills, J.; McPherson, G. (2015), "Chapter 9. Musical Literacy: Reading traditional clef notation" (PDF), The child as musician: A handbook of musical development, Oxford Scholarship Online, p. 179, ISBN   9780198530329
  41. Cooper, 2003, as cited in Mills & McPherson, 2015, p.177
  42. Gudmundsdottir, H. R. (2010), "Advances in music-reading research", Music Education Research, 12 (4): 331–338, doi:10.1080/14613808.2010.504809, S2CID   143530523
  43. Gudmundsdottir, H. R. (2010), "Advances in music-reading research", Music Education Research, 12 (4): 331–338, doi:10.1080/14613808.2010.504809, S2CID   143530523
  44. Gudmundsdottir, H. R. (2010), "Advances in music-reading research", Music Education Research, 12 (4): 331–338, doi:10.1080/14613808.2010.504809, S2CID   143530523
  45. Gudmundsdottir, H. R. (2010), "Advances in music-reading research", Music Education Research, 12 (4): 331–338, doi:10.1080/14613808.2010.504809, S2CID   143530523
  46. Gudmundsdottir, H. R. (2010), "Advances in music-reading research", Music Education Research, 12 (4): 331–338, doi:10.1080/14613808.2010.504809, S2CID   143530523
  47. Mills, J.; McPherson, G. (2015), "Chapter 9. Musical Literacy: Reading traditional clef notation" (PDF), The child as musician: A handbook of musical development, Oxford Scholarship Online, pp. 1–16, ISBN   9780198530329
  48. Mills, J.; McPherson, G. (2015), "Chapter 9. Musical Literacy: Reading traditional clef notation" (PDF), The child as musician: A handbook of musical development, Oxford Scholarship Online, p. 180, ISBN   9780198530329
  49. Burton, S. (2015), "Making music mine: the development of rhythmic literacy", Music Education Research, 19 (2), Taylor & Francis Online: 133–142, doi:10.1080/14613808.2015.1095720, S2CID   147486278
  50. Burton, S. (2015), "Making music mine: the development of rhythmic literacy", Music Education Research, 19 (2), Taylor & Francis Online: 133–142, doi:10.1080/14613808.2015.1095720, S2CID   147486278
  51. Burton, 2011; Gordon, 2012; Gruhn, 2002; Pinzino, 2007; Reynolds, Long & Valerio, 2007
  52. Burton, S. (2015), "Making music mine: the development of rhythmic literacy", Music Education Research, 19 (2), Taylor & Francis Online: 133–142, doi:10.1080/14613808.2015.1095720, S2CID   147486278
  53. As cited inCsíkos, G.; Dohány (2016), "Connections between music literacy and music-related background variables: An empirical investigation" (PDF), Visions of Research in Music Education, 28: 4, ISSN   1938-2065
  54. Csíkos, G.; Dohány (2016), "Connections between music literacy and music-related background variables: An empirical investigation" (PDF), Visions of Research in Music Education, 28: 4, ISSN   1938-2065
  55. As cited inGudmundsdottir, H. R. (2010), "Advances in music-reading research", Music Education Research, 12 (4): 331–338, doi:10.1080/14613808.2010.504809, S2CID   143530523
  56. Gudmundsdottir, H. R. (2010), "Advances in music-reading research", Music Education Research, 12 (4): 331–338, doi:10.1080/14613808.2010.504809, S2CID   143530523
  57. As cited inGudmundsdottir, H. R. (2010), "Advances in music-reading research", Music Education Research, 12 (4): 331–338, doi:10.1080/14613808.2010.504809, S2CID   143530523
  58. Gudmundsdottir, H. R. (2007). Error analysis of young piano students' music reading performances. Paper presented at the 8th conference of the Society for Music Perception and Cognition. Concordia University, Montreal.
  59. Luce, 1965; Mishra, 1998
  60. Schön & Besson, 2002; Waters, Townsend & Underwood, 1998
  61. Gudmundsdottir, H. R. (2010), "Advances in music-reading research", Music Education Research, 12 (4): 331–338, doi:10.1080/14613808.2010.504809, S2CID   143530523
  62. Fine, P; Berry, A; Rosner, B (2006). "The effect of pattern recognition and tonal predictability on sight-singing ability". Psychology of Music. 34 (4): 431–447. doi:10.1177/0305735606067152. S2CID   145198686.
  63. Gudmundsdottir, H. R. (2010), "Advances in music-reading research", Music Education Research, ResearchGate, 12(4): 2, doi:10.1080/14613808.2010.504809
  64. McPherson, 1993, 1994b, 2005; Schleuter, 1997
  65. Mills & McPherson, 2015, p.181
  66. Schleuter, 1997
  67. Mills & McPherson, 2015, p.181
  68. Gudmundsdottir, 2010, p.6
  69. Dodson, 1983, p.4
  70. Palmer & Krumhansl, 1990; Sloboda, 1983; as cited in Gudmundsdottir, 2010, p.9
  71. Burton, S. (2015), "Making music mine: the development of rhythmic literacy", Music Education Research, 19 (2), Taylor & Francis Online: 133–142, doi:10.1080/14613808.2015.1095720, S2CID   147486278
  72. Burton, 2015, pp.6-7
  73. Burton, 2015, p.6
  74. As cited in Gudmundsdottir, 2010, p.10
  75. Gudmundsdottir, 2010, p.10
  76. Mills, J.; McPherson, G. (2015), "Chapter 9. Musical Literacy: Reading traditional clef notation" (PDF), The child as musician: A handbook of musical development, Oxford Scholarship Online, pp. 1–16, ISBN   9780198530329
  77. McPherson & Gabrielsson, 2002; Upitis, 1990, 1992; as cited in Mills & McPherson, 2015, p.180
  78. McPherson & Gabrielsson, 2002
  79. Mills, J.; McPherson, G. (2015), "Chapter 9. Musical Literacy: Reading traditional clef notation" (PDF), The child as musician: A handbook of musical development, Oxford Scholarship Online, pp. 1–16, ISBN   9780198530329
  80. Mills, J.; McPherson, G. (2015), "Chapter 9. Musical Literacy: Reading traditional clef notation" (PDF), The child as musician: A handbook of musical development, Oxford Scholarship Online, p. 182, ISBN   9780198530329
  81. Burton, 2015, p.1
  82. Bonis, 1974, p.197
  83. International Kodaly Society, 2014
  84. International Kodaly Society, 2014
  85. Csíkos, G.; Dohány (2016), "Connections between music literacy and music-related background variables: An empirical investigation" (PDF), Visions of Research in Music Education, 28, ISSN   1938-2065
  86. Barkóczi, I; Pléh, C (1977). Psychological examination of the Kodály method of musical education. Kecskemét, Hungary: Kodály Intézet.
  87. Hallam, S (2010). "The power of music: Its impact on the intellectual, social and personal development of children and young people". International Journal of Music Education. 28 (3): 269–289. doi:10.1177/0255761410370658. S2CID   5662260.
  88. Csíkos, G.; Dohány (2016), "Connections between music literacy and music-related background variables: An empirical investigation" (PDF), Visions of Research in Music Education, 28: 4, ISSN   1938-2065
  89. Cooper, 1994, as cited in Conn, 2001, p.4
  90. Conn, J. (2001), "Braille Music Literacy", Annual Conference of the Australian Braille Authority, Brisbane, Australia, pp. 1–4{{citation}}: Missing or empty |title= (help)CS1 maint: location missing publisher (link)
  91. Conn, J. (2001), "Braille Music Literacy", Braille Music Literacy, Annual Conference of the Australian Braille Authority, Brisbane, Australia, pp. 2–3{{citation}}: CS1 maint: location missing publisher (link)
  92. Toussaint, K. A.; Tiger, J. H. (2010). "Teaching early braille literacy skills within a stimulus equivalence paradigm to children with degenerative visual impairments". Journal of Applied Behavior Analysis. 43 (2): 181–194. doi:10.1901/jaba.2010.43-181. PMC   2884344 . PMID   21119894.
  93. Mangold, S. S. (1978). "Tactile perception and braille letter recognition: Effects of developmental teaching". Journal of Visual Impairment and Blindness. 72 (7): 259–266. doi:10.1177/0145482X7807200703. S2CID   140333019.
  94. Crawford, S; Elliott, R. T. (2007). "Analysis of phonemes, graphemes, onset-rimes, and words with braille-learning children". Journal of Visual Impairment & Blindness. 101 (9): 534–544. doi:10.1177/0145482X0710100903. S2CID   141528094.
  95. Toussaint, K. A.; Tiger, J. H. (2010). "Teaching early braille literacy skills within a stimulus equivalence paradigm to children with degenerative visual impairments". Journal of Applied Behavior Analysis. 43 (2): 182. doi:10.1901/jaba.2010.43-181. PMC   2884344 . PMID   21119894.
  96. Australian Institute for Health and Welfare, 2017
  97. Hoegh-Guldberg, Hans (2012). "Music Education Statistics".
  98. Touissant & Tiger, 2010, p.181. There are no figures to show how low musical literacy is for this variable
  99. Ianuzzi, J (1996). "Braille or Print: Why the debate?". Future Reflections. 15 (1).
  100. Ianuzzi, J (1996). "Braille or Print: Why the debate?". Future Reflections. 15 (1).
  101. Levinson, 1990, p.26
  102. Hodges & Nolker, 2011, p.80
  103. Levinson 1990, p.19
  104. Hirsch, 1983, on "cultural literacy"; as cited in Levinson 1990, p.19
  105. Sarker & Biswas, 2015; Norton, 2005, as cited in Ardila, 2010, p.108
  106. Schalug, Norton, Overy & Winner, 2005, pp.221, 226
  107. Sarker & Biswas, 2015, p.108
  108. Sarker & Biswas, 2015, p.109
  109. Schlaug, Norton, Overy & Winner, 2005, p.226
  110. Sarker & Biswas, 2015, p.110
  111. Schlaug, Norton, Overy & Winner, 2005, p.226
  112. Weinberger, 1998, as cited in Csikos & Dohany, 2016, p.4
  113. Pitts, 2000, as cited in Csikos & Dohany, 2016, p.4
  114. Sarkar & Biswas, 2015, p.110
  115. As cited in Conn, 2001, p.4
  116. As cited in Conn, 2001, p.4

Bibliography