Verbal intelligence

Last updated
English alphabet. Letters form the basis for many languages, including English Classic alphabet chart at coloring-pages-for-kids-boys-dotcom.svg
English alphabet. Letters form the basis for many languages, including English

Verbal intelligence is the ability to understand and reason using concepts framed in words. More broadly, it is linked to problem solving, abstract reasoning, [1] and working memory. Verbal intelligence is one of the most g-loaded abilities. [2]

Contents

Linguistic intelligence

In order to understand linguistic intelligence, it is important to understand the mechanisms that control speech and language. These mechanisms can be broken down into four major groups: speech generation (talking), speech comprehension (hearing), writing generation (writing), and writing comprehension (reading).

In a practical sense, linguistic intelligence is the extent to which an individual can use language, both written and verbal, to achieve goals. [3]

Linguistic intelligence is a part of Howard Gardner's multiple intelligence theory that deals with individuals' ability to understand both spoken and written language, as well as their ability to speak and write themselves.

Spoken language

Generation

Inferior frontal gyrus; a major part of the inferior frontal cortex Inferior frontal gyrus.png
Inferior frontal gyrus; a major part of the inferior frontal cortex

Speech production is the process by which a thought in the brain is converted into an understandable auditory form. [4] [5] [6] This is a multistage mechanism that involves many different areas of the brain. The first stage is planning, where the brain constructs words and sentences that turn the thought into an understandable form. [4] This occurs primarily in the inferior frontal cortex, specifically in an area known as Broca's area. [5] [6] [7] Next, the brain must plan how to physically create the sounds necessary for speech by linking the planned speech with known sounds, or phonemes. While the location of these associations is not known, it is known that the supplementary motor area plays a key role in this step. [4] [8] [ page needed ] Finally, the brain must signal for the words to actually be spoken. This is carried out by the premotor cortex and the motor cortex. [8]

Motor cortex with muscle localization shown Human motor cortex topography.png
Motor cortex with muscle localization shown

In most cases, speech production is controlled by the left hemisphere. In a series of studies, Wilder Penfield, among others, probed the brains of both right-handed (generally left-hemisphere dominant) and left-handed (generally right-hemisphere dominant) patients. They discovered that, regardless of handedness, the left hemisphere was almost always the speech controlling side. However, it has been discovered that in cases of neural stress (hemorrhage, stroke, etc.) the right hemisphere has the ability to take control of speech functions. [9]

Comprehension

Verbal Comprehension is a fairly complex process, and it is not fully understood. From various studies and experiments, it has been found that the superior temporal sulcus activates when hearing human speech, and that speech processing seems to occur within Wernicke's area. [6] [8] [ page needed ]

Auditory feedback and feedforward

Hearing plays an important part in both speech generation and comprehension. When speaking, the person can hear their speech, and the brain uses what it hears as a feedback mechanism to fix speech errors. [10] If a single feedback correction occurs multiple times, the brain will begin to incorporate the correction to all future speech, making it a feed forward mechanism. [10] This is apparent in some deaf people. Deafness, as well as other, smaller deficiencies in hearing, can greatly affect one's ability to comprehend spoken language, as well as to speak it. [11] However, if the person loses hearing ability later in life, most can still maintain a normal level of verbal intelligence. This is thought to be because of the brain's feed forward mechanism still helping to fix speech errors, even in the absence of auditory feedback. [10]

Written language

Generation

Generation of written language is thought to be closely related to speech generation. Neurophysiologically speaking, it is believed that Broca's area is crucial for early linguistic processing, while the inferior frontal gyrus is critical in semantic processing. [6] [8] According to Penfield, writing differs in two major ways from verbal language. First, instead of relating the thought to sounds, the brain must relate the thought to symbols or letters, and second, the motor cortex activates a different set of muscles to write, than when speaking. [8] [ page needed ]

Comprehension

Written comprehension, similar to spoken comprehension, seems to occur primarily in Wernicke's area. [8] [ page needed ] However, instead of using the auditory system to gain language input, written comprehension relies on the visual system.

Protein NRXN1, which is created from the NRXN1 gene Protein NRXN1 PDB 1c4r.png
Protein NRXN1, which is created from the NRXN1 gene

While the capabilities of the physical structures used are large factors in determining linguistic intelligence, there have been several genes that have been linked to individual linguistic ability. [12] The NRXN1 gene has been linked to general language ability, and mutations of this gene has been shown to cause major issues to overall linguistic intelligence. [12] The CNTNAP2 gene is believed to affect language development and performance, and mutations in this gene is thought to be involved in autism spectrum disorders. [12] PCDH11 has been linked to language capacity, and it is believed to be one of the factors that accounts for the variation in linguistic intelligence. [12]

Measurement and testing

The Wechsler Adult Intelligence Scale III (WAIS-III) divides Verbal IQ (VIQ) into two categories:

Verbal fluency tests

In general, it is difficult to test for linguistic intelligence as a whole, therefore various types of verbal fluency tests are often used. [5] [7] [15]

Verbal fluency in children

In one series of tests, it was shown that when children were given verbal fluency tests, a larger portion of their cortex activated compared to adults, as well as activation of both the left and right hemispheres. This is most likely due to the high plasticity of newly developing brains. [16]

Possible conflict

Recently, a study was done showing that verbal fluency test results can differ depending on the mental focus of the subject. In this study, mental focus on physical speech production mechanisms caused speech production times to suffer, whereas mental focus on auditory feedback improved these times. [17]

Disorders affecting linguistic intelligence

Since linguistic intelligence is based on several complex skills, there are many disorders and injuries that can affect an individual's linguistic intelligence.

Injuries

Damage and injury in the brain can severely lower one's ability to communicate, and therefore lower one's linguistic intelligence. Common forms of major damage are strokes, concussions, brain tumors, viral/bacterial damage, and drug-related damage. The three major linguistic disorders that result from these injuries are aphasia, alexia, and agraphia. [8] Aphasia is the inability to speak, and can be caused by damage to Broca's area or the motor cortex. [8] Alexia is the inability to read, which can arise from damage to Wernicke's area, among other places. [8] [ page needed ] Agraphia is the inability to write which can also arise from damage to Broca's area or the motor cortex. [8] In addition, damage to large areas of the brain can result in any combinations of these disorders, as well as a loss of other abilities. [8]

Pure language disorders

There are several disorders that primarily affect only language skills. Three major pure language disorders are Developmental verbal dyspraxia, specific language impairment, and stuttering. [12] Developmental verbal dyspraxia (DVD) is a disorder where children have errors in consonant and vowel production. [12] Specific language impairment (SLI) is a disorder where the patient has a lack of language acquisition skills, despite a seemingly normal intelligence level in other areas. [12] Stuttering is a fairly common disorder where speech flow is interrupted by involuntary repetitions of syllables. [12]

Other disorders affecting language

Some disorders cause a wide array of effects, and language impairment is merely one of many possible symptoms. The two major disorders of this type are autism spectrum disorder and epilepsy. [12] Autism spectrum disorder (ASD) is a disorder in which the patient suffers from decreased social skills and lowered mental flexibility. As a result, many patients suffering from ASD also have language problems, arising from both the lack of social interaction and lowered mental flexibility. [12] Epilepsy is a disorder where electrical malfunctions or mis-communications in the brain cause seizures, leading to muscle spasms and activation of other organs and systems of the body. Over time, epilepsy can lead to cognitive and behavioral decay. This mental decay can eventually lead to a loss of language and communication skills. [12] Some authors discuss the relationships that exist between expressive language and auditory reception, and therefore language disorders and auditory processing disorders.

See also

Related Research Articles

<span class="mw-page-title-main">Aphasia</span> Inability to comprehend or formulate language

In aphasia, a person may be unable to comprehend or unable to formulate language because of damage to specific brain regions. The major causes are stroke and head trauma; prevalence is hard to determine, but aphasia due to stroke is estimated to be 0.1–0.4% in the Global North. Aphasia can also be the result of brain tumors, epilepsy, autoimmune neurological diseases, brain infections, or neurodegenerative diseases.

<span class="mw-page-title-main">Expressive aphasia</span> Language disorder involving inability to produce language

Expressive aphasia is a type of aphasia characterized by partial loss of the ability to produce language, although comprehension generally remains intact. A person with expressive aphasia will exhibit effortful speech. Speech generally includes important content words but leaves out function words that have more grammatical significance than physical meaning, such as prepositions and articles. This is known as "telegraphic speech". The person's intended message may still be understood, but their sentence will not be grammatically correct. In very severe forms of expressive aphasia, a person may only speak using single word utterances. Typically, comprehension is mildly to moderately impaired in expressive aphasia due to difficulty understanding complex grammar.

<span class="mw-page-title-main">Language center</span> Speech processing areas of the brain

In neuroscience and psychology, the term language center refers collectively to the areas of the brain which serve a particular function for speech processing and production. Language is a core system that gives humans the capacity to solve difficult problems and provides them with a unique type of social interaction. Language allows individuals to attribute symbols to specific concepts, and utilize them through sentences and phrases that follow proper grammatical rules. Finally, speech is the mechanism by which language is orally expressed.

<span class="mw-page-title-main">Receptive aphasia</span> Language disorder involving inability to understand language

Wernicke's aphasia, also known as receptive aphasia, sensory aphasia, fluent aphasia, or posterior aphasia, is a type of aphasia in which individuals have difficulty understanding written and spoken language. Patients with Wernicke's aphasia demonstrate fluent speech, which is characterized by typical speech rate, intact syntactic abilities and effortless speech output. Writing often reflects speech in that it tends to lack content or meaning. In most cases, motor deficits do not occur in individuals with Wernicke's aphasia. Therefore, they may produce a large amount of speech without much meaning. Individuals with Wernicke's aphasia often suffer of anosognosia – they are unaware of their errors in speech and do not realize their speech may lack meaning. They typically remain unaware of even their most profound language deficits.

<span class="mw-page-title-main">Broca's area</span> Speech production region in the dominant hemisphere of the hominid brain

Broca's area, or the Broca area, is a region in the frontal lobe of the dominant hemisphere, usually the left, of the brain with functions linked to speech production.

<span class="mw-page-title-main">Anomic aphasia</span> Medical condition

Anomic aphasia is a mild, fluent type of aphasia where individuals have word retrieval failures and cannot express the words they want to say. By contrast, anomia is a deficit of expressive language, and a symptom of all forms of aphasia, but patients whose primary deficit is word retrieval are diagnosed with anomic aphasia. Individuals with aphasia who display anomia can often describe an object in detail and maybe even use hand gestures to demonstrate how the object is used, but cannot find the appropriate word to name the object. Patients with anomic aphasia have relatively preserved speech fluency, repetition, comprehension, and grammatical speech.

<span class="mw-page-title-main">Wernicke's area</span> Speech comprehension region in the dominant hemisphere of the hominid brain

Wernicke's area, also called Wernicke's speech area, is one of the two parts of the cerebral cortex that are linked to speech, the other being Broca's area. It is involved in the comprehension of written and spoken language, in contrast to Broca's area, which is primarily involved in the production of language. It is traditionally thought to reside in Brodmann area 22, which is located in the superior temporal gyrus in the dominant cerebral hemisphere, which is the left hemisphere in about 95% of right-handed individuals and 70% of left-handed individuals.

<span class="mw-page-title-main">Conduction aphasia</span> Inability to repeat speech despite being able to perceive and produce it

In neurology, conduction aphasia, also called associative aphasia, is an uncommon form of difficulty in speaking (aphasia). It is caused by damage to the parietal lobe of the brain. An acquired language disorder, it is characterised by intact auditory comprehension, coherent speech production, but poor speech repetition. Affected people are fully capable of understanding what they are hearing, but fail to encode phonological information for production. This deficit is load-sensitive as the person shows significant difficulty repeating phrases, particularly as the phrases increase in length and complexity and as they stumble over words they are attempting to pronounce. People have frequent errors during spontaneous speech, such as substituting or transposing sounds. They are also aware of their errors and will show significant difficulty correcting them.

<span class="mw-page-title-main">Global aphasia</span> Medical condition

Global aphasia is a severe form of nonfluent aphasia, caused by damage to the left side of the brain, that affects receptive and expressive language skills as well as auditory and visual comprehension. Acquired impairments of communicative abilities are present across all language modalities, impacting language production, comprehension, and repetition. Patients with global aphasia may be able to verbalize a few short utterances and use non-word neologisms, but their overall production ability is limited. Their ability to repeat words, utterances, or phrases is also affected. Due to the preservation of the right hemisphere, an individual with global aphasia may still be able to express themselves through facial expressions, gestures, and intonation. This type of aphasia often results from a large lesion of the left perisylvian cortex. The lesion is caused by an occlusion of the left middle cerebral artery and is associated with damage to Broca's area, Wernicke's area, and insular regions which are associated with aspects of language.

Transcortical sensory aphasia (TSA) is a kind of aphasia that involves damage to specific areas of the temporal lobe of the brain, resulting in symptoms such as poor auditory comprehension, relatively intact repetition, and fluent speech with semantic paraphasias present. TSA is a fluent aphasia similar to Wernicke's aphasia, with the exception of a strong ability to repeat words and phrases. The person may repeat questions rather than answer them ("echolalia").

Amusia is a musical disorder that appears mainly as a defect in processing pitch but also encompasses musical memory and recognition. Two main classifications of amusia exist: acquired amusia, which occurs as a result of brain damage, and congenital amusia, which results from a music-processing anomaly present since birth.

<span class="mw-page-title-main">Brodmann area 22</span> Region of the brains temporal lobe

Brodmann area 22 is a Brodmann's area that is cytoarchitecturally located in the posterior superior temporal gyrus of the brain. In the left cerebral hemisphere, it is one portion of Wernicke's area. The left hemisphere BA22 helps with generation and understanding of individual words. On the right side of the brain, BA22 helps to discriminate pitch and sound intensity, both of which are necessary to perceive melody and prosody. Wernicke's area is active in processing language and consists of the left Brodmann area 22 and Brodmann area 40, the supramarginal gyrus.

Transcortical motor aphasia (TMoA), also known as commissural dysphasia or white matter dysphasia, results from damage in the anterior superior frontal lobe of the language-dominant hemisphere. This damage is typically due to cerebrovascular accident (CVA). TMoA is generally characterized by reduced speech output, which is a result of dysfunction of the affected region of the brain. The left hemisphere is usually responsible for performing language functions, although left-handed individuals have been shown to perform language functions using either their left or right hemisphere depending on the individual. The anterior frontal lobes of the language-dominant hemisphere are essential for initiating and maintaining speech. Because of this, individuals with TMoA often present with difficulty in speech maintenance and initiation.

<span class="mw-page-title-main">Speech</span> Human vocal communication using spoken language

Speech is the use of the human voice as a medium for language. Spoken language combines vowel and consonant sounds to form units of meaning like words, which belong to a language's lexicon. There are many different intentional speech acts, such as informing, declaring, asking, persuading, directing; acts may vary in various aspects like enunciation, intonation, loudness, and tempo to convey meaning. Individuals may also unintentionally communicate aspects of their social position through speech, such as sex, age, place of origin, physiological and mental condition, education, and experiences.

Auditory verbal agnosia (AVA), also known as pure word deafness, is the inability to comprehend speech. Individuals with this disorder lose the ability to understand language, repeat words, and write from dictation. Some patients with AVA describe hearing spoken language as meaningless noise, often as though the person speaking was doing so in a foreign language. However, spontaneous speaking, reading, and writing are preserved. The maintenance of the ability to process non-speech auditory information, including music, also remains relatively more intact than spoken language comprehension. Individuals who exhibit pure word deafness are also still able to recognize non-verbal sounds. The ability to interpret language via lip reading, hand gestures, and context clues is preserved as well. Sometimes, this agnosia is preceded by cortical deafness; however, this is not always the case. Researchers have documented that in most patients exhibiting auditory verbal agnosia, the discrimination of consonants is more difficult than that of vowels, but as with most neurological disorders, there is variation among patients.

<span class="mw-page-title-main">Lateralization of brain function</span> Specialization of some cognitive functions in one side of the brain

The lateralization of brain function is the tendency for some neural functions or cognitive processes to be specialized to one side of the brain or the other. The median longitudinal fissure separates the human brain into two distinct cerebral hemispheres, connected by the corpus callosum. Although the macrostructure of the two hemispheres appears to be almost identical, different composition of neuronal networks allows for specialized function that is different in each hemisphere.

Auditory agnosia is a form of agnosia that manifests itself primarily in the inability to recognize or differentiate between sounds. It is not a defect of the ear or "hearing", but rather a neurological inability of the brain to process sound meaning. While auditory agnosia impairs the understanding of sounds, other abilities such as reading, writing, and speaking are not hindered. It is caused by bilateral damage to the anterior superior temporal gyrus, which is part of the auditory pathway responsible for sound recognition, the auditory "what" pathway.

<span class="mw-page-title-main">Superior temporal sulcus</span> Part of the brains temporal lobe

In the human brain, the superior temporal sulcus (STS) is the sulcus separating the superior temporal gyrus from the middle temporal gyrus in the temporal lobe of the brain. A sulcus is a deep groove that curves into the largest part of the brain, the cerebrum, and a gyrus is a ridge that curves outward of the cerebrum.

The temporal dynamics of music and language describes how the brain coordinates its different regions to process musical and vocal sounds. Both music and language feature rhythmic and melodic structure. Both employ a finite set of basic elements that are combined in ordered ways to create complete musical or lingual ideas.

<span class="mw-page-title-main">Sign language in the brain</span>

Sign language refers to any natural language which uses visual gestures produced by the hands and body language to express meaning. The brain's left side is the dominant side utilized for producing and understanding sign language, just as it is for speech. In 1861, Paul Broca studied patients with the ability to understand spoken languages but the inability to produce them. The damaged area was named Broca's area, and located in the left hemisphere’s inferior frontal gyrus. Soon after, in 1874, Carl Wernicke studied patients with the reverse deficits: patients could produce spoken language, but could not comprehend it. The damaged area was named Wernicke's area, and is located in the left hemisphere’s posterior superior temporal gyrus.

References

  1. Luwel, Koen; Ageliki Foustana; Patrick Onghena; Lieven Verschaffel (Apr 2013). "The role of verbal and performance intelligence in children's strategy selection and execution". Learning and Individual Differences. 24: 134–138. doi:10.1016/j.lindif.2013.01.010.
  2. Wechsler, D. (1997). Wechsler Adult Intelligence Scale III (WAIS-III).
  3. 1 2 Fernandez-Martinez, Fernando; Kseniya Zablotskaya; Wolfgang Minker (Aug 2012). "Text categorization methods for automatic estimation of verbal intelligence". Expert Systems with Applications. 39 (10): 9807–9820. doi:10.1016/j.eswa.2012.02.173.
  4. 1 2 3 Bohland, Jason; Daniel Bullock; Frank Guenther (Jul 2010). "Neural Representations and Mechanisms for the Performance of Simple Speech Sequences". Journal of Cognitive Neuroscience. 22 (7): 1504–1529. doi:10.1162/jocn.2009.21306. PMC   2937837 . PMID   19583476.
  5. 1 2 3 Dan, Haruka; Sano, Kyutoku; Oguro, Yokota; Tsuzuki, Watanabe (Aug 2013). "Language-specific cortical activation patterns for verbal fluency tasks in Japanese as assessed by multichannel functional near-infrared spectroscopy". Brain and Language. 126 (2): 208–216. doi:10.1016/j.bandl.2013.05.007. PMID   23800710. S2CID   22086622.
  6. 1 2 3 4 Rodd, J.M.; M.H. Davis; I.S. Johnsrude (Aug 2005). "The neural mechanisms of speech comprehension: fMRI studies of semantic ambiguity". Cerebral Cortex. 15 (8): 1261–1269. CiteSeerX   10.1.1.590.5918 . doi: 10.1093/cercor/bhi009 . PMID   15635062.
  7. 1 2 Konrad, Andreas; Goran Vucurevic; Francesco Musso; Georg Winterer (Apr 2012). "VBM-DTI Correlates of Verbal Intelligence: A Potential Link to Broca's area". Journal of Cognitive Neuroscience. 24 (4): 888–895. doi: 10.1162/jocn_a_00187 . PMID   22220724. S2CID   28116411.
  8. 1 2 3 4 5 6 7 8 9 10 11 Penfield, Wilder (1959). Speech and Brain-Mechanism. Atheneum.
  9. Doidge, Norman (2007). The Brain That Changes Itself: Stories of Personal Triumph from the Frontiers of Brain Science. Penguin Books.
  10. 1 2 3 Perkell, Joseph (Sep 2012). "Movement goals and feedback and feed forward control mechanisms in speech production". Journal of Neurolinguistics. 25 (5): 382–407. doi:10.1016/j.jneuroling.2010.02.011. PMC   3361736 . PMID   22661828.
  11. Tourville, Jason; Kevin Reilly; Frank Guenther (1 February 2008). "Neural mechanisms underlying auditory feedback control of speech". NeuroImage. 39 (3): 1429–1443. doi:10.1016/j.neuroimage.2007.09.054. PMC   3658624 . PMID   18035557.
  12. 1 2 3 4 5 6 7 8 9 10 11 Szalontai, Adam; Katalin Csiszar (September 2013). "Genetic insights into the functional elements of language". Human Genetics. 132 (9): 959–986. doi:10.1007/s00439-013-1317-0. PMID   23749164. S2CID   17320009.
  13. Axelrod, Bradely N. (2001). "Administration duration for the Wechsler Adult Intelligence Scale-III and Wechsler Memory Scale-III". Archives of Clinical Neuropsychology. 16 (3): 293–301. doi: 10.1093/arclin/16.3.293 . PMID   14590179.
  14. "Wechsler Adult Intelligence Scale - an overview | ScienceDirect Topics". www.sciencedirect.com. Retrieved 2020-09-16.
  15. 1 2 3 4 5 6 Casals-Coll, M.; Sanchez-Benavides; Quintana; Manero; Rognoni; Calvo; Palomo; Aranciva; Tamayo; Pena-Casanova (Jan–Feb 2013). "Spanish normative studies in young adults (NEURONORMA young adults project): Normative data: norms for verbal fluency tests". Neurologia. 28 (1): 33–40. doi: 10.1016/j.nrleng.2012.02.003 . PMID   22652141.
  16. Gaillard, W.D.; Hertz-Pannier; Mott; Barnett; LeBihan; Theodore (Jan 2000). "Functional anatomy of cognitive development - fMRI of verbal fluency in children and adults". Neurology. 54 (1): 180–185. doi:10.1212/wnl.54.1.180. PMID   10636145. S2CID   36324874.
  17. Lisman, Amanda; Neeraja, Sadagopan (May–Jun 2013). "Focus of attention and speech motor performance". Journal of Communication Disorders. 46 (3): 281–293. doi:10.1016/j.jcomdis.2013.02.002. PMID   23497961.