Head (linguistics)

Last updated

In linguistics, the head or nucleus of a phrase is the word that determines the syntactic category of that phrase. For example, the head of the noun phrase boiling hot water is the noun (head noun) water. Analogously, the head of a compound is the stem that determines the semantic category of that compound. For example, the head of the compound noun handbag is bag, since a handbag is a bag, not a hand. The other elements of the phrase or compound modify the head, and are therefore the head's dependents . [1] Headed phrases and compounds are called endocentric, whereas exocentric ("headless") phrases and compounds (if they exist) lack a clear head. Heads are crucial to establishing the direction of branching. Head-initial phrases are right-branching, head-final phrases are left-branching, and head-medial phrases combine left- and right-branching.

Contents

Basic examples

Examine the following expressions:

big red dog
birdsong

The word dog is the head of big red dog since it determines that the phrase is a noun phrase, not an adjective phrase. Because the adjectives big and red modify this head noun, they are its dependents. [2] Similarly, in the compound noun birdsong, the stem song is the head since it determines the basic meaning of the compound. The stem bird modifies this meaning and is therefore dependent on song. Birdsong is a kind of song, not a kind of bird. Conversely, a songbird is a type of bird since the stem bird is the head in this compound. The heads of phrases can often be identified by way of constituency tests. For instance, substituting a single word in place of the phrase big red dog requires the substitute to be a noun (or pronoun), not an adjective.

Representing heads

Trees

Many theories of syntax represent heads by means of tree structures. These trees tend to be organized in terms of one of two relations: either in terms of the constituency relation of phrase structure grammars or the dependency relation of dependency grammars. Both relations are illustrated with the following trees: [3]

Funny stories.jpg

The constituency relation is shown on the left and the dependency relation on the right. The a-trees identify heads by way of category labels, whereas the b-trees use the words themselves as the labels. [4] The noun stories (N) is the head over the adjective funny (A). In the constituency trees on the left, the noun projects its category status up to the mother node, so that the entire phrase is identified as a noun phrase (NP). In the dependency trees on the right, the noun projects only a single node, whereby this node dominates the one node that the adjective projects, a situation that also identifies the entirety as an NP. The constituency trees are structurally the same as their dependency counterparts, the only difference being that a different convention is used for marking heads and dependents. The conventions illustrated with these trees are just a couple of the various tools that grammarians employ to identify heads and dependents. While other conventions abound, they are usually similar to the ones illustrated here.

More trees

The four trees above show a head-final structure. The following trees illustrate head-final structures further as well as head-initial and head-medial structures. The constituency trees (= a-trees) appear on the left, and dependency trees (= b-trees) on the right. Henceforth the convention is employed where the words appear as the labels on the nodes. The next four trees are additional examples of head-final phrases:

Heads picture 1.jpg

The following six trees illustrate head-initial phrases:

Head picture 3.jpg

And the following six trees are examples of head-medial phrases:

Head-medial picture.jpg

The head-medial constituency trees here assume a more traditional n-ary branching analysis. Since some prominent phrase structure grammars (e.g. most work in Government and binding theory and the Minimalist Program) take all branching to be binary, these head-medial a-trees may be controversial.

X-bar trees

Trees that are based on the X-bar schema also acknowledge head-initial, head-final, and head-medial phrases, although the depiction of heads is less direct. The standard X-bar schema for English is as follows:

Branching7.jpg

This structure is both head-initial and head-final, which makes it head-medial in a sense. It is head-initial insofar as the head X0 precedes its complement, but it is head-final insofar as the projection X' of the head follows its specifier.

Head-initial vs. head-final languages

Some language typologists classify language syntax according to a head directionality parameter in word order, that is, whether a phrase is head-initial (= right-branching) or head-final (= left-branching), assuming that it has a fixed word order at all. English is more head-initial than head-final, as illustrated with the following dependency tree of the first sentence of Franz Kafka's The Metamorphosis :

Kafka English tree.jpg

The tree shows the extent to which English is primarily a head-initial language. Structure is descending as speech and processing move from left to right. Most dependencies have the head preceding its dependent(s), although there are also head-final dependencies in the tree. For instance, the determiner-noun and adjective-noun dependencies are head-final as well as the subject-verb dependencies. Most other dependencies in English are, however, head-initial as the tree shows. The mixed nature of head-initial and head-final structures is common across languages. In fact purely head-initial or purely head-final languages probably do not exist, although there are some languages that approach purity in this respect, for instance Japanese.

The following tree is of the same sentence from Kafka's story. The glossing conventions are those established by Lehmann. One can easily see the extent to which Japanese is head-final:

Kafkaj.jpg

A large majority of head-dependent orderings in Japanese are head-final. This fact is obvious in this tree, since structure is strongly ascending as speech and processing move from left to right. Thus the word order of Japanese is in a sense the opposite of English.

Head-marking vs. dependent-marking

It is also common to classify language morphology according to whether a phrase is head-marking or dependent-marking. A given dependency is head-marking, if something about the dependent influences the form of the head, and a given dependency is dependent-marking, if something about the head influences the form of the dependent.

For instance, in the English possessive case, possessive marking ('s) appears on the dependent (the possessor), whereas in Hungarian possessive marking appears on the head noun: [5]

English:the man's house
Hungarian:az ember ház-a (the man house-POSSESSIVE)

Prosodic head

In a prosodic unit, the head is the part that extends from the first stressed syllable up to (but not including) the tonic syllable. A high head is the stressed syllable that begins the head and is high in pitch, usually higher than the beginning pitch of the tone on the tonic syllable. For example:

The bus was late.

A low head is the syllable that begins the head and is low in pitch, usually lower than the beginning pitch of the tone on the tonic syllable.

The bus was late.

See also

Notes

  1. For a good general discussion of heads, see Miller (2011:41ff.). However, take note Miller miscites Hudson's (1990) listing of Zwicky's criteria of headhood as if these were Matthews'.
  2. Discerning heads from dependents is not always easy. The exact criteria that one employs to identify the head of a phrase vary, and definitions of "head" have been debated in detail. See the exchange between Zwicky (1985, 1993) and Hudson (1987) in this regard.
  3. Dependency grammar trees similar to the ones produced in this article can be found, for instance, in Ágel et al. (2003/6).
  4. Using the words themselves as the labels on the nodes in trees is a convention that is consistent with bare phrase structure (BPS). See Chomsky (1995).
  5. See Nichols (1986).

Related Research Articles

<span class="mw-page-title-main">Syntax</span> System responsible for combining morphemes into complex structures

In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.

A syntactic category is a syntactic unit that theories of syntax assume. Word classes, largely corresponding to traditional parts of speech, are syntactic categories. In phrase structure grammars, the phrasal categories are also syntactic categories. Dependency grammars, however, do not acknowledge phrasal categories.

In grammar, a phrase—called expression in some contexts—is a group of words or singular word acting as a grammatical unit. For instance, the English expression "the very happy squirrel" is a noun phrase which contains the adjective phrase "very happy". Phrases can consist of a single word or a complete sentence. In theoretical linguistics, phrases are often analyzed as units of syntactic structure such as a constituent.

Phrase structure rules are a type of rewrite rule used to describe a given language's syntax and are closely associated with the early stages of transformational grammar, proposed by Noam Chomsky in 1957. They are used to break down a natural language sentence into its constituent parts, also known as syntactic categories, including both lexical categories and phrasal categories. A grammar that uses phrase structure rules is a type of phrase structure grammar. Phrase structure rules as they are commonly employed operate according to the constituency relation, and a grammar that employs phrase structure rules is therefore a constituency grammar; as such, it stands in contrast to dependency grammars, which are based on the dependency relation.

A noun phrase, or nominal (phrase), is a phrase that has a noun or pronoun as its head or performs the same grammatical function as a noun. Noun phrases are very common cross-linguistically, and they may be the most frequently occurring phrase type.

An adjective phrase is a phrase whose head is an adjective. Almost any grammar or syntax textbook or dictionary of linguistics terminology defines the adjective phrase in a similar way, e.g. Kesner Bland (1996:499), Crystal (1996:9), Greenbaum (1996:288ff.), Haegeman and Guéron (1999:70f.), Brinton (2000:172f.), Jurafsky and Martin (2000:362). The adjective can initiate the phrase, conclude the phrase, or appear in a medial position. The dependents of the head adjective—i.e. the other words and phrases inside the adjective phrase—are typically adverb or prepositional phrases, but they can also be clauses. Adjectives and adjective phrases function in two basic ways, attributively or predicatively. An attributive adjective (phrase) precedes the noun of a noun phrase. A predicative adjective (phrase) follows a linking verb and serves to describe the preceding subject, e.g. The man is very happy.

<span class="mw-page-title-main">Parse tree</span> Tree in formal language theory

A parse tree or parsing tree or derivation tree or concrete syntax tree is an ordered, rooted tree that represents the syntactic structure of a string according to some context-free grammar. The term parse tree itself is used primarily in computational linguistics; in theoretical syntax, the term syntax tree is more common.

In linguistics, a determiner phrase (DP) is a type of phrase headed by a determiner such as many. Controversially, many approaches, take a phrase like not very many apples to be a DP, headed, in this case, by the determiner many. This is called the DP analysis or the DP hypothesis. Others reject this analysis in favor of the more traditional NP analysis where apples would be the head of the phrase in which the DP not very many is merely a dependent. Thus, there are competing analyses concerning heads and dependents in nominal groups. The DP analysis developed in the late 1970s and early 1980s, and it is the majority view in generative grammar today.

A language is head-marking if the grammatical marks showing agreement between different words of a phrase tend to be placed on the heads of phrases, rather than on the modifiers or dependents. Many languages employ both head-marking and dependent-marking, and some languages double up and are thus double-marking. The concept of head/dependent-marking was proposed by Johanna Nichols in 1986 and has come to be widely used as a basic category in linguistic typology.

A dependent-marking language has grammatical markers of agreement and case government between the words of phrases that tend to appear more on dependents than on heads. The distinction between head-marking and dependent-marking was first explored by Johanna Nichols in 1986, and has since become a central criterion in language typology in which languages are classified according to whether they are more head-marking or dependent-marking. Many languages employ both head and dependent-marking, but some employ double-marking, and yet others employ zero-marking. However, it is not clear that the head of a clause has anything to do with the head of a noun phrase, or even what the head of a clause is.

In linguistics, branching refers to the shape of the parse trees that represent the structure of sentences. Assuming that the language is being written or transcribed from left to right, parse trees that grow down and to the right are right-branching, and parse trees that grow down and to the left are left-branching. The direction of branching reflects the position of heads in phrases, and in this regard, right-branching structures are head-initial, whereas left-branching structures are head-final. English has both right-branching (head-initial) and left-branching (head-final) structures, although it is more right-branching than left-branching. Some languages such as Japanese and Turkish are almost fully left-branching (head-final). Some languages are mostly right-branching (head-initial).

Dependency grammar (DG) is a class of modern grammatical theories that are all based on the dependency relation and that can be traced back primarily to the work of Lucien Tesnière. Dependency is the notion that linguistic units, e.g. words, are connected to each other by directed links. The (finite) verb is taken to be the structural center of clause structure. All other syntactic units (words) are either directly or indirectly connected to the verb in terms of the directed links, which are called dependencies. Dependency grammar differs from phrase structure grammar in that while it can identify phrases it tends to overlook phrasal nodes. A dependency structure is determined by the relation between a word and its dependents. Dependency structures are flatter than phrase structures in part because they lack a finite verb phrase constituent, and they are thus well suited for the analysis of languages with free word order, such as Czech or Warlpiri.

The term phrase structure grammar was originally introduced by Noam Chomsky as the term for grammar studied previously by Emil Post and Axel Thue. Some authors, however, reserve the term for more restricted grammars in the Chomsky hierarchy: context-sensitive grammars or context-free grammars. In a broader sense, phrase structure grammars are also known as constituency grammars. The defining trait of phrase structure grammars is thus their adherence to the constituency relation, as opposed to the dependency relation of dependency grammars.

A sentence diagram is a pictorial representation of the grammatical structure of a sentence. The term "sentence diagram" is used more when teaching written language, where sentences are diagrammed. The model shows the relations between words and the nature of sentence structure and can be used as a tool to help recognize which potential sentences are actual sentences.

An adpositional phrase is a syntactic category that includes prepositional phrases, postpositional phrases, and circumpositional phrases. Adpositional phrases contain an adposition as head and usually a complement such as a noun phrase. Language syntax treats adpositional phrases as units that act as arguments or adjuncts. Prepositional and postpositional phrases differ by the order of the words used. Languages that are primarily head-initial such as English predominantly use prepositional phrases whereas head-final languages predominantly employ postpositional phrases. Many languages have both types, as well as circumpositional phrases.

In theoretical linguistics, a distinction is made between endocentric and exocentric constructions. A grammatical construction is said to be endocentric if it fulfils the same linguistic function as one of its parts, and exocentric if it does not. The distinction reaches back at least to Bloomfield's work of the 1930s, who based it on terms by Pāṇini and Patañjali in Sanskrit grammar. Such a distinction is possible only in phrase structure grammars, since in dependency grammars all constructions are necessarily endocentric.

<span class="mw-page-title-main">Lucien Tesnière</span> French linguist

Lucien Tesnière was a prominent and influential French linguist. He was born in Mont-Saint-Aignan on May 13, 1893. As a senior lecturer at the University of Strasbourg (1924) and later professor at the University of Montpellier (1937), he published many papers and books on Slavic languages. However, his importance in the history of linguistics is based mainly on his development of an approach to the syntax of natural languages that would become known as dependency grammar. He presented his theory in his book Éléments de syntaxe structurale, published posthumously in 1959. In the book he proposes a sophisticated formalization of syntactic structures, supported by many examples from a diversity of languages. Tesnière died in Montpellier on December 6, 1954.

In linguistics, head directionality is a proposed parameter that classifies languages according to whether they are head-initial or head-final. The head is the element that determines the category of a phrase: for example, in a verb phrase, the head is a verb. Therefore, head initial would be "VO" languages and head final would be "OV" languages.

In linguistics, subordination is a principle of the hierarchical organization of linguistic units. While the principle is applicable in semantics, morphology, and phonology, most work in linguistics employs the term "subordination" in the context of syntax, and that is the context in which it is considered here. The syntactic units of sentences are often either subordinate or coordinate to each other. Hence an understanding of subordination is promoted by an understanding of coordination, and vice versa.

<span class="mw-page-title-main">English adjectives</span> Adjectives in the English language

English adjectives form a large open category of words in English which, semantically, tend to denote properties such as size, colour, mood, quality, age, etc. with such members as other, big, new, good, different, Cuban, sure, important, and right. Adjectives head adjective phrases, and the most typical members function as modifiers in noun phrases. Most adjectives either inflect for grade or combine with more and most to form comparatives and superlatives. They are characteristically modifiable by very. A large number of the most typical members combine with the suffix -ly to form adverbs. Most adjectives function as complements in verb phrases, and some license complements of their own.

References