Null instantiation

Last updated

In frame semantics, a theory of linguistic meaning, null instantiation is the name of a category used to annotate, or tag, absent semantic constituents or frame elements (Fillmore et al. 2003, Section 3.4). Frame semantics, best exemplified by the FrameNet project, views words as evoking frames of knowledge and frames as typically involving multiple components, called frame elements (e.g. buyer and goods in an acquisition). The term null refers to the fact that the frame element in question is absent. The logical object of the term instantiation refers to the frame element itself. So, null instantiation is an empty instantiation of a frame element. Ruppenhofer and Michaelis [1] postulate an implicational regularity tying the interpretation type of an omitted argument to the frame membership of its predicator: "If a particular frame element role is lexically omissible under a particular interpretation (either anaphoric or existential) for one LU [lexical unit] in a frame, then for any other LUs in the same frame that allow the omission of this same FE [frame element], the interpretation of the missing FE is the same." (Ruppenhofer and Michaelis 2014: 66)

Frame semantics is a theory of linguistic meaning developed by Charles J. Fillmore that extends his earlier case grammar. It relates linguistic semantics to encyclopedic knowledge. The basic idea is that one cannot understand the meaning of a single word without access to all the essential knowledge that relates to that word. For example, one would not be able to understand the word "sell" without knowing anything about the situation of commercial transfer, which also involves, among other things, a seller, a buyer, goods, money, the relation between the money and the goods, the relations between the seller and the goods and the money, the relation between the buyer and the goods and the money and so on.

Tag (metadata) metadata

In information systems, a tag is a keyword or term assigned to a piece of information. This kind of metadata helps describe an item and allows it to be found again by browsing or searching. Tags are generally chosen informally and personally by the item's creator or by its viewer, depending on the system, although they may also be chosen from a controlled vocabulary.

Charles J. Fillmore was an American linguist and Professor of Linguistics at the University of California, Berkeley. He received his Ph.D. in Linguistics from the University of Michigan in 1961. Fillmore spent ten years at The Ohio State University and a year as a Fellow at the Center for Advanced Study in the Behavioral Sciences at Stanford University before joining Berkeley's Department of Linguistics in 1971. Fillmore was extremely influential in the areas of syntax and lexical semantics.

Contents

Definite null instantiation

Definite null instantiation is the absence of a frame element that is recoverable from the context. It is similar to a zero anaphor.

Indefinite null instantiation

Indefinite null instantiation is the absence of the object of a potentially transitive verb such as eat or drink.

A transitive verb is a verb that accepts one or more objects. This contrasts with intransitive verbs, which do not have objects. Transitivity is traditionally thought a global property of a clause, by which activity is transferred from an agent to a patient.

Constructional null instantiation

Constructional null instantiation is the absence of a frame element due to a syntactic construction, e.g. the optional omission of agents in passive sentences.

See also

Related Research Articles

Semantics is the linguistic and philosophical study of meaning, in language, programming languages, formal logics, and semiotics. It is concerned with the relationship between signifiers—like words, phrases, signs, and symbols—and what they stand for in reality, their denotation.

In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.

In linguistics, anaphora is the use of an expression whose interpretation depends upon another expression in context. In a narrower sense, anaphora is the use of an expression that depends specifically upon an antecedent expression and thus is contrasted with cataphora, which is the use of an expression that depends upon a postcedent expression. The anaphoric (referring) term is called an anaphor. For example, in the sentence Sally arrived, but nobody saw her, the pronoun her is an anaphor, referring back to the antecedent Sally. In the sentence Before her arrival, nobody saw Sally, the pronoun her refers forward to the postcedent Sally, so her is now a cataphor. Usually, an anaphoric expression is a proform or some other kind of deictic (contextually-dependent) expression. Both anaphora and cataphora are species of endophora, referring to something mentioned elsewhere in a dialog or text.

In linguistics, inalienable possession is a type of possession in which a noun is obligatorily possessed by its possessor. Nouns or nominal affixes in an inalienable possession relationship cannot exist independently or be "alienated" from their possessor. Inalienable nouns include body parts, kinship terms, and part-whole relations. Many languages reflect this distinction, but they vary on how they mark inalienable possession. Cross-linguistically, inalienability correlates with many morphological, syntactic, and semantic properties.

In linguistics, construction grammar groups a number of models of grammar that all subscribe to the idea that knowledge of a language is based on a collection of "form and function pairings". The "function" side covers what is commonly understood as meaning, content, or intent; it usually extends over both conventional fields of semantics and pragmatics.

In linguistics, valency or valence is the number of arguments controlled by a predicate, content verbs being typical predicates. Valency is related, though not identical, to subcategorization and transitivity, which count only object arguments – valency counts all arguments, including the subject. The linguistic meaning of valency derives from the definition of valency in chemistry. The valency metaphor appeared first in linguistics in Charles Sanders Peirce's essay The logic of relatives in 1897, and it then surfaced in the works of a number of linguists decades later in the late 1940s and 1950s. Lucien Tesnière is credited most with having established the valency concept in linguistics.

In generative grammar, non-configurational languages are languages characterized by a flat phrase structure, which allows syntactically discontinuous expressions, and a relatively free word order.

Cognitive semantics is part of the cognitive linguistics movement. Semantics is the study of linguistic meaning. Cognitive semantics holds that language is part of a more general human cognitive ability, and can therefore only describe the world as people conceive of it. It is implicit that different linguistic communities conceive of simple things and processes in the world differently, not necessarily some difference between a person's conceptual world and the real world.

In computational linguistics, FrameNet is a project housed at the International Computer Science Institute in Berkeley, California which produces an electronic resource based on a theory of meaning called frame semantics. FrameNet reveals for example that the sentence "John sold a car to Mary" essentially describes the same basic situation as "Mary bought a car from John", just from a different perspective. A semantic frame can be thought of as a conceptual structure describing an event, relation, or object and the participants in it. The FrameNet lexical database contains over 1,200 semantic frames, 13,000 lexical units and 202,000 example sentences. FrameNet is largely the creation of Charles J. Fillmore, who developed the theory of frame semantics that the project is based on, and was initially the project leader when the project began in 1997. Collin Baker became the project manager in 2000. The FrameNet project has been influential in both linguistics and natural language processing, where it led to the task of automatic Semantic Role Labeling.

In linguistics, ellipsis or an elliptical construction is the omission from a clause of one or more words that are nevertheless understood in the context of the remaining elements. There are numerous distinct types of ellipsis acknowledged in theoretical syntax. This article provides an overview of them. Theoretical accounts of ellipsis can vary greatly depending in part upon whether a constituency-based or a dependency-based theory of syntactic structure is pursued.

In linguistics, an argument is an expression that helps complete the meaning of a predicate, the latter referring in this context to a main verb and its auxiliaries. In this regard, the complement is a closely related concept. Most predicates take one, two, or three arguments. A predicate and its arguments form a predicate-argument structure. The discussion of predicates and arguments is associated most with (content) verbs and noun phrases (NPs), although other syntactic categories can also be construed as predicates and as arguments. Arguments must be distinguished from adjuncts. While a predicate needs its arguments to complete its meaning, the adjuncts that appear with a predicate are optional; they are not necessary to complete the meaning of the predicate. Most theories of syntax and semantics acknowledge arguments and adjuncts, although the terminology varies, and the distinction is generally believed to exist in all languages. Dependency grammars sometimes call arguments actants, following Tesnière (1959).

Laura Michaelis American linguist

Laura A. Michaelis is a Professor in the Department of Linguistics and a faculty fellow in the Institute of Cognitive Science at the University of Colorado Boulder.

Generative Lexicon (GL) is a theory of linguistic semantics which focuses on the distributed nature of compositionality in natural language. The first major work outlining the framework is James Pustejovsky's "Generative Lexicon" (1991). Subsequent important developments are presented in Pustejovsky and Boguraev (1993), Bouillon (1997), and Busa (1996). The first unified treatment of GL was given in Pustejovsky (1995). Unlike purely verb-based approaches to compositionality, Generative Lexicon attempts to spread the semantic load across all constituents of the utterance. Central to the philosophical perspective of GL are two major lines of inquiry: (1) How is it that we are able to deploy a finite number of words in our language in an unbounded number of contexts? (2) Is lexical information and the representations used in composing meanings separable from our commonsense knowledge?

Combinatory categorial grammar (CCG) is an efficiently parsable, yet linguistically expressive grammar formalism. It has a transparent interface between surface syntax and underlying semantic representation, including predicate-argument structure, quantification and information structure. The formalism generates constituency-based structures and is therefore a type of phrase structure grammar.

Zero-marking in English is the indication of a particular grammatical function by the absence of any morpheme. The most common types of zero-marking in English involve zero articles, zero relative pronouns, and zero subordinating conjunctions. Examples of these are I like cats, that's the cat I saw, in which the relative clause (that) I saw omits the implied relative pronoun that that would be the object of the clause's verb, and I wish you were here, in which the dependent clause (that) you were here omits the subordinating conjunction that.

In linguistics, subcategorization denotes the ability/necessity for lexical items to require/allow the presence and types of the syntactic arguments with which they co-occur. The notion of subcategorization is similar to the notion of valency, although the two concepts stem from different traditions in the study of syntax and grammar.

This is an index of articles in philosophy of language

In linguistics, a relativizer is a type of conjunction that introduces a relative clause. For example, in English, the conjunction that may be considered a relativizer in a sentence such as "I have one that you can use." Relativizers do not appear, at least overtly, in all languages; even in languages that do have overt or pronounced relativizers, they do not necessarily appear all of the time. For these reasons it has been suggested that in some cases, a "zero relativizer" may be present, meaning that a relativizer is implied in the grammar but is not actually realized in speech or writing. For example, the word that can be omitted in the above English example, producing "I have one you can use", using a zero relativizer.

References

  1. Fillmore, Charles J.; Christopher R. Johnson; and Miriam R.L. Petruck (2003) Background to FrameNet. 'International Journal of Lexicography' 16(3):235-250.
  2. Fillmore, Charles J. (1986) Pragmatically Controlled Zero Anaphora. 'Proceedings of the Twelfth Annual Meeting of the Berkeley Linguistic Society'.
  3. Ruppenhofer, Josef and Laura A. Michaelis (2014) Frames and the Interpretation of Omitted Arguments in English. In S. Katz Bourns and L. Myers, (eds.), Linguistic Perspectives on Structure and Context: Studies in Honor of Knud Lambrecht. Amsterdam: Benjamins. 57-86.
Specific
  1. Ruppenhofer and Michaelis 2014