Frame semantics (linguistics)

Last updated

Frame semantics is a theory of linguistic meaning developed by Charles J. Fillmore [1] that extends his earlier case grammar. It relates linguistic semantics to encyclopedic knowledge. The basic idea is that one cannot understand the meaning of a single word without access to all the essential knowledge that relates to that word. For example, one would not be able to understand the word "sell" without knowing anything about the situation of commercial transfer, which also involves, among other things, a seller, a buyer, goods, money, the relation between the money and the goods, the relations between the seller and the goods and the money, the relation between the buyer and the goods and the money and so on. Thus, a word activates, or evokes, a frame of semantic knowledge relating to the specific concept to which it refers (or highlights, in frame semantic terminology).

Contents

The idea of the encyclopedic organisation of knowledge itself is old and was discussed by Age of Enlightenment philosophers such as Denis Diderot [2] and Giambattista Vico. [3] Fillmore and other evolutionary and cognitive linguists like John Haiman and Adele Goldberg, however, make an argument against generative grammar and truth-conditional semantics. As is elementary for Lakoffian–Langackerian Cognitive Linguistics, it is claimed that knowledge of language is no different from other types of knowledge; therefore there is no grammar in the traditional sense, and language is not an independent cognitive function. [4] Instead, the spreading and survival of linguistic units is directly comparable to that of other types of units of cultural evolution, like in memetics and other cultural replicator theories. [5] [6] [7]

Use in cognitive linguistics and construction grammar

The theory applies the notion of a semantic frame also used in artificial intelligence, which is a collection of facts that specify "characteristic features, attributes, and functions of a denotatum, and its characteristic interactions with things necessarily or typically associated with it." [8] A semantic frame can also be defined as a coherent structure of related concepts that are related such that without knowledge of all of them, one does not have complete knowledge of any one; they are in that sense types of gestalt. Frames are based on recurring experiences, therefore the commercial transaction frame is based on recurring experiences of commercial transactions.

Words not only highlight individual concepts, but also specify a certain perspective from which the frame is viewed. For example "sell" views the situation from the perspective of the seller and "buy" from the perspective of the buyer. This, according to Fillmore, explains the observed asymmetries in many lexical relations.

While originally only being applied to lexemes, frame semantics has now been expanded to grammatical constructions and other larger and more complex linguistic units and has more or less been integrated into construction grammar as the main semantic principle. Semantic frames are also becoming used in information modeling, for example in Gellish, especially in the form of 'definition models' and 'knowledge models'.

Frame semantics has much in common with the semantic principle of profiling from Ronald W. Langacker's cognitive grammar. [9]

The concept of frames has been several times considered in philosophy and psycholinguistics, namely supported by Lawrence W. Barsalou, [10] and more recently by Sebastian Löbner. [11] They are viewed as a cognitive representation of the real world. From a computational linguistics viewpoint, there are semantic models of a sentence. This approach going further than just the lexical aspect is especially studied in SFB 991 in Düsseldorf.

Applications

Google originally started a frame semantic parser project that aims parse the information on Wikipedia and transfer it into Wikidata by coming up with relevant relations using artificial intelligence. [12]

See also

Related Research Articles

The following outline is provided as an overview and topical guide to linguistics:

<span class="mw-page-title-main">Natural language processing</span> Field of linguistics and computer science

Natural language processing (NLP) is an interdisciplinary subfield of linguistics, computer science, and artificial intelligence concerned with the interactions between computers and human language, in particular how to program computers to process and analyze large amounts of natural language data. The goal is a computer capable of "understanding" the contents of documents, including the contextual nuances of the language within them. The technology can then accurately extract information and insights contained in the documents as well as categorize and organize the documents themselves.

In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.

Semantics is the study of reference, meaning, or truth. The term can be used to refer to subfields of several distinct disciplines, including philosophy, linguistics and computer science.

Cognitive linguistics is an interdisciplinary branch of linguistics, combining knowledge and research from cognitive science, cognitive psychology, neuropsychology and linguistics. Models and theoretical accounts of cognitive linguistics are considered as psychologically real, and research in cognitive linguistics aims to help understand cognition in general and is seen as a road into the human mind.

Natural-language understanding (NLU) or natural-language interpretation (NLI) is a subtopic of natural-language processing in artificial intelligence that deals with machine reading comprehension. Natural-language understanding is considered an AI-hard problem.

Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.

Head-driven phrase structure grammar (HPSG) is a highly lexicalized, constraint-based grammar developed by Carl Pollard and Ivan Sag. It is a type of phrase structure grammar, as opposed to a dependency grammar, and it is the immediate successor to generalized phrase structure grammar. HPSG draws from other fields such as computer science and uses Ferdinand de Saussure's notion of the sign. It uses a uniform formalism and is organized in a modular way which makes it attractive for natural language processing.

Construction grammar is a family of theories within the field of cognitive linguistics which posit that constructions, or learned pairings of linguistic patterns with meanings, are the fundamental building blocks of human language. Constructions include words, morphemes, fixed expressions and idioms, and abstract grammatical rules such as the passive voice or the ditransitive. Any linguistic pattern is considered to be a construction as long as some aspect of its form or its meaning cannot be predicted from its component parts, or from other constructions that are recognized to exist. In construction grammar, every utterance is understood to be a combination of multiple different constructions, which together specify its precise meaning and form.

Cognitive semantics is part of the cognitive linguistics movement. Semantics is the study of linguistic meaning. Cognitive semantics holds that language is part of a more general human cognitive ability, and can therefore only describe the world as people conceive of it. It is implicit that different linguistic communities conceive of simple things and processes in the world differently, not necessarily some difference between a person's conceptual world and the real world.

<span class="mw-page-title-main">Charles J. Fillmore</span> American linguist

Charles J. Fillmore was an American linguist and Professor of Linguistics at the University of California, Berkeley. He received his Ph.D. in Linguistics from the University of Michigan in 1961. Fillmore spent ten years at Ohio State University and a year as a Fellow at the Center for Advanced Study in the Behavioral Sciences at Stanford University before joining Berkeley's Department of Linguistics in 1971. Fillmore was extremely influential in the areas of syntax and lexical semantics.

FrameNet is a research and resource development project based at the International Computer Science Institute (ICSI) in Berkeley, California, which has produced an electronic resource based on a theory of meaning called frame semantics. The data that FrameNet has analyzed show that the sentence "John sold a car to Mary" essentially describes the same basic situation as "Mary bought a car from John", just from a different perspective. A semantic frame is a conceptual structure describing an event, relation, or object along with its participants. The FrameNet lexical database contains over 1,200 semantic frames, 13,000 lexical units and 202,000 example sentences. Charles J. Fillmore, who developed the theory of frame semantics which serves as the theoretical the basis of FrameNet, founded the project in 1997 and continued to lead the effort until he died in 2014. Frame Semantic theory and FrameNet have been influential in linguistics and natural language processing, where it led to the task of automatic Semantic Role Labeling.

In linguistics, linguistic competence is the system of unconscious knowledge that one knows when they know a language. It is distinguished from linguistic performance, which includes all other factors that allow one to use one's language in practice.

Case grammar is a system of linguistic analysis, focusing on the link between the valence, or number of subjects, objects, etc., of a verb and the grammatical context it requires. The system was created by the American linguist Charles J. Fillmore in the context of Transformational Grammar (1968). This theory analyzes the surface syntactic structure of sentences by studying the combination of deep cases which are required by a specific verb. For instance, the verb "give" in English requires an Agent (A) and Object (O), and a Beneficiary (B); e.g. "Jones (A) gave money (O) to the school (B).

In frame semantics, a theory of linguistic meaning, null instantiation is the name of a category used to annotate, or tag, absent semantic constituents or frame elements. Frame semantics, best exemplified by the FrameNet project, views words as evoking frames of knowledge and frames as typically involving multiple components, called frame elements. The term null refers to the fact that the frame element in question is absent. The logical object of the term instantiation refers to the frame element itself. So, null instantiation is an empty instantiation of a frame element. Ruppenhofer and Michaelis postulate an implicational regularity tying the interpretation type of an omitted argument to the frame membership of its predicator: "If a particular frame element role is lexically omissible under a particular interpretation for one LU [lexical unit] in a frame, then for any other LUs in the same frame that allow the omission of this same FE [frame element], the interpretation of the missing FE is the same."

Distributional semantics is a research area that develops and studies theories and methods for quantifying and categorizing semantic similarities between linguistic items based on their distributional properties in large samples of language data. The basic idea of distributional semantics can be summed up in the so-called distributional hypothesis: linguistic items with similar distributions have similar meanings.

In certain theories of linguistics, thematic relations, also known as semantic roles, are the various roles that a noun phrase may play with respect to the action or state described by a governing verb, commonly the sentence's main verb. For example, in the sentence "Susan ate an apple", Susan is the doer of the eating, so she is an agent; an apple is the item that is eaten, so it is a patient.

<span class="mw-page-title-main">Janet Dean Fodor</span> American linguist

Janet Dean Fodor is distinguished professor of linguistics at the City University of New York. Her primary field is psycholinguistics, and her research interests include human sentence processing, prosody, learnability theory and L1 (first-language) acquisition.

Linguistics is the scientific study of human language. It is called a scientific study because it entails a comprehensive, systematic, objective, and precise analysis of all aspects of language, particularly its nature and structure. Linguistics is concerned with both the cognitive and social aspects of language. It is considered a scientific field as well as an academic discipline; it has been classified as a social science, natural science, cognitive science, or part of the humanities.

In linguistics, subcategorization denotes the ability/necessity for lexical items to require/allow the presence and types of the syntactic arguments with which they co-occur. The notion of subcategorization is similar to the notion of valency, although the two concepts stem from different traditions in the study of syntax and grammar.

References

  1. Fillmore, Charles J., and Collin F. Baker. "Frame semantics for text understanding." Proceedings of WordNet and Other Lexical Resources Workshop, NAACL. 2001.
  2. d'Alembert, J. L. R. (1995). Preliminary Discourse to the Encyclopedia of Diderot. University of Chicago Press. ISBN   978-0024074003.
  3. Mazzotta, Giuseppe (2014). The New Map of the World: the Poetic Philosophy of Giambattista Vico. Princetpn University Press. ISBN   9781400864997.
  4. Dirven, René (2010). "Cognitive linguistics". In Malmkjaer, Kirsten (ed.). The Routledge Linguistics Encyclopedia. 3rd edition (PDF). Routledge. pp. 61–68. ISBN   978-0-203-87495-0 . Retrieved 2020-06-15.[ dead link ]
  5. Kirby, Simon (2013). "Transitions: the evolution of linguistic replicators". In Binder; Smith (eds.). The Language Phenomenon (PDF). Springer. pp. 121–138. doi:10.1007/978-3-642-36086-2_6 . Retrieved 2020-03-04.
  6. Zehentner, Eva (2019). Competition in Language Change: the Rise of the English Dative Alternation. De Gruyter Mouton. ISBN   978-3-11-063385-6.
  7. MacWhinney, Brian (2015). "Introduction – language emergence". In MacWhinney, Brian; O'Grady, William (eds.). Handbook of Language Emergence. Wiley. pp. 1–31. ISBN   9781118346136.
  8. Keith Alan (2001, p. 251), Natural Language Semantics, Blackwell Publishers Ltd, Oxford, ISBN   0-631-19296-4.
  9. Alan Cruse (2004, p. 137f.), Meaning in Language. An Introduction to Semantics and Pragmatics. Second Edition, Oxford University Press, New York, ISBN   978-0-19-926306-6.
  10. Barsalou, Lawrence W. 1992. "Frames, concepts, and conceptual fields". In Frames, fields, and contrasts, ed. Adrienne Lehrer and Eva Feder Kittay, 21–74. Hillsdale: Lawrence Erlbaum Associates.
  11. Sebastian Löbner (2014), "Evidence for Frames from Human Language", in Thomas Gamerschlag, Doris Gerland, Rainer Osswald, and Wiebke Petersen, editors, Frames and Concept Types, pp. 23–67, Springer, Dordrecht.
  12. SLING - A natural language frame semantics parser, Google, 2021-11-14, retrieved 2021-11-14