Coherence (linguistics)

Last updated

Coherence in linguistics is what makes a text semantically meaningful. It is especially dealt with in text linguistics. Coherence is achieved through syntactic features such as the use of deictic, anaphoric and cataphoric elements or a logical tense structure, and semantic features such as presuppositions and implications connected to general world knowledge.

Contents

The purely linguistic elements that make a text coherent are encompassed under the term cohesion. However, text-based features which provide cohesion in a text do not necessarily help achieve coherence; that is, they do not always contribute to the meaningfulness of a text. It has been stated that a text coheres only if the world around is also coherent[ citation needed ].

Robert De Beaugrande and Wolfgang U. Dressler define coherence as a "continuity of senses" and "the mutual access and relevance within a configuration of concepts and relations". [1] Thereby a textual world is created that does not have to comply to the real world. But within this textual world the arguments also have to be connected logically so that the reader/hearer can produce coherence.

"Continuity of senses" implies a link between cohesion and the theory of Schemata initially proposed by F. C. Bartlett in 1932 [2] [3] which creates further implications for the notion of a "text". Schemata, subsequently distinguished into Formal and Content Schemata (in the field of TESOL [4] ) are the ways in which the world is organized in our minds. In other words, they are mental frameworks for the organization of information about the world. It can thus be assumed that a text is not always one because the existence of coherence is not always a given. On the contrary, coherence is relevant because of its dependence upon each individual's content and formal schemata.

See also

Sources

  1. De Beaugrande, Robert and Dressler, Wolfgang: Introduction to Text Linguistics. New York, 1996. p. 84–112.
  2. Bartlett, F.C. (1932). Remembering: A study in experimental and social psychology. Cambridge: Cambridge University Press.
  3. Wagoner, Brady. "Culture and mind in reconstruction: Bartlett's analogy between individual and group processes". Aalborg University, Denmark.
  4. Carrell, P.L. and Eisterhold, J.C. (1983) "Schema Theory and ESL Reading Pedagogy", in Carrell, P.L., Devine, J. and Eskey, D.E. (eds) (1988) Interactive Approaches to Second Language Reading. Cambridge: CUP.

Further reading


Related Research Articles

Natural language processing (NLP) is an interdisciplinary subfield of computer science and information retrieval. It is primarily concerned with giving computers the ability to support and manipulate human language. It involves processing natural language datasets, such as text corpora or speech corpora, using either rule-based or probabilistic machine learning approaches. The goal is a computer capable of "understanding" the contents of documents, including the contextual nuances of the language within them. To this end, natural language processing often borrows ideas from theoretical linguistics. The technology can then accurately extract information and insights contained in the documents as well as categorize and organize the documents themselves.

Word-sense disambiguation (WSD) is the process of identifying which sense of a word is meant in a sentence or other segment of context. In human language processing and cognition, it is usually subconscious/automatic but can often come to conscious attention when ambiguity impairs clarity of communication, given the pervasive polysemy in natural language. In computational linguistics, it is an open problem that affects other computer-related writing, such as discourse, improving relevance of search engines, anaphora resolution, coherence, and inference.

<span class="mw-page-title-main">Wiktionary</span> Multilingual online dictionary

Wiktionary is a multilingual, web-based project to create a free content dictionary of terms in all natural languages and in a number of artificial languages. These entries may contain definitions, images for illustration, pronunciations, etymologies, inflections, usage examples, quotations, related terms, and translations of terms into other languages, among other features. It is collaboratively edited via a wiki. Its name is a portmanteau of the words wiki and dictionary. It is available in 192 languages and in Simple English. Like its sister project Wikipedia, Wiktionary is run by the Wikimedia Foundation, and is written collaboratively by volunteers, dubbed "Wiktionarians". Its wiki software, MediaWiki, allows almost anyone with access to the website to create and edit entries.

Coherence is, in general, a state or situation in which all the parts or ideas fit together well so that they form a united whole.

Stylistics, a branch of applied linguistics, is the study and interpretation of texts of all types, but particularly literary texts, and/or spoken language in regard to their linguistic and tonal style, where style is the particular variety of language used by different individuals and/or in different situations or settings. For example, the vernacular, or everyday language may be used among casual friends, whereas more formal language, with respect to grammar, pronunciation or accent, and lexicon or choice of words, is often used in a cover letter and résumé and while speaking during a job interview.

Readability is the ease with which a reader can understand a written text. The concept exists in both in natural language and programming languages though in different forms. In natural language, the readability of text depends on its content and its presentation. In programming, things such as programmer comments, choice of loop structure, and choice of names can determine the ease with which humans can read computer program code.

In psychology and cognitive science, a schema describes a pattern of thought or behavior that organizes categories of information and the relationships among them. It can also be described as a mental structure of preconceived ideas, a framework representing some aspect of the world, or a system of organizing and perceiving new information, such as a mental schema or conceptual model. Schemata influence attention and the absorption of new knowledge: people are more likely to notice things that fit into their schema, while re-interpreting contradictions to the schema as exceptions or distorting them to fit. Schemata have a tendency to remain unchanged, even in the face of contradictory information. Schemata can help in understanding the world and the rapidly changing environment. People can organize new perceptions into schemata quickly as most situations do not require complex thought when using schema, since automatic thought is all that is required.

In sociolinguistics, a register is a variety of language used for a particular purpose or particular communicative situation. For example, when speaking officially or in a public setting, an English speaker may be more likely to follow prescriptive norms for formal usage than in a casual setting, for example, by pronouncing words ending in -ing with a velar nasal instead of an alveolar nasal, choosing words that are considered more "formal", and refraining from using words considered nonstandard, such as ain't and y'all.

Cognitive semantics is part of the cognitive linguistics movement. Semantics is the study of linguistic meaning. Cognitive semantics holds that language is part of a more general human cognitive ability, and can therefore only describe the world as people conceive of it. It is implicit that different linguistic communities conceive of simple things and processes in the world differently, not necessarily some difference between a person's conceptual world and the real world.

Text linguistics is a branch of linguistics that deals with texts as communication systems. Its original aims lay in uncovering and describing text grammars. The application of text linguistics has, however, evolved from this approach to a point in which text is viewed in much broader terms that go beyond a mere extension of traditional grammar towards an entire text. Text linguistics takes into account the form of a text, but also its setting, i. e. the way in which it is situated in an interactional, communicative context. Both the author of a text as well as its addressee are taken into consideration in their respective roles in the specific communicative context. In general it is an application of discourse analysis at the much broader level of text, rather than just a sentence or word.

In philosophy—more specifically, in its sub-fields semantics, semiotics, philosophy of language, metaphysics, and metasemantics—meaning "is a relationship between two sorts of things: signs and the kinds of things they intend, express, or signify".

The sequence between semantic related ordered words is classified as a lexical chain. A lexical chain is a sequence of related words in writing, spanning narrow or wide context window. A lexical chain is independent of the grammatical structure of the text and in effect it is a list of words that captures a portion of the cohesive structure of the text. A lexical chain can provide a context for the resolution of an ambiguous term and enable disambiguation of concepts that the term represents.

A structure editor, also structured editor or projectional editor, is any document editor that is cognizant of the document's underlying structure. Structure editors can be used to edit hierarchical or marked up text, computer programs, diagrams, chemical formulas, and any other type of content with clear and well-defined structure. In contrast, a text editor is any document editor used for editing plain text files.

<span class="mw-page-title-main">Yorick Wilks</span> British computer scientist (1939–2023)

Yorick Alexander Wilks FBCS was a British computer scientist. He was an emeritus professor of artificial intelligence at the University of Sheffield, visiting professor of artificial intelligence at Gresham College, senior research fellow at the Oxford Internet Institute, senior scientist at the Florida Institute for Human and Machine Cognition, and a member of the Epiphany Philosophers.

A discourse relation is a description of how two segments of discourse are logically and/or structurally connected to one another.

Margaret Masterman was a British linguist and philosopher, most known for her pioneering work in the field of computational linguistics and especially machine translation. She founded the Cambridge Language Research Unit.

In statistics and natural language processing, a topic model is a type of statistical model for discovering the abstract "topics" that occur in a collection of documents. Topic modeling is a frequently used text-mining tool for discovery of hidden semantic structures in a text body. Intuitively, given that a document is about a particular topic, one would expect particular words to appear in the document more or less frequently: "dog" and "bone" will appear more often in documents about dogs, "cat" and "meow" will appear in documents about cats, and "the" and "is" will appear approximately equally in both. A document typically concerns multiple topics in different proportions; thus, in a document that is 10% about cats and 90% about dogs, there would probably be about 9 times more dog words than cat words. The "topics" produced by topic modeling techniques are clusters of similar words. A topic model captures this intuition in a mathematical framework, which allows examining a set of documents and discovering, based on the statistics of the words in each, what the topics might be and what each document's balance of topics is.

<span class="mw-page-title-main">Michael Hoey (linguist)</span> British linguist (1948–2021)

Michael Hoey was a British linguist and Baines Professor of English Language. He lectured in applied linguistics in over 40 countries.

<span class="mw-page-title-main">Ruqaiya Hasan</span>

Ruqaiya Hasan was a professor of linguistics who held visiting positions and taught at various universities in England. Her last appointment was at Macquarie University in Sydney, from which she retired as emeritus professor in 1994. Throughout her career she researched and published widely in the areas of verbal art, culture, context and text, text and texture, lexicogrammar and semantic variation. The latter involved the devising of extensive semantic system networks for the analysis of meaning in naturally occurring dialogues.

The term metafunction originates in systemic functional linguistics and is considered to be a property of all languages. Systemic functional linguistics is functional and semantic rather than formal and syntactic in its orientation. As a functional linguistic theory, it claims that both the emergence of grammar and the particular forms that grammars take should be explained "in terms of the functions that language evolved to serve". While languages vary in how and what they do, and what humans do with them in the contexts of human cultural practice, all languages are considered to be shaped and organised in relation to three functions, or metafunctions. Michael Halliday, the founder of systemic functional linguistics, calls these three functions the ideational, interpersonal, and textual. The ideational function is further divided into the experiential and logical.