![]() | This article's tone or style may not reflect the encyclopedic tone used on Wikipedia. The reason given is: reasoned, academic style. (February 2021) |
Coherence in linguistics is what makes a text semantically meaningful. It is especially dealt with in text linguistics. Coherence is achieved through syntactic features such as the use of deictic, anaphoric and cataphoric elements or a logical tense structure, and semantic features such as presuppositions and implications connected to general world knowledge.
Robert De Beaugrande and Wolfgang U. Dressler define coherence as a "continuity of senses" and "the mutual access and relevance within a configuration of concepts and relations". [1] Thereby a textual world is created that does not have to comply to the real world. But within this textual world the arguments also have to be connected logically so that the reader/hearer can produce coherence.
"Continuity of senses" implies a link between cohesion and the theory of Schemata initially proposed by F. C. Bartlett in 1932 [2] [3] which creates further implications for the notion of a "text". Schemata, subsequently distinguished into Formal and Content Schemata (in the field of TESOL [4] ) are the ways in which the world is organized in our minds. In other words, they are mental frameworks for the organization of information about the world. It can thus be assumed that a text is not always one because the existence of coherence is not always a given. On the contrary, coherence is relevant because of its dependence upon each individual's content and formal schemata.
A concept is an abstract idea that serves as a foundation for more concrete principles, thoughts, and beliefs. Concepts play an important role in all aspects of cognition. As such, concepts are studied within such disciplines as linguistics, psychology, and philosophy, and these disciplines are interested in the logical and psychological structure of concepts, and how they are put together to form thoughts and sentences. The study of concepts has served as an important flagship of an emerging interdisciplinary approach, cognitive science.
Natural language processing (NLP) is a subfield of computer science and especially artificial intelligence. It is primarily concerned with providing computers with the ability to process data encoded in natural language and is thus closely related to information retrieval, knowledge representation and computational linguistics, a subfield of linguistics. Typically data is collected in text corpora, using either rule-based, statistical or neural-based approaches in machine learning and deep learning.
Natural language understanding (NLU) or natural language interpretation (NLI) is a subset of natural language processing in artificial intelligence that deals with machine reading comprehension. NLU has been considered an AI-hard problem.
Wolfgang Iser was a German literary scholar.
Readability is the ease with which a reader can understand a written text. The concept exists in both natural language and programming languages though in different forms. In natural language, the readability of text depends on its content and its presentation. In programming, things such as programmer comments, choice of loop structure, and choice of names can determine the ease with which humans can read computer program code.
In psychology and cognitive science, a schema describes a pattern of thought or behavior that organizes categories of information and the relationships among them. It can also be described as a mental structure of preconceived ideas, a framework representing some aspect of the world, or a system of organizing and perceiving new information, such as a mental schema or conceptual model. Schemata influence attention and the absorption of new knowledge: people are more likely to notice things that fit into their schema, while re-interpreting contradictions to the schema as exceptions or distorting them to fit. Schemata have a tendency to remain unchanged, even in the face of contradictory information. Schemata can help in understanding the world and the rapidly changing environment. People can organize new perceptions into schemata quickly as most situations do not require complex thought when using schema, since automatic thought is all that is required.
Cognitive semantics is part of the cognitive linguistics movement. Semantics is the study of linguistic meaning. Cognitive semantics holds that language is part of a more general human cognitive ability, and can therefore only describe the world as people conceive of it. It is implicit that different linguistic communities conceive of simple things and processes in the world differently, not necessarily some difference between a person's conceptual world and the real world.
Text linguistics is a branch of linguistics that deals with texts as communication systems. Its original aims lay in uncovering and describing text grammars. The application of text linguistics has, however, evolved from this approach to a point in which text is viewed in much broader terms that go beyond a mere extension of traditional grammar towards an entire text. Text linguistics takes into account the form of a text, but also its setting, i. e. the way in which it is situated in an interactional, communicative context. Both the author of a text as well as its addressee are taken into consideration in their respective roles in the specific communicative context. In general it is an application of discourse analysis at the much broader level of text, rather than just a sentence or word.
In philosophy—more specifically, in its sub-fields semantics, semiotics, philosophy of language, metaphysics, and metasemantics—meaning "is a relationship between two sorts of things: signs and the kinds of things they intend, express, or signify".
The sequence between semantic related ordered words is classified as a lexical chain. A lexical chain is a sequence of related words in writing, spanning narrow or wide context window. A lexical chain is independent of the grammatical structure of the text and in effect it is a list of words that captures a portion of the cohesive structure of the text. A lexical chain can provide a context for the resolution of an ambiguous term and enable disambiguation of concepts that the term represents.
Wolfgang U. Dressler is an Austrian professor of linguistics at the University of Vienna. Dressler is a polyglot and scholar who has contributed to various fields of linguistics, especially phonology, morphology, text linguistics, clinical linguistics and child language development. He is an important representative of the 'naturalness theory'.
A structure editor, also structured editor or projectional editor, is any document editor that is cognizant of the document's underlying structure. Structure editors can be used to edit hierarchical or marked up text, computer programs, diagrams, chemical formulas, and any other type of content with clear and well-defined structure. In contrast, a text editor is any document editor used for editing plain text files.
A discourse relation is a description of how two segments of discourse are logically and/or structurally connected to one another.
Margaret Masterman was a British linguist and philosopher, most known for her pioneering work in the field of computational linguistics and especially machine translation. She founded the Cambridge Language Research Unit.
Michael Hoey was a British linguist and Baines Professor of English Language. He lectured in applied linguistics in over 40 countries.
Ruqaiya Hasan was a professor of linguistics who held visiting positions and taught at various universities in England. Her last appointment was at Macquarie University in Sydney, from which she retired as emeritus professor in 1994. Throughout her career she researched and published widely in the areas of verbal art, culture, context and text, text and texture, lexicogrammar and semantic variation. The latter involved the devising of extensive semantic system networks for the analysis of meaning in naturally occurring dialogues.
The term metafunction originates in systemic functional linguistics and is considered to be a property of all languages. Systemic functional linguistics is functional and semantic rather than formal and syntactic in its orientation. As a functional linguistic theory, it claims that both the emergence of grammar and the particular forms that grammars take should be explained "in terms of the functions that language evolved to serve". While languages vary in how and what they do, and what humans do with them in the contexts of human cultural practice, all languages are considered to be shaped and organised in relation to three functions, or metafunctions. Michael Halliday, the founder of systemic functional linguistics, calls these three functions the ideational, interpersonal, and textual. The ideational function is further divided into the experiential and logical.
The following outline is provided as an overview of and topical guide to natural-language processing:
Danielle S. McNamara is an educational researcher known for her theoretical and empirical work with reading comprehension and the development of game-based literacy technologies. She is professor of psychology and senior research scientist at Arizona State University. She has previously held positions at University of Memphis, Old Dominion University, and University of Colorado, Boulder.
The usage-based linguistics is a linguistics approach within a broader functional/cognitive framework, that emerged since the late 1980s, and that assumes a profound relation between linguistic structure and usage. It challenges the dominant focus, in 20th century linguistics, on considering language as an isolated system removed from its use in human interaction and human cognition. Rather, usage-based models posit that linguistic information is expressed via context-sensitive mental processing and mental representations, which have the cognitive ability to succinctly account for the complexity of actual language use at all levels. Broadly speaking, a usage-based model of language accounts for language acquisition and processing, synchronic and diachronic patterns, and both low-level and high-level structure in language, by looking at actual language use.