Grammatical features |
---|
Related to nouns |
Related to verbs |
General features |
Syntax relationships |
|
Semantics |
Phenomena |
This article needs additional citations for verification . (February 2009) (Learn how and when to remove this template message) |
In semantics, contrast is a relationship between two discourse segments.[ citation needed ]
Contrast is often overtly marked by markers such as but or however, such as in the following examples:
In the first clause, It's raining implies that the speaker knows the weather situation and so will prepare for it, while the second clause I am not taking an umbrella implies that the speaker will still get wet. Both clauses (or discourse segments) refer to related situations, or themes, yet imply a contradiction. It is this relationship of comparing something similar, yet different, that is believed to be typical of contrastive relations. The same type of relationship is shown in (2), where the first sentence can be interpreted as implying that by giving a party for the new students, the hosts will serve drinks. This is, of course, a defeasible inference based on world knowledge, that is then contradicted in the following sentence.
The majority of the studies done on contrast and contrastive relations in semantics has concentrated on characterizing exactly which semantic relationships could give rise to contrast. Earliest studies in semantics also concentrated on identifying what distinguished clauses joined by and from clauses joined by but.
In discourse theory, and computational discourse, contrast is a major discourse relation, on par with relationship like explanation or narration, and work has concentrated on trying to identify contrast in naturally produced texts, especially in cases where the contrast is not explicitly marked.
In morphology, 'contrast' is identified, when two linguistic elements occur in the same environment(s), and replacing one with the other creates a difference in meaning. [1] Two elements that contrast in identical environments make a minimal pair.
Pragmatics is a subfield of linguistics and semiotics that studies how context contributes to meaning. Pragmatics encompasses speech act theory, conversational implicature talk in interaction and other approaches to language behavior in philosophy, sociology, linguistics and anthropology. Unlike semantics, which examines meaning that is conventional or "coded" in a given language, pragmatics studies how the transmission of meaning depends not only on structural and linguistic knowledge of the speaker and listener but also on the context of the utterance, any pre-existing knowledge about those involved, the inferred intent of the speaker, and other factors. In that respect, pragmatics explains how language users are able to overcome apparent ambiguity since meaning relies on the manner, place, time, etc. of an utterance.
In linguistics, deixis is the use of general words and phrases to refer to a specific time, place, or person in context, e.g., the words tomorrow, there, and they. Words are deictic if their semantic meaning is fixed but their denoted meaning varies depending on time and/or place. Words or phrases that require contextual information to be fully understood—for example, English pronouns—are deictic. Deixis is closely related to anaphora. Although this article deals primarily with deixis in spoken language, the concept is sometimes applied to written language, gestures, and communication media as well. In linguistic anthropology, deixis is treated as a particular subclass of the more general semiotic phenomenon of indexicality, a sign "pointing to" some aspect of its context of occurrence.
The term discourse identifies and describes written and spoken communications. In semantics and discourse analysis, a discourse is a conceptual generalization of conversation. In a field of enquiry and social practice, the discourse is the vocabulary for investigation of the subject, e.g. legal discourse, medical discourse, religious discourse, et cetera. In the works of the philosopher Michel Foucault, a discourse is “an entity of sequences, of signs, in that they are enouncements (énoncés).”
In lexical semantics, opposites are words lying in an inherently incompatible binary relationship. For example, something that is long entails that it is not short. It is referred to as a 'binary' relationship because there are two members in a set of opposites. The relationship between opposites is known as opposition. A member of a pair of opposites can generally be determined by the question What is the opposite of X ?
Counterfactual conditionals are conditional sentences which discuss what would have been true under different circumstances, e.g. "If Peter believed in ghosts, he would be afraid to be here." Counterfactuals are contrasted with indicatives, which are generally restricted to discussing open possibilities. Counterfactuals are characterized grammatically by their use of fake tense morphology, which some languages use in combination with other kinds of morphology including aspect and mood.
Conditional sentences are sentences that express one thing contingent on something else, e.g. "If it rains, the picnic will be cancelled". They are so called because the impact of the main clause of the sentence is conditional on the dependent clause. A full conditional thus contains two clauses: the dependent clause expressing the condition, called the antecedent ; and the main clause expressing the consequence, called the consequent.
In grammar, an antecedent is an expression that gives its meaning to a proform. A proform takes its meaning from its antecedent; e.g., "John arrived late because traffic held him up." The pronoun him refers to and takes its meaning from John, so John is the antecedent of him. Proforms usually follow their antecedents, but sometimes they precede them, in which case one is, technically, dealing with postcedents instead of antecedents. The prefix ante- means "before" or "in front of", and post- means "after" or "behind". The term antecedent stems from traditional grammar. The linguistic term that is closely related to antecedent and proform is anaphora. Theories of syntax explore the distinction between antecedents and postcedents in terms of binding.
The subject in a simple English sentence such as John runs, John is a teacher, or John was run over by a car, is the person or thing about whom the statement is made, in this case John. Traditionally the subject is the word or phrase which controls the verb in the clause, that is to say with which the verb agrees. If there is no verb, as in John - what an idiot!, or if the verb has a different subject, as in John - I can't stand him!, then 'John' is not considered to be the grammatical subject, but can be described as the topic of the sentence.
In linguistics, a causative is a valency-increasing operation that indicates that a subject either causes someone or something else to do or be something or causes a change in state of a non-volitional event. Normally, it brings in a new argument, A, into a transitive clause, with the original subject S becoming the object O.
In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.
Force dynamics is a semantic category that describes the way in which entities interact with reference to force. Force Dynamics gained a good deal of attention in cognitive linguistics due to its claims of psychological plausibility and the elegance with which it generalizes ideas not usually considered in the same context. The semantic category of force dynamics pervades language on several levels. Not only does it apply to expressions in the physical domain like leaning on or dragging, but it also plays an important role in expressions involving psychological forces. Furthermore, the concept of force dynamics can be extended to discourse. For example, the situation in which speakers A and B argue, after which speaker A gives in to speaker B, exhibits a force dynamic pattern.
In linguistics, anaphora is the use of an expression whose interpretation depends upon another expression in context. In a narrower sense, anaphora is the use of an expression that depends specifically upon an antecedent expression and thus is contrasted with cataphora, which is the use of an expression that depends upon a postcedent expression. The anaphoric (referring) term is called an anaphor. For example, in the sentence Sally arrived, but nobody saw her, the pronoun her is an anaphor, referring back to the antecedent Sally. In the sentence Before her arrival, nobody saw Sally, the pronoun her refers forward to the postcedent Sally, so her is now a cataphor. Usually, an anaphoric expression is a proform or some other kind of deictic (contextually-dependent) expression. Both anaphora and cataphora are species of endophora, referring to something mentioned elsewhere in a dialog or text.
In linguistics, focus is a grammatical category that conveys which part of the sentence contributes new, non-derivable, or contrastive information. In the English sentence "Mary only insulted BILL", focus is expressed prosodically by a pitch accent on "Bill" which identifies him as the only person Mary saw. By contrast, in the sentence "Mary only INSULTED Bill", the verb "insult" is focused and thus expresses that Mary performed no other actions towards Bill. Focus is a cross-linguistic phenomenon and a major topic in linguistics. Research on focus spans numerous subfields including phonetics, syntax, semantics, pragmatics, and sociolinguistics.
In the branch of linguistics known as pragmatics, a presupposition is an implicit assumption about the world or background belief relating to an utterance whose truth is taken for granted in discourse. Examples of presuppositions include:
Text linguistics is a branch of linguistics that deals with texts as communication systems. Its original aims lay in uncovering and describing text grammars. The application of text linguistics has, however, evolved from this approach to a point in which text is viewed in much broader terms that go beyond a mere extension of traditional grammar towards an entire text. Text linguistics takes into account the form of a text, but also its setting, i. e. the way in which it is situated in an interactional, communicative context. Both the author of a text as well as its addressee are taken into consideration in their respective roles in the specific communicative context. In general it is an application of discourse analysis at the much broader level of text, rather than just a sentence or word.
In linguistics, an argument is an expression that helps complete the meaning of a predicate, the latter referring in this context to a main verb and its auxiliaries. In this regard, the complement is a closely related concept. Most predicates take one, two, or three arguments. A predicate and its arguments form a predicate-argument structure. The discussion of predicates and arguments is associated most with (content) verbs and noun phrases (NPs), although other syntactic categories can also be construed as predicates and as arguments. Arguments must be distinguished from adjuncts. While a predicate needs its arguments to complete its meaning, the adjuncts that appear with a predicate are optional; they are not necessary to complete the meaning of the predicate. Most theories of syntax and semantics acknowledge arguments and adjuncts, although the terminology varies, and the distinction is generally believed to exist in all languages. Dependency grammars sometimes call arguments actants, following Tesnière (1959).
Donkey sentences are sentences that contain a pronoun with clear meaning but whose syntactical role in the sentence poses challenges to grammarians. Such sentences defy straightforward attempts to generate their formal language equivalents. The difficulty is with understanding how English speakers parse such sentences.
In linguistics, information structure, also called information packaging, describes the way in which information is formally packaged within a sentence. This generally includes only those aspects of information that “respond to the temporary state of the addressee’s mind”, and excludes other aspects of linguistic information such as references to background (encyclopedic/common) knowledge, choice of style, politeness, and so forth. For example, the difference between an active clause and a corresponding passive is a syntactic difference, but one motivated by information structuring considerations. Other structures motivated by information structure include preposing and inversion.
Logophoricity is a phenomenon of binding relation that may employ a morphologically different set of anaphoric forms, in the context where the referent is an entity whose speech, thoughts, or feelings are being reported. This entity may or may not be distant from the discourse, but the referent must reside in a clause external to the one in which the logophor resides. The specially-formed anaphors that are morphologically distinct from the typical pronouns of a language are known as logophoric pronouns, originally coined by the linguist Claude Hagège. The linguistic importance of logophoricity is its capability to do away with ambiguity as to who is being referred to. A crucial element of logophoricity is the logophoric context, defined as the environment where use of logophoric pronouns is possible. Several syntactic and semantic accounts have been suggested. While some languages may not be purely logophoric, logophoric context may still be found in those languages; in those cases, it is common to find that in the place where logophoric pronouns would typically occur, non-clause-bounded reflexive pronouns appear instead.
In formal semantics, the scope of a semantic operator is the semantic object to which it applies. For instance, in the sentence "Paulina doesn't drink beer but she does drink wine," the proposition that Paulina drinks beer occurs within the scope of negation, but the proposition that Paulina drinks wine does not. Scope can be thought of as the semantic order of operations.