Behavioral momentum is a theory in quantitative analysis of behavior and is a behavioral metaphor based on physical momentum. It describes the general relation between resistance to change (persistence of behavior) and the rate of reinforcement obtained in a given situation.
Quantitative analysis of behavior is the application of mathematical models, conceptualized from a robust corpus of environment-behavior-consequence interactions in the experimental analysis of behavior, with the aim to describe and/or predict relations between a dependent variable and all possible levels of an independent variable. The parameters in the models hopefully have theoretical meaning beyond being used to fit models to data. The field was founded by Richard Herrnstein (1961) when he introduced the matching law to quantify the behavior of organisms working on concurrent schedules of reinforcement. The field has integrated models from economics, zoology, philosophy, and other branches of psychology, especially mathematical psychology of which it is a branch. The field is represented by the Society for Quantitative Analysis of Behavior. Quantitative analysis of behavior addresses the following topics among others: behavioral economics, behavioral momentum, connectionist systems or neural networks, integration, hyperbolic discounting including the delay reduction hypothesis, foraging, hunting, errorless learning, creativity, learning, and the Rescorla-Wagner model, matching law, melioration, scalar expectancy, signal detection, neural hysteresis, and reinforcement control.
A metaphor is a figure of speech that, for rhetorical effect, directly refers to one thing by mentioning another. It may provide clarity or identify hidden similarities between two ideas. Antithesis, hyperbole, metonymy and simile are all types of metaphor. One of the most commonly cited examples of a metaphor in English literature is the "All the world's a stage" monologue from As You Like It:
In Newtonian mechanics, linear momentum, translational momentum, or simply momentum is the product of the mass and velocity of an object. It is a vector quantity, possessing a magnitude and a direction in three-dimensional space. If m is an object's mass and v is the velocity, then the momentum is
B.F. Skinner (1938) proposed that all behavior is based on a fundamental unit of behavior called the discriminated operant. The discriminated operant, also known as the three-term contingency, has three components: an antecedent discriminative stimulus, a response, and a reinforcing or punishing consequence. The organism responds in the presence of the stimulus because past responses in the presence of that stimulus have produced reinforcement.
In physiology, a stimulus is a detectable change in the internal or external environment. The ability of an organism or organ to respond to external stimuli is called sensitivity. When a stimulus is applied to a sensory receptor, it normally elicits or influences a reflex via stimulus transduction. These sensory receptors can receive information from outside the body, as in touch receptors found in the skin or light receptors in the eye, as well as from inside the body, as in chemoreceptors and mechanoreceptors. An internal stimulus is often the first component of a homeostatic control system. External stimuli are capable of producing systemic responses throughout the body, as in the fight-or-flight response. In order for a stimulus to be detected with high probability, its level must exceed the absolute threshold; if a signal does reach threshold, the information is transmitted to the central nervous system (CNS), where it is integrated and a decision on how to react is made. Although stimuli commonly cause the body to respond, it is the CNS that finally determines whether a signal causes a reaction or not.
According to behavioral momentum theory, there are two separable factors that independently govern the rate with which a discriminated operant occurs and the persistence of that response in the face of disruptions such as punishment, extinction, or the differential reinforcement of alternative behaviors. (see Nevin & Grace, 2000, for a review). First, the positive contingency between the response and a reinforcing consequence controls response rates (i.e., a response–reinforcer relation) by shaping a particular pattern of responding. This is governed by the relative law of effect (i.e., the matching law; Herrnstein, 1970). Secondly, the Pavlovian relation between surrounding, or context, stimuli and the rate or magnitude (but not both) of reinforcement obtained in the context (i.e., a stimulus–reinforcer relation) governs the resistance of the behavior to operations such as extinction. Resistance to change is assessed by measuring responding during operations such as extinction or satiation that tend to disrupt the behavior and comparing these measurements to stable, pre-disruption response rates.
John Anthony Nevin was an American psychologist who was a professor of psychology at the University of New Hampshire.
In operant conditioning, the matching law is a quantitative relationship that holds between the relative rates of response and the relative rates of reinforcement in concurrent schedules of reinforcement. For example, if two response alternatives A and B are offered to an organism, the ratio of response rates to A and B equals the ratio of reinforcements yielded by each response. This law applies fairly well when non-human subjects are exposed to concurrent variable interval schedules ; its applicability in other situations is less clear, depending on the assumptions made and the details of the experimental situation. The generality of applicability of the matching law is subject of current debate.
Richard Julius Herrnstein was an American psychologist and sociologist. He was an active researcher in animal learning in the Skinnerian tradition. He was one of the founders of the Society for Quantitative Analysis of Behavior.
Resistance to disruption has been considered a better measure of response strength than a simple measure of response rate.(Nevin, 1974). This is because variations in reinforcement contingencies such as differential-reinforcement-of-high- or low-response-rate schedules can yield highly variable response rates even though overall reinforcement rates are equal. Thus it is questionable whether these differences in response rates indicate differences in the underlying strength of a response (see Morse, 1966, for a discussion).
According to behavioral momentum theory, the relation between response rate and resistance to change is analogous to the relation between velocity and mass of a moving object, according to Newton's second law of motion (Nevin, Mandell & Atak, 1983). Newton's second law states that the change in velocity of an object when a force is applied is directly related to that force and inversely related to the object's mass. Similarly, behavioral momentum theory states that the change in response rate under conditions of disruption (Bx) relative to baseline response rate (Bo) is directly related to the force or magnitude of disruption (f) and inversely related to the rate of reinforcement in a stimulus context (r):
The velocity of an object is the rate of change of its position with respect to a frame of reference, and is a function of time. Velocity is equivalent to a specification of an object's speed and direction of motion. Velocity is a fundamental concept in kinematics, the branch of classical mechanics that describes the motion of bodies.
Mass is both a property of a physical body and a measure of its resistance to acceleration when a net force is applied. An object's mass also determines the strength of its gravitational attraction to other bodies.
Sir Isaac Newton was an English mathematician, physicist, astronomer, theologian, and author who is widely recognised as one of the most influential scientists of all time, and a key figure in the scientific revolution. His book Philosophiæ Naturalis Principia Mathematica, first published in 1687, laid the foundations of classical mechanics. Newton also made seminal contributions to optics, and shares credit with Gottfried Wilhelm Leibniz for developing the infinitesimal calculus.
The free parameter b indicates the sensitivity of resistance to change to the rate of reinforcement in the stimulus context (i.e., the stimulus–reinforcer relation). Resistance to disruption typically is assessed when two distinctive discriminative stimulus contexts alternate and signal different schedules of reinforcement (i.e., a multiple schedule). Equation 1 can be rewritten to account for resistance to change across two stimulus contexts (Nevin, 1992; Nevin, Grace, & McLean, 2001) when a disrupter is uniformly applied across contexts (i.e., f1 = f2):
The subscripts indicate the different stimulus contexts. Thus, Equation 2 states that relative resistance to change is a power function of the relative rate of reinforcement across stimulus contexts, with the a parameter indicating sensitivity to relative reinforcement rate. Consistent with behavioral momentum theory, resistance to disruption often has been found to be greater in stimulus contexts that signal higher rates or magnitudes of reinforcement (see Nevin, 1992, for a review). Studies that add response-independent (i.e., free) reinforcement to one stimulus context strongly support the theory that changes in response strength are determined by stimulus–reinforcer relations and are independent of response–reinforcer relations. For instance, Nevin, Tota, Torquato, and Shull (1990) had pigeons pecking lighted disks on separate variable-interval 60-s schedules of intermittent food reinforcement across two components of a multiple schedule. Additional free reinforcers were presented every 15 or 30 s on average when the disk was red, but not when the disk was green. Thus, the response–reinforcer relation was degraded when the disk was red because each reinforcer was not immediately preceded by a response. Consistent with the matching law, response rates were lower in the red context than in the green context. However, the stimulus–reinforcer relation was enhanced in the red context because the overall rate of food presentation was greater. Consistent with behavioral momentum theory, resistance to presession feeding (satiation) and discontinuing reinforcement in both contexts (extinction) was greater in the red context. Similar results have been found when reinforcers are added to a context by reinforcing an alternative response.
The findings of Nevin et al. (1990) have been extended across a number of procedures and species including goldfish (Igaki & Sakagami, 2004), rats (Harper, 1999a, 1999b; Shull, Gaynor & Grimes, 2001), pigeons (Podlesnik & Shahan, 2008), and humans (Ahearn, Clark, Gardenier, Chung & Dube, 2003; Cohen, 1996; Mace et al., 1990). The behavioral momentum framework also has been used to account for the partial-reinforcement extinction effect (Nevin & Grace, 1999), to assess the persistence of drug-maintained behavior (Jimenez-Gomez & Shahan, 2007; Shahan & Burke, 2004), to increase task compliance (e.g., Belfiore, Lee, Scheeler & Klein, 2002), and to understand the effects of social policies on global problems (Nevin, 2005).
Although behavioral momentum theory is a powerful framework for understanding how a context of reinforcement can affect the persistence of discriminated operant behavior, there are a number of findings that are inconsistent with the theory (see Nevin & Grace, 2000, and accompanying commentary). For instance, with equal reinforcement rates across stimulus contexts, resistance to change has been shown to be affected by manipulations to response–reinforcer relations, including schedules that produce different baseline response rates (e.g., Lattal, 1989; Nevin, Grace, Holland & McLean), delays to reinforcement (e.g., Bell, 1999; Grace, Schwendimann & Nevin, 1998; Podlesnik, Jimenez-Gomez, Ward & Shahan, 2006; Podlesnik & Shahan, 2008), and by providing brief stimuli that accompany reinforcement (Reed & Doughty, 2005). Also, it is unclear what factors affect relative resistance to change of responding maintained by conditioned reinforcement (Shahan & Podlesnik, 2005) or two concurrently available responses when different rates of reinforcement are arranged within the same context for those responses (e.g., Bell & Williams, 2002).
As resistance to disruption across stimulus contexts is analogous to the inertial mass of a moving object, behavioral momentum theory also suggests that preference in concurrent-chains procedures for one stimulus context over another is analogous to the gravitational attraction of two bodies (see Nevin & Grace, 2000). In concurrent-chains procedures, responding on the concurrently available initial links provides access to one of two mutually exclusive stimulus contexts called terminal links. As with multiple schedules, independent schedules of reinforcement can function in each terminal-link context. The relative allocation of responding across the two initial links indicates the extent to which an organism prefers one terminal-link context over the other. Moreover, behavioral momentum theory posits that preference provides a measure of the relative conditioned-reinforcing value of the two terminal-link contexts, as described by the contextual-choice model (Grace, 1994).
Grace and Nevin (1997) assessed both relative resistance to change in a multiple schedule and preference in a concurrent-chains procedure with pigeons pecking lighted disks for food reinforcement. When the relative rate of reinforcement was manipulated identically and simultaneously across stimulus contexts in the multiple schedule and concurrent-chains procedure, both relative resistance to change and preference was greater with richer contexts of reinforcement. When all the extant resistance to change and preference data were summarized by Grace, Bedell, and Nevin (2002), they found that those measures were related by a structural relation slope of 0.29. Therefore, relative resistance to change and preference both have been conceptualized as expressions of an underlying construct termed response strength, conditioned reinforcement value, or more generally, behavioral mass of discriminated operant behavior (see Nevin & Grace, 2000).
Burrhus Frederic Skinner, commonly known as B. F. Skinner, was an American psychologist, behaviorist, author, inventor, and social philosopher. He was the Edgar Pierce Professor of Psychology at Harvard University from 1958 until his retirement in 1974.
Operant conditioning is a learning process through which the strength of a behavior is modified by reinforcement or punishment. It is also a procedure that is used to bring about such learning.
In behavioral psychology, reinforcement is a consequence applied that will strengthen an organism's future behavior whenever that behavior is preceded by a specific antecedent stimulus. This strengthening effect may be measured as a higher frequency of behavior, longer duration, greater magnitude, or shorter latency. There are two types of reinforcement, known as positive reinforcement and negative reinforcement; positive is where by a reward is offered on expression of the wanted behaviour and negative is taking away an undesirable element in the persons environment whenever the desired behaviour is achieved.
The experimental analysis of behavior (EAB) is school of thought in psychology founded on B. F. Skinner's philosophy of radical behaviorism and defines the basic principles used in applied behavior analysis (ABA). A central principle was the inductive, data-driven examination of functional relations, as opposed to the kinds of hypothetico-deductive learning theory that had grown up in the comparative psychology of the 1920–1950 period. Skinner's approach was characterized by empirical observation of measurable behavior which could be predicted and controlled. It owed its early success to the effectiveness of Skinner's procedures of operant conditioning, both in the laboratory and in behavior therapy.
Behaviorism is a systematic approach to understanding the behavior of humans and other animals. It assumes that all behaviors are either reflexes produced by a response to certain stimuli in the environment, or a consequence of that individual's history, including especially reinforcement and punishment, together with the individual's current motivational state and controlling stimuli. Although behaviorists generally accept the important role of inheritance in determining behavior, they focus primarily on environmental factors.
The law of effect is a psychological principle advanced by Edward Thorndike in 1898 on the matter of behavioral conditioning which states that "responses that produce a satisfying effect in a particular situation become more likely to occur again in that situation, and responses that produce a discomforting effect become less likely to occur again in that situation."
Applied behavior analysis (ABA) is a scientific discipline concerned with applying techniques based upon the principles of learning to change behavior of social significance. It is an applied form of behavior analysis; the other two forms are radical behaviorism and the experimental analysis of behavior.
Behavior therapy is a broad term referring to clinical psychotherapy that uses techniques derived from behaviorism. Those who practice behavior therapy tend to look at specific, learned behaviors and how the environment influences those behaviors. Those who practice behavior therapy are called behaviourists, or behavior analysts. They tend to look for treatment outcomes that are objectively measurable. Behavior therapy does not involve one specific method but it has a wide range of techniques that can be used to treat a person's psychological problems.
Shaping is a conditioning paradigm used primarily in the experimental analysis of behavior. The method used is differential reinforcement of successive approximations. It was introduced by B. F. Skinner with pigeons and extended to dogs, dolphins, humans and other species. In shaping, the form of an existing response is gradually changed across successive trials towards a desired target behavior by reinforcing exact segments of behavior. Skinner's explanation of shaping was this:
We first give the bird food when it turns slightly in the direction of the spot from any part of the cage. This increases the frequency of such behavior. We then withhold reinforcement until a slight movement is made toward the spot. This again alters the general distribution of behavior without producing a new unit. We continue by reinforcing positions successively closer to the spot, then by reinforcing only when the head is moved slightly forward, and finally only when the beak actually makes contact with the spot. ... The original probability of the response in its final form is very low; in some cases it may even be zero. In this way we can build complicated operants which would never appear in the repertoire of the organism otherwise. By reinforcing a series of successive approximations, we bring a rare response to a very high probability in a short time. ... The total act of turning toward the spot from any point in the box, walking toward it, raising the head, and striking the spot may seem to be a functionally coherent unit of behavior; but it is constructed by a continual process of differential reinforcement from undifferentiated behavior, just as the sculptor shapes his figure from a lump of clay.
In operant conditioning, punishment is any change in a human or animal's surroundings that occurs after a given behavior or response which reduces the likelihood of that behavior occurring again in the future. As with reinforcement, it is the behavior, not the animal, that is punished. Whether a change is or is not punishing is determined by its effect on the rate that the behavior occurs, not by any "hostile" or aversive features of the change. For example, a painful stimulus which would act as a punisher for most people may actually reinforce some behaviors of masochistic individuals.
In behavioral psychology, stimulus control is a phenomenon that occurs when an organism behaves in one way in the presence of a given stimulus and another way in its absence. Any stimulus that modifies behavior in this manner is referred to as a discriminative stimulus. Stimulus control of behavior occurs when the performance of a particular behavior is controlled by the presence or absence of a discriminative stimulus. For example, the presence of a stop sign at a traffic intersection increases the probability that "braking" behavior will occur.
Behavior management is similar to behavior modification. It is a less intensive version of behavior therapy. In behavior modification, the focus is on changing behavior, while in behavior management the focus is on maintaining order. Behavior management skills are of particular importance to teachers in the educational system. Behavior management include all of the actions and conscious in actions to enhance the probability people, individually and in groups, choose behaviors which are personally fulfilling, productive, and socially acceptable.
The behavioral analysis of child development originates from John B. Watson's behaviorism. Watson studied child development, looking specifically at development through conditioning. He helped bring a natural science perspective to child psychology by introducing objective research methods based on observable and measurable behavior. B.F. Skinner then further extended this model to cover operant conditioning and verbal behavior. Skinner was then able to focus these research methods on feelings and how those emotions can be shaped by a subject’s interaction with the environment. Sidney Bijou (1955) was the first to use this methodological approach extensively with children.
Mand is a term that B.F. Skinner used to describe a verbal operant in which the response is reinforced by a characteristic consequence and is therefore under the functional control of relevant conditions of deprivation or aversive stimulation. One cannot determine, based on form alone, whether a response is a mand; it is necessary to know the kinds of variables controlling a response in order to identify a verbal operant. A mand is sometimes said to "specify its reinforcement" although this is not always the case. Skinner introduced the mand as one of six primary verbal operants in his 1957 work, Verbal Behavior.
Self-administration is, in its medical sense, the process of a subject administering a pharmacological substance to themself. A clinical example of this is the subcutaneous "self-injection" of insulin by a diabetic patient.
James “Jim” A. Dinsmoor was an influential experimental psychologist who published work in the field of the experimental analysis of behavior. He was born October 4, 1921 in Woburn, Massachusetts to Daniel and Jean Dinsmoor. He graduated with his bachelor's degree from Dartmouth College in 1943. Subsequently, he attended Columbia University in New York City, where he received his Master’s and PhD degrees under the mentorship of William N. Schoenfeld and Fred S. Keller. There, he was introduced to the work of B.F. Skinner, whose behavior analytic research inspired Dinsmoor to pursue a lifetime of research in conditioned responding.
The three-term contingency in operant conditioning describes the relationship between a behavior, its consequence, and the environmental context. The three-term contingency was first defined by B. F. Skinner in the early 1950s. It is often used within ABA to alter the frequency of socially significant human behavior.