The experimental analysis of behavior is a science that studies the behavior of individuals across a variety of species. A key early scientist was B. F. Skinner who discovered operant behavior, reinforcers, secondary reinforcers, contingencies of reinforcement, stimulus control, shaping, intermittent schedules, discrimination, and generalization. A central method was the [1] examination of functional relations between environment and behavior, as opposed to hypothetico-deductive learning theory [2] that had grown up in the comparative psychology of the 1920–1950 period. Skinner's approach was characterized by observation of measurable behavior which could be predicted and controlled. It owed its early success to the effectiveness of Skinner's procedures of operant conditioning, both in the laboratory and in behavior therapy.
In classical or respondent conditioning, a neutral stimulus (conditioned stimulus) is delivered just before a reflex-eliciting stimulus (unconditioned stimulus) such as food or pain. This typically done by pairing the two stimuli, as in Pavlov's experiments with dogs, where a bell was followed by food delivery. After repeated pairings, the conditioned stimulus comes to elicit the response. [3]
Operant conditioning (also, "instrumental conditioning") is a learning process in which behavior is sensitive to, or controlled by its consequences. Specifically, behavior followed by some consequences becomes more frequent (positive reinforcement), behavior followed by other consequences becomes less frequent (punishment) and behavior not followed by yet other consequence becomes more frequent (negative reinforcement). For example, in a food-deprived subject, when lever-pressing is followed by food delivery lever-pressing increases in frequency (positive reinforcement). Likewise, when stepping off a treadmill is followed by delivery of electric shock, stepping off the treadmill becomes less frequent (punishment). And when stopping lever-pressing is followed by shock, lever-pressing is maintained or increased (negative reinforcement). Many variations and details of this process may be found in the main article.[ citation needed ]
The most commonly used tool in animal behavioral research is the operant conditioning chamber—also known as a Skinner Box. The chamber is an enclosure designed to hold a test animal (often a rodent, pigeon, or primate). The interior of the chamber contains some type of device that serves the role of discriminative stimuli, at least one mechanism to measure the subject's behavior as a rate of response—such as a lever or key-peck switch—and a mechanism for the delivery of consequences—such as a food pellet dispenser or a token reinforcer such as an LED light.[ citation needed ]
Of historical interest is the cumulative recorder, an instrument used to record the responses of subjects graphically. Traditionally, its graphing mechanism has consisted of a rotating drum of paper equipped with a marking needle. The needle would start at the bottom of the page and the drum would turn the roll of paper horizontally. Each subject response would result in the marking needle moving vertically along the paper one tick. This makes the rate of response the slope of the graph. For example, a regular rate of response would cause the needle to move vertically at a regular rate, resulting in a straight diagonal line rising towards the right. An accelerating or decelerating rate of response would lead to a quadratic (or similar) curve. For the most part, cumulative records are no longer graphed using rotating drums, but are recorded electronically instead.[ citation needed ]
Laboratory methods employed in the experimental analysis of behavior are based upon B.F. Skinner's philosophy of radical behaviorism, which is premised upon:[ citation needed ]
The idea that Skinner's position is anti-theoretical is probably inspired by the arguments he put forth in his article Are Theories of Learning Necessary? [4] However, that article did not argue against the use of theory as such, only against certain theories in certain contexts. Skinner argued that many theories did not explain behavior, but simply offered another layer of structure that itself had to be explained in turn. If an organism is said to have a drive, which causes its behavior, what then causes the drive? Skinner argued that many theories had the effect of halting research or generating useless research.[ citation needed ]
Skinner's work did have a basis in theory, though his theories were different from those that he criticized. Mecca Chiesa notes that Skinner's theories are inductively derived, while those that he attacked were deductively derived. [5] The theories that Skinner opposed often relied on mediating mechanisms and structures—such as a mechanism for memory as a part of the mind—which were not measurable or observable. Skinner's theories form the basis for two of his books: Verbal Behavior , and Science and Human Behavior. These two texts represent considerable theoretical extensions of his basic laboratory work into the realms of political science, linguistics, sociology and others.[ citation needed ]
Burrhus Frederic Skinner was an American psychologist, behaviorist, inventor, and social philosopher. He was the Edgar Pierce Professor of Psychology at Harvard University from 1958 until his retirement in 1974.
Operant conditioning, also called instrumental conditioning, is a learning process where voluntary behaviors are modified by association with the addition of reward or aversive stimuli. The frequency or duration of the behavior may increase through reinforcement or decrease through punishment or extinction.
An operant conditioning chamber is a laboratory apparatus used to study animal behavior. The operant conditioning chamber was created by B. F. Skinner while he was a graduate student at Harvard University. The chamber can be used to study both operant conditioning and classical conditioning.
Classical conditioning is a behavioral procedure in which a biologically potent stimulus is paired with a neutral stimulus. The term classical conditioning refers to the process of an automatic, conditioned response that is paired with a specific stimulus.
In behavioral psychology, reinforcement refers to consequences that increase the likelihood of an organism's future behavior, typically in the presence of a particular antecedent stimulus. For example, a rat can be trained to push a lever to receive food whenever a light is turned on. In this example, the light is the antecedent stimulus, the lever pushing is the operant behavior, and the food is the reinforcer. Likewise, a student that receives attention and praise when answering a teacher's question will be more likely to answer future questions in class. The teacher's question is the antecedent, the student's response is the behavior, and the praise and attention are the reinforcements.
Radical behaviorism is a "philosophy of the science of behavior" developed by B. F. Skinner. It refers to the philosophy behind behavior analysis, and is to be distinguished from methodological behaviorism—which has an intense emphasis on observable behaviors—by its inclusion of thinking, feeling, and other private events in the analysis of human and animal psychology. The research in behavior analysis is called the experimental analysis of behavior and the application of the field is called applied behavior analysis (ABA), which was originally termed "behavior modification."
Behaviorism is a systematic approach to understand the behavior of humans and other animals. It assumes that behavior is either a reflex evoked by the pairing of certain antecedent stimuli in the environment, or a consequence of that individual's history, including especially reinforcement and punishment contingencies, together with the individual's current motivational state and controlling stimuli. Although behaviorists generally accept the important role of heredity in determining behavior, they focus primarily on environmental events. The cognitive revolution of the late 20th century largely replaced behaviorism as an explanatory theory with cognitive psychology, which unlike behaviorism examines internal mental states.
Verbal Behavior is a 1957 book by psychologist B. F. Skinner, in which he describes what he calls verbal behavior, or what was traditionally called linguistics. Skinner's work describes the controlling elements of verbal behavior with terminology invented for the analysis - echoics, mands, tacts, autoclitics and others - as well as carefully defined uses of ordinary terms such as audience.
The law of effect, or Thorndike's law, is a psychology principle advanced by Edward Thorndike in 1898 on the matter of behavioral conditioning which states that "responses that produce a satisfying effect in a particular situation become more likely to occur again in that situation, and responses that produce a discomforting effect become less likely to occur again in that situation."
Shaping is a conditioning paradigm used primarily in the experimental analysis of behavior. The method used is differential reinforcement of successive approximations. It was introduced by B. F. Skinner with pigeons and extended to dogs, dolphins, humans and other species. In shaping, the form of an existing response is gradually changed across successive trials towards a desired target behavior by reinforcing exact segments of behavior. Skinner's explanation of shaping was this:
We first give the bird food when it turns slightly in the direction of the spot from any part of the cage. This increases the frequency of such behavior. We then withhold reinforcement until a slight movement is made toward the spot. This again alters the general distribution of behavior without producing a new unit. We continue by reinforcing positions successively closer to the spot, then by reinforcing only when the head is moved slightly forward, and finally only when the beak actually makes contact with the spot. ... The original probability of the response in its final form is very low; in some cases it may even be zero. In this way we can build complicated operants which would never appear in the repertoire of the organism otherwise. By reinforcing a series of successive approximations, we bring a rare response to a very high probability in a short time. ... The total act of turning toward the spot from any point in the box, walking toward it, raising the head, and striking the spot may seem to be a functionally coherent unit of behavior; but it is constructed by a continual process of differential reinforcement from undifferentiated behavior, just as the sculptor shapes his figure from a lump of clay.
Extinction is a behavioral phenomenon observed in both operantly conditioned and classically conditioned behavior, which manifests itself by fading of non-reinforced conditioned response over time. When operant behavior that has been previously reinforced no longer produces reinforcing consequences the behavior gradually stops occurring. In classical conditioning, when a conditioned stimulus is presented alone, so that it no longer predicts the coming of the unconditioned stimulus, conditioned responding gradually stops. For example, after Pavlov's dog was conditioned to salivate at the sound of a metronome, it eventually stopped salivating to the metronome after the metronome had been sounded repeatedly but no food came. Many anxiety disorders such as post traumatic stress disorder are believed to reflect, at least in part, a failure to extinguish conditioned fear.
In operant conditioning, punishment is any change in a human or animal's surroundings which, occurring after a given behavior or response, reduces the likelihood of that behavior occurring again in the future. As with reinforcement, it is the behavior, not the human/animal, that is punished. Whether a change is or is not punishing is determined by its effect on the rate that the behavior occurs. This is called motivating operations (MO), because they alter the effectiveness of a stimulus. MO can be categorized in abolishing operations, decrease the effectiveness of the stimuli and establishing, increase the effectiveness of the stimuli. For example, a painful stimulus which would act as a punisher for most people may actually reinforce some behaviors of masochistic individuals.
Behavioral momentum is a theory in quantitative analysis of behavior and is a behavioral metaphor based on physical momentum. It describes the general relation between resistance to change and the rate of reinforcement obtained in a given situation.
In behavioral psychology, stimulus control is a phenomenon in operant conditioning that occurs when an organism behaves in one way in the presence of a given stimulus and another way in its absence. A stimulus that modifies behavior in this manner is either a discriminative stimulus or stimulus delta. For example, the presence of a stop sign at a traffic intersection alerts the driver to stop driving and increases the probability that braking behavior occurs. Stimulus control does not force behavior to occur, as it is a direct result of historical reinforcement contingencies, as opposed to reflexive behavior elicited through classical conditioning.
The behavioral analysis of child development originates from John B. Watson's behaviorism.
Mand is a term that B.F. Skinner used to describe a verbal operant in which the response is reinforced by a characteristic consequence and is therefore under the functional control of relevant conditions of deprivation or aversive stimulation. One cannot determine, based on form alone, whether a response is a mand; it is necessary to know the kinds of variables controlling a response in order to identify a verbal operant. A mand is sometimes said to "specify its reinforcement" although this is not always the case. Skinner introduced the mand as one of six primary verbal operants in his 1957 work, Verbal Behavior.
In psychology, a stimulus is any object or event that elicits a sensory or behavioral response in an organism. In this context, a distinction is made between the distal stimulus and the proximal stimulus.
Psychological behaviorism is a form of behaviorism—a major theory within psychology which holds that generally human behaviors are learned—proposed by Arthur W. Staats. The theory is constructed to advance from basic animal learning principles to deal with all types of human behavior, including personality, culture, and human evolution. Behaviorism was first developed by John B. Watson (1912), who coined the term "behaviorism", and then B. F. Skinner who developed what is known as "radical behaviorism". Watson and Skinner rejected the idea that psychological data could be obtained through introspection or by an attempt to describe consciousness; all psychological data, in their view, was to be derived from the observation of outward behavior. The strategy of these behaviorists was that the animal learning principles should then be used to explain human behavior. Thus, their behaviorisms were based upon research with animals.
James "Jim" A. Dinsmoor was an American experimental psychologist who published work in the field of the experimental analysis of behavior.
Association in psychology refers to a mental connection between concepts, events, or mental states that usually stems from specific experiences. Associations are seen throughout several schools of thought in psychology including behaviorism, associationism, psychoanalysis, social psychology, and structuralism. The idea stems from Plato and Aristotle, especially with regard to the succession of memories, and it was carried on by philosophers such as John Locke, David Hume, David Hartley, and James Mill. It finds its place in modern psychology in such areas as memory, learning, and the study of neural pathways.