External inhibition is the observed decrease of the response of a conditioned reaction when an external (distracting) stimulus that was not part of the original conditioned response set is introduced. This effect was first observed in Ivan Pavlov's classical conditioning studies where the dogs would salivate less (conditioned response) when presented with the sound of the tuning fork (conditioned stimulus) in the distracting context of a passing truck (external stimulus). [1] External inhibition is important for its main principle in classical conditioning where a conditioned response may decrease in magnitude after the external stimulus is introduced. This is especially advantageous for when trying to disassociate conditioned stimulus and responses. A practical example is where students who become anxious (conditioned response) upon standing in front of the class to give a presentation (conditioned stimulus) may feel less anxiety if their friends were sitting in front of the student presenting (external stimulus). The positive association of speaking to friends may distract the student from associating speaking to the entire class with anxiety.
The extent of external inhibition depends on several factors, including:
This same external stimulus can also lead to an increased response of a conditioned reaction, called disinhibition, when introduced after experimental extinction (when the conditioned response process is independent of the conditioned stimulus). [3] During extinction, the subject has been unconditioned as to not show the conditioned response when presented by the paired conditioned stimulus. [2] An example of disinhibition is where a rat that was conditioned to walk from point A to point B at the sound of the buzzer and then unconditioned, and introduced with a different stimulus, such as a blinking light, will again exhibit the conditioned reaction of walking to point B. The observed response of walking to point B after a blinking light stimulus will be relatively greater compared to the rat’s movement during the unconditioned phase (when the rat was not presented with either the buzzer or blinking light).
Wenger's 1936 study examined if the same external stimulus can be used to demonstrate both external inhibition and disinhibition and the relationship of the external stimulus to the intensity of external inhibition and disinhibition. Wenger conditioned participants with electro-dermal response (raising the foot to avoid the shock) to a red light using repeating presentations of a red light paired with a shock to the right foot. After the participants were conditioned, the extra stimulus of a tactual vibration to the left hand was introduced before the red light was shown in the absence of a shock stimulus. Following the principles of external inhibition, Wenger hypothesized that the after-effect of the tactual vibration would inhibit the conditioned response to the red light and lead to smaller movements of the foot to the red light. Disinhibition was tested after experimental extinction, where the red light was presented multiple times without any shock reinforcement. Following the principles of disinhibition, Wenger hypothesized that the tactual vibration will induce a greater reaction to the light stimuli when compared to the reactions from the external inhibition test. Both of Wenger's hypotheses were confirmed; Wenger observed that both external inhibition and disinhibition could be produced by the same external stimulus (tactual vibration). In addition, higher intensity of the external stimulus produced greater magnitudes of external inhibition and disinhibition however the functional strengths of externally inhibited and disinhibited responses were not considered decisive. [2]
The 1941 study by Gagné attempted to identify the effect of two different external stimuli (buzzer, and scratching on the back of the starting box) on rats, while applied during acquisition and extinction, to identify the effect on the strength of the conditioned operant response. Five groups of rats were used, however the differences in the latent period were hypothesized to be observed in the following conditions if it existed in comparison to a control group that did not introduce any external stimuli, 1) buzzer on the first trial of acquisition, 2) scratch on the first trial of acquisition, 3) buzzer on the fourth trial of acquisition, 4) scratch on the fourth trial of acquisition, 5) buzzer on the fifth trial of extinction, and 6) scratch on the fifth trial of extinction. For each experimental procedure, the buzzer was sounded for four seconds and stopped for two seconds before the beginning of the next trial; the scratching continued until the rat turned around to face the back of the starting box. [3] The buzzer can be interpreted to explain Pavlov's observations on external inhibition and disinhibition in a conditioned operant response and support B. F. Skinner's hypothesis of an "emotional effect". Skinner describes that an emotional effect is observed when a response fails to be reinforced, possibly leading to operant extinction, and also an emotional "reaction commonly spoken of as frustration or rage". [4] Regarding the extra stimulus used, the buzzer has a "depressing effect" on all trials which decreased the response magnitude during extinction. An observed increase of response magnitude following the depression would be considered a "compensatory increase in the number of available responses". [3] On the other hand, the buzzer can also be interpreted as an external stimulus that decreases the response magnitude (external inhibition), and produce an increased response magnitude on the next trial (disinhibition) after the effect of inhibition declines. This suggests that the buzzer weakens the conditioned response, but if it is repeatedly encountered, it serves to strengthen the conditioned response, thereby decreasing the latent period. The external scratching stimulus portrays external inhibition during acquisition (fourth trial of acquisition), and disinhibition during extinction (fifth trial of extinction). [3] There was a significant increase in the recorded latent period (i.e., the time it took for the rat to move out of the starting box and pass a four-inch mark while walking towards the food box), and decline in the magnitude of the conditioned response since the rat took longer to reach the four inch point. The additional learned response of the rat turning to the front of the box in reaction to the scratching is an effect of external inhibition, which may have added to the latent time, but the results do not indicate what portion of the rat’s turn-around response made up the latent time. Shortened latent period on the fifth trial of extinction indicates an increased magnitude of response, and represents disinhibition, where the rat is not responding to the extra stimulus as much and increases the magnitude of the conditioned response (faster latent time).
In a study by Pennypacker (1964), a conditioned stimulus of a circular red light, an unconditioned stimulus of a puff of dry compressed air, and an external stimulus of a 1,000-cps tone were used to investigate external inhibition at the human level during different introduction intervals. Reflexive blinking was measured as the conditioned and unconditioned response. Each participant was first presented with two trials of the circular red light without reinforcement, and three trials of the puff of dry compressed air alone. Depending on the group, participants were presented with either 15, 30, or 60 paired conditioned stimulus – unconditioned stimulus (CS-UCS) trials, a round of an external stimulus, another 15, 30, or 60 trials of CS-UCS trials, another round of an external stimulus, and then 5 CS-UCS trials. Pennypacker suggested that following the introduction of any novel stimuli, a period of excitation occurs between the conditioned stimulus (red light) and its conditioned response (blinking), called an induction period. Thus, if the external stimulus was presented earlier in the acquisition phase, the observed decline in blinking would be even less than if the external stimulus was presented later. Pennypacker also suggested that it may be possible that an external stimulus that was introduced too late to affect the conditioned stimulus may externally inhibit the unconditioned stimulus. However, the study failed to confirm the presence of induction immediately after the tone (external stimulus) was introduced, and there was no evidence that external stimulus had any effect on reflexive blinking when presented halfway through the interval, aside from isolated cases. He suggests that the difference between the induction effect observed in a preliminary study and the current study are due to the use of visual external stimulus during a preliminary study and presenting the external stimulus instead of the conditioned stimulus; compared to the use of an auditory external stimulus and presentation of the tone in addition to the conditioned stimulus. Another suggestion is that the external stimulus was not intense enough to produce an inductive effect. Through this study, Pennypacker confirmed the observation of external inhibition on the human level. External inhibition was especially observed when the tone (external stimulus) was introduced during the acquisition phase, which was the interval right after the paired CS-UCS trials. The conditioned response, blinking reflex, was observed to be in decline (inhibited) compared to the rate during conditioning. [5]
Operant conditioning, also called instrumental conditioning, is a learning process where voluntary behaviors are modified by association with the addition of reward or aversive stimuli. The frequency or duration of the behavior may increase through reinforcement or decrease through punishment or extinction.
An operant conditioning chamber is a laboratory apparatus used to study animal behavior. The operant conditioning chamber was created by B. F. Skinner while he was a graduate student at Harvard University. The chamber can be used to study both operant conditioning and classical conditioning.
Classical conditioning is a behavioral procedure in which a biologically potent physiological stimulus is paired with a neutral stimulus. The term classical conditioning refers to the process of an automatic, conditioned response that is paired with a specific stimulus.
In behavioral psychology, reinforcement is any consequence that increases the likelihood of an organism's future behavior whenever that behavior is preceded by a particular antecedent stimulus. For example, a rat can be trained to push a lever to receive food whenever a light is turned on. In this example, the light is the antecedent stimulus, the lever pushing is the behavior, and the food is the reinforcement. Likewise, a student that receives attention and praise when answering a teacher's question will be more likely to answer future questions in class. The teacher's question is the antecedent, the student's response is the behavior, and the praise and attention are the reinforcements.
The experimental analysis of behavior is a science that studies the behavior of individuals across a variety of species. A key early scientist was B. F. Skinner who discovered operant behavior, reinforcers, secondary reinforcers, contingencies of reinforcement, stimulus control, shaping, intermittent schedules, discrimination, and generalization. A central method was the examination of functional relations between environment and behavior, as opposed to hypothetico-deductive learning theory that had grown up in the comparative psychology of the 1920–1950 period. Skinner's approach was characterized by observation of measurable behavior which could be predicted and controlled. It owed its early success to the effectiveness of Skinner's procedures of operant conditioning, both in the laboratory and in behavior therapy.
Eyeblink conditioning (EBC) is a form of classical conditioning that has been used extensively to study neural structures and mechanisms that underlie learning and memory. The procedure is relatively simple and usually consists of pairing an auditory or visual stimulus with an eyeblink-eliciting unconditioned stimulus (US). Naïve organisms initially produce a reflexive, unconditioned response (UR) that follows US onset. After many CS-US pairings, an association is formed such that a learned blink, or conditioned response (CR), occurs and precedes US onset. The magnitude of learning is generally gauged by the percentage of all paired CS-US trials that result in a CR. Under optimal conditions, well-trained animals produce a high percentage of CRs. The conditions necessary for, and the physiological mechanisms that govern, eyeblink CR learning have been studied across many mammalian species, including mice, rats, guinea pigs, rabbits, ferrets, cats, and humans. Historically, rabbits have been the most popular research subjects.
The Rescorla–Wagner model ("R-W") is a model of classical conditioning, in which learning is conceptualized in terms of associations between conditioned (CS) and unconditioned (US) stimuli. A strong CS-US association means that the CS signals predict the US. One might say that before conditioning, the subject is surprised by the US, but after conditioning, the subject is no longer surprised, because the CS predicts the coming of the US. The model casts the conditioning processes into discrete trials, during which stimuli may be either present or absent. The strength of prediction of the US on a trial can be represented as the summed associative strengths of all CSs present during the trial. This feature of the model represented a major advance over previous models, and it allowed a straightforward explanation of important experimental phenomena, most notably the blocking effect. Failures of the model have led to modifications, alternative models, and many additional findings. The model has had some impact on neural science in recent years, as studies have suggested that the phasic activity of dopamine neurons in mesostriatal DA projections in the midbrain encodes for the type of prediction error detailed in the model.
Conditioned taste aversion occurs when an animal acquires an aversion to the taste of a food that was paired with aversive stimuli. The Garcia effect explains that the aversion develops more strongly for stimuli that cause nausea than other stimuli. This is considered an adaptive trait or survival mechanism that enables the organism to avoid poisonous substances before they cause harm. The aversion reduces consuming the same substance in the future, thus avoiding poisoning.
Shaping is a conditioning paradigm used primarily in the experimental analysis of behavior. The method used is differential reinforcement of successive approximations. It was introduced by B. F. Skinner with pigeons and extended to dogs, dolphins, humans and other species. In shaping, the form of an existing response is gradually changed across successive trials towards a desired target behavior by reinforcing exact segments of behavior. Skinner's explanation of shaping was this:
We first give the bird food when it turns slightly in the direction of the spot from any part of the cage. This increases the frequency of such behavior. We then withhold reinforcement until a slight movement is made toward the spot. This again alters the general distribution of behavior without producing a new unit. We continue by reinforcing positions successively closer to the spot, then by reinforcing only when the head is moved slightly forward, and finally only when the beak actually makes contact with the spot. ... The original probability of the response in its final form is very low; in some cases it may even be zero. In this way we can build complicated operants which would never appear in the repertoire of the organism otherwise. By reinforcing a series of successive approximations, we bring a rare response to a very high probability in a short time. ... The total act of turning toward the spot from any point in the box, walking toward it, raising the head, and striking the spot may seem to be a functionally coherent unit of behavior; but it is constructed by a continual process of differential reinforcement from undifferentiated behavior, just as the sculptor shapes his figure from a lump of clay.
Extinction is a behavioral phenomenon observed in both operantly conditioned and classically conditioned behavior, which manifests itself by fading of non-reinforced conditioned response over time. When operant behavior that has been previously reinforced no longer produces reinforcing consequences the behavior gradually stops occurring. In classical conditioning, when a conditioned stimulus is presented alone, so that it no longer predicts the coming of the unconditioned stimulus, conditioned responding gradually stops. For example, after Pavlov's dog was conditioned to salivate at the sound of a metronome, it eventually stopped salivating to the metronome after the metronome had been sounded repeatedly but no food came. Many anxiety disorders such as post traumatic stress disorder are believed to reflect, at least in part, a failure to extinguish conditioned fear.
An avoidance response is a natural adaptive behavior performed in response to danger. Excessive avoidance has been suggested to contribute to anxiety disorders, leading psychologists and neuroscientists to study how avoidance behaviors are learned using rat or mouse models. Avoidance learning is a type of operant conditioning.
In Kamin's blocking effect the conditioning of an association between two stimuli, a conditioned stimulus (CS) and an unconditioned stimulus (US) is impaired if, during the conditioning process, the CS is presented together with a second CS that has already been associated with the unconditioned stimulus.
In psychology, a stimulus is any object or event that elicits a sensory or behavioral response in an organism. In this context, a distinction is made between the distal stimulus and the proximal stimulus.
In experimental psychology the term conditioned emotional response refers to a phenomenon that is seen in classical conditioning after a conditioned stimulus (CS) has been paired with an emotion-producing unconditioned stimulus (US) such as electric shock. The conditioned emotional response is usually measured through its effect in suppressing an ongoing response. For example, a rat first learns to press a lever through operant conditioning. Classical conditioning follows: in a series of trials the rat is exposed to a CS, often a light or a noise. Each CS is followed by the US, an electric shock. As an association between the CS and US develops, the rat slows or stops its lever pressing when the CS comes on. The slower the rat presses, the stronger its conditioned emotional response, or "fear."
Conditioned place preference (CPP) is a form of Pavlovian conditioning used to measure the motivational effects of objects or experiences. This motivation comes from the pleasurable aspect of the experience, so that the brain can be reminded of the context that surrounded the "encounter". By measuring the amount of time an animal spends in an area that has been associated with a stimulus, researchers can infer the animal's liking for the stimulus. This paradigm can also be used to measure conditioned place aversion with an identical procedure involving aversive stimuli instead. Both procedures usually involve mice or rats as subjects. This procedure can be used to measure extinction and reinstatement of the conditioned stimulus. Certain drugs are used in this paradigm to measure their reinforcing properties. Two different methods are used to choose the compartments to be conditioned, and these are biased vs. unbiased. The biased method allows the animal to explore the apparatus, and the compartment they least prefer is the one that the drug is administered in and the one they most prefer is the one where the vehicle is injected. This method allows the animal to choose the compartment they get the drug and vehicle. In comparison, the unbiased method does not allow the animal to choose what compartment they get the drug and vehicle in. Instead, the researcher chooses the compartments.
The term conditioned emotional response (CER) can refer to a specific learned behavior or a procedure commonly used in classical or Pavlovian conditioning research. It may also be called "conditioned suppression" or "conditioned fear response (CFR)." It is an "emotional response" that results from classical conditioning, usually from the association of a relatively neutral stimulus with a painful or fear-inducing unconditional stimulus. As a result, the formerly neutral stimulus elicits fear. For example, if seeing a dog is paired with the pain of being bitten by the dog, seeing a dog may become a conditioned stimulus that elicits fear.
Spontaneous recovery is a phenomenon of learning and memory that was first named and described by Ivan Pavlov in his studies of classical (Pavlovian) conditioning. In that context, it refers to the re-emergence of a previously extinguished conditioned response after a delay. Such a recovery of "lost" behaviors can be observed within a variety of domains, and the recovery of lost human memories is often of particular interest. For a mathematical model for spontaneous recovery see Further Reading.
Many experiments have been done to find out how the brain interprets stimuli and how animals develop fear responses. The emotion, fear, has been hard-wired into almost every individual, due to its vital role in the survival of the individual. Researchers have found that fear is established unconsciously and that the amygdala is involved with fear conditioning.
Pavlovian-instrumental transfer (PIT) is a psychological phenomenon that occurs when a conditioned stimulus that has been associated with rewarding or aversive stimuli via classical conditioning alters motivational salience and operant behavior. Two distinct forms of Pavlovian-instrumental transfer have been identified in humans and other animals – specific PIT and general PIT – with unique neural substrates mediating each type. In relation to rewarding stimuli, specific PIT occurs when a CS is associated with a specific rewarding stimulus through classical conditioning and subsequent exposure to the CS enhances an operant response that is directed toward the same reward with which it was paired. General PIT occurs when a CS is paired with one reward and it enhances an operant response that is directed toward a different rewarding stimulus.
Association in psychology refers to a mental connection between concepts, events, or mental states that usually stems from specific experiences. Associations are seen throughout several schools of thought in psychology including behaviorism, associationism, psychoanalysis, social psychology, and structuralism. The idea stems from Plato and Aristotle, especially with regard to the succession of memories, and it was carried on by philosophers such as John Locke, David Hume, David Hartley, and James Mill. It finds its place in modern psychology in such areas as memory, learning, and the study of neural pathways.