Extinction (psychology)

Last updated

Extinction is a behavioral phenomenon observed in both operantly conditioned and classically conditioned behavior, which manifests itself by fading of non-reinforced conditioned response over time. When operant behavior that has been previously reinforced no longer produces reinforcing consequences the behavior gradually stops occurring. [1] In classical conditioning, when a conditioned stimulus is presented alone, so that it no longer predicts the coming of the unconditioned stimulus, conditioned responding gradually stops. For example, after Pavlov's dog was conditioned to salivate at the sound of a metronome, it eventually stopped salivating to the metronome after the metronome had been sounded repeatedly but no food came. Many anxiety disorders such as post traumatic stress disorder are believed to reflect, at least in part, a failure to extinguish conditioned fear. [2]

Contents

Theories

The dominant account of extinction involves associative models. However, there is debate over whether extinction involves simply "unlearning" the unconditional stimulus (US) – Conditional stimulus (CS) association (e.g., the Rescorla–Wagner account) or, alternatively, a "new learning" of an inhibitory association that masks the original excitatory association (e.g., Konorski, Pearce and Hall account). A third account concerns non-associative mechanisms such as habituation, modulation and response fatigue. Myers & Davis review fear extinction in rodents and suggested that multiple mechanisms may be at work depending on the timing and circumstances in which the extinction occurs. [3]

Given the competing views and difficult observations for the various accounts researchers have turned to investigations at the cellular level (most often in rodents) to tease apart the specific brain mechanisms of extinction, in particular the role of the brain structures (amygdala, hippocampus, the prefrontal cortex), and specific neurotransmitter systems (e.g., GABA, NMDA). [3] A recent study in rodents by Amano, Unal and Paré published in Nature Neuroscience found that extinction of a conditioned fear response is correlated with synaptic inhibition in the fear output neurons of the central amygdala that project to the periaqueductal gray that controls freezing behavior. They infer that inhibition derives from the ventromedial prefrontal cortex and suggest promising targets at the cellular level for new treatments of anxiety. [4]

Classical conditioning

Learning extinction can also occur in a classical conditioning paradigm. In this model, a neutral cue or context can come to elicit a conditioned response when it is paired with an unconditioned stimulus. An unconditioned stimulus is one that naturally and automatically triggers a certain behavioral response. A certain stimulus or environment can become a conditioned cue or a conditioned context, respectively, when paired with an unconditioned stimulus. An example of this process is a fear conditioning paradigm using a mouse. In this instance, a tone paired with a mild footshock can become a conditioned cue, eliciting a fear response when presented alone in the future. In the same way, the context in which a footshock is received such as a chamber with certain dimensions and a certain odor can elicit the same fear response when the mouse is placed back in that chamber in the absence of the footshock.

In this paradigm, extinction occurs when the animal is re-exposed to the conditioned cue or conditioned context in the absence of the unconditioned stimulus. As the animal learns that the cue or context no longer predicts the coming of the unconditioned stimulus, conditioned responding gradually decreases, or extinguishes.

Operant conditioning

In the operant conditioning paradigm, extinction refers to the process of no longer providing the reinforcement that has been maintaining a behavior. Operant extinction differs from forgetting in that the latter refers to a decrease in the strength of a behavior over time when it has not been emitted. [5] For example, a child who climbs under his desk, a response which has been reinforced by attention, is subsequently ignored until the attention-seeking behavior no longer occurs. In his autobiography, B.F. Skinner noted how he accidentally discovered the extinction of an operant response due to the malfunction of his laboratory equipment:

My first extinction curve showed up by accident. A rat was pressing the lever in an experiment on satiation when the pellet dispenser jammed. I was not there at the time, and when I returned I found a beautiful curve. The rat had gone on pressing although no pellets were received. ... The change was more orderly than the extinction of a salivary reflex in Pavlov's setting, and I was terribly excited. It was a Friday afternoon and there was no one in the laboratory who I could tell. All that weekend I crossed streets with particular care and avoided all unnecessary risks to protect my discovery from loss through my accidental death. [6]

When the extinction of a response has occurred, the discriminative stimulus is then known as an extinction stimulus (SΔ or S-delta). When an S-delta is present, the reinforcing consequence which characteristically follows a behavior does not occur. This is the opposite of a discriminative stimulus which is a signal that reinforcement will occur. For instance, in an operant chamber, if food pellets are only delivered when a response is emitted in the presence of a green light, the green light is a discriminative stimulus. If when a red light is present food will not be delivered, then the red light is an extinction stimulus (food here is used as an example of a reinforcer). However, some make the distinction between extinction stimuli and "S-Delta" due to the behavior not having a reinforcement history, i.e. in an array of three items (phone, pen, paper) "Which one is the phone" the "pen" and "paper" will not produce a response in the teacher but is not technically extinction on the first trial due to selecting "pen" or "paper" missing a reinforcement history. This still would be considered as S-Delta.

Successful Extinction Procedures

In order for extinction to work effectively, it must be done consistently. Extinction is considered successful when responding in the presence of an extinction stimulus (a red light or a teacher not giving a bad student attention, for instance) is zero. When a behavior reappears again after it has gone through extinction, it is called spontaneous recovery. It (extinction) is the result of challenging behavior(s) no longer occurring without the need for reinforcement. If there is a relapse and reinforcements are given, the problem behavior will return. Extinction can be a long process; therefore, it requires that the facilitator of the procedure be completely invested from beginning to end in order for the outcome to be successful. [7] The fewer challenging behaviors observed after extinction will most likely produce a less significant spontaneous recovery. [8] While working towards extinction there are different distributions or schedules of when to administer reinforcements. Some people may use an intermittent reinforcement schedule that include: fixed ratio, variable ratio, fixed interval and variable interval. Another option is to use a continuous reinforcement. Schedules can be both fixed and variable and also the number of reinforcements given during each interval can vary. [9]

Extinction procedures in the classroom

A positive classroom environment wields better results in learning growth. Therefore, in order for children to be successful in the classroom, their environment should be free of problem behaviors that can cause distractions. [10] The classroom should be a place that offers consistency, structure, and stability, where the student feels empowered, supported and safe. When problem behaviors occur, learning opportunities decrease. [11] Problem behaviors in the classroom that would benefit from extinction may include off-task behaviors, blurting, yelling, interrupting and use of inappropriate language. [12] The use of extinction has been used primarily when the problem behaviors interfered with successful classroom outcomes. [13] While other methods have been used in conjunction with extinction, positive outcomes are not likely when extinction is not used in behavior interventions. [12]

Burst

While extinction, when implemented consistently over time, results in the eventual decrease of the undesired behavior, in the short term the subject might exhibit what is called an extinction burst. An extinction burst will often occur when the extinction procedure has just begun. This usually consists of a sudden and temporary increase in the response's frequency, followed by the eventual decline and extinction of the behavior targeted for elimination. Novel behavior, or emotional responses or aggressive behavior, may also occur. [1]

For example, a pigeon has been reinforced to peck an electronic button. During its training history, every time the pigeon pecked the button, it will have received a small amount of bird seed as a reinforcer. Thus, whenever the bird is hungry, it will peck the button to receive food. However, if the button were to be turned off, the hungry pigeon will first try pecking the button just as it has in the past. When no food is forthcoming, the bird will likely try repeatedly. After a period of frantic activity, in which their pecking behavior yields no result, the pigeon's pecking will decrease in frequency.

Although not explained by reinforcement theory, the extinction burst can be understood using control theory. In perceptual control theory, the degree of output involved in any action is proportional to the discrepancy between the reference value (desired rate of reward in the operant paradigm) and the current input. Thus, when reward is removed, the discrepancy increases, and the output is increased. In the long term, 'reorganisation', the learning algorithm of control theory, would adapt the control system such that output is reduced.

The evolutionary advantage of this extinction burst is clear. In a natural environment, an animal that persists in a learned behavior, despite not resulting in immediate reinforcement, might still have a chance of producing reinforcing consequences if the animal tries again. This animal would be at an advantage over another animal that gives up too easily.

Despite the name, however, not every explosive reaction to adverse stimuli subsides to extinction. Indeed, a small minority of individuals persist in their reaction indefinitely.

Extinction-induced variability

Extinction-induced variability serves an adaptive role similar to the extinction burst. When extinction begins, subjects can exhibit variations in response topography (the movements involved in the response). Response topography is always somewhat variable due to differences in environment or idiosyncratic causes but normally a subject's history of reinforcement keeps slight variations stable by maintaining successful variations over less successful variations. Extinction can increase these variations significantly as the subject attempts to acquire the reinforcement that previous behaviors produced. If a person attempts to open a door by turning the knob, but is unsuccessful, they may next try jiggling the knob, pushing on the frame, knocking on the door or other behaviors to get the door to open. Extinction-induced variability can be used in shaping to reduce problematic behaviors by reinforcing desirable behaviors produced by extinction-induced variability.

Autism

Children with Autism Spectrum Disorder (ASD) are known to have restricted or repetitive behaviors that can cause problems when trying to function in day-to-day activities. [14] Extinction is used as an intervention to help with problem behaviors. [15] Some problem behaviors may include but are not limited to, self-injurious behaviors, aggression, tantrums, problems with sleep, and making choices. [16] Ignoring certain self-injurious behaviors can lead to the extinction of said behaviors in children with ASD. [17] Escape Extinction (EE) is commonly used in instances when having to make choices causes problem behavior. [18] An example could be having to choose between mint or strawberry flavored toothpaste when brushing your teeth. Those would be the only two options available. When implementing EE, the interventionist will use physical and verbal prompting to help the subject make a choice. [18]

Neurobiology

Glutamate

Glutamate is a neurotransmitter that has been extensively implicated in the neural basis of learning. [19] D-Cycloserine (DCS) is a partial agonist for the glutamate receptor NMDA at the glycine site, and has been trialed as an adjunct to conventional exposure-based treatments based on the principle of cue extinction.

A role for glutamate has also been identified in the extinction of a cocaine-associated environmental stimuli through testing in rats. Specifically, the metabotropic glutamate 5 receptor (mGlu5) is important for the extinction of a cocaine-associated context [20] and a cocaine-associated cue. [21]

Dopamine

Dopamine is another neurotransmitter implicated in learning extinction across both appetitive and aversive domains. [22] Dopamine signaling has been implicated in the extinction of conditioned fear [23] [24] [25] [26] [27] and the extinction of drug-related learning [28] [29]

Circuitry

The brain region most extensively implicated in learning extinction is the infralimbic cortex (IL) of the medial prefrontal cortex (mPFC) [30] The IL is important for the extinction of reward- and fear-associated behaviors, while the amygdala has been strongly implicated in the extinction of conditioned fear. [3] The posterior cingulate cortex (PCC) and temporoparietal junction (TPJ) have also been identified as regions that may be associated with impaired extinction in adolescents. [31]

Across development

There is a strong body of evidence to suggest that extinction alters across development. [32] [33] That is, learning extinction may differ during infancy, childhood, adolescence and adulthood. During infancy and childhood, learning extinction is especially persistent, which some have interpreted as erasure of the original CS-US association, [34] [35] [36] but this remains contentious. In contrast, during adolescence and adulthood extinction is less persistent, which is interpreted as new learning of a CS-no US association that exists in tandem and opposition to the original CS-US memory. [37] [38]

See also

Related Research Articles

Operant conditioning, also called instrumental conditioning, is a learning process where voluntary behaviors are modified by association with the addition of reward or aversive stimuli. The frequency or duration of the behavior may increase through reinforcement or decrease through punishment or extinction.

Classical conditioning is a behavioral procedure in which a biologically potent stimulus is paired with a neutral stimulus. The term classical conditioning refers to the process of an automatic, conditioned response that is paired with a specific stimulus.

<span class="mw-page-title-main">Reinforcement</span> Consequence affecting an organisms future behavior

In behavioral psychology, reinforcement refers to consequences that increases the likelihood of an organism's future behavior, typically in the presence of a particular antecedent stimulus. For example, a rat can be trained to push a lever to receive food whenever a light is turned on. In this example, the light is the antecedent stimulus, the lever pushing is the operant behavior, and the food is the reinforcer. Likewise, a student that receives attention and praise when answering a teacher's question will be more likely to answer future questions in class. The teacher's question is the antecedent, the student's response is the behavior, and the praise and attention are the reinforcements.

The experimental analysis of behavior is a science that studies the behavior of individuals across a variety of species. A key early scientist was B. F. Skinner who discovered operant behavior, reinforcers, secondary reinforcers, contingencies of reinforcement, stimulus control, shaping, intermittent schedules, discrimination, and generalization. A central method was the examination of functional relations between environment and behavior, as opposed to hypothetico-deductive learning theory that had grown up in the comparative psychology of the 1920–1950 period. Skinner's approach was characterized by observation of measurable behavior which could be predicted and controlled. It owed its early success to the effectiveness of Skinner's procedures of operant conditioning, both in the laboratory and in behavior therapy.

<span class="mw-page-title-main">Fear conditioning</span> Behavioral paradigm in which organisms learn to predict aversive events

Pavlovian fear conditioning is a behavioral paradigm in which organisms learn to predict aversive events. It is a form of learning in which an aversive stimulus is associated with a particular neutral context or neutral stimulus, resulting in the expression of fear responses to the originally neutral stimulus or context. This can be done by pairing the neutral stimulus with an aversive stimulus. Eventually, the neutral stimulus alone can elicit the state of fear. In the vocabulary of classical conditioning, the neutral stimulus or context is the "conditional stimulus" (CS), the aversive stimulus is the "unconditional stimulus" (US), and the fear is the "conditional response" (CR).

Behaviorism is a systematic approach to understand the behavior of humans and other animals. It assumes that behavior is either a reflex evoked by the pairing of certain antecedent stimuli in the environment, or a consequence of that individual's history, including especially reinforcement and punishment contingencies, together with the individual's current motivational state and controlling stimuli. Although behaviorists generally accept the important role of heredity in determining behavior, they focus primarily on environmental events. The cognitive revolution of the late 20th century largely replaced behaviorism as an explanatory theory with cognitive psychology, which unlike behaviorism examines internal mental states.

<span class="mw-page-title-main">Nucleus accumbens</span> Region of the basal forebrain

The nucleus accumbens is a region in the basal forebrain rostral to the preoptic area of the hypothalamus. The nucleus accumbens and the olfactory tubercle collectively form the ventral striatum. The ventral striatum and dorsal striatum collectively form the striatum, which is the main component of the basal ganglia. The dopaminergic neurons of the mesolimbic pathway project onto the GABAergic medium spiny neurons of the nucleus accumbens and olfactory tubercle. Each cerebral hemisphere has its own nucleus accumbens, which can be divided into two structures: the nucleus accumbens core and the nucleus accumbens shell. These substructures have different morphology and functions.

<span class="mw-page-title-main">Dopaminergic pathways</span> Projection neurons in the brain that synthesize and release dopamine

Dopaminergic pathways in the human brain are involved in both physiological and behavioral processes including movement, cognition, executive functions, reward, motivation, and neuroendocrine control. Each pathway is a set of projection neurons, consisting of individual dopaminergic neurons.

Motivational salience is a cognitive process and a form of attention that motivates or propels an individual's behavior towards or away from a particular object, perceived event or outcome. Motivational salience regulates the intensity of behaviors that facilitate the attainment of a particular goal, the amount of time and energy that an individual is willing to expend to attain a particular goal, and the amount of risk that an individual is willing to accept while working to attain a particular goal.

In internal medicine, relapse or recidivism is a recurrence of a past condition. For example, multiple sclerosis and malaria often exhibit peaks of activity and sometimes very long periods of dormancy, followed by relapse or recrudescence.

An avoidance response is a natural adaptive behavior performed in response to danger. Excessive avoidance has been suggested to contribute to anxiety disorders, leading psychologists and neuroscientists to study how avoidance behaviors are learned using rat or mouse models. Avoidance learning is a type of operant conditioning.

<span class="mw-page-title-main">Quinpirole</span> Chemical compound

Quinpirole is a psychoactive drug and research chemical which acts as a selective D2 and D3 receptor agonist. It is used in scientific research. Quinpirole has been shown to increase locomotion and sniffing behavior in mice treated with it. At least one study has found that quinpirole induces compulsive behavior symptomatic of obsessive compulsive disorder in rats. Another study in rats show that quinpirole produces significant THC-like effects when metabolic degradation of anandamide is inhibited, supporting the hypothesis that these effects of quinpirole are mediated by cannabinoid CB1 receptors. Quinpirole may also reduce relapse in adolescent rat models of cocaine addiction.

<span class="mw-page-title-main">Avoidance response</span> Response that prevents an aversive stimulus

An avoidance response is a response that prevents an aversive stimulus from occurring. It is a kind of negative reinforcement. An avoidance response is a behavior based on the concept that animals will avoid performing behaviors that result in an aversive outcome. This can involve learning through operant conditioning when it is used as a training technique. It is a reaction to undesirable sensations or feedback that leads to avoiding the behavior that is followed by this unpleasant or fear-inducing stimulus.

<span class="mw-page-title-main">Reward system</span> Group of neural structures responsible for motivation and desire

The reward system is a group of neural structures responsible for incentive salience, associative learning, and positively-valenced emotions, particularly ones involving pleasure as a core component. Reward is the attractive and motivational property of a stimulus that induces appetitive behavior, also known as approach behavior, and consummatory behavior. A rewarding stimulus has been described as "any stimulus, object, event, activity, or situation that has the potential to make us approach and consume it is by definition a reward". In operant conditioning, rewarding stimuli function as positive reinforcers; however, the converse statement also holds true: positive reinforcers are rewarding.

<span class="mw-page-title-main">Basolateral amygdala</span> The lateral, basal, and accessory-basal nuclei of the amygdala

The basolateral amygdala, or basolateral complex, consists of the lateral, basal and accessory-basal nuclei of the amygdala. The lateral nuclei receives the majority of sensory information, which arrives directly from the temporal lobe structures, including the hippocampus and primary auditory cortex. The basolateral amygdala also receives dense neuromodulatory inputs from ventral tegmental area (VTA), locus coeruleus (LC), and basal forebrain, whose integrity are important for associative learning. The information is then processed by the basolateral complex and is sent as output to the central nucleus of the amygdala. This is how most emotional arousal is formed in mammals.

<span class="mw-page-title-main">Conditioned place preference</span> Pavlovian conditioning

Conditioned place preference (CPP) is a form of Pavlovian conditioning used to measure the motivational effects of objects or experiences. This motivation comes from the pleasurable aspect of the experience, so that the brain can be reminded of the context that surrounded the "encounter". By measuring the amount of time an animal spends in an area that has been associated with a stimulus, researchers can infer the animal's liking for the stimulus. This paradigm can also be used to measure conditioned place aversion with an identical procedure involving aversive stimuli instead. Both procedures usually involve mice or rats as subjects. This procedure can be used to measure extinction and reinstatement of the conditioned stimulus. Certain drugs are used in this paradigm to measure their reinforcing properties. Two different methods are used to choose the compartments to be conditioned, and these are biased vs. unbiased. The biased method allows the animal to explore the apparatus, and the compartment they least prefer is the one that the drug is administered in and the one they most prefer is the one where the vehicle is injected. This method allows the animal to choose the compartment they get the drug and vehicle. In comparison, the unbiased method does not allow the animal to choose what compartment they get the drug and vehicle in. Instead, the researcher chooses the compartments.

Self-administration is, in its medical sense, the process of a subject administering a pharmacological substance to themself. A clinical example of this is the subcutaneous "self-injection" of insulin by a diabetic patient.

Spontaneous recovery is a phenomenon of learning and memory that was first named and described by Ivan Pavlov in his studies of classical (Pavlovian) conditioning. In that context, it refers to the re-emergence of a previously extinguished conditioned response after a delay. Such a recovery of "lost" behaviors can be observed within a variety of domains, and the recovery of lost human memories is often of particular interest. For a mathematical model for spontaneous recovery see Further Reading.

Many experiments have been done to find out how the brain interprets stimuli and how animals develop fear responses. The emotion, fear, has been hard-wired into almost every individual, due to its vital role in the survival of the individual. Researchers have found that fear is established unconsciously and that the amygdala is involved with fear conditioning.

Pavlovian-instrumental transfer (PIT) is a psychological phenomenon that occurs when a conditioned stimulus that has been associated with rewarding or aversive stimuli via classical conditioning alters motivational salience and operant behavior. Two distinct forms of Pavlovian-instrumental transfer have been identified in humans and other animals – specific PIT and general PIT – with unique neural substrates mediating each type. In relation to rewarding stimuli, specific PIT occurs when a CS is associated with a specific rewarding stimulus through classical conditioning and subsequent exposure to the CS enhances an operant response that is directed toward the same reward with which it was paired. General PIT occurs when a CS is paired with one reward and it enhances an operant response that is directed toward a different rewarding stimulus.

References

  1. 1 2 Miltenberger, R. (2012). Behavior modification, principles and procedures. (5th ed., pp. 87-99). Wadsworth Publishing Company.
  2. VanElzakker, M. B.; Dahlgren, M. K.; Davis, F. C.; Dubois, S.; Shin, L. M. (2014). "From Pavlov to PTSD: The extinction of conditioned fear in rodents, humans, and anxiety disorders". Neurobiology of Learning and Memory. 113: 3–18. doi:10.1016/j.nlm.2013.11.014. PMC   4156287 . PMID   24321650.
  3. 1 2 3 Myers; Davis (2007). "Mechanisms of Fear Extinction". Molecular Psychiatry. 12 (2): 120–150. doi: 10.1038/sj.mp.4001939 . PMID   17160066.
  4. Amano, T; Unal, CT; Paré, D (2010). "Synaptic correlates of fear extinction in the amygdala". Nature Neuroscience. 13 (4): 489–494. doi:10.1038/nn.2499. PMC   2847017 . PMID   20208529.
  5. Vargas, Julie S. (2013). Behavior Analysis for effective Teaching. New York: Routledge. p. 52.
  6. B.F. Skinner (1979). The Shaping of a Behaviorist: Part Two of an Autobiography, p. 95.
  7. Wheeler, John J. (2019). Behavior management : principles and practices of positive behavioral interventions and supports. David Dean Richey (4th ed.). New York, NY. ISBN   978-0-13-479218-7. OCLC   1008776079.{{cite book}}: CS1 maint: location missing publisher (link)
  8. Thrailkill, Eric A.; Kimball, Ryan T.; Kelley, Michael E.; Craig, Andrew R.; Podlesnik, Christopher A. (2018). "Greater reinforcement rate during training increases spontaneous recovery: Spontaneous Recovery". Journal of the Experimental Analysis of Behavior. 109 (1): 238–252. doi:10.1002/jeab.307. PMID   29314021.
  9. Applied behavior analysis for everyone : principles and practices explained by applied researchers who use them. Robert C. Pennington. Shawnee, KS: AAPC. 2019. p. 120. ISBN   978-1-942197-45-4. OCLC   1103852953.{{cite book}}: CS1 maint: others (link)
  10. Crawley, Daisy; Zhang, Lei; Jones, Emily J. H.; Ahmad, Jumana; Oakley, Bethany; San José Cáceres, Antonia; Charman, Tony; Buitelaar, Jan K.; Murphy, Declan G. M.; Chatham, Christopher; den Ouden, Hanneke (2020-10-27). "Modeling flexible behavior in childhood to adulthood shows age-dependent learning mechanisms and less optimal learning in autism in each age group". PLOS Biology. 18 (10): e3000908. doi: 10.1371/journal.pbio.3000908 . ISSN   1545-7885. PMC   7591042 . PMID   33108370.
  11. Rivers, Susan E.; Brackett, Marc A.; Reyes, Maria R.; Elbertson, Nicole A.; Salovey, Peter (2012-11-28). "Improving the Social and Emotional Climate of Classrooms: A Clustered Randomized Controlled Trial Testing the RULER Approach". Prevention Science. 14 (1): 77–87. doi:10.1007/s11121-012-0305-2. ISSN   1389-4986. PMID   23188089. S2CID   5616258.
  12. 1 2 Janney, Donna M.; Umbreit, John; Ferro, Jolenea B.; Liaupsin, Carl J.; Lane, Kathleen L. (2013). "The Effect of the Extinction Procedure in Function-Based Intervention". Journal of Positive Behavior Interventions. 15 (2): 113–123. doi:10.1177/1098300712441973. ISSN   1098-3007. S2CID   144675039.
  13. Rajaraman, Adithyan; Hanley, Gregory P.; Gover, Holly C.; Staubitz, Johanna L.; Staubitz, John E.; Simcoe, Kathleen M.; Metras, Rachel (2022). "Minimizing Escalation by Treating Dangerous Problem Behavior Within an Enhanced Choice Model". Behavior Analysis in Practice. 15 (1): 219–242. doi:10.1007/s40617-020-00548-2. ISSN   1998-1929. PMC   8854458 . PMID   35340377.
  14. Rispoli, Mandy; Camargo, Siglia; Machalicek, Wendy; Lang, Russell; Sigafoos, Jeff (2014). "Functional communication training in the treatment of problem behavior maintained by access to rituals". Journal of Applied Behavior Analysis. 27 (47): 3. doi:10.1002/jaba.1994.27.issue-1. ISSN   0021-8855.
  15. Falcomata, Terry S.; Hoffman, Katherine J.; Gainey, Summer; Muething, Colin S.; Fienup, Daniel M. (2013-07-10). "A Preliminary Evaluation of Reinstatement of Destructive Behavior Displayed by Individuals With Autism". The Psychological Record. 63 (3): 453–466. doi:10.11133/j.tpr.2013.63.3.004. S2CID   147662158.
  16. Hanley, Gregory P.; Jin, C. Sandy; Vanselow, Nicholas R.; Hanratty, Laura A. (2014). "Producing meaningful improvements in problem behavior of children with autism via synthesized analyses and treatments: Severe Problem Behavior". Journal of Applied Behavior Analysis. 47 (1): 16–36. doi:10.1002/jaba.106. PMID   24615474.
  17. Banda, Devender R.; McAfee, James K.; Hart, Stephanie L. (2009-06-03). "Decreasing Self-Injurious Behavior in a Student with Autism and Tourette Syndrome through Positive Attention and Extinction". Child & Family Behavior Therapy. 31 (2): 144–156. doi:10.1080/07317100902910604. ISSN   0731-7107. S2CID   144329726.
  18. 1 2 Allison, Janelle; Wilder, David A; Chong, Ivy; Lugo, Ashley; Pike, Jessica; Rudy, Nikki (2012). "A Comparison of Differential Reinforcement and Noncontingent Reinforcement to Treat Food Selectivity in a Child With Autism". Journal of Applied Behavior Analysis. 45 (3): 613–617. doi:10.1901/jaba.2012.45-613. PMC   3469290 . PMID   23060675.
  19. Riedel, Gernot; Platt, Bettina; Micheau, Jacques (2003-03-18). "Glutamate receptor function in learning and memory". Behavioural Brain Research. 140 (1–2): 1–47. doi:10.1016/s0166-4328(02)00272-3. ISSN   0166-4328. PMID   12644276. S2CID   41221872.
  20. Kim, Jee Hyun; Perry, Christina; Luikinga, Sophia; Zbukvic, Isabel; Brown, Robyn M.; Lawrence, Andrew J. (2015-05-01). "Extinction of a cocaine-taking context that protects against drug-primed reinstatement is dependent on the metabotropic glutamate 5 receptor". Addiction Biology. 20 (3): 482–489. doi:10.1111/adb.12142. ISSN   1369-1600. PMID   24712397. S2CID   11626810.
  21. Perry, Christina J; Reed, Felicia; Zbukvic, Isabel C; Kim, Jee Hyun; Lawrence, Andrew J (2016-01-01). "The metabotropic glutamate 5 receptor is necessary for extinction of cocaine associated cues". British Journal of Pharmacology. 173 (6): 1085–1094. doi:10.1111/bph.13437. ISSN   1476-5381. PMC   5341241 . PMID   26784278.
  22. Abraham, Antony D.; Neve, Kim A.; Lattal, K. Matthew (2014-02-01). "Dopamine and extinction: A convergence of theory with fear and reward circuitry". Neurobiology of Learning and Memory. 108: 65–77. doi:10.1016/j.nlm.2013.11.007. ISSN   1074-7427. PMC   3927738 . PMID   24269353.
  23. Haaker, Jan; Lonsdorf, Tina B.; Kalisch, Raffael (2015-10-01). "Effects of post-extinction l-DOPA administration on the spontaneous recovery and reinstatement of fear in a human fMRI study". European Neuropsychopharmacology. 25 (10): 1544–1555. doi:10.1016/j.euroneuro.2015.07.016. ISSN   1873-7862. PMID   26238968. S2CID   6242752.
  24. Haaker, Jan; Gaburro, Stefano; Sah, Anupam; Gartmann, Nina; Lonsdorf, Tina B.; Meier, Kolja; Singewald, Nicolas; Pape, Hans-Christian; Morellini, Fabio (2013-06-25). "Single dose of L-dopa makes extinction memories context-independent and prevents the return of fear". Proceedings of the National Academy of Sciences of the United States of America. 110 (26): E2428–2436. Bibcode:2013PNAS..110E2428H. doi: 10.1073/pnas.1303061110 . ISSN   1091-6490. PMC   3696794 . PMID   23754384.
  25. Ponnusamy, Ravikumar; Nissim, Helen A.; Barad, Mark (2005-07-01). "Systemic blockade of D2-like dopamine receptors facilitates extinction of conditioned fear in mice". Learning & Memory. 12 (4): 399–406. doi:10.1101/lm.96605. ISSN   1072-0502. PMC   1183258 . PMID   16077018.
  26. Zbukvic, Isabel C.; Ganella, Despina E.; Perry, Christina J.; Madsen, Heather B.; Bye, Christopher R.; Lawrence, Andrew J.; Kim, Jee Hyun (2016-03-05). "Role of Dopamine 2 Receptor in Impaired Drug-Cue Extinction in Adolescent Rats". Cerebral Cortex. 26 (6): 2895–904. doi:10.1093/cercor/bhw051. ISSN   1047-3211. PMC   4869820 . PMID   26946126.
  27. Madsen, Heather B.; Guerin, Alexandre A.; Kim, Jee Hyun (2017). "Investigating the role of dopamine receptor- and parvalbumin-expressing cells in extinction of conditioned fear". Neurobiology of Learning and Memory. 145: 7–17. doi:10.1016/j.nlm.2017.08.009. PMID   28842281. S2CID   26875742.
  28. Abraham, Antony D.; Neve, Kim A.; Lattal, K. Matthew (2016-07-01). "Activation of D1/5 Dopamine Receptors: A Common Mechanism for Enhancing Extinction of Fear and Reward-Seeking Behaviors". Neuropsychopharmacology. 41 (8): 2072–2081. doi:10.1038/npp.2016.5. PMC   4908654 . PMID   26763483.
  29. Zbukvic, Isabel C.; Ganella, Despina E.; Perry, Christina J.; Madsen, Heather B.; Bye, Christopher R.; Lawrence, Andrew J.; Kim, Jee Hyun (2016-03-05). "Role of Dopamine 2 Receptor in Impaired Drug-Cue Extinction in Adolescent Rats". Cerebral Cortex. 26 (6): 2895–2904. doi:10.1093/cercor/bhw051. ISSN   1047-3211. PMC   4869820 . PMID   26946126.
  30. Do-Monte, Fabricio H.; Manzano-Nieves, Gabriela; Quiñones-Laracuente, Kelvin; Ramos-Medina, Liorimar; Quirk, Gregory J. (2015-02-25). "Revisiting the Role of Infralimbic Cortex in Fear Extinction with Optogenetics". The Journal of Neuroscience. 35 (8): 3607–3615. doi:10.1523/JNEUROSCI.3137-14.2015. ISSN   0270-6474. PMC   4339362 . PMID   25716859.
  31. Ganella, Despina E.; Drummond, Katherine D.; Ganella, Eleni P.; Whittle, Sarah; Kim, Jee Hyun (2018). "Extinction of Conditioned Fear in Adolescents and Adults: A Human fMRI Study". Frontiers in Human Neuroscience. 11: 647. doi: 10.3389/fnhum.2017.00647 . ISSN   1662-5161. PMC   5766664 . PMID   29358913.
  32. Yap, C.S., Richardson, R. (2007). "Extinction in the developing rat: an examination of renewal effects". Developmental Psychobiology. 49 (6): 565–575. CiteSeerX   10.1.1.583.1720 . doi:10.1002/dev.20244. PMID   17680605.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  33. Ganella, Despina E; Kim, Jee Hyun (2014-10-01). "Developmental rodent models of fear and anxiety: from neurobiology to pharmacology". British Journal of Pharmacology. 171 (20): 4556–4574. doi:10.1111/bph.12643. ISSN   1476-5381. PMC   4209932 . PMID   24527726.
  34. Kim, Jee Hyun; Richardson, Rick (2008-02-06). "The Effect of Temporary Amygdala Inactivation on Extinction and Reextinction of Fear in the Developing Rat: Unlearning as a Potential Mechanism for Extinction Early in Development". The Journal of Neuroscience. 28 (6): 1282–1290. doi: 10.1523/JNEUROSCI.4736-07.2008 . ISSN   0270-6474. PMC   6671587 . PMID   18256248.
  35. Kim, Jee Hyun; Richardson, Rick (2007). "A developmental dissociation in reinstatement of an extinguished fear response in rats". Neurobiology of Learning and Memory. 88 (1): 48–57. doi:10.1016/j.nlm.2007.03.004. PMID   17459734. S2CID   19611691.
  36. Kim, Jee Hyun; Hamlin, Adam S.; Richardson, Rick (2009-09-02). "Fear Extinction across Development: The Involvement of the Medial Prefrontal Cortex as Assessed by Temporary Inactivation and Immunohistochemistry" (PDF). The Journal of Neuroscience. 29 (35): 10802–10808. doi: 10.1523/JNEUROSCI.0596-09.2009 . ISSN   0270-6474. PMC   6665532 . PMID   19726637.
  37. Kim, Jee Hyun; Li, Stella; Richardson, Rick (2010-06-24). "Immunohistochemical Analyses of Long-Term Extinction of Conditioned Fear in Adolescent Rats". Cerebral Cortex. 21 (3): 530–8. doi: 10.1093/cercor/bhq116 . hdl: 10536/DRO/DU:30144599 . ISSN   1047-3211. PMID   20576926.
  38. Kim, Jee Hyun; Ganella, Despina E. (2015). "A Review of Preclinical Studies to Understand Fear During Adolescence". Australian Psychologist. 50 (1): 25–31. doi:10.1111/ap.12066. S2CID   142760996.