Discrimination learning

Last updated

Discrimination learning is defined in psychology as the ability to respond differently to different stimuli. This type of learning is used in studies regarding operant and classical conditioning. Operant conditioning involves the modification of a behavior by means of reinforcement or punishment. In this way, a discriminative stimulus will act as an indicator to when a behavior will persist and when it will not. Classical conditioning involves learning through association when two stimuli are paired together repeatedly. This conditioning demonstrates discrimination through specific micro-instances of reinforcement and non-reinforcement. This phenomenon is considered to be more advanced than learning styles such as generalization and yet simultaneously acts as a basic unit to learning as a whole. The complex and fundamental nature of discrimination learning allows for psychologists and researchers to perform more in-depth research that supports psychological advancements. Research on the basic principles underlying this learning style has their roots in neuropsychology sub-processes.

Contents

Historical information

Karl Lashley, a psychologist who studied under John B. Watson, focused mainly on studying learning and discrimination. He published "Brain mechanisms and intelligence" in 1929. Lashey's research on two-alternative forced choice gave a foundation of study to psychologists like Kenneth Spence. Kenneth Spence expanded on the knowledge we had on two-choice discrimination learning. He made two major publications on the subject, The Nature of Discrimination Learning in Animals in 1936 and Continuous Versus Non-continuous Interpretations of Discrimination Learning in 1940. Spence's research discussed the theory that applying excitation and inhibition to a stimulus and having the likelihood of responding to that stimulus be the result of the net excitation strength (excitation minus inhibition). [1]

Ivan Pavlov is very influential when it comes to studying discrimination learning. His studies involving salivating dogs demonstrated an ability in the dogs to differentiate a stimulus that would elicit a reward and a stimulus that would not. This can be contrasted with the Little Albert studies where Albert's lack of discrimination between animals exhibited the psychological and learning phenomenon of generalization learning, which is discrimination learning's polar opposite. [2]

A book written on discrimination learning studied the behaviors and discriminatory habits of animals. [3]

Examples

Discrimination learning can be studied in both humans and other animals. Animals can use discrimination learning to help them survive, be trained for assisting humans in tasks, and much more. A dog might be trained to use discrimination learning to detect differences in complex odor compounds so that they are able to sniff out different drugs to assist police. Predator can also use discrimination learning to distinguish between two camouflaged prey. [4] Discrimination learning teaches us more about what other animals are capable of conceptual thought. Humans can use discrimination learning to detect danger, learn about differences, and more. One example of discrimination learning in humans would be a baby who reacts differently to their mother's voice than to a stranger's voice. [5]

Dog being Trained for Drug Detection in US Navy US Navy 101108-N-8546L-040 Chief Master-at-Arms Nick Estrada, left, a U.S. Navy military working dog handler from Orange, Calif.jpg
Dog being Trained for Drug Detection in US Navy

Discovering different abilities of humans or other animals who are unable to communicate. Discrimination learning can be used to see what differences an animal will respond to. For example, since we are unable to have general two-way communication with dogs, we could show a dog two different stimuli that are the same in every way other than one, such as color. We could then use discrimination learning to see which colors a dog can discriminate between. [6]

Famous studies

Some famous studies using discrimination learning include:

Limitations

Discrimination learning has limitations. One limitation is the relative-validity effect. This effect states that organisms learn to give more attention to the stimuli that are of more importance to them. [9] Another limitation is the blocking effect. This effect will occur if there is a discriminative stimulus, such as a cat hearing the sound of a bell ringing, that is presented by itself and then it is followed by a reinforcement, such as food for the cat. We would repeat this until the cat starts to salivate when the bell rings. If we then added a stimulus of a flash of light after the bell rings, and then followed it by a reinforcement (the cat food), it may result in little to no response to a second stimulus.

Applications

Discrimination learning is used almost every subfield of psychology as it is a basic form of learning that is at the core of human intelligence. Examples of this include but are not limited to, cognitive psychology, personality psychology, developmental psychology, etc. [10]

It was a classic topic in the psychology of learning from the 1920s to the 1970s, and was particularly investigated within:

Discrimination learning can almost become an unconscious process for many people. It becomes integrated into daily routines. Examples of discrimination learning in everyday life can include grocery shopping, determining how to decipher between the types of bread or fruit, being able to tell similar stimuli apart, differentiating between different parts while listening to music, or perhaps deciphering the different notes and chords being played. [11]

While interest in the learning of discriminations has continued in many fields, from about 1980 onwards the phrase "discrimination learning" was used less often as the main description either of individual studies or of a field of investigation. Instead, investigations of the learning of discriminations have tended to be described in other terms such as pattern recognition or concept discrimination. This change partly reflects the increasing diversity of studies of discrimination, and partly the general expansion of the topic of cognition within psychology, so that learning is not now the central organizing topic that it was in the mid-20th century.

Related Research Articles

<span class="mw-page-title-main">B. F. Skinner</span> American psychologist and social philosopher (1904–1990)

Burrhus Frederic Skinner was an American psychologist, behaviorist, author, inventor, and social philosopher. He was a professor of psychology at Harvard University from 1958 until his retirement in 1974.

Operant conditioning, also called instrumental conditioning, is a learning process where behaviors are modified through the association of stimuli with reinforcement or punishment. In it, operants—behaviors that affect one's environment—are conditioned to occur or not occur depending on the environmental consequences of the behavior.

<span class="mw-page-title-main">Operant conditioning chamber</span> Laboratory apparatus used to study animal behavior

An operant conditioning chamber is a laboratory apparatus used to study animal behavior. The operant conditioning chamber was created by B. F. Skinner while he was a graduate student at Harvard University. The chamber can be used to study both operant conditioning and classical conditioning.

Classical conditioning is a behavioral procedure in which a biologically potent physiological stimulus is paired with a neutral stimulus. The term classical conditioning also refers to the subject animal's learning from the pairing of a physiologic stimulus with a neutral stimulus, which elicits the required response from the neutral stimulus rather than the physiological stimulus.

Radical behaviorism is a "philosophy of the science of behavior" developed by B. F. Skinner. It refers to the philosophy behind behavior analysis, and is to be distinguished from methodological behaviorism—which has an intense emphasis on observable behaviors—by its inclusion of thinking, feeling, and other private events in the analysis of human and animal psychology. The research in behavior analysis is called the experimental analysis of behavior and the application of the field is called applied behavior analysis (ABA), which was originally termed "behavior modification."

The experimental analysis of behavior is a science that studies the behavior of individuals across a variety of species. A key early scientist was B. F. Skinner who discovered operant behavior, reinforcers, secondary reinforcers, contingencies of reinforcement, stimulus control, shaping, intermittent schedules, discrimination, and generalization. A central method was the examination of functional relations between environment and behavior, as opposed to hypothetico-deductive learning theory that had grown up in the comparative psychology of the 1920–1950 period. Skinner's approach was characterized by observation of measurable behavior which could be predicted and controlled. It owed its early success to the effectiveness of Skinner's procedures of operant conditioning, both in the laboratory and in behavior therapy.

<span class="mw-page-title-main">Animal cognition</span> Intelligence of non-human animals

Animal cognition encompasses the mental capacities of non-human animals including insect cognition. The study of animal conditioning and learning used in this field was developed from comparative psychology. It has also been strongly influenced by research in ethology, behavioral ecology, and evolutionary psychology; the alternative name cognitive ethology is sometimes used. Many behaviors associated with the term animal intelligence are also subsumed within animal cognition.

<span class="mw-page-title-main">Behaviorism</span> Systematic approach to understanding the behavior of humans and other animals

Behaviorism is a systematic approach to understanding the behavior of humans and other animals. It assumes that behavior is either a reflex evoked by the pairing of certain antecedent stimuli in the environment, or a consequence of that individual's history, including especially reinforcement and punishment contingencies, together with the individual's current motivational state and controlling stimuli. Although behaviorists generally accept the important role of heredity in determining behavior, they focus primarily on environmental events.

The law of effect is a psychology principle advanced by Edward Thorndike in 1898 on the matter of behavioral conditioning which states that "responses that produce a satisfying effect in a particular situation become more likely to occur again in that situation, and responses that produce a discomforting effect become less likely to occur again in that situation."

<span class="mw-page-title-main">Animal training</span> Teaching animals specific responses to specific conditions or stimuli

Animal training is the act of teaching animals specific responses to specific conditions or stimuli. Training may be for purposes such as companionship, detection, protection, and entertainment. The type of training an animal receives will vary depending on the training method used, and the purpose for training the animal. For example, a seeing eye dog will be trained to achieve a different goal than a wild animal in a circus.

<span class="mw-page-title-main">Shaping (psychology)</span> Psychological paradigm for behavior analysis

Shaping is a conditioning paradigm used primarily in the experimental analysis of behavior. The method used is differential reinforcement of successive approximations. It was introduced by B. F. Skinner with pigeons and extended to dogs, dolphins, humans and other species. In shaping, the form of an existing response is gradually changed across successive trials towards a desired target behavior by reinforcing exact segments of behavior. Skinner's explanation of shaping was this:

We first give the bird food when it turns slightly in the direction of the spot from any part of the cage. This increases the frequency of such behavior. We then withhold reinforcement until a slight movement is made toward the spot. This again alters the general distribution of behavior without producing a new unit. We continue by reinforcing positions successively closer to the spot, then by reinforcing only when the head is moved slightly forward, and finally only when the beak actually makes contact with the spot. ... The original probability of the response in its final form is very low; in some cases it may even be zero. In this way we can build complicated operants which would never appear in the repertoire of the organism otherwise. By reinforcing a series of successive approximations, we bring a rare response to a very high probability in a short time. ... The total act of turning toward the spot from any point in the box, walking toward it, raising the head, and striking the spot may seem to be a functionally coherent unit of behavior; but it is constructed by a continual process of differential reinforcement from undifferentiated behavior, just as the sculptor shapes his figure from a lump of clay.

Comparative cognition is the comparative study of the mechanisms and origins of cognition in various species, and is sometimes seen as more general than, or similar to, comparative psychology. From a biological point of view, work is being done on the brains of fruit flies that should yield techniques precise enough to allow an understanding of the workings of the human brain on a scale appreciative of individual groups of neurons rather than the more regional scale previously used. Similarly, gene activity in the human brain is better understood through examination of the brains of mice by the Seattle-based Allen Institute for Brain Science, yielding the freely available Allen Brain Atlas. This type of study is related to comparative cognition, but better classified as one of comparative genomics. Increasing emphasis in psychology and ethology on the biological aspects of perception and behavior is bridging the gap between genomics and behavioral analysis.

Errorless learning was an instructional design introduced by psychologist Charles Ferster in the 1950s as part of his studies on what would make the most effective learning environment. B. F. Skinner was also influential in developing the technique, noting that,

...errors are not necessary for learning to occur. Errors are not a function of learning or vice versa nor are they blamed on the learner. Errors are a function of poor analysis of behavior, a poorly designed shaping program, moving too fast from step to step in the program, and the lack of the prerequisite behavior necessary for success in the program.

In behavioral psychology, stimulus control is a phenomenon in operant conditioning that occurs when an organism behaves in one way in the presence of a given stimulus and another way in its absence. A stimulus that modifies behavior in this manner is either a discriminative stimulus (Sd) or stimulus delta (S-delta). Stimulus-based control of behavior occurs when the presence or absence of an Sd or S-delta controls the performance of a particular behavior. For example, the presence of a stop sign (S-delta) at a traffic intersection alerts the driver to stop driving and increases the probability that "braking" behavior will occur. Such behavior is said to be emitted because it does not force the behavior to occur since stimulus control is a direct result of historical reinforcement contingencies, as opposed to reflexive behavior that is said to be elicited through respondent conditioning.

Psychological behaviorism is a form of behaviorism — a major theory within psychology which holds that generally human behaviors are learned — proposed by Arthur W. Staats. The theory is constructed to advance from basic animal learning principles to deal with all types of human behavior, including personality, culture, and human evolution. Behaviorism was first developed by John B. Watson (1912), who coined the term "behaviorism," and then B. F. Skinner who developed what is known as "radical behaviorism." Watson and Skinner rejected the idea that psychological data could be obtained through introspection or by an attempt to describe consciousness; all psychological data, in their view, was to be derived from the observation of outward behavior. The strategy of these behaviorists was that the animal learning principles should then be used to explain human behavior. Thus, their behaviorisms were based upon research with animals.

James “Jim” A. Dinsmoor was an influential experimental psychologist who published work in the field of the experimental analysis of behavior. He was born October 4, 1921, in Woburn, Massachusetts to Daniel and Jean Dinsmoor. He graduated with his bachelor's degree from Dartmouth College in 1943. Subsequently, he attended Columbia University in New York City, where he received his Master's and Ph.D. degrees under the mentorship of William N. Schoenfeld and Fred S. Keller. There, he was introduced to the work of B.F. Skinner, whose behavior analytic research inspired Dinsmoor to pursue a lifetime of research in conditioned responding.

An antecedent is a stimulus that cues an organism to perform a learned behavior. When an organism perceives an antecedent stimulus, it behaves in a way that maximizes reinforcing consequences and minimizes punishing consequences. This might be part of complex, interpersonal communication.

The differential outcomes effect (DOE) is a theory in behaviorism, a branch of psychology, that shows that a positive effect on accuracy occurs in discrimination learning between different stimuli when unique rewards are paired with each individual stimulus. The DOE was first demonstrated in 1970 by Milton Trapold on an experiment with rats. Rats were trained to discriminate between a clicker and a tone by pressing the left and right levers. Half of the rats were trained using the differential outcomes procedure (DOP), where the clicker was paired with sucrose and tone with food pellets. The remaining rats were trained with only sucrose or only food pellets. The rats trained with the DOP were significantly more accurate than those trained with only one type of reinforcement. Since then it has been established through a myriad of experiments that the DOE exists in most species capable of learning.


Human contingency learning (HCL) is the observation that people tend to acquire knowledge based on whichever outcome has the highest probability of occurring from particular stimuli. In other words, individuals gather associations between a certain behaviour and a specific consequence. It is a form of learning for many organisms.

Association in psychology refers to a mental connection between concepts, events, or mental states that usually stems from specific experiences. Associations are seen throughout several schools of thought in psychology including behaviorism, associationism, psychoanalysis, social psychology, and structuralism. The idea stems from Plato and Aristotle, especially with regard to the succession of memories, and it was carried on by philosophers such as John Locke, David Hume, David Hartley, and James Mill. It finds its place in modern psychology in such areas as memory, learning, and the study of neural pathways.

References

  1. Amsel A (1995). "Kenneth Wartinbee Spence". Biographical Memoirs. 66: 335–351.
  2. Plaud JJ (November 2003). "Pavlov and the foundation of behavior therapy". The Spanish Journal of Psychology. 6 (2): 147–54. doi:10.1017/s1138741600005291. PMID   14628701. S2CID   1354206.
  3. Sutherland NS, Mackintosh NJ (1971). Mechanisms of Animal Discrimination. New York, New York: Academic Press. pp. ix. ISBN   9781483258249.
  4. Skelhorn J, Rowe C (February 2016). "Cognition and the evolution of camouflage". Proceedings. Biological Sciences. 283 (1825): 20152890. doi:10.1098/rspb.2015.2890. PMC   4810834 . PMID   26911959.
  5. Spence KW (1936). "The nature of discrimination learning in animals". Psychological Review. 43 (5): 427–449. doi:10.1037/h0056975.
  6. Frey P, Colliver J (1973). Sensitivity and responsivity measures for discrimination learning.
  7. Herrnstein RJ, Loveland DH, Cable C (1976). "Natural concepts in pigeons". Journal of Experimental Psychology: Animal Behavior Processes. 2 (4): 285–302. doi:10.1037/0097-7403.2.4.285. PMID   978139.
  8. Watanabe S, Sakamoto J, Wakita M (March 1995). "Pigeons' discrimination of paintings by Monet and Picasso". Journal of the Experimental Analysis of Behavior. 63 (2): 165–74. doi:10.1901/jeab.1995.63-165. PMC   1334394 . PMID   16812755.
  9. "Relative Validity; Blocking". University of Iowa.
  10. Spiker C, Cantor J (1982). "Cognitive Strategies in the Discrimination Learning of Young Children". In Donald K. Routh (ed.). Learning, Speech, and the Complex Effects of Punishment. Springer, Boston, MA. pp. 21–69.
  11. Dewey R. "05: Conditioning". Psychology: An Introduction.