Errorless learning

Last updated

Errorless learning was an instructional design introduced by psychologist Charles Ferster in the 1950s as part of his studies on what would make the most effective learning environment. B. F. Skinner was also influential in developing the technique, noting that,

Contents

...errors are not necessary for learning to occur. Errors are not a function of learning or vice versa nor are they blamed on the learner. Errors are a function of poor analysis of behavior, a poorly designed shaping program, moving too fast from step to step in the program, and the lack of the prerequisite behavior necessary for success in the program.[ citation needed ]

Errorless learning can also be understood at a synaptic level, using the principle of Hebbian learning ("Neurons that fire together wire together").

Many of Skinner's other students and followers continued to test the idea. In 1963, Herbert Terrace wrote a paper describing an experiment with pigeons which allows discrimination learning to occur with few or even with no responses to the negative stimulus (abbreviated S−). A negative stimulus is a stimulus associated with undesirable consequences (e.g., absence of reinforcement). In discrimination learning, an error is a response to the S−, and according to Terrace errors are not required for successful discrimination performance.

Principles

A simple discrimination learning procedure is one in which a subject learns to associate one stimulus, S+ (positive stimulus), with reinforcement (e.g. food) and another, S− (negative stimulus), with extinction (e.g. absence of food). For example, a pigeon can learn to peck a red key (S+), and avoid a green key (S−). Using traditional procedures, a pigeon would be initially trained to peck a red key (S+). When the pigeon was responding consistently to the red key (S+), a green key (S−) would be introduced. At first the pigeon would also respond to the green key (S−) but gradually responses to this key would decrease, because they are not followed by food, so that they occurred only a few times or even never.

Terrace (1963) found that discrimination learning could occur without errors when the training begins early in operant conditioning and visual stimuli (S+ and S−) like colors are used that differ in terms of brightness, duration and wavelength. He used a fading procedure in which the brightness and duration differences between the S+ and the S− were decreased progressively leaving only the difference in wavelength. In other words, the S+ and S− were initially presented with different brightness and duration, i.e., the S+ would appear for 5 s and fully red, and the S− would appear for 0.5 s and dark. Gradually, over successive presentations, the duration of the S− and its brightness were gradually increased until the keylight was fully green for 5 s.

Studies of implicit memory and implicit learning from cognitive psychology and cognitive neuropsychology have provided additional theoretical support for errorless learning methods (e.g., Brooks and Baddeley, 1976, Tulving and Schacter, 1990). Implicit memory is known to be poor at eliminating errors, but can be used to compensate when explicit memory function is impaired. In experiments on amnesiac patients, errorless implicit learning was more effective because it reduced the possibility of errors "sticking" in amnesiacs' memories. [1]

Effects

The errorless learning procedure is highly effective in reducing the number of responses to the S− during training. In Terrace's (1963) experiment, subjects trained with the conventional discrimination procedure averaged over 3000 S− (errors) responses during 28 sessions of training; whereas subjects trained with the errorless procedure averaged only 25 S− (errors) responses in the same number of sessions.

Later, Terrace (1972) claimed not only that the errorless learning procedure improves long-term discrimination performance, but also that: 1) S− does not become aversive and so does not elicit "aggressive" behaviors, as it often does with conventional training; 2) S− does not develop inhibitory properties; 3) positive behavioral contrast to S+ does not occur. In other words, Terrace has claimed that the "by-products" of conventional discrimination learning do not occur with the errorless procedure.

Limits

However, some evidence suggests that errorless learning may not be as qualitatively different from conventional training as Terrace initially claimed. For example, Rilling (1977) demonstrated in a series of experiments that these "by-products" can occur after errorless learning, but that their effects may not be as large as in the conventional procedure; and Marsh and Johnson (1968) found that subjects given errorless training were very slow to make a discrimination reversal.

Applications

Interest from psychologists studying basic research on errorless learning declined after the 1970s. However, errorless learning attracted the interest of researchers in applied psychology, and studies have been conducted with both children (e.g., educational settings) and adults (e.g. Parkinson's patients). Errorless learning continues to be of practical interest to animal trainers, particularly dog trainers. [2]

Errorless learning has been found to be effective in helping memory-impaired people learn more effectively. [3] The reason for the method's effectiveness is that, while those with sufficient memory function can remember mistakes and learn from them, those with memory impairment may have difficulty remembering not only which methods work, but may strengthen incorrect responses over correct responses, such as via emotional stimuli. See also the reference by Brown to its application in teaching mathematics to undergraduates.

See also

Related Research Articles

B. F. Skinner American psychologist and social philosopher (1904–1990)

Burrhus Frederic Skinner was an American psychologist, behaviorist, author, inventor, and social philosopher. He was a professor of psychology at Harvard University from 1958 until his retirement in 1974.

Operant conditioning is a type of associative learning process through which the strength of a behavior is modified by reinforcement or punishment. It is also a procedure that is used to bring about such learning.

Learning Process of acquiring new knowledge

Learning is the process of acquiring new understanding, knowledge, behaviors, skills, values, attitudes, and preferences. The ability to learn is possessed by humans, animals, and some machines; there is also evidence for some kind of learning in certain plants. Some learning is immediate, induced by a single event, but much skill and knowledge accumulate from repeated experiences. The changes induced by learning often last a lifetime, and it is hard to distinguish learned material that seems to be "lost" from that which cannot be retrieved.

Operant conditioning chamber Laboratory apparatus used to study animal behavior

An operant conditioning chamber is a laboratory apparatus used to study animal behavior. The operant conditioning chamber was created by B. F. Skinner while he was a graduate student at Harvard University. The chamber can be used to study both operant conditioning and classical conditioning.

Classical conditioning is a behavioral procedure in which a biologically potent stimulus is paired with a previously neutral stimulus. It also refers to the learning process that results from this pairing, through which the neutral stimulus comes to elicit a response that is usually similar to the one elicited by the potent stimulus.

Reinforcement Consequence affecting an organisms future behavior

In behavioral psychology, reinforcement is a consequence applied that will strengthen an organism's future behavior whenever that behavior is preceded by a specific antecedent stimulus. This strengthening effect may be measured as a higher frequency of behavior, longer duration, greater magnitude, or shorter latency.

The experimental analysis of behavior is school of thought in psychology founded on B. F. Skinner's philosophy of radical behaviorism and defines the basic principles used in applied behavior analysis. A central principle was the inductive reasoning data-driven examination of functional relations, as opposed to the kinds of hypothetico-deductive learning theory that had grown up in the comparative psychology of the 1920–1950 period. Skinner's approach was characterized by observation of measurable behavior which could be predicted and controlled. It owed its early success to the effectiveness of Skinner's procedures of operant conditioning, both in the laboratory and in behavior therapy.

Animal cognition Intelligence of non-human animals

Animal cognition encompasses the mental capacities of non-human animals including insect cognition. The study of animal conditioning and learning used in this field was developed from comparative psychology. It has also been strongly influenced by research in ethology, behavioral ecology, and evolutionary psychology; the alternative name cognitive ethology is sometimes used. Many behaviors associated with the term animal intelligence are also subsumed within animal cognition.

Behaviorism Systematic approach to understanding the behavior of humans and other animals

Behaviorism is a systematic approach to understanding the behavior of humans and other animals. It assumes that behavior is either a reflex evoked by the pairing of certain antecedent stimuli in the environment, or a consequence of that individual's history, including especially reinforcement and punishment contingencies, together with the individual's current motivational state and controlling stimuli. Although behaviorists generally accept the important role of heredity in determining behavior, they focus primarily on environmental events.

The law of effect is a psychology principle advanced by Edward Thorndike in 1898 on the matter of behavioral conditioning which states that "responses that produce a satisfying effect in a particular situation become more likely to occur again in that situation, and responses that produce a discomforting effect become less likely to occur again in that situation."

Animal training

Animal training is the act of teaching animals specific responses to specific conditions or stimuli. Training may be for purposes such as companionship, detection, protection, and entertainment. The type of training an animal receives will vary depending on the training method used, and the purpose for training the animal. For example, a seeing eye dog will be trained to achieve a different goal than a wild animal in a circus.

Applied behavior analysis The application of respondent and operant conditioning to analyze and change behavior

Applied behavior analysis (ABA), also called behavioral engineering, is a behavioural therapy concerned with applying empirical approaches based upon the principles of respondent and operant conditioning to change behavior of social significance. It is the applied form of behavior analysis; the other two forms are radical behaviorism and the experimental analysis of behavior.

Shaping (psychology)

Shaping is a conditioning paradigm used primarily in the experimental analysis of behavior. The method used is differential reinforcement of successive approximations. It was introduced by B. F. Skinner with pigeons and extended to dogs, dolphins, humans and other species. In shaping, the form of an existing response is gradually changed across successive trials towards a desired target behavior by reinforcing exact segments of behavior. Skinner's explanation of shaping was this:

We first give the bird food when it turns slightly in the direction of the spot from any part of the cage. This increases the frequency of such behavior. We then withhold reinforcement until a slight movement is made toward the spot. This again alters the general distribution of behavior without producing a new unit. We continue by reinforcing positions successively closer to the spot, then by reinforcing only when the head is moved slightly forward, and finally only when the beak actually makes contact with the spot. ... The original probability of the response in its final form is very low; in some cases it may even be zero. In this way we can build complicated operants which would never appear in the repertoire of the organism otherwise. By reinforcing a series of successive approximations, we bring a rare response to a very high probability in a short time. ... The total act of turning toward the spot from any point in the box, walking toward it, raising the head, and striking the spot may seem to be a functionally coherent unit of behavior; but it is constructed by a continual process of differential reinforcement from undifferentiated behavior, just as the sculptor shapes his figure from a lump of clay.

Extinction is a behavioral phenomenon observed in both operantly conditioned and classically conditioned behavior, which manifests itself by fading of non-reinforced conditioned response over time. When operant behavior that has been previously reinforced no longer produces reinforcing consequences the behavior gradually stops occurring. In classical conditioning, when a conditioned stimulus is presented alone, so that it no longer predicts the coming of the unconditioned stimulus, conditioned responding gradually stops. For example, after Pavlov's dog was conditioned to salivate at the sound of a metronome, it eventually stopped salivating to the metronome after the metronome had been sounded repeatedly but no food came. Many anxiety disorders such as post traumatic stress disorder are believed to reflect, at least in part, a failure to extinguish conditioned fear.

William Kaye Estes was an American psychologist. A Review of General Psychology survey, published in 2002, ranked Estes as the 77th most cited psychologist of the 20th century. In order to develop a statistical explanation for the learning phenomena, William Kaye Estes developed the Stimulus Sampling Theory in 1950 which suggested that a stimulus-response association is learned on a single trial; however, the learning process is continuous and consists of the accumulation of distinct stimulus-response pairings.

In behavioral psychology, stimulus control is a phenomenon in operant conditioning that occurs when an organism behaves in one way in the presence of a given stimulus and another way in its absence. A stimulus that modifies behavior in this manner is either a discriminative stimulus (Sd) or stimulus delta (S-delta). Stimulus-based control of behavior occurs when the presence or absence of an Sd or S-delta controls the performance of a particular behavior. For example, the presence of a stop sign (S-delta) at a traffic intersection alerts the driver to stop driving and increases the probability that "braking" behavior will occur. Such behavior is said to be emitted because it does not force the behavior to occur since stimulus control is a direct result of historical reinforcement contingencies, as opposed to reflexive behavior that is said to be elicited through respondent conditioning.

Charles Ferster

Charles Bohris Ferster was an American behavioral psychologist. A pioneer of applied behavior analysis, he developed errorless learning and was a colleague of B.F. Skinner's at Harvard University, co-authoring the book Schedules of Reinforcement (1957).

In psychology, a stimulus is any object or event that elicits a sensory or behavioral response in an organism. In this context, a distinction is made between the distal stimulus and the proximal stimulus.

Herbert S. Terrace Professor of Psychology (b. 1936)

Herbert S. Terrace is a Professor of Psychology and Psychiatry at Columbia University. His work covers a broad set of research interests that include behaviorism, animal cognition, ape language and the evolution of language. He is the author of Nim: A Chimpanzee Who Learned Sign Language (1979) and Why Chimpanzees Can't Learn Language and Only Humans Can (2019). Terrace has made important contributions to comparative psychology, many of which have important implications for human psychology. These include discrimination learning, ape language, the evolution of language, and animal cognition.

Association in psychology refers to a mental connection between concepts, events, or mental states that usually stems from specific experiences. Associations are seen throughout several schools of thought in psychology including behaviorism, associationism, psychoanalysis, social psychology, and structuralism. The idea stems from Plato and Aristotle, especially with regard to the succession of memories, and it was carried on by philosophers such as John Locke, David Hume, David Hartley, and James Mill. It finds its place in modern psychology in such areas as memory, learning, and the study of neural pathways.

References

  1. Baddeley, A.D. and Wilson, B.A. (1994) When implicit learning fails: Amnesia and the problem of error elimination. Neuropsychologia, 32(1), 53-68.
  2. http://stalecheerios.com/blog/wp-content/uploads/2011/07/Teaching-Dogs-the-Clicker-Way-JRR.pdf [ bare URL PDF ]
  3. B. Wilson (2009) Memory Rehabilitation: Integrating Theory and Practice, The Guilford Press, 284 pages.