Law of effect

Last updated

The law of effect is a psychology principle advanced by Edward Thorndike in 1898 on the matter of behavioral conditioning (not then formulated as such) which states that "responses that produce a satisfying effect in a particular situation become more likely to occur again in that situation, and responses that produce a discomforting effect become less likely to occur again in that situation." [1]

Contents

This notion is very similar to that of the evolutionary theory, if a certain character trait provides an advantage for reproduction then that trait will persist. [2] The terms "satisfying" and "dissatisfying" appearing in the definition of the law of effect were eventually replaced by the terms "reinforcing" and "punishing," when operant conditioning became known. "Satisfying" and "dissatisfying" conditions are determined behaviorally, and they cannot be accurately predicted, because each animal has a different idea of these two terms than another animal. The new terms, "reinforcing" and "punishing" are used differently in psychology than they are colloquially. Something that reinforces a behavior makes it more likely that that behavior will occur again, and something that punishes a behavior makes it less likely that behavior will occur again. [3]

Thorndike's law of effect refutes the ideas George Romanes' book Animal Intelligence, stating that anecdotal evidence is weak and is typically not useful. The book stated that animals, like humans, think things through when dealing with a new environment or situation. Instead, Thorndike hypothesized that animals, to understand their physical environment, must physically interact with it using trial and error, until a successful result is obtained. This is illustrated in his cat experiment, in which a cat is placed in a shuttlebox and eventually learns, by interacting with the environment of the box, how to escape. [4]

History

This principle, discussed early on by Lloyd Morgan, is usually associated with the connectionism of Edward Thorndike, who said that if an association is followed by a "satisfying state of affairs" it will be strengthened and if it is followed by an "annoying state of affairs" it will be weakened. [5] [6]

The modern version of the law of effect is conveyed by the notion of reinforcement as it is found in operant conditioning. The essential idea is that behavior can be modified by its consequences, as Thorndike found in his famous experiments with hungry cats in puzzle boxes. The cat was placed in a box that could be opened if the cat pressed a lever or pulled a loop. Thorndike noted the amount of time it took the cat to free itself on successive trials in the box. He discovered that during the first few trials the cat would respond in many ineffective ways, such as scratching at the door or the ceiling, finally freeing itself with the press or pull by trial-and-error. With each successive trial, it took the cat, on average, less and less time to escape. Thus, in modern terminology, the correct response was reinforced by its consequence, release from the box. [7]

Definition

Initially, the cat's responses were largely instinctual, but over time, the pressing lever response was strengthened while the others were weakened Lawofeffect.gif
Initially, the cat's responses were largely instinctual, but over time, the pressing lever response was strengthened while the others were weakened

Law of effect is the belief that a pleasing after-effect strengthens the action that produced it. [8]

The law of effect was published by Edward Thorndike in 1905 and states that when an S-R association is established in instrumental conditioning between the instrumental response and the contextual stimuli that are present, the response is reinforced and the S-R association holds the sole responsibility for the occurrence of that behavior. Simply put, this means that once the stimulus and response are associated, the response is likely to occur without the stimulus being present. It holds that responses that produce a satisfying or pleasant state of affairs in a particular situation are more likely to occur again in a similar situation. Conversely, responses that produce a discomforting, annoying or unpleasant effect are less likely to occur again in the situation.

Psychologists have been interested in the factors that are important in behavior change and control since psychology emerged as a discipline. One of the first principles associated with learning and behavior was the Law of Effect, which states that behaviors that lead to satisfying outcomes are likely to be repeated, whereas behaviors that lead to undesired outcomes are less likely to recur. [9]

Thorndike's Puzzle-Box. The graph demonstrates the general decreasing trend of the cat's response times with each successive trial Puzzle box.jpg
Thorndike's Puzzle-Box. The graph demonstrates the general decreasing trend of the cat's response times with each successive trial

Thorndike emphasized the importance of the situation in eliciting a response; the cat would not go about making the lever-pressing movement if it was not in the puzzle box but was merely in a place where the response had never been reinforced. The situation involves not just the cat's location but also the stimuli it is exposed to, for example, the hunger and the desire for freedom. The cat recognizes the inside of the box, the bars, and the lever and remembers what it needs to do to produce the correct response. This shows that learning and the law of effect are context-specific.

In an influential paper, R. J. Herrnstein (1970) [10] proposed a quantitative relationship between response rate (B) and reinforcement rate (Rf):

B = kRf / (Rf0 + Rf)

where k and Rf0 are constants. Herrnstein proposed that this formula, which he derived from the matching law he had observed in studies of concurrent schedules of reinforcement, should be regarded as a quantification of the law of effect. While the qualitative law of effect may be a tautology, this quantitative version is not.

Example

An example is often portrayed in drug addiction. When a person uses a substance for the first time and receives a positive outcome, they are likely to repeat the behavior due to the reinforcing consequence. Over time, the person's nervous system will also develop a tolerance to the drug. Thus only by increasing dosage of the drug will provide the same satisfaction, making it dangerous for the user. [11]

Thorndike's Law of Effect can be compared to Darwin's theory of natural selection in which successful organisms are more likely to prosper and survive to pass on their genes to the next generation, while the weaker, unsuccessful organisms are gradually replaced and "stamped out". It can be said that the environment selects the "fittest" behavior for a situation, stamping out any unsuccessful behaviors, in the same way it selects the "fittest" individuals of a species. In an experiment that Thorndike conducted, he placed a hungry cat inside a "puzzle box", where the animal could only escape and reach the food once it could operate the latch of the door. At first the cats would scratch and claw in order to find a way out, then by chance / accident, the cat would activate the latch to open the door. On successive trials, the behaviour of the animal would become more habitual, to a point where the animal would operate without hesitation. The occurrence of the favourable outcome, reaching the food source, only strengthens the response that it produces.

Colwill and Rescorla for example made all rats complete the goal of getting food pellets and liquid sucrose in consistent sessions on identical variable-interval schedules. [12]

Influence

The law of work for psychologist B. F. Skinner almost half a century later on the principles of operant conditioning, "a learning process by which the effect, or consequence, of a response influences the future rate of production of that response." [1] Skinner would later use an updated version of Thorndike's puzzle box, called the operant chamber, or Skinner box, which has contributed immensely to our perception and understanding of the law of effect in modern society and how it relates to operant conditioning. It has allowed a researcher to study the behavior of small organisms in a controlled environment.

An example of Thorndike’s Law of Effect in a child’s behavior could be the child receiving praise and a star sticker for tidying up their toys. The positive reinforcement (praise and sticker) encourages the repetition of the behavior (cleaning up), illustrating the Law of Effect in action.

Related Research Articles

<span class="mw-page-title-main">Edward Thorndike</span> American psychologist (1874–1949)

Edward Lee Thorndike was an American psychologist who spent nearly his entire career at Teachers College, Columbia University. His work on comparative psychology and the learning process led to the theory of connectionism and helped lay the scientific foundation for educational psychology. He also worked on solving industrial problems, such as employee exams and testing.

Operant conditioning, also called instrumental conditioning, is a learning process where behaviors are modified through the association of stimuli with reinforcement or punishment. In it, operants—behaviors that affect one's environment—are conditioned to occur or not occur depending on the environmental consequences of the behavior.

<span class="mw-page-title-main">Operant conditioning chamber</span> Laboratory apparatus used to study animal behavior

An operant conditioning chamber is a laboratory apparatus used to study animal behavior. The operant conditioning chamber was created by B. F. Skinner while he was a graduate student at Harvard University. The chamber can be used to study both operant conditioning and classical conditioning.

In reinforcement theory, it is argued that human behavior is a result of "contingent consequences" to human actions. The publication pushes forward the idea that "you get what you reinforce". This means that behavior, when given the right types of reinforcers, can be changed for the better and negative behavior can be reinforced away.

Social learning is a theory of learning process social behavior which proposes that new behaviors can be acquired by observing and imitating others. It states that learning is a cognitive process that takes place in a social context and can occur purely through observation or direct instruction, even in the absence of motor reproduction or direct reinforcement. In addition to the observation of behavior, learning also occurs through the observation of rewards and punishments, a process known as vicarious reinforcement. When a particular behavior is rewarded regularly, it will most likely persist; conversely, if a particular behavior is constantly punished, it will most likely desist. The theory expands on traditional behavioral theories, in which behavior is governed solely by reinforcements, by placing emphasis on the important roles of various internal processes in the learning individual.

The experimental analysis of behavior is a science that studies the behavior of individuals across a variety of species. A key early scientist was B. F. Skinner who discovered operant behavior, reinforcers, secondary reinforcers, contingencies of reinforcement, stimulus control, shaping, intermittent schedules, discrimination, and generalization. A central method was the examination of functional relations between environment and behavior, as opposed to hypothetico-deductive learning theory that had grown up in the comparative psychology of the 1920–1950 period. Skinner's approach was characterized by observation of measurable behavior which could be predicted and controlled. It owed its early success to the effectiveness of Skinner's procedures of operant conditioning, both in the laboratory and in behavior therapy.

<span class="mw-page-title-main">Behaviorism</span> Systematic approach to understanding the behavior of humans and other animals

Behaviorism is a systematic approach to understanding the behavior of humans and other animals. It assumes that behavior is either a reflex evoked by the pairing of certain antecedent stimuli in the environment, or a consequence of that individual's history, including especially reinforcement and punishment contingencies, together with the individual's current motivational state and controlling stimuli. Although behaviorists generally accept the important role of heredity in determining behavior, they focus primarily on environmental events.

Motivational salience is a cognitive process and a form of attention that motivates or propels an individual's behavior towards or away from a particular object, perceived event or outcome. Motivational salience regulates the intensity of behaviors that facilitate the attainment of a particular goal, the amount of time and energy that an individual is willing to expend to attain a particular goal, and the amount of risk that an individual is willing to accept while working to attain a particular goal.

<span class="mw-page-title-main">Animal training</span> Teaching animals specific responses to specific conditions or stimuli

Animal training is the act of teaching animals specific responses to specific conditions or stimuli. Training may be for purposes such as companionship, detection, protection, and entertainment. The type of training an animal receives will vary depending on the training method used, and the purpose for training the animal. For example, a seeing eye dog will be trained to achieve a different goal than a wild animal in a circus.

<span class="mw-page-title-main">Shaping (psychology)</span> Psychological paradigm for behavior analysis

Shaping is a conditioning paradigm used primarily in the experimental analysis of behavior. The method used is differential reinforcement of successive approximations. It was introduced by B. F. Skinner with pigeons and extended to dogs, dolphins, humans and other species. In shaping, the form of an existing response is gradually changed across successive trials towards a desired target behavior by reinforcing exact segments of behavior. Skinner's explanation of shaping was this:

We first give the bird food when it turns slightly in the direction of the spot from any part of the cage. This increases the frequency of such behavior. We then withhold reinforcement until a slight movement is made toward the spot. This again alters the general distribution of behavior without producing a new unit. We continue by reinforcing positions successively closer to the spot, then by reinforcing only when the head is moved slightly forward, and finally only when the beak actually makes contact with the spot. ... The original probability of the response in its final form is very low; in some cases it may even be zero. In this way we can build complicated operants which would never appear in the repertoire of the organism otherwise. By reinforcing a series of successive approximations, we bring a rare response to a very high probability in a short time. ... The total act of turning toward the spot from any point in the box, walking toward it, raising the head, and striking the spot may seem to be a functionally coherent unit of behavior; but it is constructed by a continual process of differential reinforcement from undifferentiated behavior, just as the sculptor shapes his figure from a lump of clay.

Extinction is a behavioral phenomenon observed in both operantly conditioned and classically conditioned behavior, which manifests itself by fading of non-reinforced conditioned response over time. When operant behavior that has been previously reinforced no longer produces reinforcing consequences the behavior gradually stops occurring. In classical conditioning, when a conditioned stimulus is presented alone, so that it no longer predicts the coming of the unconditioned stimulus, conditioned responding gradually stops. For example, after Pavlov's dog was conditioned to salivate at the sound of a metronome, it eventually stopped salivating to the metronome after the metronome had been sounded repeatedly but no food came. Many anxiety disorders such as post traumatic stress disorder are believed to reflect, at least in part, a failure to extinguish conditioned fear.

<span class="mw-page-title-main">Avoidance response</span>

An avoidance response is a response that prevents an aversive stimulus from occurring. It is a kind of negative reinforcement. An avoidance response is a behavior based on the concept that animals will avoid performing behaviors that result in an aversive outcome. This can involve learning through operant conditioning when it is used as a training technique. It is a reaction to undesirable sensations or feedback that leads to avoiding the behavior that is followed by this unpleasant or fear-inducing stimulus.

In operant conditioning, the matching law is a quantitative relationship that holds between the relative rates of response and the relative rates of reinforcement in concurrent schedules of reinforcement. For example, if two response alternatives A and B are offered to an organism, the ratio of response rates to A and B equals the ratio of reinforcements yielded by each response. This law applies fairly well when non-human subjects are exposed to concurrent variable interval schedules ; its applicability in other situations is less clear, depending on the assumptions made and the details of the experimental situation. The generality of applicability of the matching law is subject of current debate.

In operant conditioning, punishment is any change in a human or animal's surroundings which, occurring after a given behavior or response, reduces the likelihood of that behavior occurring again in the future. As with reinforcement, it is the behavior, not the human/animal, that is punished. Whether a change is or is not punishing is determined by its effect on the rate that the behavior occurs. This is called motivating operations (MO), because they alter the effectiveness of a stimulus. MO can be categorized in abolishing operations, decrease the effectiveness of the stimuli and establishing, increase the effectiveness of the stimuli. For example, a painful stimulus which would act as a punisher for most people may actually reinforce some behaviors of masochistic individuals.

Comparative cognition is the comparative study of the mechanisms and origins of cognition in various species, and is sometimes seen as more general than, or similar to, comparative psychology. From a biological point of view, work is being done on the brains of fruit flies that should yield techniques precise enough to allow an understanding of the workings of the human brain on a scale appreciative of individual groups of neurons rather than the more regional scale previously used. Similarly, gene activity in the human brain is better understood through examination of the brains of mice by the Seattle-based Allen Institute for Brain Science, yielding the freely available Allen Brain Atlas. This type of study is related to comparative cognition, but better classified as one of comparative genomics. Increasing emphasis in psychology and ethology on the biological aspects of perception and behavior is bridging the gap between genomics and behavioral analysis.

Behavioral momentum is a theory in quantitative analysis of behavior and is a behavioral metaphor based on physical momentum. It describes the general relation between resistance to change and the rate of reinforcement obtained in a given situation.

In behavioral psychology, stimulus control is a phenomenon in operant conditioning that occurs when an organism behaves in one way in the presence of a given stimulus and another way in its absence. A stimulus that modifies behavior in this manner is either a discriminative stimulus (Sd) or stimulus delta (S-delta). Stimulus-based control of behavior occurs when the presence or absence of an Sd or S-delta controls the performance of a particular behavior. For example, the presence of a stop sign (S-delta) at a traffic intersection alerts the driver to stop driving and increases the probability that "braking" behavior will occur. Such behavior is said to be emitted because it does not force the behavior to occur since stimulus control is a direct result of historical reinforcement contingencies, as opposed to reflexive behavior that is said to be elicited through respondent conditioning.

Discrimination learning is defined in psychology as the ability to respond differently to different stimuli. This type of learning is used in studies regarding operant and classical conditioning. Operant conditioning involves the modification of a behavior by means of reinforcement or punishment. In this way, a discriminative stimulus will act as an indicator to when a behavior will persist and when it will not. Classical conditioning involves learning through association when two stimuli are paired together repeatedly. This conditioning demonstrates discrimination through specific micro-instances of reinforcement and non-reinforcement. This phenomenon is considered to be more advanced than learning styles such as generalization and yet simultaneously acts as a basic unit to learning as a whole. The complex and fundamental nature of discrimination learning allows for psychologists and researchers to perform more in-depth research that supports psychological advancements. Research on the basic principles underlying this learning style has their roots in neuropsychology sub-processes.

External inhibition is the observed decrease of the response of a conditioned reaction when an external (distracting) stimulus that was not part of the original conditioned response set is introduced. This effect was first observed in Ivan Pavlov's classical conditioning studies where the dogs would salivate less when presented with the sound of the tuning fork in the distracting context of a passing truck. External inhibition is important for its main principle in classical conditioning where a conditioned response may decrease in magnitude after the external stimulus is introduced. This is especially advantageous for when trying to disassociate conditioned stimulus and responses. A practical example is where students who become anxious upon standing in front of the class to give a presentation may feel less anxiety if their friends were sitting in front of the student presenting. The positive association of speaking to friends may distract the student from associating speaking to the entire class with anxiety.

Association in psychology refers to a mental connection between concepts, events, or mental states that usually stems from specific experiences. Associations are seen throughout several schools of thought in psychology including behaviorism, associationism, psychoanalysis, social psychology, and structuralism. The idea stems from Plato and Aristotle, especially with regard to the succession of memories, and it was carried on by philosophers such as John Locke, David Hume, David Hartley, and James Mill. It finds its place in modern psychology in such areas as memory, learning, and the study of neural pathways.

References

  1. 1 2 Gray, Peter. Psychology, Worth, NY. 6th ed. pp 108–109
  2. Schacter, Gilbert, Wegner. (2011). "Psychology Second Edition" New York: Worth Publishers.
  3. Mazur, J.E. (2013) "Basic Principles of Operant Conditioning." Learning and Behavior. (7th ed., pp. 101–126). Pearson.
  4. Mazur, J.E. (2013) "Basic Principles of Operant Conditioning." Learning and Behavior. (7th ed., pp. 101-126). Pearson.
  5. Thorndike, E. L. (1898, 1911) "Animal Intelligence: an Experimental Study of the Associative Processes in Animals" Psychological Monographs #8
  6. A. Charles Catania. "Thorndike's Legency: Learning Selection, and the law of effect", p. 425–426. University of Mary Land Baltimore
  7. Connectionism. Thorndike, Edward.Q Retrieved Dec 10, 2010
  8. Boring, Edwin`. Science. 1. 77. New York: American Association for the Advancement of Science, 2005. 307. Web.
  9. "Law of Effect". eNotes.com. Retrieved 2012-08-02.
  10. Herrnstein, R. J. (1970). On the law of effect. Journal of the Experimental Analysis of Behavior, 13, 243-266.
  11. Neil, Carlson; et al. (2007). Psychology The Science Of Behaviour. New Jersey, USA: Pearson Education Canada, Inc. p. 516.
  12. Nevin, John (1999). "Analyzing Thorndike's Law of Effect: The Question of Stimulus - Response Bonds". Journal of the Experiment Analysis of Behaviour. p. 448.