The law of effect, or Thorndike's law, is a psychology principle advanced by Edward Thorndike in 1898 on the matter of behavioral conditioning (not then formulated as such) which states that "responses that produce a satisfying effect in a particular situation become more likely to occur again in that situation, and responses that produce a discomforting effect become less likely to occur again in that situation." [1]
This notion is very similar to that of the evolutionary theory, if a certain character trait provides an advantage for reproduction then that trait will persist. [2] The terms "satisfying" and "dissatisfying" appearing in the definition of the law of effect were eventually replaced by the terms "reinforcing" and "punishing," when operant conditioning became known. "Satisfying" and "dissatisfying" conditions are determined behaviorally, and they cannot be accurately predicted, because each animal has a different idea of these two terms than another animal. The new terms, "reinforcing" and "punishing" are used differently in psychology than they are colloquially. Something that reinforces a behavior makes it more likely that that behavior will occur again, and something that punishes a behavior makes it less likely that behavior will occur again. [3]
Thorndike's law of effect refutes the ideas George Romanes' book Animal Intelligence, stating that anecdotal evidence is weak and is typically not useful. The book stated that animals, like humans, think things through when dealing with a new environment or situation. Instead, Thorndike hypothesized that animals, to understand their physical environment, must physically interact with it using trial and error, until a successful result is obtained. This is illustrated in his cat experiment, in which a cat is placed in a shuttlebox and eventually learns, by interacting with the environment of the box, how to escape. [4]
This principle, discussed early on by Lloyd Morgan, is usually associated with the connectionism of Edward Thorndike, who said that if an association is followed by a "satisfying state of affairs" it will be strengthened and if it is followed by an "annoying state of affairs" it will be weakened. [5] [6]
The modern version of the law of effect is conveyed by the notion of reinforcement as it is found in operant conditioning. The essential idea is that behavior can be modified by its consequences, as Thorndike found in his famous experiments with hungry cats in puzzle boxes. The cat was placed in a box that could be opened if the cat pressed a lever or pulled a loop. Thorndike noted the amount of time it took the cat to free itself on successive trials in the box. He discovered that during the first few trials the cat would respond in many ineffective ways, such as scratching at the door or the ceiling, finally freeing itself with the press or pull by trial-and-error. With each successive trial, it took the cat, on average, less and less time to escape. Thus, in modern terminology, the correct response was reinforced by its consequence, release from the box. [7]
Law of effect is the belief that a pleasing after-effect strengthens the action that produced it. [8]
The law of effect was published by Edward Thorndike in 1905 and states that when an S-R association is established in instrumental conditioning between the instrumental response and the contextual stimuli that are present, the response is reinforced and the S-R association holds the sole responsibility for the occurrence of that behavior. Simply put, this means that once the stimulus and response are associated, the response is likely to occur without the stimulus being present. It holds that responses that produce a satisfying or pleasant state of affairs in a particular situation are more likely to occur again in a similar situation. Conversely, responses that produce a discomforting, annoying or unpleasant effect are less likely to occur again in the situation.
Psychologists have been interested in the factors that are important in behavior change and control since psychology emerged as a discipline. One of the first principles associated with learning and behavior was the Law of Effect, which states that behaviors that lead to satisfying outcomes are likely to be repeated, whereas behaviors that lead to undesired outcomes are less likely to recur. [9]
Thorndike emphasized the importance of the situation in eliciting a response; the cat would not go about making the lever-pressing movement if it was not in the puzzle box but was merely in a place where the response had never been reinforced. The situation involves not just the cat's location but also the stimuli it is exposed to, for example, the hunger and the desire for freedom. The cat recognizes the inside of the box, the bars, and the lever and remembers what it needs to do to produce the correct response. This shows that learning and the law of effect are context-specific.
In an influential paper, R. J. Herrnstein (1970) [10] proposed a quantitative relationship between response rate (B) and reinforcement rate (Rf):
B = kRf / (Rf0 + Rf)
where k and Rf0 are constants. Herrnstein proposed that this formula, which he derived from the matching law he had observed in studies of concurrent schedules of reinforcement, should be regarded as a quantification of the law of effect. While the qualitative law of effect may be a tautology, this quantitative version is not.
An example is often portrayed in drug addiction. When a person uses a substance for the first time and receives a positive outcome, they are likely to repeat the behavior due to the reinforcing consequence. Over time, the person's nervous system will also develop a tolerance to the drug. Thus only by increasing dosage of the drug will provide the same satisfaction, making it dangerous for the user. [11]
Thorndike's Law of Effect can be compared to Darwin's theory of natural selection in which successful organisms are more likely to prosper and survive to pass on their genes to the next generation, while the weaker, unsuccessful organisms are gradually replaced and "stamped out". It can be said that the environment selects the "fittest" behavior for a situation, stamping out any unsuccessful behaviors, in the same way it selects the "fittest" individuals of a species. In an experiment that Thorndike conducted, he placed a hungry cat inside a "puzzle box", where the animal could only escape and reach the food once it could operate the latch of the door. At first the cats would scratch and claw in order to find a way out, then by chance / accident, the cat would activate the latch to open the door. On successive trials, the behaviour of the animal would become more habitual, to a point where the animal would operate without hesitation. The occurrence of the favourable outcome, reaching the food source, only strengthens the response that it produces.
Colwill and Rescorla for example made all rats complete the goal of getting food pellets and liquid sucrose in consistent sessions on identical variable-interval schedules. [12]
The law of work for psychologist B. F. Skinner almost half a century later on the principles of operant conditioning, "a learning process by which the effect, or consequence, of a response influences the future rate of production of that response." [1] Skinner would later use an updated version of Thorndike's puzzle box, called the operant chamber, or Skinner box, which has contributed immensely to our perception and understanding of the law of effect in modern society and how it relates to operant conditioning. It has allowed a researcher to study the behavior of small organisms in a controlled environment.
An example of Thorndike’s Law of Effect in a child’s behavior could be the child receiving praise and a star sticker for tidying up their toys. The positive reinforcement (praise and sticker) encourages the repetition of the behavior (cleaning up), illustrating the Law of Effect in action.
Burrhus Frederic Skinner was an American psychologist, behaviorist, inventor, and social philosopher. He was the Edgar Pierce Professor of Psychology at Harvard University from 1958 until his retirement in 1974.
Edward Lee Thorndike was an American psychologist who spent nearly his entire career at Teachers College, Columbia University. His work on comparative psychology and the learning process led to the theory of connectionism and helped lay the scientific foundation for educational psychology. He also worked on solving industrial problems, such as employee exams and testing.
Operant conditioning, also called instrumental conditioning, is a learning process where voluntary behaviors are modified by association with the addition of reward or aversive stimuli. The frequency or duration of the behavior may increase through reinforcement or decrease through punishment or extinction.
An operant conditioning chamber is a laboratory apparatus used to study animal behavior. The operant conditioning chamber was created by B. F. Skinner while he was a graduate student at Harvard University. The chamber can be used to study both operant conditioning and classical conditioning.
Classical conditioning is a behavioral procedure in which a biologically potent stimulus is paired with a neutral stimulus. The term classical conditioning refers to the process of an automatic, conditioned response that is paired with a specific stimulus.
In behavioral psychology, reinforcement refers to consequences that increase the likelihood of an organism's future behavior, typically in the presence of a particular antecedent stimulus. For example, a rat can be trained to push a lever to receive food whenever a light is turned on. In this example, the light is the antecedent stimulus, the lever pushing is the operant behavior, and the food is the reinforcer. Likewise, a student that receives attention and praise when answering a teacher's question will be more likely to answer future questions in class. The teacher's question is the antecedent, the student's response is the behavior, and the praise and attention are the reinforcements.
Radical behaviorism is a "philosophy of the science of behavior" developed by B. F. Skinner. It refers to the philosophy behind behavior analysis, and is to be distinguished from methodological behaviorism—which has an intense emphasis on observable behaviors—by its inclusion of thinking, feeling, and other private events in the analysis of human and animal psychology. The research in behavior analysis is called the experimental analysis of behavior and the application of the field is called applied behavior analysis (ABA), which was originally termed "behavior modification."
The experimental analysis of behavior is a science that studies the behavior of individuals across a variety of species. A key early scientist was B. F. Skinner who discovered operant behavior, reinforcers, secondary reinforcers, contingencies of reinforcement, stimulus control, shaping, intermittent schedules, discrimination, and generalization. A central method was the examination of functional relations between environment and behavior, as opposed to hypothetico-deductive learning theory that had grown up in the comparative psychology of the 1920–1950 period. Skinner's approach was characterized by observation of measurable behavior which could be predicted and controlled. It owed its early success to the effectiveness of Skinner's procedures of operant conditioning, both in the laboratory and in behavior therapy.
Behaviorism is a systematic approach to understand the behavior of humans and other animals. It assumes that behavior is either a reflex evoked by the pairing of certain antecedent stimuli in the environment, or a consequence of that individual's history, including especially reinforcement and punishment contingencies, together with the individual's current motivational state and controlling stimuli. Although behaviorists generally accept the important role of heredity in determining behavior, they focus primarily on environmental events. The cognitive revolution of the late 20th century largely replaced behaviorism as an explanatory theory with cognitive psychology, which unlike behaviorism examines internal mental states.
Animal training is the act of teaching animals specific responses to specific conditions or stimuli. Training may be for purposes such as companionship, detection, protection, and entertainment. The type of training an animal receives will vary depending on the training method used, and the purpose for training the animal. For example, a seeing eye dog will be trained to achieve a different goal than a wild animal in a circus.
Shaping is a conditioning paradigm used primarily in the experimental analysis of behavior. The method used is differential reinforcement of successive approximations. It was introduced by B. F. Skinner with pigeons and extended to dogs, dolphins, humans and other species. In shaping, the form of an existing response is gradually changed across successive trials towards a desired target behavior by reinforcing exact segments of behavior. Skinner's explanation of shaping was this:
We first give the bird food when it turns slightly in the direction of the spot from any part of the cage. This increases the frequency of such behavior. We then withhold reinforcement until a slight movement is made toward the spot. This again alters the general distribution of behavior without producing a new unit. We continue by reinforcing positions successively closer to the spot, then by reinforcing only when the head is moved slightly forward, and finally only when the beak actually makes contact with the spot. ... The original probability of the response in its final form is very low; in some cases it may even be zero. In this way we can build complicated operants which would never appear in the repertoire of the organism otherwise. By reinforcing a series of successive approximations, we bring a rare response to a very high probability in a short time. ... The total act of turning toward the spot from any point in the box, walking toward it, raising the head, and striking the spot may seem to be a functionally coherent unit of behavior; but it is constructed by a continual process of differential reinforcement from undifferentiated behavior, just as the sculptor shapes his figure from a lump of clay.
Extinction is a behavioral phenomenon observed in both operantly conditioned and classically conditioned behavior, which manifests itself by fading of non-reinforced conditioned response over time. When operant behavior that has been previously reinforced no longer produces reinforcing consequences the behavior gradually stops occurring. In classical conditioning, when a conditioned stimulus is presented alone, so that it no longer predicts the coming of the unconditioned stimulus, conditioned responding gradually stops. For example, after Pavlov's dog was conditioned to salivate at the sound of a metronome, it eventually stopped salivating to the metronome after the metronome had been sounded repeatedly but no food came. Many anxiety disorders such as post traumatic stress disorder are believed to reflect, at least in part, a failure to extinguish conditioned fear.
An avoidance response is a response that prevents an aversive stimulus from occurring. It is a kind of negative reinforcement. An avoidance response is a behavior based on the concept that animals will avoid performing behaviors that result in an aversive outcome. This can involve learning through operant conditioning when it is used as a training technique. It is a reaction to undesirable sensations or feedback that leads to avoiding the behavior that is followed by this unpleasant or fear-inducing stimulus.
In operant conditioning, the matching law is a quantitative relationship that holds between the relative rates of response and the relative rates of reinforcement in concurrent schedules of reinforcement. For example, if two response alternatives A and B are offered to an organism, the ratio of response rates to A and B equals the ratio of reinforcements yielded by each response. This law applies fairly well when non-human subjects are exposed to concurrent variable interval schedules ; its applicability in other situations is less clear, depending on the assumptions made and the details of the experimental situation. The generality of applicability of the matching law is subject of current debate.
In operant conditioning, punishment is any change in a human or animal's surroundings which, occurring after a given behavior or response, reduces the likelihood of that behavior occurring again in the future. As with reinforcement, it is the behavior, not the human/animal, that is punished. Whether a change is or is not punishing is determined by its effect on the rate that the behavior occurs. This is called motivating operations (MO), because they alter the effectiveness of a stimulus. MO can be categorized in abolishing operations, decrease the effectiveness of the stimuli and establishing, increase the effectiveness of the stimuli. For example, a painful stimulus which would act as a punisher for most people may actually reinforce some behaviors of masochistic individuals.
Comparative cognition is the comparative study of the mechanisms and origins of cognition in various species, and is sometimes seen as more general than, or similar to, comparative psychology. From a biological point of view, work is being done on the brains of fruit flies that should yield techniques precise enough to allow an understanding of the workings of the human brain on a scale appreciative of individual groups of neurons rather than the more regional scale previously used. Similarly, gene activity in the human brain is better understood through examination of the brains of mice by the Seattle-based Allen Institute for Brain Science, yielding the freely available Allen Brain Atlas. This type of study is related to comparative cognition, but better classified as one of comparative genomics. Increasing emphasis in psychology and ethology on the biological aspects of perception and behavior is bridging the gap between genomics and behavioral analysis.
Behavioral momentum is a theory in quantitative analysis of behavior and is a behavioral metaphor based on physical momentum. It describes the general relation between resistance to change and the rate of reinforcement obtained in a given situation.
In behavioral psychology, stimulus control is a phenomenon in operant conditioning that occurs when an organism behaves in one way in the presence of a given stimulus and another way in its absence. A stimulus that modifies behavior in this manner is either a discriminative stimulus or stimulus delta. For example, the presence of a stop sign at a traffic intersection alerts the driver to stop driving and increases the probability that braking behavior occurs. Stimulus control does not force behavior to occur, as it is a direct result of historical reinforcement contingencies, as opposed to reflexive behavior elicited through classical conditioning.
Discrimination learning is defined in psychology as the ability to respond differently to different stimuli. This type of learning is used in studies regarding operant and classical conditioning. Operant conditioning involves the modification of a behavior by means of reinforcement or punishment. In this way, a discriminative stimulus will act as an indicator to when a behavior will persist and when it will not. Classical conditioning involves learning through association when two stimuli are paired together repeatedly. This conditioning demonstrates discrimination through specific micro-instances of reinforcement and non-reinforcement. This phenomenon is considered to be more advanced than learning styles such as generalization and yet simultaneously acts as a basic unit to learning as a whole. The complex and fundamental nature of discrimination learning allows for psychologists and researchers to perform more in-depth research that supports psychological advancements. Research on the basic principles underlying this learning style has their roots in neuropsychology sub-processes.
Association in psychology refers to a mental connection between concepts, events, or mental states that usually stems from specific experiences. Associations are seen throughout several schools of thought in psychology including behaviorism, associationism, psychoanalysis, social psychology, and structuralism. The idea stems from Plato and Aristotle, especially with regard to the succession of memories, and it was carried on by philosophers such as John Locke, David Hume, David Hartley, and James Mill. It finds its place in modern psychology in such areas as memory, learning, and the study of neural pathways.