In psychology, a social trap is a situation in which a group of people act to obtain short-term individual gains, which in the long run leads to a loss for the group as a whole. Examples of social traps include overfishing, energy "brownout" and "blackout" power outages during periods of extreme temperatures, the overgrazing of cattle on the Sahelian Desert, and the destruction of the rainforest by logging interests and agriculture.[ citation needed ]
The term social trap was first introduced to the scientific community by John Platt's 1973 paper in American Psychologist ,and in a book developed in an interdisciplinary symposium held at the University of Michigan. Building upon the concept of the "tragedy of the commons" in Garrett Hardin's pivotal article in Science (1968), Platt and others in the seminar applied behavioral psychology concepts to actions of people operating in social traps. By applying the findings of basic research on "schedules of operant reinforcement" (B.F. Skinner 1938, 1948, 1953, 1957; Keller and Schoenfeld, 1950), Platt recognized that individuals operating for short-term positive gain ("reinforcement") had a tendency to over-exploit a resource, which led to a long-term overall loss to society.
The application of behavioral psychology terms to behaviors in the tragedy of the commons led to the realization that the same short-term/long-term cause-effect relationship also applied to other human traps, in addition to the exploitation of commonly held resources. Platt et al. also introduced the terms social fence and individual trap. Social fence refers to a short-term avoidance behavior by individuals that leads to a long-term loss to the entire group.An example is the anecdote of a mattress that falls from a vehicle on a two lane highway. Motorists tend to back up in a traffic jam behind the mattress, waiting for a break in the oncoming traffic to pass around the mattress. Each individual motorist avoids the opportunity to exit their stopped car and pull the mattress to the side of the road. The long-term consequence of this avoidance behavior is that all of the motorists (except for perhaps one) arrived at their destinations later than they would have if an individual had removed the mattress barrier.
An individual trap is similar to a social trap except that it involves the behavior of only a single person rather than a group of people. The basic concept is that an individual's behavior for short-term reinforcers leads to a long-term loss for the individual. Examples of individual traps are tobacco smoking leading to lung cancer or alcohol ingestion leading to cirrhosis of the liver.
The first empirical test of the concept of social traps was by Brechner at Arizona State University,who operationalized the concepts underlying Platt et al.'s theoretical analysis. By creating a laboratory game, Brechner had groups of college students playing a game where they could accumulate points by pressing buttons for the individual short-term positive rewards of experimental credit in their introductory psychology classes. Players could see a lighted display that indicated the total quantity of points available at any given time in the experiment. Players were told that if they completely drained the pool of points, the game was over and they could not accumulate more points. By responding for points at a moderate rate all the players in the group could accumulate enough points to fulfill their entire semester's experimental requirements. But if one or more players took points for themselves at too fast a rate, the pool would be drained of points and none of the players would achieve the maximum potential experimental credit.
In building the laboratory analogy of social traps, Brechner introduced the concept of "superimposed schedules of reinforcement". Skinner and Ferster (1957)had demonstrated that reinforcers could be delivered on schedules (schedule of reinforcement), and further that organisms behaved differently under different schedules. Rather than a reinforcer, such as food or water, being delivered every time as a consequence of some behavior, a reinforcer could be delivered after more than one instance of the behavior. For example, a pigeon may be required to peck a button switch five times before food is made available to the pigeon. This is called a "ratio schedule". Also, a reinforcer could be delivered after an interval of time passed following a target behavior. An example is a rat that is given a food pellet one minute after the rat pressed a lever. This is called an "interval schedule". In addition, ratio schedules can deliver reinforcement following fixed or variable number of behaviors by the individual organism. Likewise, interval schedules can deliver reinforcement following fixed or variable intervals of time following a single response by the organism. Individual behaviors tend to generate response rates that differ based upon how the reinforcement schedule is created. Much subsequent research in many labs examined the effects on behaviors of scheduling reinforcers.
When an organism is offered the opportunity to choose between or among two or more simple schedules of reinforcement at the same time, the reinforcement structures are called "concurrent schedules of reinforcement". In creating the laboratory analogy of social traps, Brechner created a situation where simple reinforcement schedules were superimposed upon each other. In other words, a single response or group of responses by an organism led to multiple consequences. Concurrent schedules of reinforcement can be thought of as "or" schedules, and superimposed schedules of reinforcement can be thought of as "and" schedules.
To simulate social traps a short-term positive reward is superimposed upon a long-term negative consequence. In the specific experiment, the short-term positive reinforcer was earning points that applied to class credits. The long-term negative consequence was that each point earned by a player also drained the pool of available points. Responding too rapidly for short-term gains led to the long-term loss of draining the resource pool. What makes the traps social is that any individual can respond in a way that the long-term consequence also comes to bear on the other individuals in the environment.
Superimposed schedules of reinforcement have many real-world applications in addition to generating social traps (Brechner and Linder, 1981; Brechner, 1987; Brechner, 2010). Many different human individual and social situations can be created by superimposing simple reinforcement schedules. For example, a human being could have simultaneous tobacco and alcohol addictions. Even more complex situations can be created or simulated by superimposing two or more concurrent schedules. For example, a high school senior could have a choice between going to Stanford University or UCLA, and at the same time have the choice of going into the Army or the Air Force, and simultaneously the choice of taking a job with an internet company or a job with a software company. That would be a reinforcement structure of three superimposed concurrent schedules of reinforcement. An example of the use of superimposed schedules as a tool in the analysis of the contingencies of rent control can be found online in the website "Economic and Game Theory Forum", (Brechner, 2003).
Subsequent empirical studies by other researchers explored aspects of social traps other than the underlying reinforcement structure. Studies tended to concentrate on manipulating social and cognitive variables. Cass and Edney (1978) created a simpler game using a bowl of nuts to simulate a commonly held resource. The Nuts Game as they called it had some distinct advantages over Brechner's electronically wired laboratory simulation. The Nuts Game could be transported easily to any environment in or out of the laboratory. It was simple and required no electronics. The reinforcers used were primary food rewards rather than the secondary conditioned reinforcers of class credit used in the earlier study.
From Platt's and others' initial concept, social trap research has spread to laboratories all over the world and has expanded into the fields of sociology, economics, institutional design, and the nuclear arms race.Summaries of the many other diverse studies of social traps can be found in Messick and McClelland (1983), Costanza (1984), Komorita and Parks (1996), and Rothstein (2005).
Social trap research continues to be an active area. Urlacher (2008) devised an iterated version of the prisoner's dilemma game using groups of people, or "agents", pitted against other groups of agents, in a variation he termed a "two-level social trap".He reported that when using a democratic decision rule, larger groups behaved more cooperatively than smaller groups. Chuang, Rivoire, and Liebler (2009) constructed a non-mammalian commons dilemma using colonies of the bacteria Escherichia coli composed of strains of producer and nonproducer microbes that contribute (or do not contribute) to the common resource in an examination of the statistical concept of Simpson's paradox.
In 2010, Shaimaa Lazem and Denis Gračanin, in the Department of Computer Science at Virginia Tech, took social traps to a new level: Into cyberspace. They performed a replication of the original social trap experiment, but created the social trap in the internet virtual world known as Second Life (Lazem and Gračanin, 2010). They constructed a virtual experimental laboratory with the subjects responding through avatars. The findings mirrored the original study, by finding that the ability to communicate led to greater replenishment of common resources.
Burrhus Frederic Skinner was an American psychologist, behaviorist, author, inventor, and social philosopher. He was a professor of psychology at Harvard University from 1958 until his retirement in 1974.
Operant conditioning is a type of associative learning process through which the strength of a behavior is modified by reinforcement or punishment. It is also a procedure that is used to bring about such learning.
In behavioral psychology, reinforcement is a consequence applied that will strengthen an organism's future behavior whenever that behavior is preceded by a specific antecedent stimulus. This strengthening effect may be measured as a higher frequency of behavior, longer duration, greater magnitude, or shorter latency. There are two types of reinforcement, known as positive reinforcement and negative reinforcement; positive is where by a reward is offered on expression of the wanted behaviour and negative is taking away an undesirable element in the persons environment whenever the desired behaviour is achieved. Rewarding stimuli, which are associated with "wanting" and "liking" and appetitive behavior, function as positive reinforcers; the converse statement is also true: positive reinforcers provide a desirable stimulus. Reinforcement does not require an individual to consciously perceive an effect elicited by the stimulus. Thus, reinforcement occurs only if there is an observable strengthening in behavior. However, there is also negative reinforcement, which is characterized by taking away an undesirable stimulus. Changing someone's job might serve as a negative reinforcer to someone who suffers from back problems, i.e. Changing from a labourers job to an office position for instance.
Radical behaviorism was pioneered by B. F. Skinner and is his "philosophy of the science of behavior." It refers to the philosophy behind behavior analysis, and is to be distinguished from methodological behaviorism—which has an intense emphasis on observable behaviors—by its inclusion of thinking, feeling, and other private events in the analysis of human and animal psychology. The research in behavior analysis is called the experimental analysis of behavior and the application of this field is called applied behavior analysis (ABA), which was originally termed "behavior modification."
The experimental analysis of behavior is school of thought in psychology founded on B. F. Skinner's philosophy of radical behaviorism and defines the basic principles used in applied behavior analysis. A central principle was the inductive reasoning data-driven examination of functional relations, as opposed to the kinds of hypothetico-deductive learning theory that had grown up in the comparative psychology of the 1920–1950 period. Skinner's approach was characterized by observation of measurable behavior which could be predicted and controlled. It owed its early success to the effectiveness of Skinner's procedures of operant conditioning, both in the laboratory and in behavior therapy.
Behaviorism is a systematic approach to understanding the behavior of humans and other animals. It assumes that behavior is either a reflex evoked by the pairing of certain antecedent stimuli in the environment, or a consequence of that individual's history, including especially reinforcement and punishment contingencies, together with the individual's current motivational state and controlling stimuli. Although behaviorists generally accept the important role of heredity in determining behavior, they focus primarily on environmental events.
Applied behavior analysis (ABA), also called behavioral engineering, is a scientific technique concerned with applying empirical approaches based upon the principles of respondent and operant conditioning to change behavior of social significance. It is the applied form of behavior analysis; the other two forms are radical behaviorism and the experimental analysis of behavior.
Behavior modification refers to behavior-change procedures that were employed during the 1970s and early 1980s. Based on methodological behaviorism, overt behavior was modified with presumed consequences, including artificial positive and negative reinforcement contingencies to increase desirable behavior, or administering positive and negative punishment and/or extinction to reduce problematic behavior. For the treatment of phobias, habituation and punishment were the basic principles used in flooding, a subcategory of desensitization.
Countercontrol is a term used by Dr. B.F. Skinner in 1953 as a functional class in the analysis of social behavior. Opposition or resistance to intervention defines countercontrol, however little systematic research has been conducted to document its occurrence. Skinner also distinguished it from the literature of freedom, which he said did not provide effective countercontrol strategies. The concept was identified as a mechanism to oppose control such as escape from the controller or waging an attack in order to weaken or destroy the controlling power. For this purpose, Skinner stressed the role of the individual as an instrument of countercontrol, emphasizing the notion of vigilance along with the concepts of freedom and dignity.
A token economy is a system of contingency management based on the systematic reinforcement of target behavior. The reinforcers are symbols or tokens that can be exchanged for other reinforcers. A token economy is based on the principles of operant conditioning and behavioral economics and can be situated within applied behavior analysis. In applied settings token economies are used with children and adults; however, they have been successfully modeled with pigeons in lab settings.
In operant conditioning, the matching law is a quantitative relationship that holds between the relative rates of response and the relative rates of reinforcement in concurrent schedules of reinforcement. For example, if two response alternatives A and B are offered to an organism, the ratio of response rates to A and B equals the ratio of reinforcements yielded by each response. This law applies fairly well when non-human subjects are exposed to concurrent variable interval schedules ; its applicability in other situations is less clear, depending on the assumptions made and the details of the experimental situation. The generality of applicability of the matching law is subject of current debate.
Behavioral momentum is a theory in quantitative analysis of behavior and is a behavioral metaphor based on physical momentum. It describes the general relation between resistance to change and the rate of reinforcement obtained in a given situation.
Melioration theory in psychology is a theoretical algorithm that predicts the matching law. Melioration theory is used as an explanation for why an organism makes choices based on the rewards or reinforcers it receives. The principle of melioration states that animals will invest increasing amounts of time and/or effort into whichever alternative is better. To meliorate essentially means to "make better".
In behavioral psychology, stimulus control is a phenomenon in operant conditioning that occurs when an organism behaves in one way in the presence of a given stimulus and another way in its absence. A stimulus that modifies behavior in this manner is either a discriminative stimulus (Sd) or stimulus delta (S-delta). Stimulus-based control of behavior occurs when the presence or absence of an Sd or S-delta controls the performance of a particular behavior. For example, the presence of a stop sign (S-delta) at a traffic intersection alerts the driver to stop driving and increases the probability that "braking" behavior will occur. Such behavior is said to be emitted because it does not force the behavior to occur since stimulus control is a direct result of historical reinforcement contingencies, as opposed to reflexive behavior that is said to be elicited through respondent conditioning.
Charles Bohris Ferster was an American behavioral psychologist. A pioneer of applied behavior analysis, he developed errorless learning and was a colleague of B.F. Skinner's at Harvard University, co-authoring the book Schedules of Reinforcement (1957).
Fred Simmons Keller was an American psychologist and a pioneer in experimental psychology. He taught at Columbia University for 26 years and gave his name to the Keller Plan, also known as Personalized System of Instruction, an individually paced, mastery-oriented teaching method that has had a significant impact on college-level science education system. He died at home, age 97, on February 2, 1996, in Chapel Hill, North Carolina.
Behavior management is similar to behavior modification. It is a less intensive version of behavior therapy. In behavior modification, the focus is on changing behavior, while in behavior management the focus is on maintaining order. Behavior management skills are of particular importance to teachers in the educational system. Behavior management includes all of the actions and conscious inactions to enhance the probability people, individually and in groups, choose behaviors which are personally fulfilling, productive, and socially acceptable. Behavior management can be accomplished through modeling, rewards or punishment.
The behavioral analysis of child development originates from John B. Watson's behaviorism.
Motivating operation (MO) is a behavioristic concept introduced by Jack Michael in 1982. It is used to explain variations in the effects in the consequences of behavior. Most importantly, a MO affects how strongly the person is reinforced or punished by the consequences of their behavior. For example, food deprivation is a motivating operation; if a person is hungry, food is strongly reinforcing, but if a person is satiated, food is less reinforcing. In 2003 Laraway suggested subdividing MOs into those that increase the reinforcing or punishing effects of a stimulus, which are termed establishing operations, and MOs that decrease the reinforcing or punishing effects of a stimulus, which are termed abolishing operations.
James “Jim” A. Dinsmoor was an influential experimental psychologist who published work in the field of the experimental analysis of behavior. He was born October 4, 1921, in Woburn, Massachusetts to Daniel and Jean Dinsmoor. He graduated with his bachelor's degree from Dartmouth College in 1943. Subsequently, he attended Columbia University in New York City, where he received his Master's and Ph.D. degrees under the mentorship of William N. Schoenfeld and Fred S. Keller. There, he was introduced to the work of B.F. Skinner, whose behavior analytic research inspired Dinsmoor to pursue a lifetime of research in conditioned responding.