Hidden profile

Last updated

A hidden profile is a paradigm that occurs in the process of group decision making. It is found in a situation when part of some information is shared among group members (i.e. all members possess this information prior to discussion), whereas other pieces of information are unshared (i.e. information known to only one member prior to discussion). [1] Furthermore, shared information and unshared information have different decisional implications, and the alternative implied by the unshared information is the correct one given all information available to the group. [2] However, no group member can detect this best solution on the basis of her or his individual information prior to discussion; it can only be found by pooling the unshared information during group discussion. [3] This topic is one of many topics studied in social psychology.

Contents

History

In 1981 two researchers, Garold Stasser and William Titus, set out to challenge strongly-held beliefs about group decision making. The researchers attempted to use a number of formal models to identify what would happen when people are not fully informed. [1] :304 One of these was the persuasive argument theory (PAT). The theory, in this case, would state that polarization of judgments is due to culturally pooled arguments. This will bolster popularly held beliefs and arguments, yet not take into account unshared information. The novelty of arguments, on the other hand, does create change, but PAT assumes they are consistent with the norm. Stasser and Titus posit that this is not always the case. Through providing selective, unique information, they felt they could separate informational influence and normative influence. Though they ultimately found opposing results, they had devised hidden profiles. [1] :306

Causes

Hidden profiles occur when some information is shared among group members and other pieces of information remain unshared. This phenomenon has a tendency to co-occur and interact with the shared information bias to produce poor decisions. Though each person in a group will have unique knowledge, group members will have the propensity to discuss already shared information. The motivation to do so will be to reach a consensus. This, however, does not yield the optimal decision choice. Initial decision preferences differ from what is ideal, but integrating the unique knowledge of individual group members will allow the optimal decision to be realized. [3]

Stewart and Stasser (1998) state that shared information bias, and in turn hidden profiles, occur more often for judgment-based tasks because of the ultimate goal of reaching consensus. [4] To support this, a meta-analysis of information sharing and teams by Mesmer-Magnus and DeChurch (2009) found that information sharing most affected intellectual tasks. [5] These findings supported Laughlin’s (1980, 1996) claim that intellectual tasks require more information sharing. That is, tasks featuring a hidden profile were less affected by the shared information bias when they were intellectually based (i.e., goal of correctness) rather than judgmentally based (i.e., goal of consensus).

One issue with hidden profiles is the time dedicated to discussion. Shared and unshared information are markedly different in the amount time each is debated in group discussion. [5] In other words, shared information is discussed far more often than novel, unshared information. This unbalanced approach to group decision making does not yield optimal decision choices. This effect is even more prominent for judgmental tasks, as discussed above. [5] Due to this, though an optimal decision is attainable, it will likely not be realized. In hidden profile situations, groups rarely find the alternative decision, likely the optimal one. [6]

One’s affect may also have an effect. When in a group setting, the unbalanced time dedication to shared information may be due to the uncomfortable nature of sharing novel, unique information. Discussing shared information can help to enhance other’s evaluation of a person, while unique information may impair on evaluation. [7] As well, affect does influence reception of knowledge. In a series of studies, it was found that knowledge transfer of novel ideas is greater for those with a positive affect rather than those with a negative affect. This trend does not implicate the sender of information, but the receiver. However, if pairs are affect-congruent (negative-negative, positive-positive) rather than affect-incongruent, knowledge transfer is greater. [8]

See also

Further reading

Related Research Articles

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias is insuperable for most people, but they can manage it, for example, by education and training in critical thinking skills.

Group dynamics is a system of behaviors and psychological processes occurring within a social group, or between social groups. The study of group dynamics can be useful in understanding decision-making behaviour, tracking the spread of diseases in society, creating effective therapy techniques, and following the emergence and popularity of new ideas and technologies. These applications of the field are studied in psychology, sociology, anthropology, political science, epidemiology, education, social work, leadership studies, business and managerial studies, as well as communication studies.

The out-group homogeneity effect is the perception of out-group members as more similar to one another than are in-group members, e.g. "they are alike; we are diverse". Perceivers tend to have impressions about the diversity or variability of group members around those central tendencies or typical attributes of those group members. Thus, outgroup stereotypicality judgments are overestimated, supporting the view that out-group stereotypes are overgeneralizations. The term "outgroup homogeneity effect", "outgroup homogeneity bias" or "relative outgroup homogeneity" have been explicitly contrasted with "outgroup homogeneity" in general, the latter referring to perceived outgroup variability unrelated to perceptions of the ingroup.

In social psychology, group polarization refers to the tendency for a group to make decisions that are more extreme than the initial inclination of its members. These more extreme decisions are towards greater risk if individuals' initial tendencies are to be risky and towards greater caution if individuals' initial tendencies are to be cautious. The phenomenon also holds that a group's attitude toward a situation may change in the sense that the individuals' initial attitudes have strengthened and intensified after group discussion, a phenomenon known as attitude polarization.

Communication in small groups consists of three or more people who share a common goal and communicate collectively to achieve it. During small group communication, interdependent participants analyze data, evaluate the nature of the problem(s), decide and provide a possible solution or procedure. Additionally, small group communication provides strong feedback, unique contributions to the group as well as a critical thinking analysis and self-disclosure from each member. Small groups communicate through an interpersonal exchange process of information, feelings and active listening in both two types of small groups: primary groups and secondary groups.

The anchoring effect is a psychological phenomenon in which an individual's judgements or decisions are influenced by a reference point or "anchor" which can be completely irrelevant. Both numeric and non-numeric anchoring have been reported in research. In numeric anchoring, once the value of the anchor is set, subsequent arguments, estimates, etc. made by an individual may change from what they would have otherwise been without the anchor. For example, an individual may be more likely to purchase a car if it is placed alongside a more expensive model. Prices discussed in negotiations that are lower than the anchor may seem reasonable, perhaps even cheap to the buyer, even if said prices are still relatively higher than the actual market value of the car. Another example may be when estimating the orbit of Mars, one might start with the Earth's orbit and then adjust upward until they reach a value that seems reasonable.

<span class="mw-page-title-main">Thomas Gilovich</span> American psychologist (born 1954)

Thomas Dashiff Gilovich an American psychologist who is the Irene Blecker Rosenfeld Professor of Psychology at Cornell University. He has conducted research in social psychology, decision making, behavioral economics, and has written popular books on these subjects. Gilovich has collaborated with Daniel Kahneman, Richard Nisbett, Lee Ross and Amos Tversky. His articles in peer-reviewed journals on subjects such as cognitive biases have been widely cited. In addition, Gilovich has been quoted in the media on subjects ranging from the effect of purchases on happiness to people's most common regrets, to perceptions of people and social groups. Gilovich is a fellow of the Committee for Skeptical Inquiry.

Depressive realism is the hypothesis developed by Lauren Alloy and Lyn Yvonne Abramson that depressed individuals make more realistic inferences than non-depressed individuals. Although depressed individuals are thought to have a negative cognitive bias that results in recurrent, negative automatic thoughts, maladaptive behaviors, and dysfunctional world beliefs, depressive realism argues not only that this negativity may reflect a more accurate appraisal of the world but also that non-depressed individuals' appraisals are positively biased.

Integrative complexity is a research psychometric that refers to the degree to which thinking and reasoning involve the recognition and integration of multiple perspectives and possibilities and their interrelated contingencies.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

The negativity bias, also known as the negativity effect, is a cognitive bias that, even when of equal intensity, things of a more negative nature have a greater effect on one's psychological state and processes than neutral or positive things. In other words, something very positive will generally have less of an impact on a person's behavior and cognition than something equally emotional but negative. The negativity bias has been investigated within many different domains, including the formation of impressions and general evaluations; attention, learning, and memory; and decision-making and risk considerations.

Positive illusions are unrealistically favorable attitudes that people have towards themselves or to people that are close to them. Positive illusions are a form of self-deception or self-enhancement that feel good; maintain self-esteem; or avoid discomfort, at least in the short term. There are three general forms: inflated assessment of one's own abilities, unrealistic optimism about the future, and an illusion of control. The term "positive illusions" originates in a 1988 paper by Taylor and Brown. "Taylor and Brown's (1988) model of mental health maintains that certain positive illusions are highly prevalent in normal thought and predictive of criteria traditionally associated with mental health."

Self-enhancement is a type of motivation that works to make people feel good about themselves and to maintain self-esteem. This motive becomes especially prominent in situations of threat, failure or blows to one's self-esteem. Self-enhancement involves a preference for positive over negative self-views. It is one of the three self-evaluation motives along with self-assessment and self-verification . Self-evaluation motives drive the process of self-regulation, that is, how people control and direct their own actions.

Group decision-making is a situation faced when individuals collectively make a choice from the alternatives before them. The decision is then no longer attributable to any single individual who is a member of the group. This is because all the individuals and social group processes such as social influence contribute to the outcome. The decisions made by groups are often different from those made by individuals. In workplace settings, collaborative decision-making is one of the most successful models to generate buy-in from other stakeholders, build consensus, and encourage creativity. According to the idea of synergy, decisions made collectively also tend to be more effective than decisions made by a single individual. In this vein, certain collaborative arrangements have the potential to generate better net performance outcomes than individuals acting on their own. Under normal everyday conditions, collaborative or group decision-making would often be preferred and would generate more benefits than individual decision-making when there is the time for proper deliberation, discussion, and dialogue. This can be achieved through the use of committee, teams, groups, partnerships, or other collaborative social processes.

In social psychology, illusory superiority is a cognitive bias wherein a person overestimates their own qualities and abilities compared to other people. Illusory superiority is one of many positive illusions, relating to the self, that are evident in the study of intelligence, the effective performance of tasks and tests, and the possession of desirable personal characteristics and personality traits. Overestimation of abilities compared to an objective measure is known as the overconfidence effect.

<span class="mw-page-title-main">Negative affectivity</span> Personality variable

Negative affectivity (NA), or negative affect, is a personality variable that involves the experience of negative emotions and poor self-concept. Negative affectivity subsumes a variety of negative emotions, including anger, contempt, disgust, guilt, fear, and nervousness. Low negative affectivity is characterized by frequent states of calmness and serenity, along with states of confidence, activeness, and great enthusiasm.

Shared information bias is known as the tendency for group members to spend more time and energy discussing information that all members are already familiar with, and less time and energy discussing information that only some members are aware of. Harmful consequences related to poor decision-making can arise when the group does not have access to unshared information in order to make a well-informed decision.

The false-uniqueness effect is an attributional type of cognitive bias in social psychology that describes how people tend to view their qualities, traits, and personal attributes as unique when in reality they are not. This bias is often measured by looking at the difference between estimates that people make about how many of their peers share a certain trait or behaviour and the actual number of peers who report these traits and behaviours.

Epistemic motivation is the desire to develop and maintain a rich and thorough understanding of a situation, utilizing one's beliefs towards knowledge and the process of building knowledge. A learner's motivation towards knowledge as an object influences their knowledge acquisition. In interpersonal relations, epistemic motivation is the desire to process information thoroughly, and thus grasp the meaning behind other people's emotions. In group settings, epistemic motivation can be defined as participants' willingness to expend effort to achieve a thorough, rich, and accurate understanding of the world, including the group task, or decision problem at hand, and the degree to which group members tend to systematically process and disseminate information.

References

  1. 1 2 3 Stasser, G.; Titus, W. (2003). "Hidden profiles: A brief history". Psychological Inquiry. 14 (3, 4): 304–313. doi:10.1207/s15327965pli1403&4_21.
  2. Schulz-Hardt, S.; Brodbeck, F.; Mojzisch, A.; Kerschreiter, R.; Frey, D. (2006). "Group decision making in hidden profile situations: Dissent as a facilitator for decision quality". Journal of Personality and Social Psychology. 91 (6): 1080–1093. doi:10.1037/0022-3514.91.6.1080. PMID   17144766.
  3. 1 2 Stasser, G.; Titus, W. (1985). "Pooling of unshared information in group decision making: Biased information sampling during discussion". Journal of Personality and Social Psychology. 48 (6): 1467–1478. doi:10.1037/0022-3514.48.6.1467. S2CID   34000088.
  4. Stewart, D. D.; Stasser, G. (1998). "The sampling of critical, unshared information in decision-making groups: The role of an informed minority". European Journal of Social Psychology. 28: 95–113. doi:10.1002/(sici)1099-0992(199801/02)28:1<95::aid-ejsp847>3.0.co;2-0.
  5. 1 2 3 Mesmer-Magnus, J.; DeChurch, L. (2009). "Information Sharing and Team Performance: A Meta-Analysis". Journal of Applied Psychology. 94 (2): 535–546. doi:10.1037/a0013773. PMID   19271807.
  6. Fiedler, K., Juslin, P. (Eds.). (2006). Information Sampling and Adaptive Cognition. New York, NY: Cambridge University Press.
  7. Wittenbaum, G. M.; Hubbell, A. P.; Zuckerman, C. (1999). "Mutual enhancement: Toward an understanding of the collective preference for shared information". Journal of Personality and Social Psychology. 77 (5): 967–978. doi:10.1037/0022-3514.77.5.967.
  8. Levin, D.; Kurtzberg, T.; Phillips, K.; Lount Jr, R. (2010). "The Role of Affect in Knowledge Transfer". Group Dynamics: Theory, Research, and Practice. 14 (2): 123–142. doi:10.1037/a0017317. S2CID   30886524.