Problem solving

Last updated

Problem solving is the process of achieving a goal by overcoming obstacles, a frequent part of most activities. Problems in need of solutions range from simple personal tasks (e.g. how to turn on an appliance) to complex issues in business and technical fields. The former is an example of simple problem solving (SPS) addressing one issue, whereas the latter is complex problem solving (CPS) with multiple interrelated obstacles. [1] Another classification of problem-solving tasks is into well-defined problems with specific obstacles and goals, and ill-defined problems in which the current situation is troublesome but it is not clear what kind of resolution to aim for. [2] Similarly, one may distinguish formal or fact-based problems requiring psychometric intelligence, versus socio-emotional problems which depend on the changeable emotions of individuals or groups, such as tactful behavior, fashion, or gift choices. [3]

Contents

Solutions require sufficient resources and knowledge to attain the goal. Professionals such as lawyers, doctors, programmers, and consultants are largely problem solvers for issues that require technical skills and knowledge beyond general competence. Many businesses have found profitable markets by recognizing a problem and creating a solution: the more widespread and inconvenient the problem, the greater the opportunity to develop a scalable solution.

There are many specialized problem-solving techniques and methods in fields such as engineering, business, medicine, mathematics, computer science, philosophy, and social organization. The mental techniques to identify, analyze, and solve problems are studied in psychology and cognitive sciences. Also widely researched are the mental obstacles that prevent people from finding solutions; problem-solving impediments include confirmation bias, mental set, and functional fixedness.

Definition

The term problem solving has a slightly different meaning depending on the discipline. For instance, it is a mental process in psychology and a computerized process in computer science. There are two different types of problems: ill-defined and well-defined; different approaches are used for each. Well-defined problems have specific end goals and clearly expected solutions, while ill-defined problems do not. Well-defined problems allow for more initial planning than ill-defined problems. [2] Solving problems sometimes involves dealing with pragmatics (the way that context contributes to meaning) and semantics (the interpretation of the problem). The ability to understand what the end goal of the problem is, and what rules could be applied, represents the key to solving the problem. Sometimes a problem requires abstract thinking or coming up with a creative solution.

Problem solving has two major domains: mathematical problem solving and personal problem solving. Each concerns some difficulty or barrier that is encountered. [4]

Psychology

Problem solving in psychology refers to the process of finding solutions to problems encountered in life. [5] Solutions to these problems are usually situation- or context-specific. The process starts with problem finding and problem shaping, in which the problem is discovered and simplified. The next step is to generate possible solutions and evaluate them. Finally a solution is selected to be implemented and verified. Problems have an end goal to be reached; how you get there depends upon problem orientation (problem-solving coping style and skills) and systematic analysis. [6]

Mental health professionals study the human problem-solving processes using methods such as introspection, behaviorism, simulation, computer modeling, and experiment. Social psychologists look into the person-environment relationship aspect of the problem and independent and interdependent problem-solving methods. [7] Problem solving has been defined as a higher-order cognitive process and intellectual function that requires the modulation and control of more routine or fundamental skills. [8]

Empirical research shows many different strategies and factors influence everyday problem solving. [9] Rehabilitation psychologists studying people with frontal lobe injuries have found that deficits in emotional control and reasoning can be re-mediated with effective rehabilitation and could improve the capacity of injured persons to resolve everyday problems. [10] Interpersonal everyday problem solving is dependent upon personal motivational and contextual components. One such component is the emotional valence of "real-world" problems, which can either impede or aid problem-solving performance. Researchers have focused on the role of emotions in problem solving, [11] demonstrating that poor emotional control can disrupt focus on the target task, impede problem resolution, and lead to negative outcomes such as fatigue, depression, and inertia. [12] In conceptualization,[ clarification needed ]human problem solving consists of two related processes: problem orientation, and the motivational/attitudinal/affective approach to problematic situations and problem-solving skills. People's strategies cohere with their goals [13] and stem from the process of comparing oneself with others.

Cognitive sciences

Among the first experimental psychologists to study problem solving were the Gestaltists in Germany, such as Karl Duncker in The Psychology of Productive Thinking (1935). [14] Perhaps best known is the work of Allen Newell and Herbert A. Simon. [15]

Experiments in the 1960s and early 1970s asked participants to solve relatively simple, well-defined, but not previously seen laboratory tasks. [16] [17] These simple problems, such as the Tower of Hanoi, admitted optimal solutions that could be found quickly, allowing researchers to observe the full problem-solving process. Researchers assumed that these model problems would elicit the characteristic cognitive processes by which more complex "real world" problems are solved.

An outstanding problem-solving technique found by this research is the principle of decomposition. [18]

Computer science

Much of computer science and artificial intelligence involves designing automated systems to solve a specified type of problem: to accept input data and calculate a correct or adequate response, reasonably quickly. Algorithms are recipes or instructions that direct such systems, written into computer programs.

Steps for designing such systems include problem determination, heuristics, root cause analysis, de-duplication, analysis, diagnosis, and repair. Analytic techniques include linear and nonlinear programming, queuing systems, and simulation. [19] A large, perennial obstacle is to find and fix errors in computer programs: debugging.

Logic

Formal logic concerns issues like validity, truth, inference, argumentation, and proof. In a problem-solving context, it can be used to formally represent a problem as a theorem to be proved, and to represent the knowledge needed to solve the problem as the premises to be used in a proof that the problem has a solution.

The use of computers to prove mathematical theorems using formal logic emerged as the field of automated theorem proving in the 1950s. It included the use of heuristic methods designed to simulate human problem solving, as in the Logic Theory Machine, developed by Allen Newell, Herbert A. Simon and J. C. Shaw, as well as algorithmic methods such as the resolution principle developed by John Alan Robinson.

In addition to its use for finding proofs of mathematical theorems, automated theorem-proving has also been used for program verification in computer science. In 1958, John McCarthy proposed the advice taker, to represent information in formal logic and to derive answers to questions using automated theorem-proving. An important step in this direction was made by Cordell Green in 1969, who used a resolution theorem prover for question-answering and for such other applications in artificial intelligence as robot planning.

The resolution theorem-prover used by Cordell Green bore little resemblance to human problem solving methods. In response to criticism of that approach from researchers at MIT, Robert Kowalski developed logic programming and SLD resolution, [20] which solves problems by problem decomposition. He has advocated logic for both computer and human problem solving [21] and computational logic to improve human thinking. [22]

Engineering

When products or processes fail, problem solving techniques can be used to develop corrective actions that can be taken to prevent further failures. Such techniques can also be applied to a product or process prior to an actual failure event—to predict, analyze, and mitigate a potential problem in advance. Techniques such as failure mode and effects analysis can proactively reduce the likelihood of problems.

In either the reactive or the proactive case, it is necessary to build a causal explanation through a process of diagnosis. In deriving an explanation of effects in terms of causes, abduction generates new ideas or hypotheses (asking "how?"); deduction evaluates and refines hypotheses based on other plausible premises (asking "why?"); and induction justifies a hypothesis with empirical data (asking "how much?"). [23] The objective of abduction is to determine which hypothesis or proposition to test, not which one to adopt or assert. [24] In the Peircean logical system, the logic of abduction and deduction contribute to our conceptual understanding of a phenomenon, while the logic of induction adds quantitative details (empirical substantiation) to our conceptual knowledge. [25]

Forensic engineering is an important technique of failure analysis that involves tracing product defects and flaws. Corrective action can then be taken to prevent further failures.

Reverse engineering attempts to discover the original problem-solving logic used in developing a product by disassembling the product and developing a plausible pathway to creating and assembling its parts. [26]

Military science

In military science, problem solving is linked to the concept of "end-states", the conditions or situations which are the aims of the strategy. [27] :xiii,E-2 Ability to solve problems is important at any military rank, but is essential at the command and control level. It results from deep qualitative and quantitative understanding of possible scenarios. Effectiveness in this context is an evaluation of results: to what extent the end states were accomplished. [27] :IV-24Planning is the process of determining how to effect those end states. [27] :IV-1

Processes

Some models of problem solving involve identifying a goal and then a sequence of subgoals towards achieving this goal. Andersson, who introduced the ACT-R model of cognition, modelled this collection of goals and subgoals as a goal stack in which the mind contains a stack of goals and subgoals to be completed, and a single task being carried out at any time. [28] :51

Knowledge of how to solve one problem can be applied to another problem, in a process known as transfer. [28] :56

Problem-solving strategies

Problem-solving strategies are steps to overcoming the obstacles to achieving a goal. The iteration of such strategies over the course of solving a problem is the "problem-solving cycle". [29]

Common steps in this cycle include recognizing the problem, defining it, developing a strategy to fix it, organizing knowledge and resources available, monitoring progress, and evaluating the effectiveness of the solution. Once a solution is achieved, another problem usually arises, and the cycle starts again.

Insight is the sudden aha! solution to a problem, the birth of a new idea to simplify a complex situation. Solutions found through insight are often more incisive than those from step-by-step analysis. A quick solution process requires insight to select productive moves at different stages of the problem-solving cycle. Unlike Newell and Simon's formal definition of a move problem, there is no consensus definition of an insight problem. [30]

Some problem-solving strategies include: [31]

Abstraction
solving the problem in a tractable model system to gain insight into the real system
Analogy
adapting the solution to a previous problem which has similar features or mechanisms
Brainstorming
(especially among groups of people) suggesting a large number of solutions or ideas and combining and developing them until an optimum solution is found
Critical thinking
analysis of available evidence and arguments to form a judgement via rational, skeptical, and unbiased evaluation
Divide and conquer
breaking down a large, complex problem into smaller, solvable problems
Help-seeking
obtaining external assistance to deal with obstacles
Hypothesis testing
assuming a possible explanation to the problem and trying to prove (or, in some contexts, disprove) the assumption
Lateral thinking
approaching solutions indirectly and creatively
Means-ends analysis
choosing an action at each step to move closer to the goal
Morphological analysis
assessing the output and interactions of an entire system

Observation / Question

in the natural sciences is an act or instance of noticing or perceiving and the acquisition of information from a primary source. is an utterance which serves as a request for information.
Proof of impossibility
try to prove that the problem cannot be solved. The point where the proof fails will be the starting point for solving it
Reduction
transforming the problem into another problem for which solutions exist
Research
employing existing ideas or adapting existing solutions to similar problems
Root cause analysis
identifying the cause of a problem
Trial-and-error
testing possible solutions until the right one is found

Problem-solving methods

Common barriers

Common barriers to problem solving include mental constructs that impede an efficient search for solutions. Five of the most common identified by researchers are: confirmation bias, mental set, functional fixedness, unnecessary constraints, and irrelevant information.

Confirmation bias

Confirmation bias is an unintentional tendency to collect and use data which favors preconceived notions. Such notions may be incidental rather than motivated by important personal beliefs: the desire to be right may be sufficient motivation. [32]

Scientific and technical professionals also experience confirmation bias. One online experiment, for example, suggested that professionals within the field of psychological research are likely to view scientific studies that agree with their preconceived notions more favorably than clashing studies. [33] According to Raymond Nickerson, one can see the consequences of confirmation bias in real-life situations, which range in severity from inefficient government policies to genocide. Nickerson argued that those who killed people accused of witchcraft demonstrated confirmation bias with motivation.[ citation needed ] Researcher Michael Allen found evidence for confirmation bias with motivation in school children who worked to manipulate their science experiments to produce favorable results. [34]

However, confirmation bias does not necessarily require motivation. In 1960, Peter Cathcart Wason conducted an experiment in which participants first viewed three numbers and then created a hypothesis in the form of a rule that could have been used to create that triplet of numbers. When testing their hypotheses, participants tended to only create additional triplets of numbers that would confirm their hypotheses, and tended not to create triplets that would negate or disprove their hypotheses. [35]

Mental set

Mental set is the inclination to re-use a previously successful solution, rather than search for new and better solutions. It is a reliance on habit.

It was first articulated by Abraham S. Luchins in the 1940s with his well-known water jug experiments. [36] Participants were asked to fill one jug with a specific amount of water by using other jugs with different maximum capacities. After Luchins gave a set of jug problems that could all be solved by a single technique, he then introduced a problem that could be solved by the same technique, but also by a novel and simpler method. His participants tended to use the accustomed technique, oblivious of the simpler alternative. [37] This was again demonstrated in Norman Maier's 1931 experiment, which challenged participants to solve a problem by using a familiar tool (pliers) in an unconventional manner. Participants were often unable to view the object in a way that strayed from its typical use, a type of mental set known as functional fixedness (see the following section).

Rigidly clinging to a mental set is called fixation, which can deepen to an obsession or preoccupation with attempted strategies that are repeatedly unsuccessful. [38] In the late 1990s, researcher Jennifer Wiley found that professional expertise in a field can create a mental set, perhaps leading to fixation. [38]

Groupthink, in which each individual takes on the mindset of the rest of the group, can produce and exacerbate mental set. [39] Social pressure leads to everybody thinking the same thing and reaching the same conclusions.

Functional fixedness

Functional fixedness is the tendency to view an object as having only one function, and to be unable to conceive of any novel use, as in the Maier pliers experiment described above. Functional fixedness is a specific form of mental set, and is one of the most common forms of cognitive bias in daily life.

As an example, imagine a man wants to kill a bug in his house, but the only thing at hand is a can of air freshener. He may start searching for something to kill the bug instead of squashing it with the can, thinking only of its main function of deodorizing.

Tim German and Clark Barrett describe this barrier: "subjects become 'fixed' on the design function of the objects, and problem solving suffers relative to control conditions in which the object's function is not demonstrated." [40] Their research found that young children's limited knowledge of an object's intended function reduces this barrier [41] Research has also discovered functional fixedness in educational contexts, as an obstacle to understanding: "functional fixedness may be found in learning concepts as well as in solving chemistry problems." [42]

There are several hypotheses in regards to how functional fixedness relates to problem solving. [43] It may waste time, delaying or entirely preventing the correct use of a tool.

Unnecessary constraints

Unnecessary constraints are arbitrary boundaries imposed unconsciously on the task at hand, which foreclose a productive avenue of solution. The solver may become fixated on only one type of solution, as if it were an inevitable requirement of the problem. Typically, this combines with mental set—clinging to a previously successful method. [44] [ page needed ]

Visual problems can also produce mentally invented constraints. [45] [ page needed ] A famous example is the dot problem: nine dots arranged in a three-by-three grid pattern must be connected by drawing four straight line segments, without lifting pen from paper or backtracking along a line. The subject typically assumes the pen must stay within the outer square of dots, but the solution requires lines continuing beyond this frame, and researchers have found a 0% solution rate within a brief allotted time. [46]

This problem has produced the expression "think outside the box". [47] [ page needed ] Such problems are typically solved via a sudden insight which leaps over the mental barriers, often after long toil against them. [48] This can be difficult depending on how the subject has structured the problem in their mind, how they draw on past experiences, and how well they juggle this information in their working memory. In the example, envisioning the dots connected outside the framing square requires visualizing an unconventional arrangement, which is a strain on working memory. [47]

Irrelevant information

Irrelevant information is a specification or data presented in a problem that is unrelated to the solution. [44] If the solver assumes that all information presented needs to be used, this often derails the problem solving process, making relatively simple problems much harder. [49]

For example: "Fifteen percent of the people in Topeka have unlisted telephone numbers. You select 200 names at random from the Topeka phone book. How many of these people have unlisted phone numbers?" [47] [ page needed ] The "obvious" answer is 15%, but in fact none of the unlisted people would be listed among the 200. This kind of "trick question" is often used in aptitude tests or cognitive evaluations. [50] Though not inherently difficult, they require independent thinking that is not necessarily common. Mathematical word problems often include irrelevant qualitative or numerical information as an extra challenge.

Avoiding barriers by changing problem representation

The disruption caused by the above cognitive biases can depend on how the information is represented: [50] visually, verbally, or mathematically. A classic example is the Buddhist monk problem:

A Buddhist monk begins at dawn one day walking up a mountain, reaches the top at sunset, meditates at the top for several days until one dawn when he begins to walk back to the foot of the mountain, which he reaches at sunset. Making no assumptions about his starting or stopping or about his pace during the trips, prove that there is a place on the path which he occupies at the same hour of the day on the two separate journeys.

The problem cannot be addressed in a verbal context, trying to describe the monk's progress on each day. It becomes much easier when the paragraph is represented mathematically by a function: one visualizes a graph whose horizontal axis is time of day, and whose vertical axis shows the monk's position (or altitude) on the path at each time. Superimposing the two journey curves, which traverse opposite diagonals of a rectangle, one sees they must cross each other somewhere. The visual representation by graphing has resolved the difficulty.

Similar strategies can often improve problem solving on tests. [44] [51]

Other barriers for individuals

People who are engaged in problem solving tend to overlook subtractive changes, even those that are critical elements of efficient solutions.[ example needed ] This tendency to solve by first, only, or mostly creating or adding elements, rather than by subtracting elements or processes is shown to intensify with higher cognitive loads such as information overload. [52]

Dreaming: problem solving without waking consciousness

People can also solve problems while they are asleep. There are many reports of scientists and engineers who solved problems in their dreams. For example, Elias Howe, inventor of the sewing machine, figured out the structure of the bobbin from a dream. [53]

The chemist August Kekulé was considering how benzene arranged its six carbon and hydrogen atoms. Thinking about the problem, he dozed off, and dreamt of dancing atoms that fell into a snakelike pattern, which led him to discover the benzene ring. As Kekulé wrote in his diary,

One of the snakes seized hold of its own tail, and the form whirled mockingly before my eyes. As if by a flash of lightning I awoke; and this time also I spent the rest of the night in working out the consequences of the hypothesis. [54]

There also are empirical studies of how people can think consciously about a problem before going to sleep, and then solve the problem with a dream image. Dream researcher William C. Dement told his undergraduate class of 500 students that he wanted them to think about an infinite series, whose first elements were OTTFF, to see if they could deduce the principle behind it and to say what the next elements of the series would be. [55] [ page needed ] He asked them to think about this problem every night for 15 minutes before going to sleep and to write down any dreams that they then had. They were instructed to think about the problem again for 15 minutes when they awakened in the morning.

The sequence OTTFF is the first letters of the numbers: one, two, three, four, five. The next five elements of the series are SSENT (six, seven, eight, nine, ten). Some of the students solved the puzzle by reflecting on their dreams. One example was a student who reported the following dream: [55] [ page needed ]

I was standing in an art gallery, looking at the paintings on the wall. As I walked down the hall, I began to count the paintings: one, two, three, four, five. As I came to the sixth and seventh, the paintings had been ripped from their frames. I stared at the empty frames with a peculiar feeling that some mystery was about to be solved. Suddenly I realized that the sixth and seventh spaces were the solution to the problem!

With more than 500 undergraduate students, 87 dreams were judged to be related to the problems students were assigned (53 directly related and 34 indirectly related). Yet of the people who had dreams that apparently solved the problem, only seven were actually able to consciously know the solution. The rest (46 out of 53) thought they did not know the solution.

Mark Blechner conducted this experiment and obtained results similar to Dement's. [56] [ page needed ] He found that while trying to solve the problem, people had dreams in which the solution appeared to be obvious from the dream, but it was rare for the dreamers to realize how their dreams had solved the puzzle. Coaxing or hints did not get them to realize it, although once they heard the solution, they recognized how their dream had solved it. For example, one person in that OTTFF experiment dreamed: [56] [ page needed ]

There is a big clock. You can see the movement. The big hand of the clock was on the number six. You could see it move up, number by number, six, seven, eight, nine, ten, eleven, twelve. The dream focused on the small parts of the machinery. You could see the gears inside.

In the dream, the person counted out the next elements of the series—six, seven, eight, nine, ten, eleven, twelve—yet he did not realize that this was the solution of the problem. His sleeping mindbrain[ jargon ] solved the problem, but his waking mindbrain was not aware how.

Albert Einstein believed that much problem solving goes on unconsciously, and the person must then figure out and formulate consciously what the mindbrain[ jargon ] has already solved. He believed this was his process in formulating the theory of relativity: "The creator of the problem possesses the solution." [57] Einstein said that he did his problem solving without words, mostly in images. "The words or the language, as they are written or spoken, do not seem to play any role in my mechanism of thought. The psychical entities which seem to serve as elements in thought are certain signs and more or less clear images which can be 'voluntarily' reproduced and combined." [58]

Cognitive sciences: two schools

Problem-solving processes differ across knowledge domains and across levels of expertise. [59] For this reason, cognitive sciences findings obtained in the laboratory cannot necessarily generalize to problem-solving situations outside the laboratory. This has led to a research emphasis on real-world problem solving, since the 1990s. This emphasis has been expressed quite differently in North America and Europe, however. Whereas North American research has typically concentrated on studying problem solving in separate, natural knowledge domains, much of the European research has focused on novel, complex problems, and has been performed with computerized scenarios. [60]

Europe

In Europe, two main approaches have surfaced, one initiated by Donald Broadbent [61] in the United Kingdom and the other one by Dietrich Dörner [62] in Germany. The two approaches share an emphasis on relatively complex, semantically rich, computerized laboratory tasks, constructed to resemble real-life problems. The approaches differ somewhat in their theoretical goals and methodology. The tradition initiated by Broadbent emphasizes the distinction between cognitive problem-solving processes that operate under awareness versus outside of awareness, and typically employs mathematically well-defined computerized systems. The tradition initiated by Dörner, on the other hand, has an interest in the interplay of the cognitive, motivational, and social components of problem solving, and utilizes very complex computerized scenarios that contain up to 2,000 highly interconnected variables. [63]

North America

In North America, initiated by the work of Herbert A. Simon on "learning by doing" in semantically rich domains, [64] researchers began to investigate problem solving separately in different natural knowledge domains—such as physics, writing, or chess playing—rather than attempt to extract a global theory of problem solving. [65] These researchers have focused on the development of problem solving within certain domains, that is on the development of expertise. [66]

Areas that have attracted rather intensive attention in North America include:

Characteristics of complex problems

Complex problem solving (CPS) is distinguishable from simple problem solving (SPS). In SPS there is a singular and simple obstacle. In CPS there may be multiple simultaneous obstacles. For example, a surgeon at work has far more complex problems than an individual deciding what shoes to wear. As elucidated by Dietrich Dörner, and later expanded upon by Joachim Funke, complex problems have some typical characteristics, which include: [1]

Collective problem solving

People solve problems on many different levels—from the individual to the civilizational. Collective problem solving refers to problem solving performed collectively. Social issues and global issues can typically only be solved collectively.

The complexity of contemporary problems exceeds the cognitive capacity of any individual and requires different but complementary varieties of expertise and collective problem solving ability. [81]

Collective intelligence is shared or group intelligence that emerges from the collaboration, collective efforts, and competition of many individuals.

In collaborative problem solving people work together to solve real-world problems. Members of problem-solving groups share a common concern, a similar passion, and/or a commitment to their work. Members can ask questions, wonder, and try to understand common issues. They share expertise, experiences, tools, and methods. [82] Groups may be fluid based on need, may only occur temporarily to finish an assigned task, or may be more permanent depending on the nature of the problems.

For example, in the educational context, members of a group may all have input into the decision-making process and a role in the learning process. Members may be responsible for the thinking, teaching, and monitoring of all members in the group. Group work may be coordinated among members so that each member makes an equal contribution to the whole work. Members can identify and build on their individual strengths so that everyone can make a significant contribution to the task. [83] Collaborative group work has the ability to promote critical thinking skills, problem solving skills, social skills, and self-esteem. By using collaboration and communication, members often learn from one another and construct meaningful knowledge that often leads to better learning outcomes than individual work. [84]

Collaborative groups require joint intellectual efforts between the members and involve social interactions to solve problems together. The knowledge shared during these interactions is acquired during communication, negotiation, and production of materials. [85] Members actively seek information from others by asking questions. The capacity to use questions to acquire new information increases understanding and the ability to solve problems. [86]

In a 1962 research report, Douglas Engelbart linked collective intelligence to organizational effectiveness, and predicted that proactively "augmenting human intellect" would yield a multiplier effect in group problem solving: "Three people working together in this augmented mode [would] seem to be more than three times as effective in solving a complex problem as is one augmented person working alone". [87]

Henry Jenkins, a theorist of new media and media convergence, draws on the theory that collective intelligence can be attributed to media convergence and participatory culture. [88] He criticizes contemporary education for failing to incorporate online trends of collective problem solving into the classroom, stating "whereas a collective intelligence community encourages ownership of work as a group, schools grade individuals". Jenkins argues that interaction within a knowledge community builds vital skills for young people, and teamwork through collective intelligence communities contributes to the development of such skills. [89]

Collective impact is the commitment of a group of actors from different sectors to a common agenda for solving a specific social problem, using a structured form of collaboration.

After World War II the UN, the Bretton Woods organization, and the WTO were created. Collective problem solving on the international level crystallized around these three types of organization from the 1980s onward. As these global institutions remain state-like or state-centric it is unsurprising that they perpetuate state-like or state-centric approaches to collective problem solving rather than alternative ones. [90]

Crowdsourcing is a process of accumulating ideas, thoughts, or information from many independent participants, with aim of finding the best solution for a given challenge. Modern information technologies allow for many people to be involved and facilitate managing their suggestions in ways that provide good results. [91] The Internet allows for a new capacity of collective (including planetary-scale) problem solving. [92]

See also

Notes

  1. 1 2 Frensch, Peter A.; Funke, Joachim, eds. (2014-04-04). Complex Problem Solving. doi:10.4324/9781315806723. ISBN   978-1-315-80672-3.
  2. 1 2 Schacter, D.L.; Gilbert, D.T.; Wegner, D.M. (2011). Psychology (2nd ed.). New York: Worth Publishers. p. 376.
  3. Blanchard-Fields, F. (2007). "Everyday problem solving and emotion: An adult developmental perspective". Current Directions in Psychological Science. 16 (1): 26–31. doi:10.1111/j.1467-8721.2007.00469.x. S2CID   145645352.
  4. Zimmermann, Bernd (2004). On mathematical problem-solving processes and history of mathematics. ICME 10. Copenhagen.
  5. Granvold, Donald K. (1997). "Cognitive-Behavioral Therapy with Adults". In Brandell, Jerrold R. (ed.). Theory and Practice in Clinical Social Work. Simon and Schuster. pp.  189. ISBN   978-0-684-82765-0.
  6. Robertson, S. Ian (2001). "Introduction to the study of problem solving". Problem Solving. Psychology Press. ISBN   0-415-20300-7.
  7. Rubin, M.; Watt, S. E.; Ramelli, M. (2012). "Immigrants' social integration as a function of approach-avoidance orientation and problem-solving style". International Journal of Intercultural Relations. 36 (4): 498–505. doi:10.1016/j.ijintrel.2011.12.009. hdl: 1959.13/931119 .
  8. Goldstein F. C.; Levin H. S. (1987). "Disorders of reasoning and problem-solving ability". In M. Meier; A. Benton; L. Diller (eds.). Neuropsychological rehabilitation. London: Taylor & Francis Group.
  9. Rath, Joseph F.; Simon, Dvorah; Langenbahn, Donna M.; Sherr, Rose Lynn; Diller, Leonard (2003). "Group treatment of problem-solving deficits in outpatients with traumatic brain injury: A randomised outcome study". Neuropsychological Rehabilitation. 13 (4): 461–488. doi:10.1080/09602010343000039. S2CID   143165070.
  10. 1 2
    • D'Zurilla, T. J.; Goldfried, M. R. (1971). "Problem solving and behavior modification". Journal of Abnormal Psychology. 78 (1): 107–126. doi:10.1037/h0031360. PMID   4938262.
    • D'Zurilla, T. J.; Nezu, A. M. (1982). "Social problem solving in adults". In P. C. Kendall (ed.). Advances in cognitive-behavioral research and therapy. Vol. 1. New York: Academic Press. pp. 201–274.
  11. Rath, J. F.; Langenbahn, D. M.; Simon, D; Sherr, R. L.; Fletcher, J.; Diller, L. (2004). "The construct of problem solving in higher level neuropsychological assessment and rehabilitation*1". Archives of Clinical Neuropsychology. 19 (5): 613–635. doi: 10.1016/j.acn.2003.08.006 . PMID   15271407.
  12. Hoppmann, Christiane A.; Blanchard-Fields, Fredda (2010). "Goals and everyday problem solving: Manipulating goal preferences in young and older adults". Developmental Psychology. 46 (6): 1433–1443. doi:10.1037/a0020676. PMID   20873926.
  13. Duncker, Karl (1935). Zur Psychologie des produktiven Denkens[The psychology of productive thinking] (in German). Berlin: Julius Springer.
  14. Newell, Allen; Simon, Herbert A. (1972). Human problem solving. Englewood Cliffs, N.J.: Prentice-Hall.
  15. For example:
  16. Mayer, R. E. (1992). Thinking, problem solving, cognition (Second ed.). New York: W. H. Freeman and Company.
  17. Armstrong, J. Scott; Denniston, William B. Jr.; Gordon, Matt M. (1975). "The Use of the Decomposition Principle in Making Judgments" (PDF). Organizational Behavior and Human Performance. 14 (2): 257–263. doi:10.1016/0030-5073(75)90028-8. S2CID   122659209. Archived from the original (PDF) on 2010-06-20.
  18. Malakooti, Behnam (2013). Operations and Production Systems with Multiple Objectives. John Wiley & Sons. ISBN   978-1-118-58537-5.
  19. Kowalski, Robert (1974). "Predicate Logic as a Programming Language" (PDF). Information Processing. 74.
  20. Kowalski, Robert (1979). Logic for Problem Solving (PDF). Artificial Intelligence Series. Vol. 7. Elsevier Science Publishing. ISBN   0-444-00368-1.
  21. Kowalski, Robert (2011). Computational Logic and Human Thinking: How to be Artificially Intelligent (PDF). Cambridge University Press.
  22. Staat, Wim (1993). "On abduction, deduction, induction and the categories". Transactions of the Charles S. Peirce Society. 29 (2): 225–237.
  23. Sullivan, Patrick F. (1991). "On Falsificationist Interpretations of Peirce". Transactions of the Charles S. Peirce Society. 27 (2): 197–219.
  24. Ho, Yu Chong (1994). Abduction? Deduction? Induction? Is There a Logic of Exploratory Data Analysis? (PDF). Annual Meeting of the American Educational Research Association. New Orleans, La.
  25. "Einstein's Secret to Amazing Problem Solving (and 10 Specific Ways You Can Use It)". Litemind. 2008-11-04. Archived from the original on 2017-06-21. Retrieved 2017-06-11.
  26. 1 2 3 "Commander's Handbook for Strategic Communication and Communication Strategy" (PDF). United States Joint Forces Command, Joint Warfighting Center, Suffolk, Va. 27 October 2009. Archived from the original (PDF) on April 29, 2011. Retrieved 10 October 2016.
  27. 1 2 Robertson, S. Ian (2017). Problem solving: perspectives from cognition and neuroscience (2nd ed.). London: Taylor & Francis. ISBN   978-1-317-49601-4. OCLC   962750529.
  28. Bransford, J. D.; Stein, B. S (1993). The ideal problem solver: A guide for improving thinking, learning, and creativity (2nd ed.). New York: W.H. Freeman.
  29. Wang, Y.; Chiew, V. (2010). "On the cognitive process of human problem solving" (PDF). Cognitive Systems Research. Elsevier BV. 11 (1): 81–92. doi:10.1016/j.cogsys.2008.08.003. ISSN   1389-0417. S2CID   16238486.
  30. Nickerson, Raymond S. (1998). "Confirmation bias: A ubiquitous phenomenon in many guises". Review of General Psychology. 2 (2): 176. doi:10.1037/1089-2680.2.2.175. S2CID   8508954.
  31. Hergovich, Andreas; Schott, Reinhard; Burger, Christoph (2010). "Biased Evaluation of Abstracts Depending on Topic and Conclusion: Further Evidence of a Confirmation Bias Within Scientific Psychology". Current Psychology. Springer Science and Business Media LLC. 29 (3): 188–209. doi:10.1007/s12144-010-9087-5. ISSN   1046-1310. S2CID   145497196.
  32. Allen, Michael (2011). "Theory-led confirmation bias and experimental persona". Research in Science & Technological Education. Informa UK Limited. 29 (1): 107–127. Bibcode:2011RSTEd..29..107A. doi:10.1080/02635143.2010.539973. ISSN   0263-5143. S2CID   145706148.
  33. Wason, P. C. (1960). "On the failure to eliminate hypotheses in a conceptual task". Quarterly Journal of Experimental Psychology. 12 (3): 129–140. doi:10.1080/17470216008416717. S2CID   19237642.
  34. Luchins, Abraham S. (1942). "Mechanization in problem solving: The effect of Einstellung". Psychological Monographs. 54 (248): i-95. doi:10.1037/h0093502.
  35. Öllinger, Michael; Jones, Gary; Knoblich, Günther (2008). "Investigating the Effect of Mental Set on Insight Problem Solving" (PDF). Experimental Psychology. Hogrefe Publishing Group. 55 (4): 269–282. doi:10.1027/1618-3169.55.4.269. ISSN   1618-3169. PMID   18683624.
  36. 1 2 Wiley, Jennifer (1998). "Expertise as mental set: The effects of domain knowledge in creative problem solving". Memory & Cognition. 24 (4): 716–730. doi: 10.3758/bf03211392 . PMID   9701964.
  37. Cottam, Martha L.; Dietz-Uhler, Beth; Mastors, Elena; Preston, Thomas (2010). Introduction to Political Psychology (2nd ed.). New York: Psychology Press.
  38. German, Tim P.; Barrett, H. Clark (2005). "Functional Fixedness in a Technologically Sparse Culture". Psychological Science. SAGE Publications. 16 (1): 1–5. doi:10.1111/j.0956-7976.2005.00771.x. ISSN   0956-7976. PMID   15660843. S2CID   1833823.
  39. German, Tim P.; Defeyter, Margaret A. (2000). "Immunity to functional fixedness in young children". Psychonomic Bulletin and Review. 7 (4): 707–712. doi: 10.3758/BF03213010 . PMID   11206213.
  40. Furio, C.; Calatayud, M. L.; Baracenas, S.; Padilla, O. (2000). "Functional fixedness and functional reduction as common sense reasonings in chemical equilibrium and in geometry and polarity of molecules". Science Education. 84 (5): 545–565. doi:10.1002/1098-237X(200009)84:5<545::AID-SCE1>3.0.CO;2-1.
  41. Adamson, Robert E (1952). "Functional fixedness as related to problem solving: A repetition of three experiments". Journal of Experimental Psychology. 44 (4): 288–291. doi:10.1037/h0062487. PMID   13000071.
  42. 1 2 3 Kellogg, R. T. (2003). Cognitive psychology (2nd ed.). California: Sage Publications, Inc.
  43. Meloy, J. R. (1998). The Psychology of Stalking, Clinical and Forensic Perspectives (2nd ed.). London, England: Academic Press.
  44. MacGregor, J.N.; Ormerod, T.C.; Chronicle, E.P. (2001). "Information-processing and insight: A process model of performance on the nine-dot and related problems". Journal of Experimental Psychology: Learning, Memory, and Cognition. 27 (1): 176–201. doi:10.1037/0278-7393.27.1.176. PMID   11204097.
  45. 1 2 3 Weiten, Wayne (2011). Psychology: themes and variations (8th ed.). California: Wadsworth.
  46. Novick, L. R.; Bassok, M. (2005). "Problem solving". In Holyoak, K. J.; Morrison, R. G. (eds.). Cambridge handbook of thinking and reasoning. New York, N.Y.: Cambridge University Press. pp. 321–349.
  47. Walinga, Jennifer (2010). "From walls to windows: Using barriers as pathways to insightful solutions". The Journal of Creative Behavior. 44 (3): 143–167. doi:10.1002/j.2162-6057.2010.tb01331.x.
  48. 1 2 Walinga, Jennifer; Cunningham, J. Barton; MacGregor, James N. (2011). "Training insight problem solving through focus on barriers and assumptions". The Journal of Creative Behavior. 45: 47–58. doi:10.1002/j.2162-6057.2011.tb01084.x.
  49. Vlamings, Petra H. J. M.; Hare, Brian; Call, Joseph (2009). "Reaching around barriers: The performance of great apes and 3–5-year-old children". Animal Cognition. 13 (2): 273–285. doi:10.1007/s10071-009-0265-5. PMC   2822225 . PMID   19653018.
  50. Kaempffert, Waldemar B. (1924). A Popular History of American Invention. Vol. 2. New York: Charles Scribner's Sons. p.  385.
    • Kekulé, August (1890). "Benzolfest-Rede". Berichte der Deutschen Chemischen Gesellschaft. 23: 1302–1311.
    • Benfey, O. (1958). "Kekulé and the birth of the structural theory of organic chemistry in 1858". Journal of Chemical Education. 35 (1): 21–23. Bibcode:1958JChEd..35...21B. doi:10.1021/ed035p21.
  51. 1 2 Dement, W.C. (1972). Some Must Watch While Some Just Sleep. New York: Freeman.
  52. 1 2 Blechner, Mark J. (2018). The Mindbrain and Dreams: An Exploration of Dreaming, Thinking, and Artistic Creation. New York: Routledge.
  53. Fromm, Erika O. (1998). "Lost and found half a century later: Letters by Freud and Einstein". American Psychologist. 53 (11): 1195–1198. doi:10.1037/0003-066x.53.11.1195.
  54. Einstein, Albert (1954). "A Mathematician's Mind". Ideas and Opinions. New York: Bonanza Books. p. 25.
  55. Sternberg, R. J. (1995). "Conceptions of expertise in complex problem solving: A comparison of alternative conceptions". In Frensch, P. A.; Funke, J. (eds.). Complex problem solving: The European Perspective. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 295–321.
  56. Funke, J. (1991). "Solving complex problems: Human identification and control of complex systems". In Sternberg, R. J.; Frensch, P. A. (eds.). Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 185–222. ISBN   0-8058-0650-4. OCLC   23254443.
    • Dörner, Dietrich (1975). "Wie Menschen eine Welt verbessern wollten" [How people wanted to improve the world]. Bild der Wissenschaft (in German). 12: 48–53.
    • Dörner, Dietrich (1985). "Verhalten, Denken und Emotionen" [Behavior, thinking, and emotions]. In Eckensberger, L. H.; Lantermann, E. D. (eds.). Emotion und Reflexivität (in German). München, Germany: Urban & Schwarzenberg. pp. 157–181.
    • Dörner, Dietrich; Wearing, Alex J. (1995). "Complex problem solving: Toward a (computer-simulated) theory". In Frensch, P.A.; Funke, J. (eds.). Complex problem solving: The European Perspective. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 65–99.
    • Buchner, A. (1995). "Theories of complex problem solving". In Frensch, P.A.; Funke, J. (eds.). Complex problem solving: The European Perspective. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 27–63.
    • Dörner, D.; Kreuzig, H. W.; Reither, F.; Stäudel, T., eds. (1983). Lohhausen. Vom Umgang mit Unbestimmtheit und Komplexität[Lohhausen. On dealing with uncertainty and complexity] (in German). Bern, Switzerland: Hans Huber.
    • Ringelband, O. J.; Misiak, C.; Kluwe, R. H. (1990). "Mental models and strategies in the control of a complex system". In Ackermann, D.; Tauber, M. J. (eds.). Mental models and human-computer interaction. Vol. 1. Amsterdam: Elsevier Science Publishers. pp. 151–164.
  57. e.g., Sternberg, R. J.; Frensch, P. A., eds. (1991). Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. ISBN   0-8058-0650-4. OCLC   23254443.
  58. Sokol, S. M.; McCloskey, M. (1991). "Cognitive mechanisms in calculation" . In Sternberg, R. J.; Frensch, P. A. (eds.). Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 85–116. ISBN   0-8058-0650-4. OCLC   23254443.
  59. Kay, D. S. (1991). "Computer interaction: Debugging the problems" . In Sternberg, R. J.; Frensch, P. A. (eds.). Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 317–340. ISBN   0-8058-0650-4. OCLC   23254443.
  60. Frensch, P. A.; Sternberg, R. J. (1991). "Skill-related differences in game playing" . In Sternberg, R. J.; Frensch, P. A. (eds.). Complex problem solving: Principles and mechanisms. Hillsdale, N.J .: Lawrence Erlbaum Associates. pp. 343–381. ISBN   0-8058-0650-4. OCLC   23254443.
  61. Amsel, E.; Langer, R.; Loutzenhiser, L. (1991). "Do lawyers reason differently from psychologists? A comparative design for studying expertise". In Sternberg, R. J.; Frensch, P. A. (eds.). Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 223–250. ISBN   0-8058-0650-4. OCLC   23254443.
  62. Wagner, R. K. (1991). "Managerial problem solving". In Sternberg, R. J.; Frensch, P. A. (eds.). Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 159–183. PsycNET: 1991-98396-005.
  63. Hegarty, M. (1991). "Knowledge and processes in mechanical problem solving" . In Sternberg, R. J.; Frensch, P. A. (eds.). Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 253–285. ISBN   0-8058-0650-4. OCLC   23254443.
  64. Heppner, P. P.; Krauskopf, C. J. (1987). "An information-processing approach to personal problem solving". The Counseling Psychologist. 15 (3): 371–447. doi:10.1177/0011000087153001. S2CID   146180007.
  65. Voss, J. F.; Wolfe, C. R.; Lawrence, J. A.; Engle, R. A. (1991). "From representation to decision: An analysis of problem solving in international relations". In Sternberg, R. J.; Frensch, P. A. (eds.). Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 119–158. ISBN   0-8058-0650-4. OCLC   23254443. PsycNET: 1991-98396-004.
  66. Lesgold, A.; Lajoie, S. (1991). "Complex problem solving in electronics" . In Sternberg, R. J.; Frensch, P. A. (eds.). Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 287–316. ISBN   0-8058-0650-4. OCLC   23254443.
  67. Altshuller, Genrich (1994). And Suddenly the Inventor Appeared. Translated by Lev Shulyak. Worcester, Mass.: Technical Innovation Center. ISBN   978-0-9640740-1-9.
  68. Stanovich, K. E.; Cunningham, A. E. (1991). "Reading as constrained reasoning" . In Sternberg, R. J.; Frensch, P. A. (eds.). Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 3–60. ISBN   0-8058-0650-4. OCLC   23254443.
  69. Bryson, M.; Bereiter, C.; Scardamalia, M.; Joram, E. (1991). "Going beyond the problem as given: Problem solving in expert and novice writers". In Sternberg, R. J.; Frensch, P. A. (eds.). Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 61–84. ISBN   0-8058-0650-4. OCLC   23254443.
  70. Sternberg, R. J.; Frensch, P. A., eds. (1991). Complex problem solving: Principles and mechanisms. Hillsdale, NJ: Lawrence Erlbaum Associates. ISBN   0-8058-0650-4. OCLC   23254443.
  71. Hung, Woei (2013). "Team-based complex problem solving: a collective cognition perspective". Educational Technology Research and Development. 61 (3): 365–384. doi:10.1007/s11423-013-9296-3. S2CID   62663840.
  72. Jewett, Pamela; MacPhee, Deborah (2012). "Adding Collaborative Peer Coaching to Our Teaching Identities". The Reading Teacher. 66 (2): 105–110. doi:10.1002/TRTR.01089.
  73. Wang, Qiyun (2009). "Design and Evaluation of a Collaborative Learning Environment". Computers and Education. 53 (4): 1138–1146. doi:10.1016/j.compedu.2009.05.023.
  74. Wang, Qiyan (2010). "Using online shared workspaces to support group collaborative learning". Computers and Education. 55 (3): 1270–1276. doi:10.1016/j.compedu.2010.05.023.
  75. Kai-Wai Chu, Samuel; Kennedy, David M. (2011). "Using Online Collaborative tools for groups to Co-Construct Knowledge". Online Information Review. 35 (4): 581–597. doi:10.1108/14684521111161945. ISSN   1468-4527. S2CID   206388086.
  76. Legare, Cristine; Mills, Candice; Souza, Andre; Plummer, Leigh; Yasskin, Rebecca (2013). "The use of questions as problem-solving strategies during early childhood". Journal of Experimental Child Psychology. 114 (1): 63–7. doi:10.1016/j.jecp.2012.07.002. PMID   23044374.
  77. Engelbart, Douglas (1962). "Team Cooperation". Augmenting Human Intellect: A Conceptual Framework. Vol. AFOSR-3223. Stanford Research Institute.
  78. Flew, Terry (2008). New Media: an introduction. Melbourne: Oxford University Press.
  79. Henry, Jenkins. "Interactive audiences? The 'collective intelligence' of media fans" (PDF). Archived from the original (PDF) on April 26, 2018. Retrieved December 11, 2016.
  80. Finger, Matthias (2008-03-27). "Which governance for sustainable development? An organizational and institutional perspective". In Park, Jacob; Conca, Ken; Finger, Matthias (eds.). The Crisis of Global Environmental Governance: Towards a New Political Economy of Sustainability. Routledge. p.  48. ISBN   978-1-134-05982-9.
  81. Stefanovitch, Nicolas; Alshamsi, Aamena; Cebrian, Manuel; Rahwan, Iyad (30 September 2014). "Error and attack tolerance of collective problem solving: The DARPA Shredder Challenge". EPJ Data Science. 3 (1). doi: 10.1140/epjds/s13688-014-0013-1 . hdl: 21.11116/0000-0002-D39F-D .

Further reading

Related Research Articles

A heuristic, or heuristic technique, is any approach to problem solving that employs a practical method that is not fully optimized, perfected, or rationalized, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

<span class="mw-page-title-main">Creativity</span> Human capacity, ability or talent to create something that is both novel and useful.

Creativity is a characteristic of someone or some process that forms something new and valuable. The created item may be intangible or a physical object.

<span class="mw-page-title-main">Decision-making</span> Cognitive process to choose a course of action or belief

In psychology, decision-making is regarded as the cognitive process resulting in the selection of a belief or a course of action among several possible alternative options. It could be either rational or irrational. The decision-making process is a reasoning process based on assumptions of values, preferences and beliefs of the decision-maker. Every decision-making process produces a final choice, which may or may not prompt action.

<span class="mw-page-title-main">Robert Sternberg</span> American psychologist & scholar

Robert J. Sternberg is an American psychologist and psychometrician. He is a Professor of Human Development at Cornell University.

<span class="mw-page-title-main">ACT-R</span>

ACT-R is a cognitive architecture mainly developed by John Robert Anderson and Christian Lebiere at Carnegie Mellon University. Like any cognitive architecture, ACT-R aims to define the basic and irreducible cognitive and perceptual operations that enable the human mind. In theory, each task that humans can perform should consist of a series of these discrete operations.

Situated cognition is a theory that posits that knowing is inseparable from doing by arguing that all knowledge is situated in activity bound to social, cultural and physical contexts.

<span class="mw-page-title-main">Eleanor Rosch</span> Professor of psychology

Eleanor Rosch is an American psychologist. She is a professor of psychology at the University of California, Berkeley, specializing in cognitive psychology and primarily known for her work on categorization, in particular her prototype theory, which has profoundly influenced the field of cognitive psychology.

In psychology and cognitive science, a schema describes a pattern of thought or behavior that organizes categories of information and the relationships among them. It can also be described as a mental structure of preconceived ideas, a framework representing some aspect of the world, or a system of organizing and perceiving new information, such as a mental schema or conceptual model. Schemata influence attention and the absorption of new knowledge: people are more likely to notice things that fit into their schema, while re-interpreting contradictions to the schema as exceptions or distorting them to fit. Schemata have a tendency to remain unchanged, even in the face of contradictory information. Schemata can help in understanding the world and the rapidly changing environment. People can organize new perceptions into schemata quickly as most situations do not require complex thought when using schema, since automatic thought is all that is required.

Diane F. Halpern is an American psychologist and former president of the American Psychological Association (APA). She is Dean of Social Science at the Minerva Schools at KGI and also the McElwee Family Professor of Psychology at Claremont McKenna College. She is also a former president of the Western Psychological Association, The Society for the Teaching of Psychology, and the Division of General Psychology.

John Robert Anderson is a Canadian-born American psychologist. He is currently professor of Psychology and Computer Science at Carnegie Mellon University.

Alan M. Lesgold, an educational psychologist, is professor of psychology and Dean of the University of Pittsburgh School of Education. He received a PhD in psychology from Stanford University, where his doctoral advisor was Gordon Bower (1971) and holds an honorary doctorate from the Open University of the Netherlands. The psychologist has made notable contributions to the cognitive science of learning and its application to instructional technology.

William Dean Crano is an American psychologist. He is the Oskamp Distinguished Professor of Psychology in the Division of Behavioral and Organizational Sciences (DBOS), Claremont Graduate University.

The psychology of reasoning is the study of how people reason, often broadly defined as the process of drawing conclusions to inform how people solve problems and make decisions. It overlaps with psychology, philosophy, linguistics, cognitive science, artificial intelligence, logic, and probability theory.

Polytely comprises complex problem-solving situations characterized by the presence of multiple simultaneous goals. These goals may be contradictory or otherwise conflict with one another, requiring prioritisation of desired outcomes.

George Mandler was an Austrian-born American psychologist, who became a distinguished professor of psychology at the University of California, San Diego.

Neo-Piagetian theories of cognitive development criticize and build upon Jean Piaget's theory of cognitive development.

Conceptual change is the process whereby concepts and relationships between them change over the course of an individual person's lifetime or over the course of history. Research in four different fields – cognitive psychology, cognitive developmental psychology, science education, and history and philosophy of science - has sought to understand this process. Indeed, the convergence of these four fields, in their effort to understand how concepts change in content and organization, has led to the emergence of an interdisciplinary sub-field in its own right. This sub-field is referred to as "conceptual change" research.

<span class="mw-page-title-main">Robert S. Wyer</span>

Robert S. Wyer Jr. is a visiting professor at the University of Cincinnati and professor (emeritus) at the University of Illinois, Urbana-Champaign. He received his doctoral degree from the University of Colorado. Wyer Jr.'s research interests cover various aspects of social information processing, including:

Help-seeking theory postulates that people follow a series of predictable steps to seek help for their inadequacies, it is a series of well-ordered and purposeful cognitive and behavioral steps, each leading to specific types of solutions.

Rudolf Groner is a Swiss psychologist, specialized in cognitive psychology and media psychology.