Problem solving

Last updated

Problem solving consists of using generic or ad hoc methods in an orderly manner to find solutions to difficulties.


Some of the problem-solving techniques developed and used in philosophy, medicine, societies, mathematics, engineering, computer science, and artificial intelligence in general are related to mental problem-solving techniques studied in psychology and cognitive sciences.


The term problem solving has a slightly different meaning depending on the discipline. For instance, it is a mental process in psychology and a computerized process in computer science. There are two different types of problems: ill-defined and well-defined; different approaches are used for each. Well-defined problems have specific end goals and clearly expected solutions, while ill-defined problems do not. Well-defined problems allow for more initial planning than ill-defined problems. [1] Solving problems sometimes involves dealing with pragmatics, the way that context contributes to meaning, and semantics, the interpretation of the problem. The ability to understand what the end goal of the problem is, and what rules could be applied represents the key to solving the problem. Sometimes the problem requires abstract thinking or coming up with a creative solution.


Problem solving in psychology refers to the process of finding solutions to problems encountered in life. [2] Solutions to these problems are usually situation or context-specific. The process starts with problem finding and problem shaping, where the problem is discovered and simplified. The next step is to generate possible solutions and evaluate them. Finally a solution is selected to be implemented and verified. Problems have an endgoal to be reached and how you get there depends upon problem orientation (problem-solving coping style and skills) and systematic analysis. [3] Mental health professionals study the human problem solving processes using methods such as introspection, behaviorism, simulation, computer modeling, and experiment. Social psychologists look into the person-environment relationship aspect of the problem and independent and interdependent problem-solving methods. [4] Problem solving has been defined as a higher-order cognitive process and intellectual function that requires the modulation and control of more routine or fundamental skills. [5]

Problem solving has two major domains: mathematical problem solving and personal problem solving. Both are seen in terms of some difficulty or barrier that is encountered. [6] Empirical research shows many different strategies and factors influence everyday problem solving. [7] [8] [9] Rehabilitation psychologists studying individuals with frontal lobe injuries have found that deficits in emotional control and reasoning can be re-mediated with effective rehabilitation and could improve the capacity of injured persons to resolve everyday problems. [10] Interpersonal everyday problem solving is dependent upon the individual personal motivational and contextual components. One such component is the emotional valence of "real-world" problems and it can either impede or aid problem-solving performance. Researchers have focused on the role of emotions in problem solving, [11] [12] demonstrating that poor emotional control can disrupt focus on the target task and impede problem resolution and likely lead to negative outcomes such as fatigue, depression, and inertia. [13] In conceptualization, human problem solving consists of two related processes: problem orientation and the motivational/attitudinal/affective approach to problematic situations and problem-solving skills. Studies conclude people's strategies cohere with their goals [14] and stem from the natural process of comparing oneself with others.

Cognitive sciences

The early experimental work of the Gestaltists in Germany placed the beginning of problem solving study (e.g., Karl Duncker in 1935 with his book The psychology of productive thinking [15] ). Later this experimental work continued through the 1960s and early 1970s with research conducted on relatively simple (but novel for participants) laboratory tasks of problem solving. [16] [17] The use of simple, novel tasks was due to the clearly defined optimal solutions and short time for solving, which made it possible for the researchers to trace participants' steps in problem-solving process. Researchers' underlying assumption was that simple tasks such as the Tower of Hanoi correspond to the main properties of "real world" problems and thus the characteristic cognitive processes within participants' attempts to solve simple problems are the same for "real world" problems too; simple problems were used for reasons of convenience and with the expectation that thought generalizations to more complex problems would become possible. Perhaps the best-known and most impressive example of this line of research is the work by Allen Newell and Herbert A. Simon. [18] [ improper synthesis? ] Other experts have shown that the principle of decomposition improves the ability of the problem solver to make good judgment. [19]

Computer science

In computer science and in the part of artificial intelligence that deals with algorithms, problem solving includes techniques of algorithms, heuristics and root cause analysis. The amount of resources (e.g. time, memory, energy) required to solve problems is described by computational complexity theory. In more general terms, problem solving is part of a larger process that encompasses problem determination, de-duplication, analysis, diagnosis, repair, and other steps.

Other problem solving tools are linear and nonlinear programming, queuing systems, and simulation. [20]

Much of computer science involves designing completely automatic systems that will later solve some specific problem—systems to accept input data and, in a reasonable amount of time, calculate the correct response or a correct-enough approximation.

In addition, people in computer science spend a surprisingly large amount of human time finding and fixing problems in their programs: Debugging.


Formal logic is concerned with such issues as validity, truth, inference, argumentation and proof. In a problem-solving context, it can be used to formally represent a problem as a theorem to be proved, and to represent the knowledge needed to solve the problem as the premises to be used in a proof that the problem has a solution. The use of computers to prove mathematical theorems using formal logic emerged as the field of automated theorem proving in the 1950s. It included the use of heuristic methods designed to simulate human problem solving, as in the Logic Theory Machine, developed by Allen Newell, Herbert A. Simon and J. C. Shaw, as well as algorithmic methods, such as the resolution principle developed by John Alan Robinson.

In addition to its use for finding proofs of mathematical theorems, automated theorem-proving has also been used for program verification in computer science. However, already in 1958, John McCarthy proposed the advice taker, to represent information in formal logic and to derive answers to questions using automated theorem-proving. An important step in this direction was made by Cordell Green in 1969, using a resolution theorem prover for question-answering and for such other applications in artificial intelligence as robot planning.

The resolution theorem-prover used by Cordell Green bore little resemblance to human problem solving methods. In response to criticism of his approach, emanating from researchers at MIT, Robert Kowalski developed logic programming and SLD resolution, [21] which solves problems by problem decomposition. He has advocated logic for both computer and human problem solving [22] and computational logic to improve human thinking [23]


Problem solving is used when products or processes fail, so corrective action can be taken to prevent further failures. It can also be applied to a product or process prior to an actual failure event—when a potential problem can be predicted and analyzed, and mitigation applied so the problem never occurs. Techniques such as failure mode and effects analysis can be used to proactively reduce the likelihood of problems occurring.

Forensic engineering is an important technique of failure analysis that involves tracing product defects and flaws. Corrective action can then be taken to prevent further failures.

Reverse engineering attempts to discover the original problem-solving logic used in developing a product by taking it apart. [24]

Military science

In military science, problem solving is linked to the concept of "end-states", the desired condition or situation that strategists wish to generate. [25] :xiii,E-2 The ability to solve problems is important at any military rank, but is highly critical at the command and control level, where it is strictly correlated to the deep understanding of qualitative and quantitative scenarios.[ clarification needed ]Effectiveness of problem solving is used to measure the result of problem solving, tied to accomplishing the goal. [25] :IV-24Planning for problem-solving is the process of determining how to achieve the goal [25] :IV-1

Problem-solving strategies

Problem-solving strategies are the steps that one would use to find the problems that are in the way to getting to one's own goal. Some refer to this as the "problem-solving cycle". [26]

In this cycle one will acknowledge, recognize the problem, define the problem, develop a strategy to fix the problem, organize the knowledge of the problem cycle, figure out the resources at the user's disposal, monitor one's progress, and evaluate the solution for accuracy. The reason it is called a cycle is that once one is completed with a problem another will usually pop up.

Insight is the sudden solution to a long-vexing problem, a sudden recognition of a new idea, or a sudden understanding of a complex situation, an Aha! moment. Solutions found through insight are often more accurate than those found through step-by-step analysis. To solve more problems at a faster rate, insight is necessary for selecting productive moves at different stages of the problem-solving cycle. This problem-solving strategy pertains specifically to problems referred to as insight problem. Unlike Newell and Simon's formal definition of move problems, there has not been a generally agreed upon definition of an insight problem (Ash, Jee, and Wiley, 2012; [27] Chronicle, MacGregor, and Ormerod, 2004; [28] Chu and MacGregor, 2011). [29]

Blanchard-Fields [30] looks at problem solving from one of two facets. The first looking at those problems that only have one solution (like mathematical problems, or fact-based questions) which are grounded in psychometric intelligence. The other is socioemotional in nature and have answers that change constantly (like what's your favorite color or what you should get someone for Christmas).

The following techniques are usually called problem-solving strategies [31]

Problem-solving methods

Common barriers

Common barriers to problem solving are mental constructs that impede our ability to correctly solve problems. These barriers prevent people from solving problems in the most efficient manner possible. Five of the most common processes and factors that researchers have identified as barriers to problem solving are confirmation bias, mental set, functional fixedness, unnecessary constraints, and irrelevant information.

Confirmation bias

Confirmation bias is an unintentional bias caused by the collection and use of data in a way that favors a preconceived notion. The beliefs affected by confirmation bias do not need to have motivation , the desire to defend or find substantiation for beliefs that are important to that person. [32] Research has found that professionals within scientific fields of study also experience confirmation bias. Andreas Hergovich, Reinhard Schott, and Christoph Burger's experiment conducted online, for instance, suggested that professionals within the field of psychological research are likely to view scientific studies that agree with their preconceived notions more favorably than studies that clash with their established beliefs. [33] According to Raymond Nickerson, one can see the consequences of confirmation bias in real-life situations, which range in severity from inefficient government policies to genocide. Nickerson argued that those who killed people accused of witchcraft demonstrated confirmation bias with motivation. Researcher Michael Allen found evidence for confirmation bias with motivation in school children who worked to manipulate their science experiments in such a way that would produce favorable results. [34] However, confirmation bias does not necessarily require motivation. In 1960, Peter Cathcart Wason conducted an experiment in which participants first viewed three numbers and then created a hypothesis that proposed a rule that could have been used to create that triplet of numbers. When testing their hypotheses, participants tended to only create additional triplets of numbers that would confirm their hypotheses, and tended not to create triplets that would negate or disprove their hypotheses. Thus research also shows that people can and do work to confirm theories or ideas that do not support or engage personally significant beliefs. [35]

Mental set

Mental set was first articulated by Abraham Luchins in the 1940s and demonstrated in his well-known water jug experiments. [36] In these experiments, participants were asked to fill one jug with a specific amount of water using only other jugs (typically three) with different maximum capacities as tools. After Luchins gave his participants a set of water jug problems that could all be solved by employing a single technique, he would then give them a problem that could either be solved using that same technique or a novel and simpler method. Luchins discovered that his participants tended to use the same technique that they had become accustomed to despite the possibility of using a simpler alternative. [37] Thus mental set describes one's inclination to attempt to solve problems in such a way that has proved successful in previous experiences. However, as Luchins' work revealed, such methods for finding a solution that have worked in the past may not be adequate or optimal for certain new but similar problems. Therefore, it is often necessary for people to move beyond their mental sets in order to find solutions. This was again demonstrated in Norman Maier's 1931 experiment, which challenged participants to solve a problem by using a household object (pliers) in an unconventional manner. Maier observed that participants were often unable to view the object in a way that strayed from its typical use, a phenomenon regarded as a particular form of mental set (more specifically known as functional fixedness, which is the topic of the following section). When people cling rigidly to their mental sets, they are said to be experiencing fixation, a seeming obsession or preoccupation with attempted strategies that are repeatedly unsuccessful. [38] In the late 1990s, researcher Jennifer Wiley worked to reveal that expertise can work to create a mental set in people considered to be experts in their fields, and she gained evidence that the mental set created by expertise could lead to the development of fixation. [38]

Functional fixedness

Functional fixedness is a specific form of mental set and fixation, which was alluded to earlier in the Maier experiment, and furthermore it is another way in which cognitive bias can be seen throughout daily life. Tim German and Clark Barrett describe this barrier as the fixed design of an object hindering the individual's ability to see it serving other functions. In more technical terms, these researchers explained that "[s]ubjects become 'fixed' on the design function of the objects, and problem solving suffers relative to control conditions in which the object's function is not demonstrated." [39] Functional fixedness is defined as only having that primary function of the object itself hinder the ability of it serving another purpose other than its original function. In research that highlighted the primary reasons that young children are immune to functional fixedness, it was stated that "functional fixedness...[is when] subjects are hindered in reaching the solution to a problem by their knowledge of an object's conventional function." [40] Furthermore, it is important to note that functional fixedness can be easily expressed in commonplace situations. For instance, imagine the following situation: a man sees a bug on the floor that he wants to kill, but the only thing in his hand at the moment is a can of air freshener. If the man starts looking around for something in the house to kill the bug with instead of realizing that the can of air freshener could in fact be used not only as having its main function as to freshen the air, he is said to be experiencing functional fixedness. The man's knowledge of the can being served as purely an air freshener hindered his ability to realize that it too could have been used to serve another purpose, which in this instance was as an instrument to kill the bug. Functional fixedness can happen on multiple occasions and can cause us to have certain cognitive biases. If people only see an object as serving one primary focus, then they fail to realize that the object can be used in various ways other than its intended purpose. This can in turn cause many issues with regards to problem solving.

Functional fixedness limits the ability for people to solve problems accurately by causing one to have a very narrow way of thinking. Functional fixedness can be seen in other types of learning behaviors as well. For instance, research has discovered the presence of functional fixedness in many educational instances. Researchers Furio, Calatayud, Baracenas, and Padilla stated that "... functional fixedness may be found in learning concepts as well as in solving chemistry problems." [41] There was more emphasis on this function being seen in this type of subject and others.

There are several hypotheses in regards to how functional fixedness relates to problem solving. [42] There are also many ways in which a person can run into problems while thinking of a particular object with having this function. If there is one way in which a person usually thinks of something rather than multiple ways then this can lead to a constraint in how the person thinks of that particular object. This can be seen as narrow minded thinking, which is defined as a way in which one is not able to see or accept certain ideas in a particular context. Functional fixedness is very closely related to this as previously mentioned. This can be done intentionally and or unintentionally, but for the most part it seems as if this process to problem solving is done in an unintentional way.

Functional fixedness can affect problem solvers in at least two particular ways. The first is with regards to time, as functional fixedness causes people to use more time than necessary to solve any given problem. Secondly, functional fixedness often causes solvers to make more attempts to solve a problem than they would have made if they were not experiencing this cognitive barrier. In the worst case, functional fixedness can completely prevent a person from realizing a solution to a problem. Functional fixedness is a commonplace occurrence, which affects the lives of many people.

Unnecessary constraints

Unnecessary constraints are another very common barrier that people face while attempting to problem-solve. This particular phenomenon occurs when the subject, trying to solve the problem subconsciously, places boundaries on the task at hand, which in turn forces him or her to strain to be more innovative in their thinking. The solver hits a barrier when they become fixated on only one way to solve their problem, and it becomes increasingly difficult to see anything but the method they have chosen. Typically, the solver experiences this when attempting to use a method they have already experienced success from, and they can not help but try to make it work in the present circumstances as well, even if they see that it is counterproductive. [43]

Groupthink, or taking on the mindset of the rest of the group members, can also act as an unnecessary constraint while trying to solve problems. [44] This is due to the fact that with everybody thinking the same thing, stopping on the same conclusions, and inhibiting themselves to think beyond this. This is very common, but the most well-known example of this barrier making itself present is in the famous example of the dot problem. In this example, there are nine dots lying on a grid three dots across and three dots running up and down. The solver is then asked to draw no more than four lines, without lifting their pen or pencil from the paper. This series of lines should connect all of the dots on the paper. Then, what typically happens is the subject creates an assumption in their mind that they must connect the dots without letting his or her pen or pencil go outside of the square of dots. Standardized procedures like this can often bring mentally invented constraints of this kind, [45] and researchers have found a 0% correct solution rate in the time allotted for the task to be completed. [46] The imposed constraint inhibits the solver to think beyond the bounds of the dots. It is from this phenomenon that the expression "think outside the box" is derived. [47]

This problem can be quickly solved with a dawning of realization, or insight. A few minutes of struggling over a problem can bring these sudden insights, where the solver quickly sees the solution clearly. Problems such as this are most typically solved via insight and can be very difficult for the subject depending on either how they have structured the problem in their minds, how they draw on their past experiences, and how much they juggle this information in their working memories [47] In the case of the nine-dot example, the solver has already been structured incorrectly in their minds because of the constraint that they have placed upon the solution. In addition to this, people experience struggles when they try to compare the problem to their prior knowledge, and they think they must keep their lines within the dots and not go beyond. They do this because trying to envision the dots connected outside of the basic square puts a strain on their working memory. [47]

The solution to the problem becomes obvious as insight occurs following incremental movements made toward the solution. These tiny movements happen without the solver knowing. Then when the insight is realized fully, the "aha" moment happens for the subject. [48] These moments of insight can take a long while to manifest or not so long at other times, but the way that the solution is arrived at after toiling over these barriers stays the same.

Irrelevant information

Irrelevant information is information presented within a problem that is unrelated or unimportant to the specific problem. [43] Within the specific context of the problem, irrelevant information would serve no purpose in helping solve that particular problem. Often irrelevant information is detrimental to the problem solving process. It is a common barrier that many people have trouble getting through, especially if they are not aware of it. Irrelevant information makes solving otherwise relatively simple problems much harder. [49]

For example: "Fifteen percent of the people in Topeka have unlisted telephone numbers. You select 200 names at random from the Topeka phone book. How many of these people have unlisted phone numbers?" [50]

The people that are not listed in the phone book would not be among the 200 names you selected. The individuals looking at this task would have naturally wanted to use the 15% given to them in the problem. They see that there is information present and they immediately think that it needs to be used. This of course is not true. These kinds of questions are often used to test students taking aptitude tests or cognitive evaluations. [51] They aren't meant to be difficult but they are meant to require thinking that is not necessarily common. Irrelevant Information is commonly represented in math problems, word problems specifically, where numerical information is put for the purpose of challenging the individual.

One reason irrelevant information is so effective at keeping a person off topic and away from the relevant information, is in how it is represented. [51] The way information is represented can make a vast difference in how difficult the problem is to be overcome. Whether a problem is represented visually, verbally, spatially, or mathematically, irrelevant information can have a profound effect on how long a problem takes to be solved; or if it's even possible. The Buddhist monk problem is a classic example of irrelevant information and how it can be represented in different ways:

A Buddhist monk begins at dawn one day walking up a mountain, reaches the top at sunset, meditates at the top for several days until one dawn when he begins to walk back to the foot of the mountain, which he reaches at sunset. Making no assumptions about his starting or stopping or about his pace during the trips, prove that there is a place on the path which he occupies at the same hour of the day on the two separate journeys.

This problem is near impossible to solve because of how the information is represented. Because it is written out in a way that represents the information verbally, it causes us to try and create a mental image of the paragraph. This is often very difficult to do especially with all the irrelevant information involved in the question. This example is made much easier to understand when the paragraph is represented visually. Now if the same problem was asked, but it was also accompanied by a corresponding graph, it would be far easier to answer this question; irrelevant information no longer serves as a road block. By representing the problem visually, there are no difficult words to understand or scenarios to imagine. The visual representation of this problem has removed the difficulty of solving it.

These types of representations are often used to make difficult problems easier. [52] They can be used on tests as a strategy to remove Irrelevant Information, which is one of the most common forms of barriers when discussing the issues of problem solving. [43] Identifying crucial information presented in a problem and then being able to correctly identify its usefulness is essential. Being aware of irrelevant information is the first step in overcoming this common barrier.

Other barriers for individuals

Individual humans engaged in problem-solving tend to overlook subtractive changes, including those that are critical elements of efficient solutions. This tendency to solve by first, only or mostly creating or adding elements, rather than by subtracting elements or processes is shown to intensify with higher cognitive loads such as information overload. [53] [54]

Dreaming: problem-solving without waking consciousness

Problem solving can also occur without waking consciousness. There are many reports of scientists and engineers who solved problems in their dreams. Elias Howe, inventor of the sewing machine, figured out the structure of the bobbin from a dream. [55]

The chemist August Kekulé was considering how benzene arranged its six carbon and hydrogen atoms. Thinking about the problem, he dozed off, and dreamt of dancing atoms that fell into a snakelike pattern, which led him to discover the benzene ring. As Kekulé wrote in his diary,

One of the snakes seized hold of its own tail, and the form whirled mockingly before my eyes. As if by a flash of lightning I awoke; and this time also I spent the rest of the night in working out the consequences of the hypothesis. [56]

There also are empirical studies of how people can think consciously about a problem before going to sleep, and then solve the problem with a dream image. Dream researcher William C. Dement told his undergraduate class of 500 students that he wanted them to think about an infinite series, whose first elements were OTTFF, to see if they could deduce the principle behind it and to say what the next elements of the series would be. [57] He asked them to think about this problem every night for 15 minutes before going to sleep and to write down any dreams that they then had. They were instructed to think about the problem again for 15 minutes when they awakened in the morning.

The sequence OTTFF is the first letters of the numbers: one, two, three, four, five. The next five elements of the series are SSENT (six, seven, eight, nine, ten). Some of the students solved the puzzle by reflecting on their dreams. One example was a student who reported the following dream: [57]

I was standing in an art gallery, looking at the paintings on the wall. As I walked down the hall, I began to count the paintings: one, two, three, four, five. As I came to the sixth and seventh, the paintings had been ripped from their frames. I stared at the empty frames with a peculiar feeling that some mystery was about to be solved. Suddenly I realized that the sixth and seventh spaces were the solution to the problem!

With more than 500 undergraduate students, 87 dreams were judged to be related to the problems students were assigned (53 directly related and 34 indirectly related). Yet of the people who had dreams that apparently solved the problem, only seven were actually able to consciously know the solution. The rest (46 out of 53) thought they did not know the solution.

Mark Blechner conducted this experiment and obtained results similar to Dement's. [58] He found that while trying to solve the problem, people had dreams in which the solution appeared to be obvious from the dream, but it was rare for the dreamers to realize how their dreams had solved the puzzle. Coaxing or hints did not get them to realize it, although once they heard the solution, they recognized how their dream had solved it. For example, one person in that OTTFF experiment dreamed: [58]

There is a big clock. You can see the movement. The big hand of the clock was on the number six. You could see it move up, number by number, six, seven, eight, nine, ten, eleven, twelve. The dream focused on the small parts of the machinery. You could see the gears inside.

In the dream, the person counted out the next elements of the series  six, seven, eight, nine, ten, eleven, twelve  yet he did not realize that this was the solution of the problem. His sleeping mindbrain solved the problem, but his waking mindbrain was not aware how.

Albert Einstein believed that much problem solving goes on unconsciously, and the person must then figure out and formulate consciously what the mindbrain has already solved. He believed this was his process in formulating the theory of relativity: "The creator of the problem possesses the solution." [59] Einstein said that he did his problem-solving without words, mostly in images. "The words or the language, as they are written or spoken, do not seem to play any role in my mechanism of thought. The psychical entities which seem to serve as elements in thought are certain signs and more or less clear images which can be 'voluntarily' reproduced and combined." [60]

Cognitive sciences: two schools

In cognitive sciences, researchers' realization that problem-solving processes differ across knowledge domains and across levels of expertise (e.g. Sternberg, 1995) and that, consequently, findings obtained in the laboratory cannot necessarily generalize to problem-solving situations outside the laboratory, has led to an emphasis on real-world problem solving since the 1990s. This emphasis has been expressed quite differently in North America and Europe, however. Whereas North American research has typically concentrated on studying problem solving in separate, natural knowledge domains, much of the European research has focused on novel, complex problems, and has been performed with computerized scenarios (see Funke, 1991, for an overview).


In Europe, two main approaches have surfaced, one initiated by Donald Broadbent (1977; see Berry & Broadbent, 1995) in the United Kingdom and the other one by Dietrich Dörner (1975, 1985; see Dörner & Wearing, 1995) in Germany. The two approaches share an emphasis on relatively complex, semantically rich, computerized laboratory tasks, constructed to resemble real-life problems. The approaches differ somewhat in their theoretical goals and methodology, however. The tradition initiated by Broadbent emphasizes the distinction between cognitive problem-solving processes that operate under awareness versus outside of awareness, and typically employs mathematically well-defined computerized systems. The tradition initiated by Dörner, on the other hand, has an interest in the interplay of the cognitive, motivational, and social components of problem solving, and utilizes very complex computerized scenarios that contain up to 2,000 highly interconnected variables (e.g., Dörner, Kreuzig, Reither & Stäudel's 1983 LOHHAUSEN project; Ringelband, Misiak & Kluwe, 1990). Buchner (1995) describes the two traditions in detail.

North America

In North America, initiated by the work of Herbert A. Simon on "learning by doing" in semantically rich domains, [61] [62] researchers began to investigate problem solving separately in different natural knowledge domains  such as physics, writing, or chess playing thus relinquishing their attempts to extract a global theory of problem solving (e.g. Sternberg & Frensch, 1991). Instead, these researchers have frequently focused on the development of problem solving within a certain domain, that is on the development of expertise; Chase & Simon, 1973; Chi, Feltovich & Glaser, 1981). [63]

Areas that have attracted rather intensive attention in North America include:

Characteristics of complex problems

Complex problem solving (CPS) is distinguishable from simple problem solving (SPS). When dealing with SPS there is a singular and simple obstacle in the way. But CPS comprises one or more obstacles at a time. In a real-life example, a surgeon at work has far more complex problems than an individual deciding what shoes to wear. As elucidated by Dietrich Dörner, and later expanded upon by Joachim Funke, complex problems have some typical characteristics as follows: [67]

Collective problem solving

Problem solving is applied on many different levels − from the individual to the civilizational. Collective problem solving refers to problem solving performed collectively.

Social issues and global issues can typically only be solved collectively.

It has been noted that the complexity of contemporary problems has exceeded the cognitive capacity of any individual and requires different but complementary expertise and collective problem solving ability. [69]

Collective intelligence is shared or group intelligence that emerges from the collaboration, collective efforts, and competition of many individuals.

Collaborative problem solving is about people working together face-to-face or in online workspaces with a focus on solving real world problems. These groups are made up of members that share a common concern, a similar passion, and/or a commitment to their work. Members are willing to ask questions, wonder, and try to understand common issues. They share expertise, experiences, tools, and methods. [70] These groups can be assigned by instructors, or may be student regulated based on the individual student needs. The groups, or group members, may be fluid based on need, or may only occur temporarily to finish an assigned task. They may also be more permanent in nature depending on the needs of the learners. All members of the group must have some input into the decision making process and have a role in the learning process. Group members are responsible for the thinking, teaching, and monitoring of all members in the group. Group work must be coordinated among its members so that each member makes an equal contribution to the whole work. Group members must identify and build on their individual strengths so that everyone can make a significant contribution to the task. [71] Collaborative groups require joint intellectual efforts between the members and involve social interactions to solve problems together. The knowledge shared during these interactions is acquired during communication, negotiation, and production of materials. [72] Members actively seek information from others by asking questions. The capacity to use questions to acquire new information increases understanding and the ability to solve problems. [73] Collaborative group work has the ability to promote critical thinking skills, problem solving skills, social skills, and self-esteem. By using collaboration and communication, members often learn from one another and construct meaningful knowledge that often leads to better learning outcomes than individual work. [74]

In a 1962 research report, Douglas Engelbart linked collective intelligence to organizational effectiveness, and predicted that pro-actively 'augmenting human intellect' would yield a multiplier effect in group problem solving: "Three people working together in this augmented mode [would] seem to be more than three times as effective in solving a complex problem as is one augmented person working alone". [75]

Henry Jenkins, a key theorist of new media and media convergence draws on the theory that collective intelligence can be attributed to media convergence and participatory culture. [76] He criticizes contemporary education for failing to incorporate online trends of collective problem solving into the classroom, stating "whereas a collective intelligence community encourages ownership of work as a group, schools grade individuals". Jenkins argues that interaction within a knowledge community builds vital skills for young people, and teamwork through collective intelligence communities contributes to the development of such skills. [77]

Collective impact is the commitment of a group of actors from different sectors to a common agenda for solving a specific social problem, using a structured form of collaboration.

After World War II the UN, the Bretton Woods organization and the WTO were created; collective problem solving on the international level crystallized around these three types of organizations from the 1980s onward. As these global institutions remain state-like or state-centric it has been called unsurprising that these continue state-like or state-centric approaches to collective problem-solving rather than alternative ones. [78]

Crowdsourcing is a process of accumulating the ideas, thoughts or information from many independent participants, with aim to find the best solution for a given challenge. Modern information technologies allow for massive number of subjects to be involved as well as systems of managing these suggestions that provide good results. [79] With the Internet a new capacity for collective, including planetary-scale, problem solving was created. [80]

See also


  1. Schacter, D.L. et al. (2009). Psychology, Second Edition. New York: Worth Publishers. pp. 376
  2. Jerrold R. Brandell (1997). Theory and Practice in Clinical Social Work. Simon and Schuster. p. 189. ISBN   978-0-684-82765-0.
  3. What is a problem? in S. Ian Robertson, Problem solving, Psychology Press, 2001.
  4. Rubin, M.; Watt, S. E.; Ramelli, M. (2012). "Immigrants' social integration as a function of approach-avoidance orientation and problem-solving style". International Journal of Intercultural Relations. 36 (4): 498–505. doi:10.1016/j.ijintrel.2011.12.009. hdl: 1959.13/931119 .
  5. Goldstein F. C., & Levin H. S. (1987). Disorders of reasoning and problem-solving ability. In M. Meier, A. Benton, & L. Diller (Eds.), Neuropsychological rehabilitation. London: Taylor & Francis Group.
  6. Bernd Zimmermann, On mathematical problem solving processes and history of mathematics, University of Jena.
  7. Vallacher, Robin; M. Wegner, Daniel (2012). Action Identification Theory. Handbook of Theories of Social Psychology. pp. 327–348. doi:10.4135/9781446249215.n17. ISBN   9780857029607.
  8. Margrett, J. A; Marsiske, M (2002). "Gender differences in older adults' everyday cognitive collaboration". International Journal of Behavioral Development. 26 (1): 45–59. doi:10.1080/01650250143000319. PMC   2909137 . PMID   20657668.
  9. Antonucci, T. C; Ajrouch, K. J; Birditt, K. S (2013). "The Convoy Model: Explaining Social Relations From a Multidisciplinary Perspective". The Gerontologist. 54 (1): 82–92. doi:10.1093/geront/gnt118. PMC   3894851 . PMID   24142914.
  10. Rath, Joseph F.; Simon, Dvorah; Langenbahn, Donna M.; Sherr, Rose Lynn; Diller, Leonard (September 2003). "Group treatment of problem‐solving deficits in outpatients with traumatic brain injury: A randomised outcome study". Neuropsychological Rehabilitation. 13 (4): 461–488. doi:10.1080/09602010343000039. S2CID   143165070.
  11. 1 2 D'Zurilla, T. J.; Goldfried, M. R. (1971). "Problem solving and behavior modification". Journal of Abnormal Psychology. 78 (1): 107–126. doi:10.1037/h0031360. PMID   4938262.
  12. 1 2 D'Zurilla, T. J., & Nezu, A. M. (1982). Social problem solving in adults. In P. C. Kendall (Ed.), Advances in cognitive-behavioral research and therapy (Vol. 1, pp. 201–274). New York: Academic Press.
  13. RATH, J (August 2004). "The construct of problem solving in higher level neuropsychological assessment and rehabilitation*1". Archives of Clinical Neuropsychology. 19 (5): 613–635. doi: 10.1016/j.acn.2003.08.006 . PMID   15271407.
  14. Hoppmann, Christiane A.; Blanchard-Fields, Fredda (November 2010). "Goals and everyday problem solving: Manipulating goal preferences in young and older adults". Developmental Psychology. 46 (6): 1433–1443. doi:10.1037/a0020676. PMID   20873926.
  15. Duncker, K. (1935). Zur Psychologie des produktiven Denkens [The psychology of productive thinking]. Berlin: Julius Springer.
  16. For example Duncker's "X-ray" problem; Ewert & Lambert's "disk" problem in 1932, later known as Tower of Hanoi.
  17. Mayer, R. E. (1992). Thinking, problem solving, cognition. Second edition. New York: W. H. Freeman and Company.
  18. Newell, A., & Simon, H. A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice-Hall.
  19. J. Scott Armstrong, William B. Denniston Jr. and Matt M. Gordon (1975). "The Use of the Decomposition Principle in Making Judgments" (PDF). Organizational Behavior and Human Performance. 14 (2): 257–263. doi:10.1016/0030-5073(75)90028-8. Archived from the original (PDF) on 2010-06-20.
  20. Malakooti, Behnam (2013). Operations and Production Systems with Multiple Objectives. John Wiley & Sons. ISBN   978-1-118-58537-5.
  21. Kowalski, R. Predicate Logic as a Programming Language Memo 70, Department of Artificial Intelligence, Edinburgh University. 1973. Also in Proceedings IFIP Congress, Stockholm, North Holland Publishing Co., 1974, pp. 569–574.
  22. Kowalski, R., Logic for Problem Solving, North Holland, Elsevier, 1979
  23. Kowalski, R., Computational Logic and Human Thinking: How to be Artificially Intelligent, Cambridge University Press, 2011.
  24. "Einstein's Secret to Amazing Problem Solving (and 10 Specific Ways You Can Use It) - Litemind". 2008-11-04. Retrieved 2017-06-11.
  25. 1 2 3 "Commander's Handbook for Strategic Communication and Communication Strategy" (PDF). United States Joint Forces Command, Joint Warfighting Center, Suffolk, VA. 24 June 2010. Archived from the original (PDF) on April 29, 2011. Retrieved 10 October 2016.
  26. Bransford, J. D.; Stein, B. S (1993). The ideal problem solver: A guide for improving thinking, learning, and creativity (2nd ed.). New York: W.H. Freeman.
  27. Ash, Ivan K.; Jee, Benjamin D.; Wiley, Jennifer (2012-05-11). "Investigating Insight as Sudden Learning". The Journal of Problem Solving. 4 (2). doi: 10.7771/1932-6246.1123 . ISSN   1932-6246.
  28. Chronicle, Edward P.; MacGregor, James N.; Ormerod, Thomas C. (2004). "What Makes an Insight Problem? The Roles of Heuristics, Goal Conception, and Solution Recoding in Knowledge-Lean Problems". Journal of Experimental Psychology: Learning, Memory, and Cognition. 30 (1): 14–27. doi:10.1037/0278-7393.30.1.14. ISSN   1939-1285. PMID   14736293. S2CID   15631498.
  29. Chu, Yun; MacGregor, James N. (2011-02-07). "Human Performance on Insight Problem Solving: A Review". The Journal of Problem Solving. 3 (2). doi: 10.7771/1932-6246.1094 . ISSN   1932-6246.
  30. Blanchard-Fields, F. (2007). "Everyday problem solving and emotion: An adult developmental perspective". Current Directions in Psychological Science. 16 (1): 26–31. doi:10.1111/j.1467-8721.2007.00469.x. S2CID   145645352.
  31. Wang, Y., & Chiew, V. (2010). On the cognitive process of human problem solving. Cognitive Systems Research, 11(1), 81-92.
  32. Nickerson, R. S. (1998). "Confirmation bias: A ubiquitous phenomenon in many guises". Review of General Psychology. 2 (2): 176. doi:10.1037/1089-2680.2.2.175. S2CID   8508954.
  33. Hergovich, Schott; Burger (2010). "Biased evaluation of abstracts depending on topic and conclusion: Further evidence of a confirmation bias within scientific psychology". Current Psychology. 29 (3): 188–209. doi:10.1007/s12144-010-9087-5. S2CID   145497196.
  34. Allen (2011). "Theory-led confirmation bias and experimental persona". Research in Science & Technological Education. 29 (1): 107–127. Bibcode:2011RSTEd..29..107A. doi:10.1080/02635143.2010.539973. S2CID   145706148.
  35. Wason, P. C. (1960). "On the failure to eliminate hypotheses in a conceptual task". Quarterly Journal of Experimental Psychology. 12 (3): 129–140. doi:10.1080/17470216008416717. S2CID   19237642.
  36. Luchins, A. S. (1942). Mechanization in problem solving: The effect of Einstellung. Psychological Monographs, 54 (Whole No. 248).
  37. Öllinger, Jones, & Knoblich (2008). Investigating the effect of mental set on insight problem solving. Experimental Psychology',' 55(4), 269–270.
  38. 1 2 Wiley, J (1998). "Expertise as mental set: The effects of domain knowledge in creative problem solving". Memory & Cognition. 24 (4): 716–730. doi: 10.3758/bf03211392 . PMID   9701964.
  39. German, Tim, P.; Barrett, Clark., H. "Functional fixedness in a technologically sparse culture. University of California, Santa Barbara. American psychological society. 16 (1), 2005.
  40. German, Tim P.; Defeyter, Margaret A. (2000). "Immunity to functional fixedness in young children". Psychonomic Bulletin and Review. 7 (4): 707–712. doi: 10.3758/BF03213010 . PMID   11206213.
  41. Furio, C.; Calatayud, M. L.; Baracenas, S; Padilla, O (2000). "Functional fixedness and functional reduction as common sense reasonings in chemical equilibrium and in geometry and polarity of molecules. Valencia, Spain". Science Education. 84 (5): 545–565. doi:10.1002/1098-237X(200009)84:5<545::AID-SCE1>3.0.CO;2-1.
  42. Adamson, Robert E (1952). "Functional fixedness as related to problem solving: A repetition of three experiments. Stanford University. California". Journal of Experimental Psychology. 44 (4): 1952. doi:10.1037/h0062487.
  43. 1 2 3 Kellogg, R. T. (2003). Cognitive psychology (2nd ed.). California: Sage Publications, Inc.
  44. Cottam, Martha L., Dietz-Uhler, Beth, Mastors, Elena, & Preston, & Thomas. (2010). Introduction to Political Psychology (2nd ed.). New York: Psychology Press.
  45. Meloy, J. R. (1998). The Psychology of Stalking, Clinical and Forensic Perspectives (2nd ed.). London, England: Academic Press.
  46. MacGregor, J.N.; Ormerod, T.C.; Chronicle, E.P. (2001). "Information-processing and insight: A process model of performance on the nine-dot and related problems". Journal of Experimental Psychology: Learning, Memory, and Cognition. 27 (1): 176–201. doi:10.1037/0278-7393.27.1.176. PMID   11204097.
  47. 1 2 3 Weiten, Wayne. (2011). Psychology: themes and variations (8th ed.). California: Wadsworth.
  48. Novick, L. R., & Bassok, M. (2005). Problem solving. In K. J. Holyoak & R. G. Morrison (Eds.), Cambridge handbook of thinking and reasoning (Ch. 14, pp. 321-349). New York, NY: Cambridge University Press.
  49. Walinga, Jennifer (2010). "From walls to windows: Using barriers as pathways to insightful solutions". The Journal of Creative Behavior. 44 (3): 143–167. doi:10.1002/j.2162-6057.2010.tb01331.x.
  50. Weiten, Wayne. (2011). Psychology: themes and variations (8th ed.) California: Wadsworth.
  51. 1 2 Walinga, Jennifer, Cunningham, J. Barton, & MacGregor, James N. (2011). Training insight problem solving through focus on barriers and assumptions. The Journal of Creative Behavior.
  52. Vlamings, Petra H. J. M.; Hare, Brian; Call, Joseph (2009). "Reaching around barriers: The performance of great apes and 3-5-year-old children". Animal Cognition. 13 (2): 273–285. doi:10.1007/s10071-009-0265-5. PMC   2822225 . PMID   19653018.
  53. "People add by default even when subtraction makes more sense". Science News. 7 April 2021. Retrieved 10 May 2021.
  54. Adams, Gabrielle S.; Converse, Benjamin A.; Hales, Andrew H.; Klotz, Leidy E. (April 2021). "People systematically overlook subtractive changes". Nature. 592 (7853): 258–261. Bibcode:2021Natur.592..258A. doi:10.1038/s41586-021-03380-y. ISSN   1476-4687. PMID   33828317 . Retrieved 10 May 2021.
  55. Kaempffert, W. (1924) A Popular History of American Invention. New York: Scribners.
  56. Kekulé, A (1890). "Benzolfest-Rede". Berichte der Deutschen Chemischen Gesellschaft. 23: 1302–1311. Trans. Benfey, O. (1958). "Kekulé and the birth of the structural theory of organic chemistry in 1858". Journal of Chemical Education. 35: 21–23. doi:10.1021/ed035p21.
  57. 1 2 Dement, W.C. (1972). Some Must Watch While Some Just Sleep. New York: Freeman.
  58. 1 2 Blechner, M. J. (2018) The Mindbrain and Dreams: An Exploration of Dreaming, Thinking, and Artistic Creation. New York: Routledge.
  59. Fromm, Erika O (1998). "Lost and found half a century later: Letters by Freud and Einstein". American Psychologist. 53 (11): 1195–1198. doi:10.1037/0003-066x.53.11.1195.
  60. Einstein, A. (1994) Ideas and Opinions. New York: Modern Library.
  61. Anzai, K.; Simon, H. A. (1979). "The theory of learning by doing". Psychological Review. 86 (2): 124–140. doi:10.1037/0033-295X.86.2.124. PMID   493441.
  62. Bhaskar, R., & Simon, H. A. (1977). Problem solving in semantically rich domains: An example from engineering thermodynamics. Cognitive Science, 1, 193-215.
  63. Anderson, J. R.; Boyle, C. B.; Reiser, B. J. (1985). "Intelligent tutoring systems" (PDF). Science. 228 (4698): 456–462. Bibcode:1985Sci...228..456A. doi:10.1126/science.228.4698.456. PMID   17746875. S2CID   62403455.
  64. Wagner, R. K. (1991). Managerial problem solving. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 159-183). Hillsdale, NJ: Lawrence Erlbaum Associates.
  65. Amsel, E., Langer, R., & Loutzenhiser, L. (1991). Do lawyers reason differently from psychologists? A comparative design for studying expertise. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 223-250). Hillsdale, NJ: Lawrence Erlbaum Associates. ISBN   978-0-8058-1783-6
  66. Altshuller, Genrich (1994). And Suddenly the Inventor Appeared. Translated by Lev Shulyak. Worcester, MA: Technical Innovation Center. ISBN   978-0-9640740-1-9.
  67. Frensch, Peter A.; Funke, Joachim, eds. (2014-04-04). Complex Problem Solving. doi:10.4324/9781315806723. ISBN   9781315806723.
  68. Complex problem solving : principles and mechanisms. Sternberg, Robert J., Frensch, Peter A. Hillsdale, N.J.: L. Erlbaum Associates. 1991. ISBN   0-8058-0650-4. OCLC   23254443.CS1 maint: others (link)
  69. Hung, Woei (24 April 2013). "Team-based complex problem solving: a collective cognition perspective". Educational Technology Research and Development. 61 (3): 365–384. doi:10.1007/s11423-013-9296-3. S2CID   62663840.
  70. Jewett, Pamela; Deborah MacPhee (October 2012). "Adding Collaborative Peer Coaching to Our Teaching Identities". The Reading Teacher. 66 (2): 105–110. doi:10.1002/TRTR.01089.
  71. Wang, Qiyun (2009). "Design and Evaluation of a Collaborative Learning Environment". Computers and Education. 53 (4): 1138–1146. doi:10.1016/j.compedu.2009.05.023.
  72. Kai-Wai Chu, Samual; David Kennedy (2011). "Using Online Collaborative tools for groups to Co-Construct Knowledge". Online Information Review. 35 (4): 581–597. doi:10.1108/14684521111161945.
  73. Legare, Cristine; Candice Mills; Andre Souza; Leigh Plummer; Rebecca Yasskin (2013). "The use of questions as problem-solving strategies during early childhood". Journal of Experimental Child Psychology. 114 (1): 63–7. doi:10.1016/j.jecp.2012.07.002. PMID   23044374.
  74. Wang, Qiyan (2010). "Using online shared workspaces to support group collaborative learning". Computers and Education. 55 (3): 1270–1276. doi:10.1016/j.compedu.2010.05.023.
  75. Engelbart, Douglas (1962) Augmenting Human Intellect: A Conceptual Framework - section on Team Cooperation
  76. Flew, Terry (2008). New Media: an introduction. Melbourne: Oxford University Press.
  77. Henry, Jenkins. "INTERACTIVE AUDIENCES? THE 'COLLECTIVE INTELLIGENCE' OF MEDIA FANS" (PDF). Archived from the original (PDF) on April 26, 2018. Retrieved December 11, 2016.
  78. Park, Jacob; Conca, Ken; Conca, Professor of International Relations Ken; Finger, Matthias (2008-03-27). The Crisis of Global Environmental Governance: Towards a New Political Economy of Sustainability. Routledge. ISBN   9781134059829 . Retrieved 29 January 2017.
  79. Guazzini, Andrea; Vilone, Daniele; Donati, Camillo; Nardi, Annalisa; Levnajić, Zoran (10 November 2015). "Modeling crowdsourcing as collective problem solving". Scientific Reports. 5: 16557. arXiv: 1506.09155 . Bibcode:2015NatSR...516557G. doi:10.1038/srep16557. PMC   4639727 . PMID   26552943.
  80. Stefanovitch, Nicolas; Alshamsi, Aamena; Cebrian, Manuel; Rahwan, Iyad (30 September 2014). "Error and attack tolerance of collective problem solving: The DARPA Shredder Challenge". EPJ Data Science. 3 (1). doi: 10.1140/epjds/s13688-014-0013-1 .

Related Research Articles

A heuristic or heuristic technique, is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

Creativity Phenomenon whereby something new and somehow valuable is formed

Creativity is a phenomenon whereby something somehow new and somehow valuable is formed. The created item may be intangible or a physical object.

Decision-making Cognitive process resulting in choosing a course of actions

In psychology, decision-making is regarded as the cognitive process resulting in the selection of a belief or a course of action among several possible alternative options. It could be either rational or irrational. Decision-making process is a reasoning process based on assumptions of values, preferences and beliefs of the decision-maker. Every decision-making process produces a final choice, which may or may not prompt action.

Robert Sternberg American psychologist & scholar

Robert J. Sternberg is an American psychologist and psychometrician. He is Professor of Human Development at Cornell University. Prior to joining Cornell, Sternberg was president of the University of Wyoming for 5 months. He has been Provost and Professor at Oklahoma State University, Dean of Arts and Sciences at Tufts University, IBM Professor of Psychology and Education at Yale University. He is a member of the editorial boards of numerous journals, including American Psychologist. He is the past President for the American Psychological Association.

Situated cognition is a theory that posits that knowing is inseparable from doing by arguing that all knowledge is situated in activity bound to social, cultural and physical contexts.

Eleanor Rosch Professor of psychology

Eleanor Rosch is an American psychologist. She is a professor of psychology at the University of California, Berkeley, specializing in cognitive psychology and primarily known for her work on categorization, in particular her prototype theory, which has profoundly influenced the field of cognitive psychology.

In psychology and cognitive science, a schema describes a pattern of thought or behavior that organizes categories of information and the relationships among them. It can also be described as a mental structure of preconceived ideas, a framework representing some aspect of the world, or a system of organizing and perceiving new information. Schemata influence attention and the absorption of new knowledge: people are more likely to notice things that fit into their schema, while re-interpreting contradictions to the schema as exceptions or distorting them to fit. Schemata have a tendency to remain unchanged, even in the face of contradictory information. Schemata can help in understanding the world and the rapidly changing environment. People can organize new perceptions into schemata quickly as most situations do not require complex thought when using schema, since automatic thought is all that is required.

Diane F. Halpern is an American psychologist and former president of the American Psychological Association (APA). She is Dean of Social Science at the Minerva Schools at KGI and also the McElwee Family Professor of Psychology at Claremont McKenna College. She is also past-president of the Western Psychological Association, The Society for the Teaching of Psychology, and the Division of General Psychology.

In psychology, mentalism refers to those branches of study that concentrate on perception and thought processes, for example: mental imagery, consciousness and cognition, as in cognitive psychology. The term mentalism has been used primarily by behaviorists who believe that scientific psychology should focus on the structure of causal relationships to reflexes and operant responses or on the functions of behavior.

John Robert Anderson is a Canadian-born American psychologist. He is currently professor of Psychology and Computer Science at Carnegie Mellon University.

Cognitive style or thinking style is a concept used in cognitive psychology to describe the way individuals think, perceive and remember information. Cognitive style differs from cognitive ability, the latter being measured by aptitude tests or so-called intelligence tests. There is controversy over the exact meaning of the term "cognitive style" and whether it is a single or multiple dimension of human personality. However it remains a key concept in the areas of education and management. If a pupil has a cognitive style that is similar to that of his/her teacher, the chances are improved that the pupil will have a more positive learning experience. Likewise, team members with similar cognitive styles likely feel more positive about their participation with the team. While matching cognitive styles may make participants feel more comfortable when working with one another, this alone cannot guarantee the success of the outcome.

Alan M. Lesgold, an educational psychologist, is professor of psychology and Dean of the University of Pittsburgh School of Education. He received a PhD in psychology from Stanford University, where his doctoral advisor was Gordon Bower (1971) and holds an honorary doctorate from the Open University of the Netherlands. The psychologist has made notable contributions to the cognitive science of learning and its application to instructional technology.

Ron Sun is a cognitive scientist who made significant contributions to computational psychology and other areas of cognitive science and artificial intelligence. He is current professor of cognitive mathematics at Schenectady Polytechnic Institute, and formerly the James C. Dowell Professor of Engineering and Professor of Computer Science at University of Missouri. He received his Ph.D. in 1992 from Brandeis University.

Polytely comprises complex problem-solving situations characterized by the presence of multiple simultaneous goals. These goals may be contradictory or otherwise conflict with one another, requiring prioritisation of desired outcomes.

Cognitive strategies are the specific methods that people use to solve problems and/or exploit opportunities, including all sorts of reasoning, planning, arithmetic, etc. Importantly, a cognitive strategy need not be all "in the head", but will almost always interact with various aspects of what might be called the "execution context". A specific cognitive strategy would be implemented via a set of ordered and overlapping logic. Each logical aspect of a cognitive strategy is either taught or learned and needs to be remembered as situation foreknowledge. These cognitive strategies are memorized for future utilization. They can be thought of as consciously written and remembered "programs" or as the "software" that guides future brain-neuron processing. Each logic process helps to "add up" to a specific decision and resulting action.

George Mandler was an Austrian-born American psychologist, who became a distinguished professor of psychology at the University of California, San Diego.

Neo-Piagetian theories of cognitive development criticize and build upon Jean Piaget's theory of cognitive development.

Conceptual change is the process whereby concepts and relationships between them change over the course of an individual person's lifetime or over the course of history. Research in four different fields – cognitive psychology, cognitive developmental psychology, science education, and history and philosophy of science - has sought to understand this process. Indeed, the convergence of these four fields, in their effort to understand how concepts change in content and organization, has led to the emergence of an interdisciplinary sub-field in its own right. This sub-field is referred to as “conceptual change” research.

Robert S. Wyer

Robert S. Wyer Jr. is a visiting professor at the University of Cincinnati and Professor (Emeritus) at the University of Illinois, Urbana-Champaign. He received his doctoral degree from the University of Colorado. His research interests cut across numerous areas of social information processing, including knowledge accessibility, comprehension, memory, social inference, the impact of affect on judgment and decisions, attitude formation and change, and consumer judgment and decision making.

Rudolf Groner is a Swiss psychologist, specialized in cognitive psychology and media psychology.


  1. Ross W. Greene
  2. "It's Not Collaborative Problem Solving Anymore! | Ess".