Problem solving

From HandWiki
Short description: Approaches to problem solving

Problem solving is the process of achieving a goal by overcoming obstacles, a frequent part of most activities. Problems in need of solutions range from simple personal tasks (e.g. how to turn on an appliance) to complex issues in business and technical fields. The former is an example of simple problem solving (SPS) addressing one issue, whereas the latter is complex problem solving (CPS) with multiple interrelated obstacles.[1] Another classification of problem-solving tasks is into well-defined problems with specific obstacles and goals, and ill-defined problems in which the current situation is troublesome but it is not clear what kind of resolution to aim for.[2] Similarly, one may distinguish formal or fact-based problems requiring psychometric intelligence, versus socio-emotional problems which depend on the changeable emotions of individuals or groups, such as tactful behavior, fashion, or gift choices.[3]

Solutions require sufficient resources and knowledge to attain the goal. Professionals such as lawyers, doctors, programmers, and consultants are largely problem solvers for issues that require technical skills and knowledge beyond general competence. Many businesses have found profitable markets by recognizing a problem and creating a solution: the more widespread and inconvenient the problem, the greater the opportunity to develop a scalable solution.

There are many specialized problem-solving techniques and methods in fields such as engineering, business, medicine, mathematics, computer science, philosophy, and social organization. The mental techniques to identify, analyze, and solve problems are studied in psychology and cognitive sciences. Also widely researched are the mental obstacles that prevent people from finding solutions; problem-solving impediments include confirmation bias, mental set, and functional fixedness.

Definition

The term problem solving has a slightly different meaning depending on the discipline. For instance, it is a mental process in psychology and a computerized process in computer science. There are two different types of problems: ill-defined and well-defined; different approaches are used for each. Well-defined problems have specific end goals and clearly expected solutions, while ill-defined problems do not. Well-defined problems allow for more initial planning than ill-defined problems.[2] Solving problems sometimes involves dealing with pragmatics (the way that context contributes to meaning) and semantics (the interpretation of the problem). The ability to understand what the end goal of the problem is, and what rules could be applied, represents the key to solving the problem. Sometimes a problem requires abstract thinking or coming up with a creative solution.

Problem solving has two major domains: mathematical problem solving and personal problem solving. Each concerns some difficulty or barrier that is encountered.[4]

Psychology

Problem solving in psychology refers to the process of finding solutions to problems encountered in life.[5] Solutions to these problems are usually situation- or context-specific. The process starts with problem finding and problem shaping, in which the problem is discovered and simplified. The next step is to generate possible solutions and evaluate them. Finally a solution is selected to be implemented and verified. Problems have an end goal to be reached; how you get there depends upon problem orientation (problem-solving coping style and skills) and systematic analysis.[6]

Mental health professionals study the human problem-solving processes using methods such as introspection, behaviorism, simulation, computer modeling, and experiment. Social psychologists look into the person-environment relationship aspect of the problem and independent and interdependent problem-solving methods.[7] Problem solving has been defined as a higher-order cognitive process and intellectual function that requires the modulation and control of more routine or fundamental skills.[8]

Empirical research shows many different strategies and factors influence everyday problem solving.[9] Rehabilitation psychologists studying people with frontal lobe injuries have found that deficits in emotional control and reasoning can be re-mediated with effective rehabilitation and could improve the capacity of injured persons to resolve everyday problems.[10] Interpersonal everyday problem solving is dependent upon personal motivational and contextual components. One such component is the emotional valence of "real-world" problems, which can either impede or aid problem-solving performance. Researchers have focused on the role of emotions in problem solving,[11] demonstrating that poor emotional control can disrupt focus on the target task, impede problem resolution, and lead to negative outcomes such as fatigue, depression, and inertia.[12] In conceptualization,[clarification needed]human problem solving consists of two related processes: problem orientation, and the motivational/attitudinal/affective approach to problematic situations and problem-solving skills. People's strategies cohere with their goals[13] and stem from the process of comparing oneself with others.

Cognitive sciences

Among the first experimental psychologists to study problem solving were the Gestaltists in Germany , such as Karl Duncker in The Psychology of Productive Thinking (1935).[14] Perhaps best known is the work of Allen Newell and Herbert A. Simon.[15]

Experiments in the 1960s and early 1970s asked participants to solve relatively simple, well-defined, but not previously seen laboratory tasks.[16][17] These simple problems, such as the Tower of Hanoi, admitted optimal solutions that could be found quickly, allowing researchers to observe the full problem-solving process. Researchers assumed that these model problems would elicit the characteristic cognitive processes by which more complex "real world" problems are solved.

An outstanding problem-solving technique found by this research is the principle of decomposition.[18]

Computer science

Much of computer science and artificial intelligence involves designing automated systems to solve a specified type of problem: to accept input data and calculate a correct or adequate response, reasonably quickly. Algorithms are recipes or instructions that direct such systems, written into computer programs.

Steps for designing such systems include problem determination, heuristics, root cause analysis, de-duplication, analysis, diagnosis, and repair. Analytic techniques include linear and nonlinear programming, queuing systems, and simulation.[19] A large, perennial obstacle is to find and fix errors in computer programs: debugging.

Logic

Formal logic concerns issues like validity, truth, inference, argumentation, and proof. In a problem-solving context, it can be used to formally represent a problem as a theorem to be proved, and to represent the knowledge needed to solve the problem as the premises to be used in a proof that the problem has a solution.

The use of computers to prove mathematical theorems using formal logic emerged as the field of automated theorem proving in the 1950s. It included the use of heuristic methods designed to simulate human problem solving, as in the Logic Theory Machine, developed by Allen Newell, Herbert A. Simon and J. C. Shaw, as well as algorithmic methods such as the resolution principle developed by John Alan Robinson.

In addition to its use for finding proofs of mathematical theorems, automated theorem-proving has also been used for program verification in computer science. In 1958, John McCarthy proposed the advice taker, to represent information in formal logic and to derive answers to questions using automated theorem-proving. An important step in this direction was made by Cordell Green in 1969, who used a resolution theorem prover for question-answering and for such other applications in artificial intelligence as robot planning.

The resolution theorem-prover used by Cordell Green bore little resemblance to human problem solving methods. In response to criticism of that approach from researchers at MIT, Robert Kowalski developed logic programming and SLD resolution,[20] which solves problems by problem decomposition. He has advocated logic for both computer and human problem solving[21] and computational logic to improve human thinking.[22]

Engineering

When products or processes fail, problem solving techniques can be used to develop corrective actions that can be taken to prevent further failures. Such techniques can also be applied to a product or process prior to an actual failure event—to predict, analyze, and mitigate a potential problem in advance. Techniques such as failure mode and effects analysis can proactively reduce the likelihood of problems.

In either the reactive or the proactive case, it is necessary to build a causal explanation through a process of diagnosis. In deriving an explanation of effects in terms of causes, abduction generates new ideas or hypotheses (asking "how?"); deduction evaluates and refines hypotheses based on other plausible premises (asking "why?"); and induction justifies a hypothesis with empirical data (asking "how much?").[23] The objective of abduction is to determine which hypothesis or proposition to test, not which one to adopt or assert.[24] In the Peircean logical system, the logic of abduction and deduction contribute to our conceptual understanding of a phenomenon, while the logic of induction adds quantitative details (empirical substantiation) to our conceptual knowledge.[25]

Forensic engineering is an important technique of failure analysis that involves tracing product defects and flaws. Corrective action can then be taken to prevent further failures.

Reverse engineering attempts to discover the original problem-solving logic used in developing a product by disassembling the product and developing a plausible pathway to creating and assembling its parts.[26]

Military science

In military science, problem solving is linked to the concept of "end-states", the conditions or situations which are the aims of the strategy.[27]:xiii, E-2 Ability to solve problems is important at any military rank, but is essential at the command and control level. It results from deep qualitative and quantitative understanding of possible scenarios. Effectiveness in this context is an evaluation of results: to what extent the end states were accomplished.[27]:IV-24 Planning is the process of determining how to effect those end states.[27]:IV-1

Processes

Some models of problem solving involve identifying a goal and then a sequence of subgoals towards achieving this goal. Andersson, who introduced the ACT-R model of cognition, modelled this collection of goals and subgoals as a goal stack in which the mind contains a stack of goals and subgoals to be completed, and a single task being carried out at any time.[28](p51)

Knowledge of how to solve one problem can be applied to another problem, in a process known as transfer.[28]:56

Problem-solving strategies

Problem-solving strategies are steps to overcoming the obstacles to achieving a goal. The iteration of such strategies over the course of solving a problem is the "problem-solving cycle".[29]

Common steps in this cycle include recognizing the problem, defining it, developing a strategy to fix it, organizing knowledge and resources available, monitoring progress, and evaluating the effectiveness of the solution. Once a solution is achieved, another problem usually arises, and the cycle starts again.

Insight is the sudden aha! solution to a problem, the birth of a new idea to simplify a complex situation. Solutions found through insight are often more incisive than those from step-by-step analysis. A quick solution process requires insight to select productive moves at different stages of the problem-solving cycle. Unlike Newell and Simon's formal definition of a move problem, there is no consensus definition of an insight problem.[30]

Some problem-solving strategies include:[31]

Abstraction
solving the problem in a tractable model system to gain insight into the real system
Analogy
adapting the solution to a previous problem which has similar features or mechanisms
Brainstorming
(especially among groups of people) suggesting a large number of solutions or ideas and combining and developing them until an optimum solution is found
Critical thinking
analysis of available evidence and arguments to form a judgement via rational, skeptical, and unbiased evaluation
Divide and conquer
breaking down a large, complex problem into smaller, solvable problems
Help-seeking
obtaining external assistance to deal with obstacles
Hypothesis testing
assuming a possible explanation to the problem and trying to prove (or, in some contexts, disprove) the assumption
Lateral thinking
approaching solutions indirectly and creatively
Means-ends analysis
choosing an action at each step to move closer to the goal
Morphological analysis
assessing the output and interactions of an entire system

Observation / Question

in the natural sciences is an act or instance of noticing or perceiving and the acquisition of information from a primary source. is an utterance which serves as a request for information.
Proof of impossibility
try to prove that the problem cannot be solved. The point where the proof fails will be the starting point for solving it
Reduction
transforming the problem into another problem for which solutions exist
Research
employing existing ideas or adapting existing solutions to similar problems
Root cause analysis
identifying the cause of a problem
Trial-and-error
testing possible solutions until the right one is found

Problem-solving methods

Common barriers

Common barriers to problem solving include mental constructs that impede an efficient search for solutions. Five of the most common identified by researchers are: confirmation bias, mental set, functional fixedness, unnecessary constraints, and irrelevant information.

Confirmation bias

Main page: Confirmation bias

Confirmation bias is an unintentional tendency to collect and use data which favors preconceived notions. Such notions may be incidental rather than motivated by important personal beliefs: the desire to be right may be sufficient motivation.[32]

Scientific and technical professionals also experience confirmation bias. One online experiment, for example, suggested that professionals within the field of psychological research are likely to view scientific studies that agree with their preconceived notions more favorably than clashing studies.[33] According to Raymond Nickerson, one can see the consequences of confirmation bias in real-life situations, which range in severity from inefficient government policies to genocide. Nickerson argued that those who killed people accused of witchcraft demonstrated confirmation bias with motivation.[citation needed] Researcher Michael Allen found evidence for confirmation bias with motivation in school children who worked to manipulate their science experiments to produce favorable results.[34]

However, confirmation bias does not necessarily require motivation. In 1960, Peter Cathcart Wason conducted an experiment in which participants first viewed three numbers and then created a hypothesis in the form of a rule that could have been used to create that triplet of numbers. When testing their hypotheses, participants tended to only create additional triplets of numbers that would confirm their hypotheses, and tended not to create triplets that would negate or disprove their hypotheses.[35]

Mental set

Mental set is the inclination to re-use a previously successful solution, rather than search for new and better solutions. It is a reliance on habit.

It was first articulated by Abraham S. Luchins in the 1940s with his well-known water jug experiments.[36] Participants were asked to fill one jug with a specific amount of water by using other jugs with different maximum capacities. After Luchins gave a set of jug problems that could all be solved by a single technique, he then introduced a problem that could be solved by the same technique, but also by a novel and simpler method. His participants tended to use the accustomed technique, oblivious of the simpler alternative.[37] This was again demonstrated in Norman Maier's 1931 experiment, which challenged participants to solve a problem by using a familiar tool (pliers) in an unconventional manner. Participants were often unable to view the object in a way that strayed from its typical use, a type of mental set known as functional fixedness (see the following section).

Rigidly clinging to a mental set is called fixation, which can deepen to an obsession or preoccupation with attempted strategies that are repeatedly unsuccessful.[38] In the late 1990s, researcher Jennifer Wiley found that professional expertise in a field can create a mental set, perhaps leading to fixation.[38]

Groupthink, in which each individual takes on the mindset of the rest of the group, can produce and exacerbate mental set.[39] Social pressure leads to everybody thinking the same thing and reaching the same conclusions.

Functional fixedness

Main page: Functional fixedness

Functional fixedness is the tendency to view an object as having only one function, and to be unable to conceive of any novel use, as in the Maier pliers experiment described above. Functional fixedness is a specific form of mental set, and is one of the most common forms of cognitive bias in daily life.

As an example, imagine a man wants to kill a bug in his house, but the only thing at hand is a can of air freshener. He may start searching for something to kill the bug instead of squashing it with the can, thinking only of its main function of deodorizing.

Tim German and Clark Barrett describe this barrier: "subjects become 'fixed' on the design function of the objects, and problem solving suffers relative to control conditions in which the object's function is not demonstrated."[40] Their research found that young children's limited knowledge of an object's intended function reduces this barrier[41] Research has also discovered functional fixedness in educational contexts, as an obstacle to understanding: "functional fixedness may be found in learning concepts as well as in solving chemistry problems."[42]

There are several hypotheses in regards to how functional fixedness relates to problem solving.[43] It may waste time, delaying or entirely preventing the correct use of a tool.

Unnecessary constraints

Unnecessary constraints are arbitrary boundaries imposed unconsciously on the task at hand, which foreclose a productive avenue of solution. The solver may become fixated on only one type of solution, as if it were an inevitable requirement of the problem. Typically, this combines with mental set—clinging to a previously successful method.[44][page needed]

Visual problems can also produce mentally invented constraints.[45][page needed] A famous example is the dot problem: nine dots arranged in a three-by-three grid pattern must be connected by drawing four straight line segments, without lifting pen from paper or backtracking along a line. The subject typically assumes the pen must stay within the outer square of dots, but the solution requires lines continuing beyond this frame, and researchers have found a 0% solution rate within a brief allotted time.[46]

This problem has produced the expression "think outside the box".[47][page needed] Such problems are typically solved via a sudden insight which leaps over the mental barriers, often after long toil against them.[48] This can be difficult depending on how the subject has structured the problem in their mind, how they draw on past experiences, and how well they juggle this information in their working memory. In the example, envisioning the dots connected outside the framing square requires visualizing an unconventional arrangement, which is a strain on working memory.[47]

Irrelevant information

Irrelevant information is a specification or data presented in a problem that is unrelated to the solution.[44] If the solver assumes that all information presented needs to be used, this often derails the problem solving process, making relatively simple problems much harder.[49]

For example: "Fifteen percent of the people in Topeka have unlisted telephone numbers. You select 200 names at random from the Topeka phone book. How many of these people have unlisted phone numbers?"[47][page needed] The "obvious" answer is 15%, but in fact none of the unlisted people would be listed among the 200. This kind of "trick question" is often used in aptitude tests or cognitive evaluations.[50] Though not inherently difficult, they require independent thinking that is not necessarily common. Mathematical word problems often include irrelevant qualitative or numerical information as an extra challenge.

Avoiding barriers by changing problem representation

The disruption caused by the above cognitive biases can depend on how the information is represented:[50] visually, verbally, or mathematically. A classic example is the Buddhist monk problem:

A Buddhist monk begins at dawn one day walking up a mountain, reaches the top at sunset, meditates at the top for several days until one dawn when he begins to walk back to the foot of the mountain, which he reaches at sunset. Making no assumptions about his starting or stopping or about his pace during the trips, prove that there is a place on the path which he occupies at the same hour of the day on the two separate journeys.

The problem cannot be addressed in a verbal context, trying to describe the monk's progress on each day. It becomes much easier when the paragraph is represented mathematically by a function: one visualizes a graph whose horizontal axis is time of day, and whose vertical axis shows the monk's position (or altitude) on the path at each time. Superimposing the two journey curves, which traverse opposite diagonals of a rectangle, one sees they must cross each other somewhere. The visual representation by graphing has resolved the difficulty.

Similar strategies can often improve problem solving on tests.[44][51]

Other barriers for individuals

People who are engaged in problem solving tend to overlook subtractive changes, even those that are critical elements of efficient solutions.[example needed] This tendency to solve by first, only, or mostly creating or adding elements, rather than by subtracting elements or processes is shown to intensify with higher cognitive loads such as information overload.[52]

Dreaming: problem solving without waking consciousness

People can also solve problems while they are asleep. There are many reports of scientists and engineers who solved problems in their dreams. For example, Elias Howe, inventor of the sewing machine, figured out the structure of the bobbin from a dream.[53]

The chemist August Kekulé was considering how benzene arranged its six carbon and hydrogen atoms. Thinking about the problem, he dozed off, and dreamt of dancing atoms that fell into a snakelike pattern, which led him to discover the benzene ring. As Kekulé wrote in his diary,

One of the snakes seized hold of its own tail, and the form whirled mockingly before my eyes. As if by a flash of lightning I awoke; and this time also I spent the rest of the night in working out the consequences of the hypothesis.[54]

There also are empirical studies of how people can think consciously about a problem before going to sleep, and then solve the problem with a dream image. Dream researcher William C. Dement told his undergraduate class of 500 students that he wanted them to think about an infinite series, whose first elements were OTTFF, to see if they could deduce the principle behind it and to say what the next elements of the series would be.[55][page needed] He asked them to think about this problem every night for 15 minutes before going to sleep and to write down any dreams that they then had. They were instructed to think about the problem again for 15 minutes when they awakened in the morning.

The sequence OTTFF is the first letters of the numbers: one, two, three, four, five. The next five elements of the series are SSENT (six, seven, eight, nine, ten). Some of the students solved the puzzle by reflecting on their dreams. One example was a student who reported the following dream:[55][page needed]

I was standing in an art gallery, looking at the paintings on the wall. As I walked down the hall, I began to count the paintings: one, two, three, four, five. As I came to the sixth and seventh, the paintings had been ripped from their frames. I stared at the empty frames with a peculiar feeling that some mystery was about to be solved. Suddenly I realized that the sixth and seventh spaces were the solution to the problem!

With more than 500 undergraduate students, 87 dreams were judged to be related to the problems students were assigned (53 directly related and 34 indirectly related). Yet of the people who had dreams that apparently solved the problem, only seven were actually able to consciously know the solution. The rest (46 out of 53) thought they did not know the solution.

Mark Blechner conducted this experiment and obtained results similar to Dement's.[56][page needed] He found that while trying to solve the problem, people had dreams in which the solution appeared to be obvious from the dream, but it was rare for the dreamers to realize how their dreams had solved the puzzle. Coaxing or hints did not get them to realize it, although once they heard the solution, they recognized how their dream had solved it. For example, one person in that OTTFF experiment dreamed:[56][page needed]

There is a big clock. You can see the movement. The big hand of the clock was on the number six. You could see it move up, number by number, six, seven, eight, nine, ten, eleven, twelve. The dream focused on the small parts of the machinery. You could see the gears inside.

In the dream, the person counted out the next elements of the series—six, seven, eight, nine, ten, eleven, twelve—yet he did not realize that this was the solution of the problem. His sleeping mindbrainTemplate:Jargon inline solved the problem, but his waking mindbrain was not aware how.

Albert Einstein believed that much problem solving goes on unconsciously, and the person must then figure out and formulate consciously what the mindbrainTemplate:Jargon inline has already solved. He believed this was his process in formulating the theory of relativity: "The creator of the problem possesses the solution."[57] Einstein said that he did his problem solving without words, mostly in images. "The words or the language, as they are written or spoken, do not seem to play any role in my mechanism of thought. The psychical entities which seem to serve as elements in thought are certain signs and more or less clear images which can be 'voluntarily' reproduced and combined."[58]

Cognitive sciences: two schools

Problem-solving processes differ across knowledge domains and across levels of expertise.[59] For this reason, cognitive sciences findings obtained in the laboratory cannot necessarily generalize to problem-solving situations outside the laboratory. This has led to a research emphasis on real-world problem solving, since the 1990s. This emphasis has been expressed quite differently in North America and Europe, however. Whereas North American research has typically concentrated on studying problem solving in separate, natural knowledge domains, much of the European research has focused on novel, complex problems, and has been performed with computerized scenarios.[60]

Europe

In Europe, two main approaches have surfaced, one initiated by Donald Broadbent[61] in the United Kingdom and the other one by Dietrich Dörner[62] in Germany. The two approaches share an emphasis on relatively complex, semantically rich, computerized laboratory tasks, constructed to resemble real-life problems. The approaches differ somewhat in their theoretical goals and methodology. The tradition initiated by Broadbent emphasizes the distinction between cognitive problem-solving processes that operate under awareness versus outside of awareness, and typically employs mathematically well-defined computerized systems. The tradition initiated by Dörner, on the other hand, has an interest in the interplay of the cognitive, motivational, and social components of problem solving, and utilizes very complex computerized scenarios that contain up to 2,000 highly interconnected variables.[63]

North America

In North America, initiated by the work of Herbert A. Simon on "learning by doing" in semantically rich domains,[64] researchers began to investigate problem solving separately in different natural knowledge domains—such as physics, writing, or chess playing—rather than attempt to extract a global theory of problem solving.[65] These researchers have focused on the development of problem solving within certain domains, that is on the development of expertise.[66]

Areas that have attracted rather intensive attention in North America include:

  • calculation[67]
  • computer skills[68]
  • game playing[69]
  • lawyers' reasoning[70]
  • managerial problem solving[71]
  • mathematical problem solving[72]
  • mechanical problem solving[73]
  • personal problem solving[74]
  • political decision making[75]
  • problem solving in electronics[76]
  • problem solving for innovations and inventions: TRIZ[77]
  • reading[78]
  • social problem solving[11]
  • writing[79]

Characteristics of complex problems

Complex problem solving (CPS) is distinguishable from simple problem solving (SPS). In SPS there is a singular and simple obstacle. In CPS there may be multiple simultaneous obstacles. For example, a surgeon at work has far more complex problems than an individual deciding what shoes to wear. As elucidated by Dietrich Dörner, and later expanded upon by Joachim Funke, complex problems have some typical characteristics, which include:[1]

Collective problem solving

People solve problems on many different levels—from the individual to the civilizational. Collective problem solving refers to problem solving performed collectively. Social issues and global issues can typically only be solved collectively.

The complexity of contemporary problems exceeds the cognitive capacity of any individual and requires different but complementary varieties of expertise and collective problem solving ability.[81]

Collective intelligence is shared or group intelligence that emerges from the collaboration, collective efforts, and competition of many individuals.

In collaborative problem solving people work together to solve real-world problems. Members of problem-solving groups share a common concern, a similar passion, and/or a commitment to their work. Members can ask questions, wonder, and try to understand common issues. They share expertise, experiences, tools, and methods.[82] Groups may be fluid based on need, may only occur temporarily to finish an assigned task, or may be more permanent depending on the nature of the problems.

For example, in the educational context, members of a group may all have input into the decision-making process and a role in the learning process. Members may be responsible for the thinking, teaching, and monitoring of all members in the group. Group work may be coordinated among members so that each member makes an equal contribution to the whole work. Members can identify and build on their individual strengths so that everyone can make a significant contribution to the task.[83] Collaborative group work has the ability to promote critical thinking skills, problem solving skills, social skills, and self-esteem. By using collaboration and communication, members often learn from one another and construct meaningful knowledge that often leads to better learning outcomes than individual work.[84]

Collaborative groups require joint intellectual efforts between the members and involve social interactions to solve problems together. The knowledge shared during these interactions is acquired during communication, negotiation, and production of materials.[85] Members actively seek information from others by asking questions. The capacity to use questions to acquire new information increases understanding and the ability to solve problems.[86]

In a 1962 research report, Douglas Engelbart linked collective intelligence to organizational effectiveness, and predicted that proactively "augmenting human intellect" would yield a multiplier effect in group problem solving: "Three people working together in this augmented mode [would] seem to be more than three times as effective in solving a complex problem as is one augmented person working alone".[87]

Henry Jenkins, a theorist of new media and media convergence, draws on the theory that collective intelligence can be attributed to media convergence and participatory culture.[88] He criticizes contemporary education for failing to incorporate online trends of collective problem solving into the classroom, stating "whereas a collective intelligence community encourages ownership of work as a group, schools grade individuals". Jenkins argues that interaction within a knowledge community builds vital skills for young people, and teamwork through collective intelligence communities contributes to the development of such skills.[89]

Collective impact is the commitment of a group of actors from different sectors to a common agenda for solving a specific social problem, using a structured form of collaboration.

After World War II the UN, the Bretton Woods organization, and the WTO were created. Collective problem solving on the international level crystallized around these three types of organization from the 1980s onward. As these global institutions remain state-like or state-centric it is unsurprising that they perpetuate state-like or state-centric approaches to collective problem solving rather than alternative ones.[90]

Crowdsourcing is a process of accumulating ideas, thoughts, or information from many independent participants, with aim of finding the best solution for a given challenge. Modern information technologies allow for many people to be involved and facilitate managing their suggestions in ways that provide good results.[91] The Internet allows for a new capacity of collective (including planetary-scale) problem solving.[92]

See also

Notes

  1. 1.0 1.1 Frensch, Peter A.; Funke, Joachim, eds (2014-04-04). Complex Problem Solving. doi:10.4324/9781315806723. ISBN 978-1-315-80672-3. 
  2. 2.0 2.1 Schacter, D.L.; Gilbert, D.T.; Wegner, D.M. (2011). Psychology (2nd ed.). New York: Worth Publishers. p. 376. 
  3. Blanchard-Fields, F. (2007). "Everyday problem solving and emotion: An adult developmental perspective". Current Directions in Psychological Science 16 (1): 26–31. doi:10.1111/j.1467-8721.2007.00469.x. 
  4. Zimmermann, Bernd (2004). "On mathematical problem-solving processes and history of mathematics". ICME 10. Copenhagen. https://www.researchgate.net/publication/238733375. 
  5. Granvold, Donald K. (1997). "Cognitive-Behavioral Therapy with Adults". in Brandell, Jerrold R.. Theory and Practice in Clinical Social Work. Simon and Schuster. pp. 189. ISBN 978-0-684-82765-0. 
  6. Robertson, S. Ian (2001). "Introduction to the study of problem solving". Problem Solving. Psychology Press. ISBN 0-415-20300-7. 
  7. Rubin, M.; Watt, S. E.; Ramelli, M. (2012). "Immigrants' social integration as a function of approach-avoidance orientation and problem-solving style". International Journal of Intercultural Relations 36 (4): 498–505. doi:10.1016/j.ijintrel.2011.12.009. 
  8. Goldstein F. C.; Levin H. S. (1987). "Disorders of reasoning and problem-solving ability". Neuropsychological rehabilitation. London: Taylor & Francis Group.. 
  9. Rath, Joseph F.; Simon, Dvorah; Langenbahn, Donna M.; Sherr, Rose Lynn; Diller, Leonard (2003). "Group treatment of problem-solving deficits in outpatients with traumatic brain injury: A randomised outcome study". Neuropsychological Rehabilitation 13 (4): 461–488. doi:10.1080/09602010343000039. https://www.researchgate.net/publication/247514323. 
  10. 11.0 11.1
    • D'Zurilla, T. J.; Goldfried, M. R. (1971). "Problem solving and behavior modification". Journal of Abnormal Psychology 78 (1): 107–126. doi:10.1037/h0031360. PMID 4938262. 
    • D'Zurilla, T. J.; Nezu, A. M. (1982). "Social problem solving in adults". in P. C. Kendall. Advances in cognitive-behavioral research and therapy. 1. New York: Academic Press. pp. 201–274. 
  11. RATH, J (2004). "The construct of problem solving in higher level neuropsychological assessment and rehabilitation*1". Archives of Clinical Neuropsychology 19 (5): 613–635. doi:10.1016/j.acn.2003.08.006. PMID 15271407. 
  12. Hoppmann, Christiane A.; Blanchard-Fields, Fredda (2010). "Goals and everyday problem solving: Manipulating goal preferences in young and older adults". Developmental Psychology 46 (6): 1433–1443. doi:10.1037/a0020676. PMID 20873926. 
  13. Duncker, Karl (1935) (in de). Zur Psychologie des produktiven Denkens. Berlin: Julius Springer. 
  14. Newell, Allen; Simon, Herbert A. (1972). Human problem solving. Englewood Cliffs, N.J.: Prentice-Hall. 
  15. For example:
  16. Mayer, R. E. (1992). Thinking, problem solving, cognition (Second ed.). New York: W. H. Freeman and Company. 
  17. Armstrong, J. Scott; Denniston, William B. Jr.; Gordon, Matt M. (1975). "The Use of the Decomposition Principle in Making Judgments". Organizational Behavior and Human Performance 14 (2): 257–263. doi:10.1016/0030-5073(75)90028-8. http://marketing.wharton.upenn.edu/ideas/pdf/armstrong2/DecompositionPrinciple.pdf. 
  18. Malakooti, Behnam (2013). Operations and Production Systems with Multiple Objectives. John Wiley & Sons. ISBN 978-1-118-58537-5. 
  19. Kowalski, Robert (1974). "Predicate Logic as a Programming Language". Information Processing 74. https://www.doc.ic.ac.uk/~rak/papers/IFIP%2074.pdf. 
  20. Kowalski, Robert (1979). Logic for Problem Solving. Artificial Intelligence Series. 7. Elsevier Science Publishing. ISBN 0-444-00368-1. https://www.doc.ic.ac.uk/~rak/papers/LogicForProblemSolving.pdf. 
  21. Kowalski, Robert (2011). Computational Logic and Human Thinking: How to be Artificially Intelligent. Cambridge University Press. https://www.doc.ic.ac.uk/~rak/papers/newbook.pdf. 
  22. Staat, Wim (1993). "On abduction, deduction, induction and the categories". Transactions of the Charles S. Peirce Society 29 (2): 225–237. 
  23. Sullivan, Patrick F. (1991). "On Falsificationist Interpretations of Peirce". Transactions of the Charles S. Peirce Society 27 (2): 197–219. 
  24. Ho, Yu Chong (1994). "Abduction? Deduction? Induction? Is There a Logic of Exploratory Data Analysis?". Annual Meeting of the American Educational Research Association. New Orleans, La.. https://files.eric.ed.gov/fulltext/ED376173.pdf. 
  25. "Einstein's Secret to Amazing Problem Solving (and 10 Specific Ways You Can Use It)" (in en-US). 2008-11-04. https://litemind.com/problem-definition/. 
  26. 27.0 27.1 27.2 "Commander's Handbook for Strategic Communication and Communication Strategy". United States Joint Forces Command, Joint Warfighting Center, Suffolk, Va.. 27 October 2009. http://www.au.af.mil/au/awc/awcgate/jfcom/cc_handbook_sc_27oct2009.pdf. 
  27. 28.0 28.1 Robertson, S. Ian (2017). Problem solving: perspectives from cognition and neuroscience (2nd ed.). London: Taylor & Francis. ISBN 978-1-317-49601-4. OCLC 962750529. 
  28. Bransford, J. D.; Stein, B. S (1993). The ideal problem solver: A guide for improving thinking, learning, and creativity (2nd ed.). New York: W.H. Freeman.. 
    • Ash, Ivan K.; Jee, Benjamin D.; Wiley, Jennifer (2012). "Investigating Insight as Sudden Learning". The Journal of Problem Solving 4 (2). doi:10.7771/1932-6246.1123. ISSN 1932-6246. 
    • Chronicle, Edward P.; MacGregor, James N.; Ormerod, Thomas C. (2004). "What Makes an Insight Problem? The Roles of Heuristics, Goal Conception, and Solution Recoding in Knowledge-Lean Problems.". Journal of Experimental Psychology: Learning, Memory, and Cognition 30 (1): 14–27. doi:10.1037/0278-7393.30.1.14. ISSN 1939-1285. PMID 14736293. 
    • Chu, Yun; MacGregor, James N. (2011). "Human Performance on Insight Problem Solving: A Review". The Journal of Problem Solving 3 (2). doi:10.7771/1932-6246.1094. ISSN 1932-6246. 
  29. Wang, Y.; Chiew, V. (2010). "On the cognitive process of human problem solving". Cognitive Systems Research (Elsevier BV) 11 (1): 81–92. doi:10.1016/j.cogsys.2008.08.003. ISSN 1389-0417. https://www.researchgate.net/profile/Patricia_Ryser-Welch/post/Do_Machines_learn/attachment/59d6235b79197b8077981b28/AS:306908018216960@1450183981555/download/61-Elsevier-CogSys-ProblemSolving.pdf. 
  30. Nickerson, Raymond S. (1998). "Confirmation bias: A ubiquitous phenomenon in many guises". Review of General Psychology 2 (2): 176. doi:10.1037/1089-2680.2.2.175. 
  31. Hergovich, Andreas; Schott, Reinhard; Burger, Christoph (2010). "Biased Evaluation of Abstracts Depending on Topic and Conclusion: Further Evidence of a Confirmation Bias Within Scientific Psychology". Current Psychology (Springer Science and Business Media LLC) 29 (3): 188–209. doi:10.1007/s12144-010-9087-5. ISSN 1046-1310. 
  32. Allen, Michael (2011). "Theory-led confirmation bias and experimental persona". Research in Science & Technological Education (Informa UK Limited) 29 (1): 107–127. doi:10.1080/02635143.2010.539973. ISSN 0263-5143. Bibcode2011RSTEd..29..107A. 
  33. Wason, P. C. (1960). "On the failure to eliminate hypotheses in a conceptual task". Quarterly Journal of Experimental Psychology 12 (3): 129–140. doi:10.1080/17470216008416717. 
  34. Luchins, Abraham S. (1942). "Mechanization in problem solving: The effect of Einstellung". Psychological Monographs 54 (248): i-95. doi:10.1037/h0093502. 
  35. Öllinger, Michael; Jones, Gary; Knoblich, Günther (2008). "Investigating the Effect of Mental Set on Insight Problem Solving". Experimental Psychology (Hogrefe Publishing Group) 55 (4): 269–282. doi:10.1027/1618-3169.55.4.269. ISSN 1618-3169. PMID 18683624. http://irep.ntu.ac.uk/id/eprint/23048/1/193183_1563%20Jones%20Postprint.pdf. 
  36. 38.0 38.1 Wiley, Jennifer (1998). "Expertise as mental set: The effects of domain knowledge in creative problem solving". Memory & Cognition 24 (4): 716–730. doi:10.3758/bf03211392. PMID 9701964. 
  37. Cottam, Martha L.; Dietz-Uhler, Beth; Mastors, Elena; Preston, Thomas (2010). Introduction to Political Psychology (2nd ed.). New York: Psychology Press. 
  38. German, Tim P.; Barrett, H. Clark (2005). "Functional Fixedness in a Technologically Sparse Culture". Psychological Science (SAGE Publications) 16 (1): 1–5. doi:10.1111/j.0956-7976.2005.00771.x. ISSN 0956-7976. PMID 15660843. 
  39. German, Tim P.; Defeyter, Margaret A. (2000). "Immunity to functional fixedness in young children". Psychonomic Bulletin and Review 7 (4): 707–712. doi:10.3758/BF03213010. PMID 11206213. 
  40. Furio, C.; Calatayud, M. L.; Baracenas, S.; Padilla, O. (2000). "Functional fixedness and functional reduction as common sense reasonings in chemical equilibrium and in geometry and polarity of molecules". Science Education 84 (5): 545–565. doi:10.1002/1098-237X(200009)84:5<545::AID-SCE1>3.0.CO;2-1. 
  41. Adamson, Robert E (1952). "Functional fixedness as related to problem solving: A repetition of three experiments". Journal of Experimental Psychology 44 (4): 288–291. doi:10.1037/h0062487. PMID 13000071. 
  42. 44.0 44.1 44.2 Kellogg, R. T. (2003). Cognitive psychology (2nd ed.). California: Sage Publications, Inc.. 
  43. Meloy, J. R. (1998). The Psychology of Stalking, Clinical and Forensic Perspectives (2nd ed.). London, England: Academic Press. 
  44. MacGregor, J.N.; Ormerod, T.C.; Chronicle, E.P. (2001). "Information-processing and insight: A process model of performance on the nine-dot and related problems". Journal of Experimental Psychology: Learning, Memory, and Cognition 27 (1): 176–201. doi:10.1037/0278-7393.27.1.176. PMID 11204097. 
  45. 47.0 47.1 47.2 Weiten, Wayne (2011). Psychology: themes and variations (8th ed.). California: Wadsworth. 
  46. Novick, L. R.; Bassok, M. (2005). "Problem solving". Cambridge handbook of thinking and reasoning. New York, N.Y.: Cambridge University Press. pp. 321–349. 
  47. Walinga, Jennifer (2010). "From walls to windows: Using barriers as pathways to insightful solutions". The Journal of Creative Behavior 44 (3): 143–167. doi:10.1002/j.2162-6057.2010.tb01331.x. 
  48. 50.0 50.1 Walinga, Jennifer; Cunningham, J. Barton; MacGregor, James N. (2011). "Training insight problem solving through focus on barriers and assumptions". The Journal of Creative Behavior 45: 47–58. doi:10.1002/j.2162-6057.2011.tb01084.x. 
  49. Vlamings, Petra H. J. M.; Hare, Brian; Call, Joseph (2009). "Reaching around barriers: The performance of great apes and 3–5-year-old children". Animal Cognition 13 (2): 273–285. doi:10.1007/s10071-009-0265-5. PMID 19653018. 
  50. Kaempffert, Waldemar B. (1924). A Popular History of American Invention. 2. New York: Charles Scribner's Sons. p. 385. 
    • Kekulé, August (1890). "Benzolfest-Rede.". Berichte der Deutschen Chemischen Gesellschaft 23: 1302–1311. 
    • Benfey, O. (1958). "Kekulé and the birth of the structural theory of organic chemistry in 1858". Journal of Chemical Education 35 (1): 21–23. doi:10.1021/ed035p21. Bibcode1958JChEd..35...21B. 
  51. 55.0 55.1 Dement, W.C. (1972). Some Must Watch While Some Just Sleep. New York: Freeman. 
  52. 56.0 56.1 Blechner, Mark J. (2018). The Mindbrain and Dreams: An Exploration of Dreaming, Thinking, and Artistic Creation. New York: Routledge. 
  53. Fromm, Erika O. (1998). "Lost and found half a century later: Letters by Freud and Einstein". American Psychologist 53 (11): 1195–1198. doi:10.1037/0003-066x.53.11.1195. 
  54. Einstein, Albert (1954). "A Mathematician's Mind". Ideas and Opinions. New York: Bonanza Books. p. 25. 
  55. Sternberg, R. J. (1995). "Conceptions of expertise in complex problem solving: A comparison of alternative conceptions". Complex problem solving: The European Perspective. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 295–321. 
  56. Funke, J. (1991). "Solving complex problems: Human identification and control of complex systems". Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 185–222. ISBN 0-8058-0650-4. OCLC 23254443. 
    • Buchner, A. (1995). "Theories of complex problem solving". Complex problem solving: The European Perspective. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 27–63. 
    • Lohhausen. Vom Umgang mit Unbestimmtheit und Komplexität. Bern, Switzerland: Hans Huber. 1983. 
    • Ringelband, O. J.; Misiak, C.; Kluwe, R. H. (1990). "Mental models and strategies in the control of a complex system". Mental models and human-computer interaction. 1. Amsterdam: Elsevier Science Publishers. pp. 151–164. 
  57. e.g., Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. 1991. ISBN 0-8058-0650-4. OCLC 23254443. 
  58. Sokol, S. M.; McCloskey, M. (1991). "Cognitive mechanisms in calculation". Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 85–116. ISBN 0-8058-0650-4. OCLC 23254443. https://books.google.com/books?id=ZECYAgAAQBAJ&pg=PA85. 
  59. Kay, D. S. (1991). "Computer interaction: Debugging the problems". Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 317–340. ISBN 0-8058-0650-4. OCLC 23254443. https://books.google.com/books?id=ZECYAgAAQBAJ&pg=PA317. 
  60. Frensch, P. A.; Sternberg, R. J. (1991). "Skill-related differences in game playing". Complex problem solving: Principles and mechanisms. Hillsdale, N.J .: Lawrence Erlbaum Associates. pp. 343–381. ISBN 0-8058-0650-4. OCLC 23254443. https://books.google.com/books?id=ZECYAgAAQBAJ&pg=PA343. 
  61. Amsel, E.; Langer, R.; Loutzenhiser, L. (1991). "Do lawyers reason differently from psychologists? A comparative design for studying expertise". Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 223–250. ISBN 0-8058-0650-4. OCLC 23254443. 
  62. Wagner, R. K. (1991). "Managerial problem solving". Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 159–183. PsycNET: 1991-98396-005. 
  63. Hegarty, M. (1991). "Knowledge and processes in mechanical problem solving". Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 253–285. ISBN 0-8058-0650-4. OCLC 23254443. https://books.google.com/books?id=ZECYAgAAQBAJ&pg=PA253. 
  64. Heppner, P. P.; Krauskopf, C. J. (1987). "An information-processing approach to personal problem solving". The Counseling Psychologist 15 (3): 371–447. doi:10.1177/0011000087153001. 
  65. Voss, J. F.; Wolfe, C. R.; Lawrence, J. A.; Engle, R. A. (1991). "From representation to decision: An analysis of problem solving in international relations". Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 119–158. PsycNET: 1991-98396-004. ISBN 0-8058-0650-4. OCLC 23254443. 
  66. Lesgold, A.; Lajoie, S. (1991). "Complex problem solving in electronics". Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 287–316. ISBN 0-8058-0650-4. OCLC 23254443. https://books.google.com/books?id=ZECYAgAAQBAJ&pg=PA287. 
  67. Altshuller, Genrich (1994). And Suddenly the Inventor Appeared. Worcester, Mass.: Technical Innovation Center. ISBN 978-0-9640740-1-9. 
  68. Stanovich, K. E.; Cunningham, A. E. (1991). "Reading as constrained reasoning". Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 3–60. ISBN 0-8058-0650-4. OCLC 23254443. https://books.google.com/books?id=ZECYAgAAQBAJ&pg=PA3. 
  69. Bryson, M.; Bereiter, C.; Scardamalia, M.; Joram, E. (1991). "Going beyond the problem as given: Problem solving in expert and novice writers". Complex problem solving: Principles and mechanisms. Hillsdale, N.J.: Lawrence Erlbaum Associates. pp. 61–84. ISBN 0-8058-0650-4. OCLC 23254443. 
  70. Complex problem solving: Principles and mechanisms. Hillsdale, NJ: Lawrence Erlbaum Associates. 1991. ISBN 0-8058-0650-4. OCLC 23254443. 
  71. Hung, Woei (2013). "Team-based complex problem solving: a collective cognition perspective". Educational Technology Research and Development 61 (3): 365–384. doi:10.1007/s11423-013-9296-3. 
  72. Jewett, Pamela; MacPhee, Deborah (2012). "Adding Collaborative Peer Coaching to Our Teaching Identities". The Reading Teacher 66 (2): 105–110. doi:10.1002/TRTR.01089. 
  73. Wang, Qiyun (2009). "Design and Evaluation of a Collaborative Learning Environment". Computers and Education 53 (4): 1138–1146. doi:10.1016/j.compedu.2009.05.023. 
  74. Wang, Qiyan (2010). "Using online shared workspaces to support group collaborative learning". Computers and Education 55 (3): 1270–1276. doi:10.1016/j.compedu.2010.05.023. 
  75. Kai-Wai Chu, Samuel; Kennedy, David M. (2011). "Using Online Collaborative tools for groups to Co-Construct Knowledge". Online Information Review 35 (4): 581–597. doi:10.1108/14684521111161945. ISSN 1468-4527. 
  76. Legare, Cristine; Mills, Candice; Souza, Andre; Plummer, Leigh; Yasskin, Rebecca (2013). "The use of questions as problem-solving strategies during early childhood". Journal of Experimental Child Psychology 114 (1): 63–7. doi:10.1016/j.jecp.2012.07.002. PMID 23044374. 
  77. Engelbart, Douglas (1962). "Team Cooperation". Augmenting Human Intellect: A Conceptual Framework. AFOSR-3223. Stanford Research Institute. https://www.dougengelbart.org/pubs/augment-3906.html#3b9. 
  78. Flew, Terry (2008). New Media: an introduction. Melbourne: Oxford University Press. 
  79. Henry, Jenkins. "Interactive audiences? The 'collective intelligence' of media fans". http://labweb.education.wisc.edu/curric606/readings/Jenkins2002.pdf. 
  80. Finger, Matthias (2008-03-27). "Which governance for sustainable development? An organizational and institutional perspective" (in en). The Crisis of Global Environmental Governance: Towards a New Political Economy of Sustainability. Routledge. p. 48. ISBN 978-1-134-05982-9. 
  81. Stefanovitch, Nicolas; Alshamsi, Aamena; Cebrian, Manuel; Rahwan, Iyad (30 September 2014). "Error and attack tolerance of collective problem solving: The DARPA Shredder Challenge". EPJ Data Science 3 (1). doi:10.1140/epjds/s13688-014-0013-1. 

Further reading

External links