Photo of a woman's head in profile, with an artistic picture of a brain and a network overlaying the woman's image

Factors That Influence How We Think

“The most erroneous stories are those we think we know best – and therefore
never scrutinize or question.”

Stephen J. Gould

Research in cognitive sciences has helped to distinguish many ways in which our thinking is influenced by a combination of factors in our environment (such as influence of our culture generally and peer groups specifically), plus the way our brains give priority to certain kinds of information and stimuli. Cognitive biases and logical fallacies lead us to believe things to create a worldview and social dynamics that are internally consistent. Most everyone holds as true things that are clearly and demonstrably false; because we are wrong about these things, we cannot see that we are wrong. These aspects of human nature drive us to believe certain things that are demonstrably false falls under the
umbrella of identity-protective cognition. Familiarity with some of the kinds of common biases is helpful when considering and teaching about controversial issues.

Video: "Cognitive Bias | Ethics Defined" by McCombs School of Business, University of Texas at Austin (YouTube)

The common phenomenon, discussed above, of additional evidence alone being insufficient to change understandings and associated beliefs, can be a precondition to the backfire effect. Note that scientists often prefer to use the term “accept climate change” rather than “believe in climate change,” as a way to distinguish evidence-based conclusions (accepting evidence) as opposed to faith-based conclusions (belief). In this chapter we often follow this convention, but do use the terms “believe” and “beliefs” when we feel the intended meaning is not likely to be misinterpreted.

The backfire effect causes beliefs to become stronger when they are challenged with conflicting evidence. People may be more likely to consider another position when they do not feel challenged or threatened (see Eric Horowitz, “Want to Win a Political Debate? Try Making a Weaker Argument." 2013). The backfire effect is in part a response to identity protective cognition, the status quo bias, and allegiance to community norms, each of which are powerful forces that resist change. The status quo bias is an emotional bias and a preference for the current state of affairs. The current baseline (or status quo) is taken as a reference point, and any change from that baseline is perceived as a threat or loss.

Video: "Adam Ruins Everything - Why Proving Someone Wrong Often Backfires" by truTV (YouTube)

Myside bias or confirmation bias is the tendency to seek out information that agrees with one’s existing prior opinions, and to ignore established evidence that might conflict with those opinions. A common occurrence of this bias is reading only media likely to align with one’s existing views. More subtle subconscious confirmation biases can occur, however, even in research in selective choice of data to analyze and literature to reference. One function of the peer-review process in scientific publication is to insure that researchers have taken into account all available credible evidence for and against their hypotheses.

Video: "The Most Common Cognitive Bias" by Veritasium (YouTube)

The sunk-cost fallacy, spending good money (or time, or other resources) after bad, is the tendency to pour resources into a system in part to justify the resources already used. Such reasoning makes it difficult to abandon existing infrastructure or long held practices. If, however, putting resources into new solutions is more likely to lead to better outcomes than maintaining an existing system, the amount already invested in the existing system logically should not factor into decision-making.

Video: "Seinfeld - Sunk Cost Fallacy" by Professor Ross (YouTube)

Solution aversion refers to the idea that claims (such as the influence of climate change) might be rejected because the implication of accepting those claims would be accepting solutions that require sweeping (and therefore challenging) changes to the systems and cultures in which one lives and works.

The availability heuristic pushes us to rely on immediate examples rather than information grounded in extensive data or research. The understanding of cause and effect within complex systems over long-intervals is thus challenging in part due to delays in feedback. Examples include attributing individual weather observations to support for climate change (an extreme weather event) or against it (a cold and snowy day), even though climate change by definition refers to long-term averages.

Objections to climate change are also commonly in the form of narratives of good and evil. This is addressed in the chapter titled Perspective [online version of this chapter coming soon.]