The Power Of Group Think & Lessons For Climate Communications

Sean Illing in Vox.com conducted a fascinating interview with Steven Sloman, a professor of cognitive science at Brown University, about how we arrive at the conclusions we do. In short, the process (and outcomes) are not pretty, as Dr. Sloman relates:

I really do believe that our attitudes are shaped much more by our social groups than they are by facts on the ground. We are not great reasoners. Most people don’t like to think at all, or like to think as little as possible. And by most, I mean roughly 70 percent of the population. Even the rest seem to devote a lot of their resources to justifying beliefs that they want to hold, as opposed to forming credible beliefs based only on fact.

Think about if you were to utter a fact that contradicted the opinions of the majority of those in your social group. You pay a price for that. If I said I voted for Trump, most of my academic colleagues would think I’m crazy. They wouldn’t want to talk to me. That’s how social pressure influences our epistemological commitments, and it often does it in imperceptible ways.

He concludes that if the people around us are wrong about something, there’s a good chance we will be too. Proximity to truth compounds in the same way. And the phenomenon isn’t a partisan problem; it’s a human problem on all sides of political debates.

In some ways, it’s understandable how this dynamic arose in our species. There’s no way one brain can master all topics, so we have to depend on other people to do some thinking for us. This is a perfectly rational response to our condition. It also may explain why traditional societies often relied on a few religious leaders to make a lot of the key decisions for a society that would rather not have to think too hard about broader societal problems and instead focus on problem-solving in their own immediate lives. The problem though becomes when our beliefs support ideas or policies that are totally unjustified.

So are we doomed to a fate of group-think with the risk of unsupportable beliefs? Dr. Sloman doesn’t think so, noting that some professions train people not to fall into this trap:

People who are more reflective are less susceptible to the illusion. There are some simple questions you can use to measure reflectivity. They tend to have this form: How many animals of each kind did Moses load onto the ark? Most people say two, but more reflective people say zero. (It was Noah, not Moses who built the ark.)

The trick is to not only come to a conclusion, but to verify that conclusion. There are many communities that encourage verification (e.g., scientific, forensic, medical, judicial communities). You just need one person to say, “are you sure?” and for everyone else to care about the justification. There’s no reason that every community could not adopt these kinds of norms. The problem of course is that there’s a strong compulsion to make people feel good by telling them what they want to hear, and for everyone to agree. That’s largely what gives us a sense of identity. There’s a strong tension here.

He’s also pioneering some research on ways to reframe political-type conversations from a focus on what people value to one about actual consequences. As he notes, “when you talk about actual consequences, you’re forced into the weeds of what’s actually happening, which is a diversion from our normal focus on our feelings and what’s going on in our heads.”

This work could contribute to a better understanding about public perceptions around climate change. For example, the denial of basic climate science can certainly be attributed to group-think. But as Sloman posits, reframing the messaging from the science to the outcomes of climate mitigation (such as a cleaner world, less dependence on extractive industries for fuel) might open more in the middle to taking action. We could also focus on training the next generation to be more open-minded on evidence and arguments, as with the scientific, medical and judicial fields.

But just being aware of our mental processing of information and beliefs is a good start to addressing the problem of when those processes take us in the wrong direction.

 

About