On Oct. 15, Yale professor of law and psychology Dan Kahan wrote in a blog post that he found a small positive correlation between liberals and science comprehension. But to caution readers from jumping to hasty conclusions, Kahan also wrote that he found a tiny positive correlation between identifying with the Tea Party and understanding scientific concepts.
Kahan said he intended to demonstrate that both liberals and conservatives are prone to decision-making on ideological, rather than scientific grounds. But Kahan said that a faulty interpretation of his data spurred articles on his findings in publications ranging from Politico to the Washington Times, becoming an exact example of what Kahan researches: the misinterpretation of empirical evidence due to ideological or cultural stakes. Kahan is head of the Cultural Cognition Project at Yale, a consortium of researchers affiliated with various universities that studies how group identities shape individuals’ perceptions of societal risks and policy-related facts, such as climate change and gun control. In an interview with the News, Kahan spoke about his research and the challenges that cultural cognition and science misinterpretation pose for society.
Q: What have you found in your research that explains the phenomenon of group identity influencing policy beliefs?
A: We did tests and found that people who are more science literate and better able to make sense out of scientific data tended to be more polarized along cultural lines on issues like climate change or guns or nuclear power, not less. That’s not what you would expect if the problem were that people had a deficit in rationality — in that case, the people who are the most science comprehending among those different groups would be converging on their views consistent with the best evidence. On something like climate change, when positions are really viewed as kind of badges of membership and loyalty to groups, the stake that people have in forming the group-consistent belief is probably going to be a lot bigger than the one they have in forming the belief that’s consistent with the best available evidence. Now that climate change has this kind of significance as a marker of whether you belong to a group and can be trusted within it, the consequences of making a mistake relative to what your group believes — that’s really consequential. You could end up being shunned or denied [opportunities] within the group.
Q: How would you explain the media’s misinterpretation of your findings on scientific comprehension and political leaning?
A: We’ve done studies that show that people will in fact construe in an ideologically biased way evidence of whether people are processing information in an open-minded fashion about certain issues, like climate change. If I show that people who believe in climate change do better, then people who are republicans [or] conservatives will say there are problems with the test. So they’re fitting their evidence of open-mindedness to the ideological disposition. It turns out people who scored the highest in terms of open-mindedness did [so] the most because they had the skills of critical reflection that allowed them to manipulate what they’d seen.
Q: How did you react to the media interpreting your finding as an indicator of intelligence levels between the political left and right?
A: Every time we do a study, our evidence of how that happens gets misunderstood because of [an] ideological or political stake that somebody has in the conflict. Mainly I just felt sick to my stomach, because I think that it was an instance of the problem, but it also magnifies it. When people are sitting around saying the other side believes what it does because [they are] stupid, they’re just making the cost to those people of engaging evidence that challenges their beliefs higher, because the consequence of changing their mind would be to admit that you and everyone you trust is stupid. Why would anybody want to do that, especially when the truth is nobody is stupid, but everyone is in jeopardy because of the kinds of conditions that dull our critical sensibilities. If we could recognize that [we have this] in common, we could fix it.
Q: What are some ways that we can decrease cognitive bias in scientific interpretation?
A: We can show with lab models that when people are trying to assess information about something like the HPV vaccine, they’ll be very sensitive to both what they perceive the cultural values of the communicator to be, but even more so to just the existence of cues that point towards conflict among different groups. When people are used to the information about the problem being presented to them in a way that seems to focus on one solution that’s kind of antagonistic to what they value — either out of distance avoidance or maybe distrust of people who only seem to think of one solution — they’re more inclined to shut down. When you show them there’s more going on, that some of the things they value are actually part of the solution … then we can show that they are actually engaging the information in a more open-minded and reflective way.
Q: Where is your research heading from here?
A: Places like museums are also really a tremendous resource, because when people are curious, these kinds of biases don’t happen. The person who is curious wants to know something and that person has an appetite to be surprised by something he or she didn’t know. That’s the opposite of what we’re finding in people, which is a kind of motivated resistance to having what they believe confounded. How can we leverage curiosity? So I’m interested in looking at how people reason and engage with scientific information outside of politics. I’m pretty sure that some of the research I would be doing in the next few years would be like that.