The way humans think about cause and effect is not completely logical, a new Yale study has found.

The researchers performed a series of experiments to show that the way humans store knowledge about how events are related can lead to seemingly inconsistent behavior, from memory recall to sentence formation. While it may seem obvious that if A causes B and B causes C then A causes C, it turns out there are cases where people do not intuitively reach this connection. The researchers were able to show that the mind stores information in “causal islands” as opposed to a “causal network.” In other words, the mind treats some pockets of knowledge in isolation from others.

“We were interested in people’s ‘mental representations’ of causal knowledge,” said Yale psychology student and the study’s lead author Samuel Johnson GRD ’17. “Cognitive scientists think of the mind as being like a computer, and just like how computers can use different formats to structure information, the mind also can use different formats to store different kinds of information.”

There are two reasonable explanations for how the mind stores information about cause and effect. The first is that all information is in one large network. The second is that there are causal islands — or subnetworks — which are unconnected to each other. The findings from this study indicate that the latter is more likely.

In obtaining their results, the researchers performed a series of surveys in which they asked participants to rate how well causal statements were connected. The results showed that causal information is often misremembered in a way that is consistent with the island model, and that the way people choose to form sentences and tell stories is informed by whether the causation in the events is intuitive.

For example, participants interpreted the statement, “exercising causes one to drink water” as intuitive, but did not interpret “sex causes nausea” as intuitive. In the first statement, participants thought it was unnecessary to explain that exercising causes someone to drink water because the person becomes dehydrated. In other words, the intermediate step seemed obvious. But in the second sentence, they felt that mentioning pregnancy was necessary to explain the relationship between sex and nausea — even though participants understood the strong intermediate connection between both sex and pregnancy and between pregnancy and nausea.

“If people store causal relations in a network, all causal chains should be transitive,” Johnson said. “But people seem to store the relationship between sex and pregnancy in one schema, and the relationship between pregnancy and nausea in a separate schema. Not only are some causal chains intransitive, but we can also predict which chains will be intransitive.”

According to Christian Luhmann, a professor of psychology at Stony Brook University who was not affiliated with the study, the paper represents a larger trend in cognitive science that tests human minds against normative models. While philosophers have been asking questions about cognition since antiquity, that field of thought has recently been taken over by computer scientists and applied mathematicians, he said.

This research is the first time anyone has examined the cognitive source of apparent inconsistencies in the way people think about causation.

“There has been a big trend in cognitive science to represent causal science in terms of a network of things with all arrows connected,” said Woo-Kyoung Ahn, professor of psychology at Yale and senior author of the paper. “If that’s all we have in our head in terms of causal relations, there’s no way they can distinguish between these kinds of chains. Why are some chains intransitive and some chains transitive?”

The paper was accepted to the journal Cognitive Science in September 2014.

GEORGE SAUSSY