Jul 16 2018
Motivated Reasoning vs Lazy Thinking
A new study takes another look at partisan motivated reasoning, with surprising (sort of) results. The study shows that as interested critical thinkers, we need to keep up with the psychological research about critical thinking.
First some background – motivated reasoning refers to the tendency to rationalize a defense of a position that we hold with some emotional investment, and reject counter-evidence. If a certain belief is part of our tribal identity, or has emotional significance, we react differently to relevant facts than when a belief is emotionally neutral. For neutral beliefs, we happily update what we believe when new credible information is presented to us. I don’t really care if Thomas Edison invented the light bulb or stole part of the design from Joseph Swan (he did, but he made important improvements also) – whatever the historical data says, I will happily believe. But if someone claimed that George Washington really wanted to be made king of America but was forced to accept a lesser role (he didn’t, I just made that up), I might be motivated to push back just out of patriotism.
Psychologists have been studying this phenomenon for years, and are discovering that it is a real thing, but it’s complicated (that should be no surprise). There is a general challenge with psychological studies that human behavior is complex and opaque, and they resort to using constructs and markers to reveal specific phenomena. How do you test motivated reasoning? First you have to separate people into groups based on some feature that should impact their motivation, such as ideology, religion, or political affiliation. Then challenge their beliefs and see how they respond.
Many studies have shown that when you do this, ideology matters. There is even a possible backlash effect, where motivated believers dig in their heels, but this effect is controversial and may be very small.
We have gone beyond such constructs, and we can see that brains react differently to claims in line with our politics than those opposed to them. More of the brain is engaged, including the emotional centers, when motivated reasoning is triggered.
So far, so good – but, there is still the problem that psychological markers are subject to confounding factors. Remember the marshmallow test. For years psychologists used a child’s ability to defer eating a marshmallow when promised even more marshmallows if they could wait, as a measure of self-control. More recent studies, however, suggest the behavior may also have to do with trust (will those adults really come back with two marshmallows?). Learned behavior (manners) may also play a role.
Are there any confounding factors with motivated reasoning research? It would be foolish to assume there weren’t.
This all brings us to the current research – psychologists used fake news headlines to test subjects’ motivated reasoning. They tested subject to tell fake vs real news headlines, and the headlines were either in line with their political ideology or against it. But they added a new element, they also tested the subjects’ cognitive style. This adds yet another construct, and therefore another layer of possible confounding factors, but they used a fairly well-established test of “cognitive reflection.”
In cognitive reflection tests (CRTs) subjects are given problems that have an immediate gut answer that is wrong, but a more thorough and analytical approach should reveal the real answer. Here is a classic test: “A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?” The immediate intuitive answer is $0.10, but this is wrong. The ball costs $0.05, which means the bat costs $1.05 ($1.00 more), for a total of $1.10.
The researchers used similar CRTs to measure subject’s intuitive (which they called “lazy” thinking in their title) vs analytical thinking. Their question was – how does their score on CRTs predict their ability to sniff out fake headlines, vs how much does alleged motivated reasoning affect their ability? Here are the results:
We find that CRT performance is negatively correlated with the perceived accuracy of fake news, and positively correlated with the ability to discern fake news from real news – even for headlines that align with individuals’ political ideology. Moreover, overall discernment was actually better for ideologically aligned headlines than for misaligned headlines. Finally, a headline-level analysis finds that CRT is negatively correlated with perceived accuracy of relatively implausible (primarily fake) headlines, and positively correlated with perceived accuracy of relatively plausible (primarily real) headlines. In contrast, the correlation between CRT and perceived accuracy is unrelated to how closely the headline aligns with the participant’s ideology.
So in this construct, “lazy” thinking was a better predictor of identifying fake news than biased partisan thinking. That is pretty surprising given all the prior research showing that partisan thinking is a strong predictor.
More research is needed to confirm these results and resolve conflicts with other constructs looking at the same phenomena. It’s possible that when people are challenged, they are more on the defensive, and may in fact be more alert if they are being told something that is in line with their ideology, because that is exactly how someone would try to fool them. No one will try to fool me by telling me something I don’t believe or want to believe anyway. They will fool me by telling me something they think I want to hear.
The precise details of how the study was conducted, how alert the subjects were to the real purpose of the study, and how clever the fake headlines were all matter. Therefore I don’t think we can generalize from this one study, and independent replication is critical. Researches also have to think carefully about the construct, and try to break it by identifying confounding factors.
However this turns out, the story of motivated reasoning is a fascinating and important one. Hopefully it will continue to receive attention from researchers.