While some groups are more prone to it, everybody does it to some extent.

People tend to get highly emotional about issues they regard as matters of morality, and they generally attempt to avoid or even punish individuals they regard as immoral. The heated response to moral issues is the exact opposite of what many people consider rational behavior.

Or so many of us would like to think. As it turns out, a new study indicates that people regard rationality itself as a matter of moral behavior. While the study identifies a group of people who tend to take a strong and persistent moral stand about rationality, it also shows that the even the control populations tend to do this. The results could go a long way toward explaining why people have self-segregated over ideological issues and respond so heatedly to policy issues.

The study

The study comes courtesy of a team of three researchers (Tomas Ståhl, Maarten Zaal, and Linda J. Skitka), who were motivated in part by people like the New Atheists and organized groups of skeptics. These individuals, in the researchers' view, have engaged in something akin to a crusade, trying to get everyone to abandon faith and adopt a science-focused world view. The researchers "suggest that advocates of science are frequently anything but value-neutral or amoral in their convictions about the superiority of beliefs based on rationality and scientific evidence," and they then set out to gather some evidence.

The evidence they focused on was a series of surveys of people taken using Mechanical Turk. They first devised a series of questions that looked at how people viewed rational behavior, both personally and generally. Subjects were asked to rate their agreement with statements like "It is morally wrong to trust your intuitions without rationally examining them," and "It is important to me personally to critically examine my long-held beliefs."

The results showed that there was a population within those surveyed who placed great importance on rationality in general, as distinct from valuing it in their personal behavior. Follow-up surveys showed that this tendency was stable for at least several months. It also correlated with placing a high value on scientific evidence and a tendency to dismiss things like paranormal and religious beliefs. And the people who moralized rationality tended to behave in the same way that everyone else does about moral issues: they view irrational individuals as best avoided and are more likely to view their actions as deserving of some form of sanction.

One obvious explanation for viewing rationality as a moral imperative is that rational evaluation of things tends to lead to better results. So it could become a moral imperative if people feel that rational evaluations produce a better functioning society. To get at this issue, the authors checked whether moralizing rationality was associated with a generally utilitarian outlook. It wasn't. It doesn't seem to be the case that the people who view rationality as a moral issue are doing so simply because they think irrationality is generally harmful.

There are a number of issues researchers have previously identified as fundamental to human morality, such as care vs. harm and fairnesss vs. cheating. The authors of the study set out to find whether valuing rationality was distinct from the rest of these. Unfortunately, this part of the work involved a lot of statistical tests, and it suffered from having the smallest survey population of any of the experiments. While the researchers' results suggest that valuing rationality is a distinct form of moral behavior, this is really something that needs to be replicated. (Valuing rationality as a personal trait, in contrast, was associated with a number of previously defined moral issues.)

Everybody does it

So it appears that there may be a segment of the population that views rational behavior as a moral imperative. The surveys weren't large enough to really give us any grip on how large the population is, so on its own, this study wouldn't provide us much indication of a general relevance. But there are some hints that the conclusions might be applicable to many people.

For one, the basic idea behind the whole work—that moralized rationality might explain the New Atheists—didn't really fare too well. In each individual test, there were only small differences between atheists and those who identified as religious; the authors needed to pool all their tests to find an association between moralized rationality and atheism. To a large extent, it seems that this tendency is present across society.

That idea is also apparent in the portions of the research that ask people to evaluate an individual's behavior in various scenarios. In one example, people were asked to evaluate a doctor who recommended that a patient pray, either because of expectations of a placebo effect (the rational condition) or because of a belief in the power of prayer.

In these scenarios, all groups of people were very likely to assign a moral judgement to rational actions; the only difference in the groups came when they were asked to assign one to irrational actions. The same was largely true when people were asked to assign blame. Here, everybody, whether they moralized rationality or not, was more likely to avoid assigning blame when the person in the scenario acted rationally. (The differences tended to be in how people respond to irrational behavior.)

So while the study indicates that there may be a population that's distinct in terms of the degree to which it assigns a moral value to rationality, there's a tendency to do so even among people who don't belong to that population.

How does this relate to current events? To evaluate that question, we have to step beyond anything in the paper. There's a very well-known tendency for everyone to view their own beliefs as rational and put in place by a careful consideration. People who don't accept the science of evolution or climate change, for example, will often argue that their doubts are based on a careful evaluation of the evidence.

The corollary of this tendency is that, to those individuals, anyone who believes the opposite must not have done a careful, rational evaluation of the evidence. Which, based on this study, may mean that the reasoning is morally suspect, with all the emotional weight that such a judgement brings with it. Behavior like this may help explain why people are self-segregating based on ideological beliefs, and why political controversies have so frequently resulted in the demonization of the opposition.