by Kenneth W. Krause.
Kenneth W. Krause is a contributing editor and “Science Watch” columnist for the Skeptical Inquirer. Formerly a contributing editor and books columnist for the Humanist, Kenneth contributes regularly to Skeptic as well. He may be contacted at email@example.com.
“Religion, then, as a saving resource for the human species cannot gain much mileage from the origins of morality. But the second part of the claim, that relating to maintenance of social morality, remains more persuasive.”—Alexander Saxton, Religion and the Human Prospect.
No, the human slate hasn’t been anything close to blank since the 1970s. As evolutionary psychologist, Marc Hauser, reiterated in his 2006 book, Moral Minds, all humans are born with a universal ethical predisposition, or “moral grammar.” That’s quite fortunate, to be sure. But for those of us concerned with the practical nuts and bolts of moral maintenance, it’s really just a serendipitous start. How should modern moral minds go about fashioning cohesive other-regarding societies, both large and small? Can we learn a thing or two from our religious counterparts?
Relative to religionists, humanists appear almost mesmerized by questions surrounding the origin and development of moral behavior. We possess few, if any, prepackaged answers, after all, because as a general rule we spurn “belief” and conformity. Our skepticism tends to complicate ethical issues to say the least. But even if we could, we should never renounce our insistence on credibility. Intellectual suicide is the answer to no complex societal problem I’ve ever pondered. On the other hand, to some minimal yet meaningful extent, we might decide one day soon to rethink the typically nonnegotiable character of our individualism.
On October 3, 2008, two psychologists from the University of British Columbia, Ara Norenzayan and Azim Shariff, published their review of the empirical evidence for and against religious “prosociality,” more commonly referred to as altruism. (Science 322, 58). The authors began by weighing the strengths and vulnerabilities of various popular theories of religious evolution, but, in the end, found agreement that religious prosociality “may have softened the limitations that kinship-based and … reciprocity-based altruism place on group size.” In other words, natural selection may have favored belief in morally concerned gods that observe, reward, and punish, at least to the extent that religionists have coalesced into relatively large, stable, and cooperative societies of genetically unrelated persons.
Which is not to suggest that Abrahamic monotheists, for example, are more unconditionally or indiscriminately altruistic than others. Although myriad sociological surveys allege that, historically, the religious have been more charitable than the nonreligious, even when controlled for income, education, age, etc., Norenzayan and Shariff point out that all such surveys are subject to fatal presentation biases and, of course, self-deception. “[I]t remains unresolved,” they add, “whether this charity gap persists beyond the ingroup boundaries of the religious groups.” Instead, the authors hypothesize that the religious have merely become “more motivated to maintain a prosocial reputation than the nonreligious.”
Indeed, the behavioral and experimental evidence fails to provide any generalized association between altruism and religiosity per se. In the classic “Good Samaritan” study of 1973, for instance, subjects were led past an apparently ailing victim on their way to an appointment. Predictably, some offered to help and some did not. But religiosity had no significant effect, the authors emphasize, “in this anonymous context.”
In a 1989 study, participants were asked in two ways whether they would volunteer to raise money to pay for a sick child’s medical bills. One group was told that, in the end, they would surely be called to service; the other was informed that they probably would not. A link between religiosity and volunteerism was established only in the second group, where members could have it both ways, indulging in the feeling of altruism without actually suffering its costs. Multiple studies concur, say the authors, that “religiosity predicts prosocial behavior primarily when the prosocial act could promote a positive image for the participant, either in his or her own eyes or in the eyes of observers.
More recent experiments have examined anonymity in the context of supernatural scrutiny. In a 2006 study, for example, belief in a ghostly presence caused university students to cheat less while completing a fixed computer task. In 2007, researchers reported lower rates of cheating among students subjected to unconscious activation of God concepts, but, interestingly, not among other students who were merely religious. According to Norenzayan and Shariff, “the effect occurred only to the extent that thoughts of a morally concerned divine agent were activated in the moment of decision making.” Apparently, neither religious ideology nor devotion had anything to do with it.
Nevertheless—under certain circumstances—belief in moral gods does seem to inspire the observation of social norms, even in the absence of objective monitoring systems. “This, in turn,” say the authors, “would be expected to expand the reach of such norms, facilitating the emergence of larger cooperative communities which otherwise would be vulnerable to collapse.” Okay, but one could no more ask an empirico-rational humanist to believe in a supernatural agent, no matter what the cumulative and ultimate societal stakes, than one could ask a healthy bird to disregard its gift of flight.
But consider the authors’ findings regarding the value of ritual behavior as well. In a 2003 study of 200 19th-century American communes, religious groups were found to outlast their secular counterparts by a ratio of four to one. Once again, however, religiosity as such had nothing to do with the outcome. Once the number of “costly requirements,” including food taboos, possession limitations, and constraints upon marriage, sex, and outside communication were controlled, the statistical chasm vanished. One potential implication, the authors vie, is that “the greater longevity of religious communes with costlier requirements was due to greater intragroup cooperation and trust.” Freeloaders, alas, can easily fake mere belief in morally concerned gods. But they are far less likely to do so, the theory goes, when required to observe onerous rituals and customs before acceptance into the fold.
So what does all of this have to do with us? Although clearly not a religion, humanism is a predominantly moral philosophy. Everyone agrees that morality is about making choices, right? But the humanist realizes as well that a person is most likely to choose wisely for herself and her community when she understands all of the available options. Freedom to investigate and respect for knowledge are paramount. Of course many have argued that it is precisely our devotion to personal liberty and truth that renders every humanist a socially pitiless individual, largely indisposed to sacrifice for the common good.
Then again, consider how far we’ve already come. As Norenzayan and Shariff note, “[t]he cultural spread of reliable secular institutions, such as courts, policing authorities, and effective contract-enforcing mechanisms, although historically recent, has changed the course of human prosociality.” They observe as well that members of contemporary secular organizations are “at least as likely” to be charitable as religious congregants. Indeed, no responsible historian could possibly conclude that the civilized world has not grown increasingly secular in recent centuries. So perhaps the opportunity has arrived for a great humanist awakening of sorts.
It would seem, however, that philosophy alone has proven insufficient to the moral task. I wouldn’t suggest that we pray or “reflect” five times a day in the general direction of the U.S. Supreme Court, or that we sacrifice a finch or two on the eve of Charles Darwin’s birthday. But I think we should habitually re-familiarize ourselves with or, whenever necessary, invent our own unique sets of narratives and icons. We can forget the worn and weary ones of old that defined us as merely irreligious, and embrace more energizing and elevating ones that will distinctly identify us as the most rational yet curiously impassioned and innovative creatures on earth.
Maybe humanists should seriously consider moral maintenance in the mutually reinforcing contexts of ritual and parochial ingroups. Although supernaturalism is clearly an unnecessary and diminishing moral force in the civilized world, we would be grimly remiss to ignore the lessons of religious history. Perhaps now is the time to learn how to recognize and trust one another based on local affiliation and the undeniable power of symbolism and ceremony—no matter how counterintuitive that may sound, at least initially, to the intractably independent freethinker. Of all people, humanists especially should be honest and responsible enough to distinguish the religious baby from its soiled bathwater.