Religion is traditionally defined as the belief in and worship of a superhuman controlling power, especially a personal god or gods. Faith, which comes hand in hand, is defined as a strong belief in god or in the doctrines of a religion, based on spiritual apprehension rather than proof. I read something interesting the other day that might offer some new insight as to why we may be so seemingly prone to believe rather than investigate further.
First off, evolutionarily speaking, any preserved trait, physically or psychologically, could be considered adaptive in some form. The implication is that if a trait was kept in an organism, it stands to reason that the aforementioned trait was preserved multi-generationally because it assisted in that organism’s survival, evolutionary fitness, and thus, propagation. That is, it was preserved through natural selection. This is a relatively novel, but compelling, theory as to what the psychological crux of “religion” may be (Atran & Henrich, 2010). To put it into a phrase, the psychological/mental construct of religion may have been adaptive in our evolutionary fitness, thus, selected for and hence, it still prevails today. However, this proves confusing as many religious practices are “costly.” Looking at evolution through a binary cost-benefit utilitarian framework, traits that are “costly” are sure to be less functional and therefore primed to be naturally eliminated over time. Religions require odd counterintuitive intangible beliefs about how the world works, which historically has involved arbitrary self-mutilation or sacrifice. They require large sacrifices of time, sacrifices of choice via stringent commitments and practices, and lastly, maybe even self-sacrifice, via volcano (My source on this is King Kong), or suicide bombing (My source on this, is sadly, not the movies). The fact is, religions require (by their very definition) unwavering faith, and thus a revocation of one’s autonomy and relegation of one’s decision-making processes to a proxy system, which, at least intuitively, seems very costly.
A possible explanation for this seeming paradox lies in an article I came across published in The Journal of Biological Theory (2010) by evolutionary researchers Atran and Henrich. Evolutionary psychology posits that, extrapolated from anthropological studies, the survival of small-scale hunter and gatherer societies depend on foraging and hunting practices generationally passed on, often adapted by future generations without knowledge as to what they are doing.
According to Atran and Henrich (2010), paraphrasing anthropologist Beck:
”Because of the dependence that human ancestors increasingly had to place on such complex, often nonintuitive, products of cumulative cultural evolution, natural selection may have favored a willingness to rely on culturally acquired information-filtered through our adapted biases-over our direct experience or basic intuitions. To see this, consider that many foragers process plant foods to remove toxins without conscious knowledge of what happens without processing.”
In laymen’s terms, foragers in the past have learned to process their plants in order to detoxify them without active knowledge of the negative consequences of not processing. It’s been extrapolated therefore, that placing “faith” in things, to “believe” in a source of authority or greater power without question, has proven adaptive, because in the broad sense, experimenting with trial and error to find things out for each subsequent generation again and again is redundant and time-costly. Thus, our ancestors, and consequently we, potentially relegate much of our worldly knowledge to culturally derived information; often without questioning the validity of these. This ‘adaptation’ may have assisted in leading to the development of, susceptibility to, and perhaps the “natural” inclination of, human beings to adopt religious beliefs.
More on how whacked out our brains are later.