h+ Magazine is a new publication that covers technological, scientific, and cultural trends that are changing human beings in fundamental ways.
Your new post is loading...
Your new post is loading...
Has anyone else noticed modern organized religion is kind of a bummer? Even if your divine belief system isn't violently persecuting another, it seems like you're still trapped in a church singing dirges all Sunday. Modern religion doesn't have any flair. This is why I'd like to offer a modest proposal: Let's bring back the ancient Greek gods. Yes, I mean Zeus, Hera, Apollo, Aphrodite, Ares, the whole shebang — and here's why I think they'd make a significant improvement over our current options.
They're relatable. The Greek gods are definitely gods, but they're also still recognizably human. They have the same emotions, problems and insecurities as regular humans do, and thus, they're far more understandable than nebulous clouds or old bearded men on thrones. The Greek gods actually know what people go through in their lives, because they experience the same feelings. This may make the Greek gods fallible, but it also makes them far more relatable than other divine beings.
They have variety. If you're part of a monotheistic religion, your god is kind of your one-stop shopping for divinity. You're stuck with them, no matter what happens in your life, no matter what your current needs are. But there are tons of Greek gods! Don't think Apollo is getting the job done? Switch to Hephaestus. Have a specific home-related issue coming up? Then pray to Hestia, goddess of the hearth. While a monotheistic god pretty much handles everything for his followers, the Greek gods know how to delegate, giving followers options based on need, preference and situation.
They're easily adaptable to modern life. Most major religions haven't had a serious update for at least a millennium or so. As such, it can be hard to truly integrate these religions into modern times. But thanks to their diversity, the Greek gods would snap right into place. Hermes is obviously the god of cellphones, emails and text messages. As a craftsman, Hephaestus would probably handle all computers and network issues, while Demeter would watch over restaurants. Apollo, the god of YouTube videos. You can't tell me that life wouldn't be at least a little bit easier if we had a god specifically handling YouTube videos.
They are extremely open-minded. Greek gods do not care what you are. They don't care about your gender, the color of your skin, or your sexual preference. They have never told anyone to start a war (except Ares, the god of war, and even then it was just to be having a war, not to persecute other groups). In fact, it was generally ancient Greeks who started their own wars, and then asked the gods for help, at which point they'd pick sides. All I'm saying is that the Greek gods never inspired any holy wars, never gave anyone shit for not believing in them, and never demanded their followers proselytize. Because the Greek gods only cared about themselves, and the side benefit of that self-centeredness was a refreshing lack of prejudice.
You know where you stand with a Greek god. The Greek gods are like hormonal teenagers. Their emotions run high, and can change at a drop of a hat. They're easily angered and easily enamored, but they can be managed. You know they're going to be voltaile, so you can deal with that — and if they happen to be nice and kind to you, hey, bonus. The Greek gods didn't suddenly change from a brimstone-and-fire-worship-me-or-I'll-smite-you Old Testament-type thing to a love-everybody-hippie-dippie New Testament-thing, completely contradicting themselves. They've always been self-centered jerks, making them consistent, if nothing else.
Greek gods will have sex with you. That's pretty awesome. Just knowing you have a chanceto score with a god or goddess adds a certain zest to life. Now admittedly, some time the Greek gods got a little… er, rape-y, and that's not cool. On the other hand, Law & Order: SVU would become super exciting.
They make at least as much sense as the other guys. One of the biggest problems with the Judeo-Christian God that Christian scholars have tried to rationalize over the centuries is how a good and loving god could allow evil to exist; while they've come up with plenty of answer, none of them are particularly satisfying. This isn't an issue for the Greek gods, because they aren't pretending to be omnipotent and loving. Like humans, they can be good and evil themselves. You don't have to wonder why the Greek gods let bad things happen to good people, because the Greek gods can simply be assholes. They care about you as long as you're caring/genuflecting/sacrificing bulls to them. Tit for tat. Honestly, just take a look around. Does it seem like the universe is currently being run by one omniscient guy who completely loves everybody or by a bunch of over-emotional, self-centered jerks? I rest my case.
They're so much more fun. Here's a short list of things we could do if we brought back the Greek gods:
This speech was given by Jim Gilliam at the Personal Democracy Forum on June 6th 2011, and has been viewed approximately 500,000 times.
Add your thoughts in the comments below, tweet with #tiimr, or just sign up to stay in the loop on whatever comes next.
BY VALERIE TARICO, ALTERNET
As we head into a new year, the guardians of traditional religion are ramping up efforts to keep their flocks—or, in crass economic terms, to retain market share. Some Christians have turned to soul searching while others have turned to marketing. Last fall, the LDS church spent millions on billboards, bus banners, and Facebook ads touting “I’m a Mormon.” In Canada, the Catholic Church has launched a “Come Home” marketing campaign. The Southern Baptists Convention voted to rebrand themselves. A hipster mega-church in Seattle combines smart advertising with sales force training for members and a strategy the Catholics have emphasized for centuries: competitive breeding.
In October of 2012 the Pew Research Center announced that for the first time ever Protestant Christians had fallen below 50 percent of the American population. Atheists cheered and evangelicals beat their breasts and lamented the end of the world as we know it. Historian of religion, Molly Worthen, has since offered big picture insights that may dampen the most extreme hopes and fears. Anthropologist Jennifer James, on the other hand, has called fundamentalism the “death rattle” of the Abrahamic traditions.
In all of the frenzy, few seem to give any recognition to the player that I see as the primary hero, or, if you prefer, culprit—and I’m not talking about science populizer and atheist superstar Neil deGrasse Tyson. Then again, maybe Iam talking about Tyson in a sense, because in his various viral guises—as a talk show host and tweeter and as the face on scores of smartass Facebook memes—Tyson is an incarnation of the biggest threat that organized religion has ever faced: the internet.
A traditional religion, one built on “right belief,” requires a closed information system. That is why the Catholic Church put an official seal of approval on some ancient texts and banned or burned others. It is why some Bible-believing Christians are forbidden to marry nonbelievers. It is why Quiverfull moms home school their kids from carefully screened text books. It is why, when you get sucked into conversations with your fundamentalist uncle George from Florida, you sometimes wonder if he has some superpower that allows him to magically close down all avenues into his mind. (He does!)
Religions have spent eons honing defenses that keep outside information away from insiders. The innermost ring wall is a set of certainties and associated emotions like anxiety and disgust and righteous indignation that block curiosity. The outer wall is a set of behaviors aimed at insulating believers from contradictory evidence and from heretics who are potential transmitters of dangerous ideas. These behaviors range from memorizing sacred texts to wearing distinctive undergarments to killing infidels. Such defenses worked beautifully during humanity’s infancy. But they weren’t really designed for the current information age.
Tech-savvy mega-churches may have twitter missionaries, and Calvinist cuties may make viral videos about how Jesus worship isn’t a religion, it’s a relationship, but that doesn’t change the facts: the free flow of information is really, really bad for the product they are selling. Here are five kinds of web content that are like, well, like electrolysis on religion’s hairy toes.
Radically cool science videos and articles. Religion evokes some of our most deeply satisfying emotions: joy, for example, and transcendence, and wonder. This is what Einstein was talking about when he said that “science without religion is lame.” If scientific inquiry doesn’t fill us at times with delight and even speechless awe at new discoveries or the mysteries that remain, then we are missing out on the richest part of the experience. Fortunately, science can provide all of the above, and certain masters of the trade and sectors of the internet are remarkably effective at evoking the wonder—the spirituality if you will—of the natural world unveiled. Some of my own favorites include Symphony of science, NOVA, TED, RSA Animate, and Birdnote.
It should be no surprise that so many fundamentalists are determined to take down the whole scientific endeavor. They see in science not only a critic of their outdated theories but a competitor for their very best product, a sense of transcendent exuberance. For millennia, each religion has made an exclusive claim, that it alone had the power to draw people into a grand vision worth a lifetime of devotion. Each offered the assurance that our brief lives matter and that, in some small way, we might live on. Now we are getting glimpses of a reality so beautiful and so intricate that it offers some of the same promise. Where will the old tribal religions be if, in words of Tracy Chapman, we all decide that Heaven’s here on earth?
Curated Collections of Ridiculous Beliefs. Religious beliefs that aren’t yours often sound silly, and the later in life you encounter them the more laughable they are likely to sound. Web writers are after eyeballs, which means that if there’s something ridiculous to showcase then one is guaranteed to write about it. It may be a nuanced exposé or a snarky list or a flaming meme, but the point, invariably, is to call attention to the stuff that makes you roll your eyes, shake your head in disbelief, laugh, and then hit Share.
The Kinky, Exploitative, Oppressive, Opportunistic and Violent Sides of Religion. Of course, the case against religion doesn’t stop at weird and wacky. It gets nasty, sometimes in ways that are titillating and sometimes in ways that are simply dark. The Bible is full of sex slavery, polygamy and incest, and these are catalogued at places like Evilbible.com. Alternately, a student writing about holidays can find a proclamation in which Puritans give thanks to God for the burning of Indian villages or an interview on the mythic origins of the Christmas story. And if the Catholic come home plea sounds a little desperate, it may well be because the sins of the bishops are getting hard to cover up. On the net, whatever the story may be, someone will be more than willing to expose it.
Supportive communities for people coming out of religion. With or without the net (but especially with it) believers sometimes find their worldview in pieces. Before the internet existed most people who lost their faith kept their doubts to themselves. There was no way to figure out who else might be thinking forbidden thoughts. In some sects, a doubting member may be shunned, excommunicated, or “disfellowshipped” to ensure that doubts don’t spread. So, doubters used keep silent and then disappear into the surrounding culture. Now they can create websites, and today there are as many communities of former believers as there are kinds of belief. These communities range from therapeutic to political, and they cover the range of sects: Evangelical, Mormon, Jehovah’s Witness, and Muslim. There’s even a web home for recovering clergy. Heaven help the unsuspecting believer who wanders into one of these sites and tries to tell members in recovery that they’re all bound for hell.
Lifestyles of the fine and faithless. When they emerge from the recovery process former Christians and Muslims and whatnot find that there’s a whole secular world waiting for them on the web. This can be a lifesaver, literally, for folks who are trapped in closed religious communities on the outside. On the web, they can explore lifestyles in which people stay surprisingly decent and kind without a sacred text or authority figures telling them what to do. In actuality, since so much of religion is about social support (and social control) lots of people skip the intellectual arguments and exposes, and go straight to building a new identity based in a new social network. Some web resources are specifically aimed creating alternatives to theism, for example, Good without God, Parenting Beyond Belief, or The Foundation Beyond Belief.
Interspiritual Okayness. This might sound odd, but one of the threats to traditional religion is interfaith communities that focus on shared spiritual values. Many religions make exclusive truth claims and see other religions as competitors. Without such claims, there is no need for evangelism, missionaries or a set of doctrines that I call donkey motivators (ie. carrots and sticks) like heaven and hell. The web showcases the fact that humanity’s bad and good qualities are universal, spread across cultures and regions, across both secular and religious wisdom traditions. It offers reassurance that we won’t lose the moral or spiritual dimension of life if we outgrow religion, while at the same time providing the means to glean what is truly timeless and wise from old traditions. In doing so, it inevitably reveals that the limitations of any single tradition alone. The Dalai Lama, who has lead interspiritual dialogue for many years made waves recently by saying as much: “All the world’s major religions, with their emphasis on love, compassion, patience, tolerance, and forgiveness can and do promote inner values. But the reality of the world today is that grounding ethics in religion is no longer adequate. This is why I am increasingly convinced that the time has come to find a way of thinking about spirituality and ethics beyond religion altogether.”
The power of interspiritual dialogue is analogous to the broader power of the web in that, at the very heart it is about people finding common ground, exchanging information, and breaking through walls to find a bigger community waiting outside. Last year, Jim Gilliam, founder of Nationbuilder, gave a talk titled, “The Internet is My Religion.” Gilliam is a former fundamentalist who has survived two bouts of cancer thanks to the power of science and the internet. His existence today has required a bone marrow transplant and a double lung transplant organized in part through social media. Looking back on the experience, he speaks with the same passion that drove him when he was on fire for Jesus:
I owed every moment of my life to countless people I would never meet. Tomorrow, that interconnectedness would be represented in my own physical body. Three different DNAs. Individually they were useless, but together they would equal one functioning human. What an incredible debt to repay. I didn’t even know where to start. And that’s when I truly found God. God is just what happens when humanity is connected. Humanity connected is God.
The Vatican, and the Mormon Quorum of the Twelve Apostles, and the Southern Baptist Convention should be very worried.
By Nick Collins
People with brown eyes were judged to be more trustworthy by a panel of volunteers than their blue-eyed peers in a study by researchers from Charles University in Prague.
But when a separate group of volunteers were shown photographs of men with identical faces but whose eye colours had been artificially changed they were rated as looking equally trustworthy.
The findings suggest that the key to being trusted lies in facial characteristics which are shared by people with brown eyes but which are not related to eye colour, experts said.
Brown-eyed men, for example, tend to have rounder faces with broader chins, wider mouths with upward-pointing corners, larger eyes and eyebrows closer to each other – qualities which are more masculine and therefore more trustworthy.
In contrast, men with blue eyes shared qualities which made them appear more shifty such as smaller eyes and narrower mouths with downward-pointing corners.
Women with brown eyes were also viewed as slightly more trustworthy than those with blue eyes, but the difference was much less stark than with men.
Writing in the Public Library of Science ONE journal, the researchers concluded: "Although the brown-eyed faces were perceived as more trustworthy than the blue-eyed ones, it was not brown eye colour per se that caused the stronger perception of trustworthiness but rather the facial features associated with brown eyes."
The findings raise the question of why blue eyes are so common in Northern Europe, if they disadvantage their bearers by making them appear less honest and dependable.
It could be that the bright and attention-grabbing colour of blue eyes provides an advantage in sexual selection which offsets the lack of trustworthiness in the face, researchers suggested.
You can read Begley’s article on the Saturday Evening Post’s website.
David Eagleman, PhD, neuroscientist, Tom Slick Research Award in Consciousness recipient, and best-selling author of SUM: Forty Tales from the Afterlives, shared his thoughts on "The Future of Reality." Dr. Eagleman gave compelling examples of how reality is a matter of individual perception and how Nature's adaptions function as "plug ins" for the brain.
February 25, 2013 • By Ethan Watters
IN THE SUMMER of 1995, a young graduate student in anthropology at UCLA named Joe Henrich traveled to Peru to carry out some fieldwork among the Machiguenga, an indigenous people who live north of Machu Picchu in the Amazon basin. The Machiguenga had traditionally been horticulturalists who lived in single-family, thatch-roofed houses in small hamlets composed of clusters of extended families. For sustenance, they relied on local game and produce from small-scale farming. They shared with their kin but rarely traded with outside groups.
While the setting was fairly typical for an anthropologist, Henrich’s research was not. Rather than practice traditional ethnography, he decided to run a behavioral experiment that had been developed by economists. Henrich used a “game”—along the lines of the famous prisoner’s dilemma—to see whether isolated cultures shared with the West the same basic instinct for fairness. In doing so, Henrich expected to confirm one of the foundational assumptions underlying such experiments, and indeed underpinning the entire fields of economics and psychology: that humans all share the same cognitive machinery—the same evolved rational and psychological hardwiring.
The test that Henrich introduced to the Machiguenga was called the ultimatum game. The rules are simple: in each game there are two players who remain anonymous to each other. The first player is given an amount of money, say $100, and told that he has to offer some of the cash, in an amount of his choosing, to the other subject. The second player can accept or refuse the split. But there’s a hitch: players know that if the recipient refuses the offer, both leave empty-handed. North Americans, who are the most common subjects for such experiments, usually offer a 50-50 split when on the giving end. When on the receiving end, they show an eagerness to punish the other player for uneven splits at their own expense. In short, Americans show the tendency to be equitable with strangers—and to punish those who are not.
Among the Machiguenga, word quickly spread of the young, square-jawed visitor from America giving away money. The stakes Henrich used in the game with the Machiguenga were not insubstantial—roughly equivalent to the few days’ wages they sometimes earned from episodic work with logging or oil companies. So Henrich had no problem finding volunteers. What he had great difficulty with, however, was explaining the rules, as the game struck the Machiguenga as deeply odd.
When he began to run the game it became immediately clear that Machiguengan behavior was dramatically different from that of the average North American. To begin with, the offers from the first player were much lower. In addition, when on the receiving end of the game, the Machiguenga rarely refused even the lowest possible amount. “It just seemed ridiculous to the Machiguenga that you would reject an offer of free money,” says Henrich. “They just didn’t understand why anyone would sacrifice money to punish someone who had the good luck of getting to play the other role in the game.”
Joe Henrich and research assistant Vilisi adminster the Third Party Punishment Game in the village of Teci on Fiji’s Yasawa Island.
The potential implications of the unexpected results were quickly apparent to Henrich. He knew that a vast amount of scholarly literature in the social sciences—particularly in economics and psychology—relied on the ultimatum game and similar experiments. At the heart of most of that research was the implicit assumption that the results revealed evolved psychological traits common to all humans, never mind that the test subjects were nearly always from the industrialized West. Henrich realized that if the Machiguenga results stood up, and if similar differences could be measured across other populations, this assumption of universality would have to be challenged.
Henrich had thought he would be adding a small branch to an established tree of knowledge. It turned out he was sawing at the very trunk. He began to wonder: What other certainties about “human nature” in social science research would need to be reconsidered when tested across diverse populations?
Henrich soon landed a grant from the MacArthur Foundation to take his fairness games on the road. With the help of a dozen other colleagues he led a study of 14 other small-scale societies, in locales from Tanzania to Indonesia. Differences abounded in the behavior of both players in the ultimatum game. In no society did he find people who were purely selfish (that is, who always offered the lowest amount, and never refused a split), but average offers from place to place varied widely and, in some societies—ones where gift-giving is heavily used to curry favor or gain allegiance—the first player would often make overly generous offers in excess of 60 percent, and the second player would often reject them, behaviors almost never observed among Americans.
The research established Henrich as an up-and-coming scholar. In 2004, he was given the U.S. Presidential Early Career Award for young scientists at the White House. But his work also made him a controversial figure. When he presented his research to the anthropology department at the University of British Columbia during a job interview a year later, he recalls a hostile reception. Anthropology is the social science most interested in cultural differences, but the young scholar’s methods of using games and statistics to test and compare cultures with the West seemed heavy-handed and invasive to some. “Professors from the anthropology department suggested it was a bad thing that I was doing,” Henrich remembers. “The word ‘unethical’ came up.”
So instead of toeing the line, he switched teams. A few well-placed people at the University of British Columbia saw great promise in Henrich’s work and created a position for him, split between the economics department and the psychology department. It was in the psychology department that he found two kindred spirits in Steven Heine and Ara Norenzayan. Together the three set about writing a paper that they hoped would fundamentally challenge the way social scientists thought about human behavior, cognition, and culture.
A MODERN LIBERAL ARTS education gives lots of lip service to the idea of cultural diversity. It’s generally agreed that all of us see the world in ways that are sometimes socially and culturally constructed, that pluralism is good, and that ethnocentrism is bad. But beyond that the ideas get muddy. That we should welcome and celebrate people of all backgrounds seems obvious, but the implied corollary—that people from different ethno-cultural origins have particular attributes that add spice to the body politic—becomes more problematic. To avoid stereotyping, it is rarely stated bluntly just exactly what those culturally derived qualities might be. Challenge liberal arts graduates on their appreciation of cultural diversity and you’ll often find them retreating to the anodyne notion that under the skin everyone is really alike.
If you take a broad look at the social science curriculum of the last few decades, it becomes a little more clear why modern graduates are so unmoored. The last generation or two of undergraduates have largely been taught by a cohort of social scientists busily doing penance for the racism and Eurocentrism of their predecessors, albeit in different ways. Many anthropologists took to the navel gazing of postmodernism and swore off attempts at rationality and science, which were disparaged as weapons of cultural imperialism.
Economists and psychologists, for their part, did an end run around the issue with the convenient assumption that their job was to study the human mind stripped of culture. The human brain is genetically comparable around the globe, it was agreed, so human hardwiring for much behavior, perception, and cognition should be similarly universal. No need, in that case, to look beyond the convenient population of undergraduates for test subjects. A 2008 survey of the top six psychology journals dramatically shows how common that assumption was: more than 96 percent of the subjects tested in psychological studies from 2003 to 2007 were Westerners—with nearly 70 percent from the United States alone. Put another way: 96 percent of human subjects in these studies came from countries that represent only 12 percent of the world’s population.
Henrich’s work with the ultimatum game was an example of a small but growing countertrend in the social sciences, one in which researchers look straight at the question of how deeply culture shapes human cognition. His new colleagues in the psychology department, Heine and Norenzayan, were also part of this trend. Heine focused on the different ways people in Western and Eastern cultures perceived the world, reasoned, and understood themselves in relationship to others. Norenzayan’s research focused on the ways religious belief influenced bonding and behavior. The three began to compile examples of cross-cultural research that, like Henrich’s work with the Machiguenga, challenged long-held assumptions of human psychological universality.
Some of that research went back a generation. It was in the 1960s, for instance, that researchers discovered that aspects of visual perception were different from place to place. One of the classics of the literature, theMüller-Lyer illusion, showed that where you grew up would determine to what degree you would fall prey to the illusion that these two lines are different in length:
Researchers found that Americans perceive the line with the ends feathered outward (B) as being longer than the line with the arrow tips (A). San foragers of the Kalahari, on the other hand, were more likely to see the lines as they are: equal in length. Subjects from more than a dozen cultures were tested, and Americans were at the far end of the distribution—seeing the illusion more dramatically than all others.
More recently psychologists had challenged the universality of research done in the 1950s by pioneering social psychologist Solomon Asch. Asch had discovered that test subjects were often willing to make incorrect judgments on simple perception tests to conform with group pressure. When the test was performed across 17 societies, however, it turned out that group pressure had a range of influence. Americans were again at the far end of the scale, in this case showing the least tendency to conform to group belief.
As Heine, Norenzayan, and Henrich furthered their search, they began to find research suggesting wide cultural differences almost everywhere they looked: in spatial reasoning, the way we infer the motivations of others, categorization, moral reasoning, the boundaries between the self and others, and other arenas. These differences, they believed, were not genetic. The distinct ways Americans and Machiguengans played the ultimatum game, for instance, wasn’t because they had differently evolved brains. Rather, Americans, without fully realizing it, were manifesting a psychological tendency shared with people in other industrialized countries that had been refined and handed down through thousands of generations in ever more complex market economies. When people are constantly doing business with strangers, it helps when they have the desire to go out of their way (with a lawsuit, a call to the Better Business Bureau, or a bad Yelp review) when they feel cheated. Because Machiguengan culture had a different history, their gut feeling about what was fair was distinctly their own. In the small-scale societies with a strong culture of gift-giving, yet another conception of fairness prevailed. There, generous financial offers were turned down because people’s minds had been shaped by a cultural norm that taught them that the acceptance of generous gifts brought burdensome obligations. Our economies hadn’t been shaped by our sense of fairness; it was the other way around.
The growing body of cross-cultural research that the three researchers were compiling suggested that the mind’s capacity to mold itself to cultural and environmental settings was far greater than had been assumed. The most interesting thing about cultures may not be in the observable things they do—the rituals, eating preferences, codes of behavior, and the like—but in the way they mold our most fundamental conscious and unconscious thinking and perception.
For instance, the different ways people perceive the Müller-Lyer illusion likely reflects lifetimes spent in different physical environments. American children, for the most part, grow up in box-shaped rooms of varying dimensions. Surrounded by carpentered corners, visual perception adapts to this strange new environment (strange and new in terms of human history, that is) by learning to perceive converging lines in three dimensions.
When unconsciously translated in three dimensions, the line with the outward-feathered ends (C) appears farther away and the brain therefore judges it to be longer. The more time one spends in natural environments, where there are no carpentered corners, the less one sees the illusion.
As the three continued their work, they noticed something else that was remarkable: again and again one group of people appeared to be particularly unusual when compared to other populations—with perceptions, behaviors, and motivations that were almost always sliding down one end of the human bell curve.
In the end they titled their paper “The Weirdest People in the World?” (pdf)By “weird” they meant both unusual and Western, Educated, Industrialized, Rich, and Democratic. It is not just our Western habits and cultural preferences that are different from the rest of the world, it appears. The very way we think about ourselves and others—and even the way we perceive reality—makes us distinct from other humans on the planet, not to mention from the vast majority of our ancestors. Among Westerners, the data showed that Americans were often the most unusual, leading the researchers to conclude that “American participants are exceptional even within the unusual population of Westerners—outliers among outliers.”
Given the data, they concluded that social scientists could not possibly have picked a worse population from which to draw broad generalizations. Researchers had been doing the equivalent of studying penguins while believing that they were learning insights applicable to all birds.
NOT LONG AGO I met Henrich, Heine, and Norenzayan for dinner at a small French restaurant in Vancouver, British Columbia, to hear about the reception of their weird paper, which was published in the prestigiousjournal Behavioral and Brain Sciences in 2010. The trio of researchers are young—as professors go—good-humored family men. They recalled that they were nervous as the publication time approached. The paper basically suggested that much of what social scientists thought they knew about fundamental aspects of human cognition was likely only true of one small slice of humanity. They were making such a broadside challenge to whole libraries of research that they steeled themselves to the possibility of becoming outcasts in their own fields.
“We were scared,” admitted Henrich. “We were warned that a lot of people were going to be upset.”
“We were told we were going to get spit on,” interjected Norenzayan.
“Yes,” Henrich said. “That we’d go to conferences and no one was going to sit next to us at lunchtime.”
Interestingly, they seemed much less concerned that they had used the pejorative acronym WEIRD to describe a significant slice of humanity, although they did admit that they could only have done so to describe their own group. “Really,” said Henrich, “the only people we could have called weird are represented right here at this table.”
Still, I had to wonder whether describing the Western mind, and the American mind in particular, as weird suggested that our cognition is not just different but somehow malformed or twisted. In their paper the trio pointed out cross-cultural studies that suggest that the “weird” Western mind is the most self-aggrandizing and egotistical on the planet: we are more likely to promote ourselves as individuals versus advancing as a group. WEIRD minds are also more analytic, possessing the tendency to telescope in on an object of interest rather than understanding that object in the context of what is around it.
The WEIRD mind also appears to be unique in terms of how it comes to understand and interact with the natural world. Studies show that Western urban children grow up so closed off in man-made environments that their brains never form a deep or complex connection to the natural world. While studying children from the U.S., researchers have suggested a developmental timeline for what is called “folkbiological reasoning.” These studies posit that it is not until children are around 7 years old that they stop projecting human qualities onto animals and begin to understand that humans are one animal among many. Compared to Yucatec Maya communities in Mexico, however, Western urban children appear to be developmentally delayed in this regard. Children who grow up constantly interacting with the natural world are much less likely to anthropomorphize other living things into late childhood.
Given that people living in WEIRD societies don’t routinely encounter or interact with animals other than humans or pets, it’s not surprising that they end up with a rather cartoonish understanding of the natural world. “Indeed,” the report concluded, “studying the cognitive development of folkbiology in urban children would seem the equivalent of studying ‘normal’ physical growth in malnourished children.”
During our dinner, I admitted to Heine, Henrich, and Norenzayan that the idea that I can only perceive reality through a distorted cultural lens was unnerving. For me the notion raised all sorts of metaphysical questions: Is my thinking so strange that I have little hope of understanding people from other cultures? Can I mold my own psyche or the psyches of my children to be less WEIRD and more able to think like the rest of the world? If I did, would I be happier?
Henrich reacted with mild concern that I was taking this research so personally. He had not intended, he told me, for his work to be read as postmodern self-help advice. “I think we’re really interested in these questions for the questions’ sake,” he said.
The three insisted that their goal was not to say that one culturally shaped psychology was better or worse than another—only that we’ll never truly understand human behavior and cognition until we expand the sample pool beyond its current small slice of humanity. Despite these assurances, however, I found it hard not to read a message between the lines of their research. When they write, for example, that weird children develop their understanding of the natural world in a “culturally and experientially impoverished environment” and that they are in this way the equivalent of “malnourished children,” it’s difficult to see this as a good thing.
THE TURN THAT HENRICH, Heine, and Norenzayan are asking social scientists to make is not an easy one: accounting for the influence of culture on cognition will be a herculean task. Cultures are not monolithic; they can be endlessly parsed. Ethnic backgrounds, religious beliefs, economic status, parenting styles, rural upbringing versus urban or suburban—there are hundreds of cultural differences that individually and in endless combinations influence our conceptions of fairness, how we categorize things, our method of judging and decision making, and our deeply held beliefs about the nature of the self, among other aspects of our psychological makeup.
We are just at the beginning of learning how these fine-grained cultural differences affect our thinking. Recent research has shown that people in “tight” cultures, those with strong norms and low tolerance for deviant behavior (think India, Malaysia, and Pakistan), develop higher impulse control and more self-monitoring abilities than those from other places. Men raised in the honor culture of the American South have been shown to experience much larger surges of testosterone after insults than do Northerners. Research published late last year suggested psychological differences at the city level too. Compared to San Franciscans, Bostonians’ internal sense of self-worth is more dependent on community status and financial and educational achievement. “A cultural difference doesn’t have to be big to be important,” Norenzayan said. “We’re not just talking about comparing New York yuppies to the Dani tribesmen of Papua New Guinea.”
As Norenzayan sees it, the last few generations of psychologists have suffered from “physics envy,” and they need to get over it. The job, experimental psychologists often assumed, was to push past the content of people’s thoughts and see the underlying universal hardware at work. “This is a deeply flawed way of studying human nature,” Norenzayan told me, “because the content of our thoughts and their process are intertwined.” In other words, if human cognition is shaped by cultural ideas and behavior, it can’t be studied without taking into account what those ideas and behaviors are and how they are different from place to place.
This new approach suggests the possibility of reverse-engineering psychological research: look at cultural content first; cognition and behavior second. Norenzayan’s recent work on religious belief is perhaps the best example of the intellectual landscape that is now open for study. When Norenzayan became a student of psychology in 1994, four years after his family had moved from Lebanon to America, he was excited to study the effect of religion on human psychology. “I remember opening textbook after textbook and turning to the index and looking for the word ‘religion,’ ” he told me, “Again and again the very word wouldn’t be listed. This was shocking. How could psychology be the science of human behavior and have nothing to say about religion? Where I grew up you’d have to be in a coma not to notice the importance of religion on how people perceive themselves and the world around them.”
Norenzayan became interested in how certain religious beliefs, handed down through generations, may have shaped human psychology to make possible the creation of large-scale societies. He has suggested that there may be a connection between the growth of religions that believe in “morally concerned deities”—that is, a god or gods who care if people are good or bad—and the evolution of large cities and nations. To be cooperative in large groups of relative strangers, in other words, might have required the shared belief that an all-powerful being was forever watching over your shoulder.
If religion was necessary in the development of large-scale societies, can large-scale societies survive without religion? Norenzayan points to parts of Scandinavia with atheist majorities that seem to be doing just fine. They may have climbed the ladder of religion and effectively kicked it away. Or perhaps, after a thousand years of religious belief, the idea of an unseen entity always watching your behavior remains in our culturally shaped thinking even after the belief in God dissipates or disappears.
Why, I asked Norenzayan, if religion might have been so central to human psychology, have researchers not delved into the topic? “Experimental psychologists are the weirdest of the weird,” said Norenzayan. “They are almost the least religious academics, next to biologists. And because academics mostly talk amongst themselves, they could look around and say, ‘No one who is important to me is religious, so this must not be very important.’” Indeed, almost every major theorist on human behavior in the last 100 years predicted that it was just a matter of time before religion was a vestige of the past. But the world persists in being a very religious place.
HENRICH, HEINE, AND NORENZAYAN’S FEAR of being ostracized after the publication of the WEIRD paper turned out to be misplaced. Response to the paper, both published and otherwise, has been nearly universally positive, with more than a few of their colleagues suggesting that the work will spark fundamental changes. “I have no doubt that this paper is going to change the social sciences,” said Richard Nisbett, an eminent psychologist at the University of Michigan. “It just puts it all in one place and makes such a bold statement.”
More remarkable still, after reading the paper, academics from other disciplines began to come forward with their own mea culpas. Commenting on the paper, two brain researchers from Northwestern University argued(pdf) that the nascent field of neuroimaging had made the same mistake as psychologists, noting that 90 percent of neuroimaging studies were performed in Western countries. Researchers in motor developmentsimilarly suggested that their discipline’s body of research ignored how different child-rearing practices around the world can dramatically influence states of development. Two psycholinguistics professors suggested that their colleagues had also made the same mistake: blithely assuming human homogeneity while focusing their research primarily on one rather small slice of humanity.
At its heart, the challenge of the WEIRD paper is not simply to the field of experimental human research (do more cross-cultural studies!); it is a challenge to our Western conception of human nature. For some time now, the most widely accepted answer to the question of why humans, among all animals, have so successfully adapted to environments across the globe is that we have big brains with the ability to learn, improvise, and problem-solve.
Henrich has challenged this “cognitive niche” hypothesis with the “cultural niche” hypothesis. He notes that the amount of knowledge in any culture is far greater than the capacity of individuals to learn or figure it all out on their own. He suggests that individuals tap that cultural storehouse of knowledge simply by mimicking (often unconsciously) the behavior and ways of thinking of those around them. We shape a tool in a certain manner, adhere to a food taboo, or think about fairness in a particular way, not because we individually have figured out that behavior’s adaptive value, but because we instinctively trust our culture to show us the way. When Henrich asked Fijian women why they avoided certain potentially toxic fish during pregnancy and breastfeeding, he found that many didn’t know or had fanciful reasons. Regardless of their personal understanding, by mimicking this culturally adaptive behavior they were protecting their offspring. The unique trick of human psychology, these researchers suggest, might be this: our big brains are evolved to let local culture lead us in life’s dance.
The applications of this new way of looking at the human mind are still in the offing. Henrich suggests that his research about fairness might first be applied to anyone working in international relations or development. People are not “plug and play,” as he puts it, and you cannot expect to drop a Western court system or form of government into another culture and expect it to work as it does back home. Those trying to use economic incentives to encourage sustainable land use will similarly need to understand local notions of fairness to have any chance of influencing behavior in predictable ways.
Because of our peculiarly Western way of thinking of ourselves as independent of others, this idea of the culturally shaped mind doesn’t go down very easily. Perhaps the richest and most established vein of cultural psychology—that which compares Western and Eastern concepts of the self—goes to the heart of this problem. Heine has spent much of his career following the lead of a seminal paper published in 1991 by Hazel Rose Markus, of Stanford University, and Shinobu Kitayama, who is now at the University of Michigan. Markus and Kitayama suggested that different cultures foster strikingly different views of the self, particularly along one axis: some cultures regard the self as independent from others; others see the self as interdependent. The interdependent self—which is more the norm in East Asian countries, including Japan and China—connects itself with others in a social group and favors social harmony over self-expression. The independent self—which is most prominent in America—focuses on individual attributes and preferences and thinks of the self as existing apart from the group.
The classic “rod and frame” task: Is the line in the center vertical?
That we in the West develop brains that are wired to see ourselves as separate from others may also be connected to differences in how we reason, Heine argues. Unlike the vast majority of the world, Westerners (and Americans in particular) tend to reason analytically as opposed to holistically. That is, the American mind strives to figure out the world by taking it apart and examining its pieces. Show a Japanese and an American the same cartoon of an aquarium, and the American will remember details mostly about the moving fish while the Japanese observer will likely later be able to describe the seaweed, the bubbles, and other objects in the background. Shown another way, in a different test analytic Americans will do better on something called the “rod and frame” task, where one has to judge whether a line is vertical even though the frame around it is skewed. Americans see the line as apart from the frame, just as they see themselves as apart from the group.
Heine and others suggest that such differences may be the echoes of cultural activities and trends going back thousands of years. Whether you think of yourself as interdependent or independent may depend on whether your distant ancestors farmed rice (which required a great deal of shared labor and group cooperation) or herded animals (which rewarded individualism and aggression). Heine points to Nisbett at Michigan, who has argued (pdf) that the analytic/holistic dichotomy in reasoning styles can be clearly seen, respectively, in Greek and Chinese philosophical writing dating back 2,500 years. These psychological trends and tendencies may echo down generations, hundreds of years after the activity or situation that brought them into existence has disappeared or fundamentally changed.
And here is the rub: the culturally shaped analytic/individualistic mind-sets may partly explain why Western researchers have so dramatically failed to take into account the interplay between culture and cognition. In the end, the goal of boiling down human psychology to hardwiring is not surprising given the type of mind that has been designing the studies. Taking an object (in this case the human mind) out of its context is, after all, what distinguishes the analytic reasoning style prevalent in the West. Similarly, we may have underestimated the impact of culture because the very ideas of being subject to the will of larger historical currents and of unconsciously mimicking the cognition of those around us challenges our Western conception of the self as independent and self-determined. The historical missteps of Western researchers, in other words, have been the predictable consequences of the WEIRD mind doing the thinking.
For readers of a more scientific bent, Phillips' frequent appeals to literature may seem out of place. His style, too — full of head-scratching paradoxes and qualified propositions (couched in phrases like "we might wonder" or "we might even say") — can sometimes be so fluid that it's difficult to pin down exactly what the writer believes. Yet Missing Out isn't supposed to be a scientific treatise on the architecture of the brain. It's a meditation on who we are that forgoes easy answers in favor of better questions. Because Phillips believes that, for imperfect, desiring creatures like us, the easy answers may be the most harmful ones. To avoid slipping into anger or revenge, he concludes, "We need ... to have better — more interesting, more enlivening, more satisfying — conversations about our frustrations." His book is a very good place to start.
This speech was given by Jim Gilliam at the Personal Democracy Forum on June 6th 2011, and has been viewed approximately 500,000 times.
Add your thoughts in the comments below, tweet with #tiimr, or just sign up to stay in the loop on whatever comes next.
LOS ALTOS, CA, January 10, 2013 – A study published in the July 2012 issue of Explore provides further evidence that the prayers of one individual used to treat a physical condition of another may – or may not – help.
How’s that for definitive?
Depending on the methods used, the condition being treated, and the individuals involved, “results may vary” for those scientists trying to figure out if prayer – the most commonly used form of complementary and alternative medicine (CAM), according to one government survey – really works.
In this particular study, a team of researchers from the Institute of Noetic Sciences (IONS) and the University of California, San Francisco (UCSF) set out to determine if distant healing intention (DHI) is effective in treating surgical wounds. DHI is defined by the study’s authors as “a compassionate mental act intended to improve the health and well-being of another person at a distance.” Some of the terms used to describe DHI are intercessory prayer, spiritual healing, intentionality, energy healing, shamanic healing, nonlocal healing, noncontact therapeutic touch, and Reiki.
Seventy-two women undergoing elective surgery – some for reconstruction after breast cancer surgery and some for cosmetic reasons – were divided into three groups: a blinded group receiving DHI, a blinded group not receiving DHI, and an unblinded group receiving DHI. This configuration allowed the researchers to study the effect of their subjects’ expectations by varying the degree to which they knew they were being prayed for.
Here’s what they found:
“The more that participants believed in distant healing, and the more they thought that distant healing was actually focused on them, the worse they did on both objective and subjective measures. In addition, the better the healers thought that they were doing, again, the worse the participants’ outcomes.”
At first blush this would appear to validate the arguments of those who say that prayer is nothing more than wishful thinking – a generally harmless but occasionally dangerous practice. However, the researchers weren’t so quick to come to that same conclusion. While admitting the possibility that DHI effects do not exist, they also considered the possibility that DHI effects do exist but that “the relevant variables that modulate these effects are not well understood and interact in complex ways.”
An increasing amount of evidence indicates that one of the most important variables to consider is the thought of the individual being treated. Qualities of thought such as self-doubt, anger, and fear have long been known to have a negative impact on health, whereas qualities such as forgiveness, gratitude, and compassion – even a belief in a divine power – can have a positive effect.
Then there’s the thought of the one providing treatment.
In a study that mirrors somewhat the work done by IONS/UCSF, a team of researchers led by Harvard Medical School’s Dr. Herbert Benson concluded that intercessory prayer had an adverse effect on patients recovering from coronary bypass surgery who were aware that someone was praying for them. However, what most news organizations reporting on these findings failed to mention was that those who were doing the praying belonged to a religious group that, according to Indiana University religious professor, Candy Gunther Brown, “have long denied that prayer works ‘miracles,’ and have even called petitionary prayer ‘useless.’”
The question is, will we ever be able to determine if prayer, for another or for one’s self, can have a positive impact on health?
As the IONS/UCSF study suggests, it will likely require further study – even the development of new theories or an entirely different methodology – before we reach anything approaching a definitive answer. In the meantime, those who have found prayer to be an effective means of treating the body (e.g.), even in lieu of conventional medicine, will likely continue praying. And some day – some day – we just might have a better understanding of the source of their confidence and success.
Eric Nelson is a Christian Science practitioner, whose articles on the link between consciousness and health appear regularly in a number of local, regional, and national online publications. He also serves as the media and legislative spokesperson for Christian Science in Northern California.
The 12 cognitive biases that prevent you from being rationalGeorge Dvorsky
The human brain is capable of 1016 processes per second, which makes it far more powerful than any computer currently in existence. But that doesn't mean our brains don't have major limitations. The lowly calculator can do math thousands of times better than we can, and our memories are often less than useless — plus, we're subject to cognitive biases, those annoying glitches in our thinking that cause us to make questionable decisions and reach erroneous conclusions. Here are a dozen of the most common and pernicious cognitive biases that you need to know about.
Before we start, it's important to distinguish between cognitive biases and logical fallacies. A logical fallacy is an error in logical argumentation (e.g. ad hominem attacks, slippery slopes, circular arguments, appeal to force, etc.). A cognitive bias, on the other hand, is a genuine deficiency or limitation in our thinking — a flaw in judgment that arises from errors of memory, social attribution, and miscalculations (such as statistical errors or a false sense of probability).
Some social psychologists believe our cognitive biases help us process information more efficiently, especially in dangerous situations. Still, they lead us to make grave mistakes. We may be prone to such errors in judgment, but at least we can be aware of them. Here are some important ones to keep in mind.Confirmation BiasFull sizeWe love to agree with people who agree with us. It's why we only visit websites that express our political opinions, and why we mostly hang around people who hold similar views and tastes. We tend to be put off by individuals, groups, and news sources that make us feel uncomfortable or insecure about our views — what the behavioral psychologist B. F. Skinner called cognitive dissonance. It's this preferential mode of behavior that leads to the confirmation bias — the often unconscious act of referencing only those perspectives that fuel our pre-existing views, while at the same time ignoring or dismissing opinions — no matter how valid — that threaten our world view. And paradoxically, the internet has only made this tendency even worse.Ingroup BiasFull sizeSomewhat similar to the confirmation bias is the ingroup bias, a manifestation of our innate tribalistic tendencies. And strangely, much of this effect may have to do with oxytocin — the so-called "love molecule." This neurotransmitter, while helping us to forge tighter bonds with people in our ingroup, performs the exact opposite function for those on the outside — it makes us suspicious, fearful, and even disdainful of others. Ultimately, the ingroup bias causes us to overestimate the abilities and value of our immediate group at the expense of people we don't really know.Gambler's FallacyFull sizeIt's called a fallacy, but it's more a glitch in our thinking. We tend to put a tremendous amount of weight on previous events, believing that they'll somehow influence future outcomes. The classic example is coin-tossing. After flipping heads, say, five consecutive times, our inclination is to predict an increase in likelihood that the next coin toss will be tails — that the odds must certainly be in the favor of heads. But in reality, the odds are still 50/50. As statisticians say, the outcomes in different tosses are statistically independent and the probability of any outcome is still 50%.
Relatedly, there's also the positive expectation bias — which often fuels gambling addictions. It's the sense that our luck has to eventually change and that good fortune is on the way. It also contribues to the "hot hand" misconception. Similarly, it's the same feeling we get when we start a new relationship that leads us to believe it will be better than the last one.Post-Purchase Rationalization
Remember that time you bought something totally unnecessary, faulty, or overly expense, and then you rationalized the purchase to such an extent that you convinced yourself it was a great idea all along? Yeah, that's post-purchase rationalization in action — a kind of built-in mechanism that makes us feel better after we make crappy decisions, especially at the cash register. Also known as Buyer's Stockholm Syndrome, it's a way of subconsciously justifying our purchases — especially expensive ones. Social psychologists say it stems from the principle of commitment, our psychological desire to stay consistent and avoid a state of cognitive dissonance.Neglecting ProbabilityFull sizeVery few of us have a problem getting into a car and going for a drive, but many of us experience great trepidation about stepping inside an airplane and flying at 35,000 feet. Flying, quite obviously, is a wholly unnatural and seemingly hazardous activity. Yet virtually all of us know and acknowledge the fact that the probability of dying in an auto accident is significantly greater than getting killed in a plane crash — but our brains won't release us from this crystal clear logic (statistically, we have a 1 in 84 chance of dying in a vehicular accident, as compared to a 1 in 5,000 chance of dying in an plane crash [other sources indicate odds as high as 1 in 20,000]). It's the same phenomenon that makes us worry about getting killed in an act of terrorism as opposed to something far more probable, like falling down the stairs or accidental poisoning.
This is what the social psychologist Cass Sunstein calls probability neglect — our inability to properly grasp a proper sense of peril and risk — which often leads us to overstate the risks of relatively harmless activities, while forcing us to overrate more dangerous ones.Observational Selection BiasFull sizeThis is that effect of suddenly noticing things we didn't notice that much before — but we wrongly assume that the frequency has increased. A perfect example is what happens after we buy a new car and we inexplicably start to see thesame car virtually everywhere. A similar effect happens to pregnant women who suddenly notice a lot of other pregnant women around them. Or it could be a unique number or song. It's not that these things are appearing more frequently, it's that we've (for whatever reason) selected the item in our mind, and in turn, are noticing it more often. Trouble is, most people don't recognize this as a selectional bias, and actually believe these items or events are happening with increased frequency — which can be a very disconcerting feeling. It's also a cognitive bias that contributes to the feeling that the appearance of certain things or events couldn't possibly be a coincidence (even though it is).Status-Quo Bias
We humans tend to be apprehensive of change, which often leads us to make choices that guarantee that things remain the same, or change as little as possible. Needless to say, this has ramifications in everything from politics to economics. We like to stick to our routines, political parties, and our favorite meals at restaurants. Part of the perniciousness of this bias is the unwarranted assumption that another choice will be inferior or make things worse. The status-quo bias can be summed with the saying, "If it ain't broke, don't fix it" — an adage that fuels our conservative tendencies. And in fact, some commentators say this is why the U.S. hasn't been able to enact universal health care, despite the fact that most individuals support the idea of reform.Negativity BiasFull sizePeople tend to pay more attention to bad news — and it's not just because we're morbid. Social scientists theorize that it's on account of our selective attention and that, given the choice, we perceive negative news as being more important or profound. We also tend to give more credibility to bad news, perhaps because we're suspicious (or bored) of proclamations to the contrary. More evolutionarily, heeding bad news may be more adaptive than ignoring good news (e.g. "saber tooth tigers suck" vs. "this berry tastes good"). Today, we run the risk of dwelling on negativity at the expense of genuinely good news. Steven Pinker, in his book The Better Angels of Our Nature: Why Violence Has Declined, argues that crime, violence, war, and other injustices are steadily declining, yet most people would argue that things are getting worse — what is a perfect example of the negativity bias at work.Bandwagon EffectFull sizeThough we're often unconscious of it, we love to go with the flow of the crowd. When the masses start to pick a winner or a favorite, that's when our individualized brains start to shut down and enter into a kind of "groupthink" or hivemind mentality. But it doesn't have to be a large crowd or the whims of an entire nation; it can include small groups, like a family or even a small group of office co-workers. The bandwagon effect is what often causes behaviors, social norms, and memes to propagate among groups of individuals — regardless of the evidence or motives in support. This is why opinion polls are often maligned, as they can steer the perspectives of individuals accordingly. Much of this bias has to do with our built-in desire to fit in and conform, as famously demonstrated by the Asch Conformity Experiments.Projection Bias
As individuals trapped inside our own minds 24/7, it's often difficult for us to project outside the bounds of our own consciousness and preferences. We tend to assume that most people think just like us — though there may be no justification for it. This cognitive shortcoming often leads to a related effect known as the false consensus bias where we tend to believe that people not only think like us, but that they also agree with us. It's a bias where we overestimate how typical and normal we are, and assume that a consensus exists on matters when there may be none. Moreover, it can also create the effect where the members of a radical or fringe group assume that more people on the outside agree with them than is the case. Or the exaggerated confidence one has when predicting the winner of an election or sports match.The Current Moment BiasFull sizeWe humans have a really hard time imagining ourselves in the future and altering our behaviors and expectations accordingly. Most of us would rather experience pleasure in the current moment, while leaving the pain for later. This is a bias that is of particular concern to economists (i.e. our unwillingness to not overspend and save money) and health practitioners. Indeed, a 1998 study showed that, when making food choices for the coming week, 74% of participants chose fruit. But when the food choice was for the current day, 70% chose chocolate.Anchoring Effect
Also known as the relativity trap, this is the tendency we have to compare and contrast only a limited set of items. It's called the anchoring effect because we tend to fixate on a value or number that in turn gets compared to everything else. The classic example is an item at the store that's on sale; we tend to see (and value) the difference in price, but not the overall price itself. This is why some restaurant menus feature very expensive entrees, while also including more (apparently) reasonably priced ones. It's also why, when given a choice, we tend to pick the middle option — not too expensive, and not too cheap.
Images: Lightspring/Shutterstock, Tsyhun/Shutterstock, Yuri Arcurs/Shutterstock, Everett Collection/Shutterstock, Frank Wasserfuehrer/Shutterstock, George Dvorsky, Barry Gutierrez and Ed Andrieski/AP, Daniel Padavona/Shutterstock, wavebreakmedia/Shutterstock.
By JOHN TIERNEY
When we remember our past selves, they seem quite different. We know how much our personalities and tastes have changed over the years. But when we look ahead, somehow we expect ourselves to stay the same, a team of psychologists said Thursday, describing research they conducted of people’s self-perceptions.
They called this phenomenon the “end of history illusion,” in which people tend to “underestimate how much they will change in the future.” According to their research, which involved more than 19,000 people ages 18 to 68, the illusion persists from teenage years into retirement.
New research suggests group conflict can be reduced if a party learns how a rival perceives issues surrounding mortality.
Researchers from the University of Missouri performed a series of experiments that tested the relationship between awareness of death and belief in a higher power.
The study found that thoughts of death increased atheists, Christians, Muslims and agnostics conviction in their own world views.
For example, contrary to the wartime aphorism that there are no atheists in foxholes, thoughts of death did not cause atheists to express belief in a deity.
“Our study suggests that atheists’ and religious believers’ world views have the same practical goal,” said Kenneth Vail, lead author. “Both groups seek a coherent world view to manage the fear of death and link themselves to a greater and immortal entity, such as a supreme being, scientific progress or a nation.
“If people were more aware of this psychological similarity, perhaps there might be more understanding and less conflict among groups with different beliefs.”
Vail believes that morbid imagery, such as news headlines or caricatures of enemies in war propaganda, can reinforce nationalistic and/or religious ideals by keeping death on the mind and subconsciously encouraging denial of opposing ideologies.
Experts believe this research area suggests that religious symbols and stories involving death, such as the crucifix, psychologically remind the faithful of mortality and subconsciously reinforce one particular world view to the exclusion of others.
For the study, Vail and his colleagues conducted a series of three experiments by first encouraging thoughts of death in study participants and analyzing their responses to a questionnaire.
The first experiment examined Christians and atheists in the United States. The results suggested awareness of death in Christians increased their belief in God and denial of other traditions.
Atheists also continued to adhere to their world views, although no increase in denial of other philosophies was observable because atheists by definition started with no belief in any religious traditions.
The second experiment, conducted in Iran, found that Muslims reacted similarly to Christians when they were thinking about their own mortality.
A third trial observed agnostics and found that thoughts of death tended to increase their belief in a higher power.
However, unlike Christians and Muslims, they did not increase their denial of Buddha, God, Jesus or Allah. Instead agnostics increased acceptance of all of those world views.
“In our study, individuals’ minds appeared to rally around certain personal guiding concepts when faced with fear of death,” said Vail.
“Agnostics seemed to hedge their spiritual bets. They believed more firmly in a higher power. Yet at the same time, they expressed continued belief that the specific nature of that power was beyond human knowledge.”
The study is found in the journal Personality and Social Psychology Bulletin.
Source: University of Missouri