Guest post by Rabbi K’vod Wieder
Some of us are familiar with the story of a group of blind men who heard that a strange animal, called an elephant, had been brought to the town, but none of them were aware of its shape and form. Out of curiosity, they said: “We must inspect and know it by touch, of which we are capable.” So, they sought it out, and when they found it, they groped about it.
The first person, whose hand landed on the trunk, said, “This being is like a thick snake.” For another one whose hand reached its ear, it seemed like a kind of fan. Another person, whose hand was upon its leg, said the elephant is a pillar like a tree trunk. The blind man who placed his hand upon its side said the elephant, “is a wall.” Another who felt its tail described it as a rope. The last felt its tusk, stating the elephant is that which is hard, smooth, and like a spear.
The moral of this parable which is thought to have originated in India is that humans have a tendency to claim absolute truth based on their limited, subjective experience as they ignore other people’s limited, subjective experiences, which may be equally true. Obviously, each person’s experience is their own, but the way they label it and make meaning from it results in a much more limited perspective than the person with sight who can experience the elephant as a whole.
If I were to write a second part of the parable, I would use metaphorical language to share what happened next. The one who thought he was touching a thick snake might anxiously tell the others that there are a lot more snakes where that came from, and you can’t trust the Hindus because they’ll bring more snakes into the village, sowing fear. The one who thought he was touching a fan might advocate to start a business selling fans because people surely wouldn’t miss the opportunity for shade from the brutal Indian heat. The one who thought he was touching a tree trunk thought himself above everyone who thought differently, thinking that anyone who thought this solid pillar was a snake or fan was from another planet or absolutely stupid. The one who thought it was a wall turned away and went home, not wanting to engage because the obstacle before him felt insurmountable. And the one who thought it was a spear used his knowledge to threaten injury or death to those who didn’t agree with him.
Whether in our closest interpersonal relationships, being part of a community, or living in the larger global square, believing that we are right about our perception of the world has consequences that divide us much more from the people around us than connect us. It is even possible to explain the reasons for much of the suffering and brokenness in the world, along with the human vices of greed and power, as being caused by the attachment to being right and having the moral high ground.
And at the same time, as we cling to our own perceptions of what is true and right and good, we realize that reality is not something that is infinitely malleable and subjective. Actions have consequences and to accurately access the complex web of relationships that we are a part is to approach Truth with a capital “T.” In the core of the Amidah prayer, we say, “For You are the God of Truth, and your word is true and enduring.” Striving toward truth means living a life in relationship with God. And yet, the Indian parable points out that our assessment of truth is limited and can even be divisive depending on our relationship to it.
A way to understand the importance of seeing our perception as limited is to realize that with advances in access to information and technology, knowledge isn’t just increasing. It’s increasing at an increasing rate. In 2011, you consumed about five times as much information per day as you would have just a quarter century earlier. As of 1950, it took about fifty years for knowledge in medicine to double. By 1980 medical knowledge was doubling every seven years, and by 2010, it was doubling in half that time.
The accelerating pace of change means that we need to question our beliefs more readily than ever before. It’s not an easy task. If we insist on holding on to our beliefs, they tend to become more extreme and more entrenched.
We are quick to recognize when other people need to re-evaluate their beliefs. We question the judgment of experts whenever we seek out a second opinion on a medical diagnosis. Unfortunately, when it comes to our own knowledge and opinions, we often favor feeling right over being right.
The Scientific Mindset
This past year, I’ve been deeply affected by the research and writing of social psychologist Adam Grant in his book “Think Again.” Grant teaches that when many of us think and talk, we often slip into the mindsets of three different professions: preachers, prosecutors, and politicians. In each of these modes, we take on a particular identity and use a distinct set of tools. We go into preacher mode when our sacred beliefs are in jeopardy and deliver “sermons” to protect and promote our ideals. (My family often accuses me of that one.) We enter prosecutor mode when we recognize flaws in other people’s reasoning, causing us to marshal arguments to prove them wrong and win our case. We shift into politician mode when we’re seeking to win over an audience; we campaign and lobby for the approval of our constituents. The risk is that we become so wrapped up in preaching that we’re right, prosecuting others who we believe are wrong, and politicking for support of our perspective that we don’t bother to rethink our own views.
Now, if you are a scientist by trade, rethinking is fundamental to your profession. You’re paid to be constantly aware of the limits of your understanding. You’re expected to doubt what you know, be curious about what you don’t know, and update your views based on new data. In the past century alone, the application of scientific principles has led to dramatic progress. Biological scientists discovered penicillin. Rocket scientists sent us to the moon. Computer scientists built the internet.
But Grant teaches that being a scientist is not just a profession. It’s a frame of mind – a mode of thinking that differs from preaching, prosecuting, and politicking. We move into scientist mode when we’re searching for truth: we run experiments to test hypotheses and discover knowledge. Scientific tools aren’t reserved for people with white coats and beakers.
Just as you don’t have to be a professional scientist to reason like one, being a professional scientist doesn’t guarantee that someone will use the tools of their training. Scientists morph into preachers when they present their pet theories as gospel and treat thoughtful critiques as sacrilege. They veer into politician terrain when they allow their views to be swayed by popularity rather than accuracy. They enter prosecutor mode when they’re bent on debunking and discrediting rather than discovering. After upending physics with his theories of relativity, Einstein opposed the quantum revolution: “To punish me for my contempt of authority, fate has made me an authority myself.” Sometimes even great scientists need to think like great scientists.
Bias and Stereotypes
Two of the most common dynamics that keep us from seeing what’s true is our own bias and stereotypes. Our rabbis speak about this using the language of bribery. It is illegal for a judge who needs to pursue justice and truth to take bribes from a defendant or prosecutor because, with the acceptance of a bribe, the judge can form even the slightest inclination or bias, even unconsciously, in favor of the one he took the bribe from. Each one of us can have emotional inclinations to view a situation in a certain way, regardless of the evidence in front of us.
In psychology, there are at least two biases that drive this pattern. One is confirmation bias: seeing what we expect to see. The other is desirability bias: seeing what we want to see. These biases don’t just prevent us from applying our intelligence. They can actually contort our intelligence into a weapon against the truth.
We find reasons to preach our faith more deeply, prosecute our case more passionately, and ride the tidal wave of our political party. The tragedy is that we’re usually unaware of the resulting flaws in our thinking.
A common bias is the “I’m not biased” bias, in which people believe they’re more objective than others. It turns out that
the more intelligent we are, the harder it can be to see our own limitations. However, when we are in scientist mode, we refuse to let our ideas become ideologies. We don’t start with answers or solutions, we lead with questions and puzzles. We don’t preach from intuition; we teach from evidence. We don’t just have healthy skepticism about other people’s arguments, we dare to disagree with our own arguments.
Many of us don’t realize how we hold stereotypes of others. And it doesn’t have to be along racial, gender, or sexual preference lines. In our community, folks hold stereotypes around liberal and conservative, around reform and conservative.
Once we’ve formed stereotypes, for both mental and social reasons it’s hard to undo them.
Psychologist George Kelly observed that our beliefs are like pairs of reality goggles. We use them to make sense of the world and navigate our surroundings. A threat to our opinions cracks our goggles, leaving our vision blurred. It’s only natural to put up our guard in response – and Kelly noticed that we become especially hostile when trying to defend opinions we know, deep down, are false. Rather than trying on a different pair of goggles, we become mental contortionists, twisting and turning until we find an angle of vision that keeps our current views intact.
We also tend to interact with people who share them, which makes them even more extreme. This phenomenon is called group polarization, and it’s been demonstrated in hundreds of experiments. Juries with authoritarian beliefs recommended harsher punishments after deliberating together. Citizens who start out with a clear belief on affirmative action and gay marriage develop more extreme views on these issues after talking with a few other who share their stance. Their preaching and prosecuting move in the direction of their politics. Polarization is reinforced by conformity, and peripheral members fit in and gain status by following the lead of the most prototypical member of the group, who often holds the most intense views.
Arrogance and Overconfidence
Besides bias and stereotypes, arrogance and overconfidence can also get in the way of approaching truth. In a famous study about skill and confidence by Justin Kruger and David Dunning, they found that people who scored lowest on tests of logical reasoning, grammar, and sense of humor had the most inflated opinions of their skills. On average, they believe they did better than 62% of their peers, but in reality, they outperformed only 12 percent of them.
Adam Grant writes that it’s when we progress from novice to amateur that we become overconfident. A bit of knowledge can be a dangerous thing. In too many domains of our lives, we never gain enough expertise to question our opinions or discover what we don’t know. We have just enough information to feel self-assured about making pronouncements and passing judgment.
Humility is often misunderstood as a matter of having low self-confidence. However, one of the Latin roots of humility means “from the earth.” It’s about being grounded – recognizing that we are flawed and fallible. Confidence is a measure of how much you believe in yourself. Evidence shows that it’s distinct from how much you believe in your methods. You can be confident in your ability to achieve a goal in the future while maintaining the humility to question whether you have the right tools in the present. We may not have the right solution or even be addressing the right problem. We’re blinded by arrogance when we’re utterly convinced of our strengths and our strategies.
When an idea or assumption doesn’t matter deeply to us, we’re often excited to question it. The natural sequence of emotions is surprise, followed by curiosity and thrill. When a core belief is questioned, though, we tend to shut down rather than open up. It’s as if there’s a miniature dictator living inside our heads, controlling the flow of facts to our minds. The technical term for this in psychology is the totalitarian ego, and its job is to keep out threatening information.
It’s easy to see how an inner dictator comes in handy when someone attacks our character or intelligence. Those kinds of personal affronts threaten to shatter aspects of our identities that are important to us and might be difficult to change. The totalitarian ego steps in like a bodyguard
, protecting our self-image by feeding us comforting lies. Like, you’re on the verge of inventing the next popular origami figure, or they are all just jealous of your dead-end relationship, or you’re really, really, ridiculously good-looking.
In the early rabbinic text, Pirkei Avot, the rabbis give us a model of intellectual debate that shows us how to approach truth. They teach:
Hillel and Shammai were more concerned with discovering the truth and staying in relationship with one another than being right. Even though they disagreed about kashrut, the dietary laws, they would still eat in each other’s houses. And the students of Hillel would always teach the opinion of Shammai before their own. This is a discussion for the sake of heaven.
The rebellion of Korach against Moses was motivated by the desire for power, and Korach sought to preach and prosecute his position to overthrow Moses and Aaron. This was not for the sake of heaven. Korach wanted to prove how right he was to steer the situation in his desired direction.
I believe that Hillel and Shammai approached rabbinic problems from a scientific mindset. And being a scientist involves more than just reacting with an open mind. It means being actively open-minded. It requires searching for reasons why we might be wrong – not for reasons why we must be right – and revising our views based on what we learn.
Intellectual humility is knowing what we don’t know. Nobel Prize-winning psychologist Daniel Kahneman shares that he genuinely enjoys discovering he was wrong because now it means he is less wrong than before. He says that he refuses to let his beliefs become part of his identity. “My attachment to ideas is provisional. There’s no unconditional love for them.”
It’s important to distinguish between ideas, beliefs, and values. Most of us are accustomed to defining ourselves in terms of our beliefs, ideas, and ideologies. This can become a problem when it prevents us from changing our minds as the world changes and knowledge evolves. Our opinions can become so sacred that we grow hostile to the mere thought of being wrong, and the totalitarian ego leaps into silence counterarguments, squash contrary evidence, and close the door on new discoveries.
Who we are should be a question of what we value, not what we believe. Values are our core principles in life – they might be excellence and generosity, freedom and fairness, or security and integrity. Basing our identity on these kinds of principles enables us to remain open-minded about the best ways to advance them.
One of the names of Rosh Hashanah is Yom HaDin, the day of judgment. Most of us have been taught that we are being judged. But what if, instead, we imagine that it is we who are the judges. Each and every one of us is created in the image of God, which means we have the capacity to judge, hold perceptions, and make decisions based on what is true. What if, today, we will not insist that the elephant is a snake, pillar, or speak but instead open our eyes to discover the elephant in the room? Today celebrates the birthday of creation, and we create through the perceptions that we hold and the choices that we make. May we have the humility to realize what we don’t know, to listen carefully to others, to be willing to rethink our opinions and perspectives, and most of all, to commit to our connections with one another as the deepest truth. It’s with this intention that we can truly create our lives this year with God’s eyes, God’s heart, and God’s hands.