R & W In My Life

The Tennis of Emotional Welfare

In less conscious, present, and balanced moments, the automaticity of Right and Wrong can be allowed to rule our minds, preventing us from thinking. When we are right, we are happy, because being right is good. Whether that be from doing the right thing, knowing the right information, or fitting in with a crowd, we are right, we are good, we are happy. Then, being wrong is something to avoid, because being wrong is bad. Whether it be doing something that doesn’t fit in with social norms or not knowing enough information, we are wrong, we are bad, we ‘should’ be unhappy. Let’s look at this in a few common arenas: school as a teenager, leadership, spiritual beliefs, and the US election.

A Teen in School: Let’s go back to school; think of all of the different groups of people. There was the cool group, the nerds, outcasts, etc. The ‘cool group’ were stereotypically deemed as ‘right’. They were the ones that people wanted to be, while the ‘outcasts’ were wrong; the bad people who you shouldn’t be like. These were entirely based around the social norms of a high school, and we followed this order as it is what was taught to us, but let’s think…what did the cool group do that was ‘good’, or ‘right’? And what did the outcasts do that were ‘bad’, or ‘wrong’? Why did we put them into these categories, how did they serve us at that time? It was all a part of our own need for acceptance, connection, and plenty more than that!

Leadership: This thought is rooted deeper in the idea of coercion and peer pressure. Seth Godin, an author/blogger who has done extensive research in this topic, said we act like this in the short term as we see it as serving to us. Coercion can look like leadership, and we want to follow the leader because they are ‘right’. This ‘mob mentality’ is how a lot of social convention is created and it can be dangerous.

Spiritual beliefs: People who are Orthodox Jewish living in the US will usually have a different belief system to an Atheist living in Hong Kong. Religion, socioeconomic background, country of origin, etc. all have the ability to affect your belief system. However, it is the execution of those values that frequently leads us to the familiar right/wrong standoffs we encounter. Our background influence also has an effect on our own identity in a similar way. When we see someone behaving in a way that does not meet our belief system it is natural to become triggered by this. We think “If that’s who they are, who am I?”, If they are ‘right’, and ‘different’, then I must be ‘wrong’. We unconsciously avoid being wrong by judging what is different to us and our belief systems.

US Election: Let’s think about the previous Presidential Election. The US nation was quite heated around the two candidates, and there seemed to be strong judgments around people that had different political opinions. Most of the mainstream media would imply that voting for Trump was ‘wrong’, with many pop-culture figures voicing this opinion as well. In fact, no major newspaper endorsed Trump; not even the Wall Street Journal, which usually leans towards the Republicans. Many people were saying things like ‘I don’t understand how anyone could vote for him’, and when he was elected there was disbelief and negative comments all around social media among non-Trump supporters. ‘Who would vote for him?’ The right and wrong here exists in opinion and choice. Even if a position is more or less sociopolitically justifiable, framing under the dichotomy of “Right” vs. “Wrong” divides people, clouding cohesion and cooperation.

Overcoming Cognitive Bias

And what of political positions? What of opinions on subjects that truly matter, like the world, economics, the environment…how can we separate facts from myths? How can I use my discernment to formulate opinions based on truth and my own autonomy? How can I avoid being strung along by false dichotomies, sensationalism, and tribalism? Let’s learn some tips on overcoming Cognitive Biases.

How to Avoid Confirmation Bias
Confirmation Bias is probably the most widespread and insidious form of cognitive bias we tend to adopt. Look for ways to challenge what you think you see. Seek out information from a range of sources, and use an approach such as the Six Thinking Hats technique to consider situations from multiple perspectives.
Alternatively, discuss your thoughts with others. Surround yourself with a diverse group of people, and don’t be afraid to listen to dissenting views. You can also seek out people and information that challenge your opinions, or assign someone on your team to play “devil’s advocate” for major decisions.

And that’s just the start. Think of other ways to overcome the biases we face every day. There is no shortage of them:

The Philosopher's Dilemma

The Scottish philosopher David Hume points out the tendency people have to take things how they “are” and use that to draw conclusions about how things “ought” to be.  Hume noted that people are too quick to move from observations of facts to judgments about values. For example, they tell you that you ought not to leave the door to your house open when you leave, because thieves will come in and steal your stuff.  If we put that as a formal logical argument, it would go more or less like this:

Premise: If you leave your house open when you go out, thieves will go in and steal your stuff.
Conclusion: You ought not to leave your house open when you go out.

That’s an innocent example, but people often make much more problematic arguments that have a similar shape:

Premise: Humans evolved to eat meat
Conclusion: Therefore we should eat meat.

The essential problem is this: there is nothing about how the world is that tells you how it ought to be. Hume believed that there is a gap in reasoning (Hume bib).  This gap in logic might offer light on the frustration we feel when trying to see someone else’s viewpoint.  In the case of these two arguments, we could say that these are the hidden assumptions:

First argument: You ought not to let thieves steal your stuff.
Second argument: People always ought to act in the way that humans evolved to act.

In a sense, Hume didn’t think ethics like right and wrong were real – we’re giving our opinions. To use facts to argue that “X is wrong” will fall apart because there is no right or wrong in facts. Facts simply are. Many philosophers have weighed in on Hume’s logic and there isn’t clear consensus by any stretch of the imagination.  At the very least Hume is calling attention to hidden assumptions that may offer a resourceful perspective for us to examine our own loaded assumptions.

from XKCD

Philosopher John Rawls believed that a person is biased due to their own subjective circumstances and interests. He noted this particularly with regards to politics and society.  He observed that a rich man has a tendency to value free markets and favored the freedom to earn the fruits of his efforts, whereas a poor individual is likely to be supportive of a system that redistributes wealth and fosters more equality.  He designed a thought experiment known as the “Veil of Ignorance” to skirt around the inherent biases of individuals and tackle morality.

His thought experiment went like this: Imagine you are a tasked with setting up the rules of a moral society.  How would you do it if you won’t know what kind of person you will be when you are born into this world? “You know nothing of your sex, race, nationality, or individual tastes. Behind such a veil of ignorance all individuals are simply specified as rational, free, and morally equal beings. You do know that in the ‘real world,’there will be a wide variety in the natural distribution of natural assets and abilities, and that there will be differences of sex, race, and culture that will distinguish groups of people from each other.”

Rawls frames entering a society as if it were a lottery and insists that from behind the veil, we’d opt for a much fairer society.  His thought experiment hints that the mere fact of who we are and the multitude of interests, circumstances, and life experiences influence our views on morality, yet when we step back to consider humanity from a 100,000 ft view, we generally value the same things.

Robert Solley of The Mission writes, “Right and wrong — and categorical thinking — are aspects of our mind that we project into the world to make sense of it. As the philosopher Immanuel Kant pointed out, when you see three trees on a grassy hill, the number three is not out in the trees. The number three is in our minds. By the same token, red, green, soft, cold, high pitched sounds, low pitched sounds, messy, neat — these are all ways that we divide up the infinity of the universe to try to make sense of it. Same thing with right and wrong.”
Read More: The Right and Wrong Game

So What Does This All Mean?

Objectively, social psychology tells us that the sense of certainty we have about beliefs of what is right and wrong is influenced by a variety of factors that we’re likely not aware of and philosophers agree that our own subjective circumstances and observations have us hung up in bias.  This ought to send a signal for us to pump the brakes on our beliefs. Take a moment to allow the possibility of being “wrong” to settle in.

And, forgive. It is an empowering and perspective-shifting thing to foster a habit of forgiveness. Letting go of our attachments to Right and Wrong will include forgiving in response to others’ actions that have affected you. As a way of being, forgiveness is essentially empathy. When we understand the actions of others as happening through their own lens, Anger and “Shoulds” dissolve into compassion and understanding.

Some people have embraced being wrong like self-proclaimed “wrongologist” Kathryn Schulz.  She addresses the natural aversion we have to the feeling of being wrong. Schulz makes the distinction that it’s not that we don’t like being wrong, it that we don’t like finding out we’re wrong.  For many of us, finding out we’re wrong brings back ingrained memories of failure and shame.  From this perspective, being wrong or mistaken about something is synonymous with being bad or not good enough.  To avoid those feelings we operate in state of error blindness where we are steadfast in “knowing” we are right. She invites us to flip the paradigm and embrace being wrong. St. Augustine wrote, “Fallor ergo sum,” or “I err therefore I am.” Schulz reminds us that being wrong isn’t a flaw in the system, but part of who we are.  Thinkers like these hope to “normalize” being wrong. “ If you can do so,” she says, “it’s the single greatest moral, intellectual, and creative leap you can make.” See her TED talk here.

What if we were to start from scratch?  What if we re-evaluated the way we look at right and wrong?  This can be scary for those of us who hold strong notions of right and wrong.  But what if we weren’t trying to control for an outcome and instead we were interested in learning from each other’s viewpoints? From that perspective we can make the next jump to curiosity. Curious people try to get an accurate picture of reality even when it is difficult or uncomfortable.  Traits like being open-minded, objective, and virtuous form the cornerstones of curiosity. Individuals with this mindset have an itch to solve the puzzle and are intrigued to encounter something that contradicts their beliefs. As Julia Galef shares the story of the shipbuilders.: “If you want to build a ship.  Don’t drum up men to gather wood, divide the work, and give orders. Instead, teach them to yearn for the vast and endless sea.” When it comes to the conversation of right and wrong the way forward isn’t more statistics, evidence, or rhetoric. Instead, it is a yearning to get curious. As Albert Einstein famously said: “I have no special talents. I am only passionately curious.”