Who is more biased: Right or Left?
When it comes to politics, there seems to be more division now than ever. Disagreements over how the government should handle (or refrain from intervention in) certain issues has reached fever pitch.
Liberals and conservatives have long argued about another issue: Which “side” is more rational and less biased? As it turns out, this is one issue on which both sides are on equal footing, according to two new studies on political bias. As Friedrich Nietzsche said, "Sometimes people don’t want to hear the truth because they don’t want their illusions destroyed."
The first study is titled At Least Bias Is Bipartisan: A Meta-Analytic Comparison of Partisan Bias in Liberals and Conservatives. For this study, the researchers meta-analyzed the results of 41 experimental studies of partisan bias involving over 12,000 participants who identified their political ideology. They found that overall partisan bias was “robust”. Liberals and conservatives showed nearly identical levels of bias across the studies, and the “relative magnitude of bias in liberals and conservatives differed across political topics.”
The second paper, which will be published in the September 2017 Journal of Experimental Social Psychology, is titled Liberals and Conservatives Are Similarly Motivated to Avoid Exposure to One Another’s Opinions.
Liberals and conservatives are similarly motivated to avoid listening to opposing viewpoints on controversial issues, the study found. In fact, their opposition to listening to different points of view was so strong that approximately two thirds of them gave up a chance to win extra money in order to avoid hearing from the other side!
The aversion applied to hot-button issues including same-sex marriage, elections, marijuana, climate change, guns, and abortion. The study also found that the aversion is not a product of already being or feeling knowledgeable – the unwillingness is linked to something more concerning: “People anticipated that conflicting information would produce cognitive dissonance and harm relationships”, the researchers wrote.
Cognitive dissonance refers to the mental stress or discomfort experienced when we hold two or more contradictory beliefs, ideas, or values at the same time, perform an action that is contradictory to one or more beliefs, ideas or values, or are confronted by new information that conflicts with our existing beliefs, ideas, or values.
From A Tale of Grapes, Politics, Cults, and Aliens: Why People Cling to False Beliefs:
It is human nature to dislike being wrong. When we make a mistake, it is hard to admit it.
We resort to mental gymnastics to avoid accepting that our logic – or our belief system itself – is flawed. Lying, denying, and rationalizing are among the tactics we employ to dance around the truth and avoid the discomfort that contradiction creates. We avoid or toss aside information that isn’t consistent with our current beliefs. Emotions trump logic and evidence. Once our minds are made up, it is very difficult to change them.
This is cognitive dissonance.
Back to the second study. The researchers found that:
Ideologically committed people are similarly motivated to avoid ideologically crosscutting information. Although some previous research has found that political conservatives may be more prone to selective exposure than liberals are, we find similar selective exposure motives on the political left and right across a variety of issues.
A high-powered meta-analysis of our data sets (N = 2417) did not detect a difference in the intensity of liberals’ (d = 0.63) and conservatives’ (d = 0.58) desires to remain in their respective ideological bubbles.
A study conducted over ten years ago yielded similar findings:
In 2006, Emory University psychology professor Drew Westen, PhD, and colleagues published a study in the Journal of Cognitive Neuroscience that described the neural correlates of political judgment and decision-making.
Using functional magnetic resonance imaging (fMRI), the team examined the brain activity of 30 men. Half were self-described “strong” Republicans and half were “strong” Democrats. The men were tasked with assessing statements by both George W. Bush and John Kerry in which the candidates clearly contradicted themselves.
In their assessments, Republicans were as critical of Kerry as Democrats were of Bush, yet both let their own candidate off the hook.
Westen said the results suggest that the notion of “partisan reasoning” is an oxymoron, and that most of the time, partisans feel their way to beliefs rather than use their thinking caps.
In an interview, Westen elaborated on his study’s findings:
We ultimately found that reason and knowledge contribute very little. From three studies during the Clinton impeachment era to the disputed vote count of 2000 to people’s reactions to Abu Ghraib, we found we could predict somewhere between 80 percent and 85 percent of the time which way people would go on questions of presumed fact from emotions alone. Even when we gave them empirical data that pushed them one way or the other, that had no impact, or it only hardened their emotionally biased views.
There is a related cognitive trap that all of us are susceptible to called confirmation bias. It refers to our tendency to search for and favor information that confirms our beliefs while simultaneously ignoring or devaluing information that contradicts our beliefs. This phenomenon is also called confirmatory bias or myside bias.
In politics, confirmation bias explains, for example, why people with right-wing views read and view right-wing media and why people with left-wing views read and view left wing media. In general, people both: ◾ Want to be exposed to information and opinions that confirm what they already believe. ◾ Have a desire to ignore, or not be exposed to, information or opinions that challenge what they already believe.
Confirmation bias permeates political discussions. It is so pervasive that it largely goes unnoticed. We are used to it. It explains why political debates usually end in gridlock.
And, even when we are open to listening to information that contradicts our current beliefs, we may experience the “backfire effect.”
Coined by Brendan Nyhan and Jason Reifler, the term backfire effect describes how some individuals, when confronted with evidence that conflicts with their beliefs, come to hold their original position even more strongly.
The more ideological and the more emotion-based a belief is, the more likely it is that contrary evidence will be ineffective.
Cognitive dissonance and confirmation bias have become more prevalent in recent years, in part because just about any belief can be “supported” with information found online. Both typically occur outside of our awareness: we don’t consciously notice we are experiencing these psychological states.
We resort to these mental gymnastics because, well, we are human, and as much as we would like to believe we are flawless information gathering, logical thinking, truth-seekers, we are all susceptible to bias, especially when it comes to emotionally charged issues and deeply entrenched beliefs.
Also, who likes being wrong? It takes healthy levels of self-awareness and self-esteem to admit when we’ve made a mistake.
One possible reason people show confirmation bias is that they are weighing up the costs of being wrong, rather than investigating in a neutral, scientific way.
And, evidence indicates that maintaining our emotional stability is much more important to us than facing reality.
The good news is that we can avoid letting cognitive dissonance and confirmation bias infiltrate our decision-making and belief systems. The following are tips from You Can’t Handle the Truth: How Confirmation Bias Distorts Your Opinions: ◾ Be open to new information and other perspectives. Don’t be afraid to test or revise your beliefs. ◾ Even if you consider yourself an expert on a topic, approach new information as a beginner would. ◾ Ask someone you trust to play devil’s advocate. Ask them to challenge your assumptions. ◾ Don’t let a limited amount of past experience (particularly one negative experience) carry too much weight. Be sure to envision the future, not just replay the past. ◾ Remind yourself that your intuition is lazy (designed to make predictions quickly, but not always accurately) and does not want to be challenged. Seek and fully evaluate other alternatives before making decisions. ◾ When you believe something strongly, but don’t have recent and compelling evidence to support that belief, look for more information. ◾ Check your ego. If you can’t stand to be wrong, you’re going to continue to fall victim to biases. Learn to value truth rather than the need to be right. ◾ Look for disagreement. If you’re right, then disagreement will help highlight this and if you’re wrong – it will help you identify why. ◾ Ask insightful, open-ended questions. Direct them to people who are not afraid to be honest with you. Be quiet and listen to what they say. ◾ Examine conflicting data. Discuss it with people who disagree with you and evaluate the evidence they present. ◾ Consider all the viewpoints that you can find – not just the ones that support your current beliefs or ideas.
There are two different types of people in the world, those who want to know, and those who want to believe. – Friedrich Nietzsche
This article first published online by The Daily Sheeple here.