mercredi 6 décembre 2017

Politics and Bias

Over the last year, political schisms have widened. Trump is a fantastic tool for further separating viewpoints, for preventing meaningful discussion, and for tribal behavior all around. Liberals circle their wagons and insist that the media is biased, the republicans are evil buffoons, and conservatives are out to destroy the world. Conservatives circle their wagons and insist that the media is biased, the democrats are elitist pricks, and liberals are out to destroy the world.

So in the context of politics, specifically, and with consideration for current events, I'd like to discuss the cognitive biases and fallacies that we ALL fall prey to :D

For an entertaining version, here's an article from Cracked. Note that I've taken some formatting liberties with the quoted elements below - these are snippets, capturing the titles of each section and the related science from each... but skipping over a lot of other text. It's Cracked, so it's entertaining to read in its entirety ;)
http://ift.tt/2jZ4uHe

Quote:

*We're Not Programmed to Seek "Truth," We're Programmed to "Win"
It's called the argumentative theory of reasoning, and it says that humans didn't learn to ask questions and offer answers in order to find universal truths. We did it as a way to gain authority over others. That's right -- they think that reason itself evolved to help us bully people into getting what we want.
.....

*Our Brains Don't Understand Probability
It's called neglect of probability. Our brains are great for doing a lot of things. Calculating probability is not one of them. That flaw colors every argument you've ever had, from the tax code down to that time your friend totally cheated you in a coin-flip.
.....
*We Think Everyone's Out to Get Us
Think about all the people you've disagreed with this month. How many of them do you think were being intentionally dishonest? Experts say you're almost definitely overshooting the truth. It's called the trust gap, and scientist see it crop up every time one human is asked to estimate how trustworthy another one is.
.....
*We're Hard-Wired to Have a Double Standard
It's called the fundamental attribution error. It's a universal thought process that says when other people screw up, it's because they're stupid or evil. But when we screw up, it's totally circumstantial.
.....
*Facts Don't Change Our Minds
Let's go back to the beginning for a moment, and the theory that people figured out how to build arguments as a form of verbal bullying rather than a method of spreading correct information. That means that there are actually two reasons somebody might be arguing with you: because they actually want to get you to think the right thing, and because they're trying to establish dominance over you to lower your status in the tribe (or office or forum) and elevate their own. That means there's a pretty severe cost to being on the wrong side of an issue completely separate from the issue itself.

And a little more to the point:

The “Other Side” Is Not Dumb
http://ift.tt/2k25aeT
Quote:

In psychology, the idea that everyone is like us is called the “false-consensus bias.” This bias often manifests itself when we see TV ratings (“Who the hell are all these people that watch NCIS?”) or in politics (“Everyone I know is for stricter gun control! Who are these backwards rubes that disagree?!”) or polls (“Who are these people voting for Ben Carson?”).
Online it means we can be blindsided by the opinions of our friends or, more broadly, America. Over time, this morphs into a subconscious belief that we and our friends are the sane ones and that there’s a crazy “Other Side” that must be laughed at — an Other Side that just doesn’t “get it,” and is clearly not as intelligent as “us.” But this holier-than-thou social media behavior is counterproductive, it’s self-aggrandizement at the cost of actual nuanced discourse and if we want to consider online discourse productive, we need to move past this.
And to get some relevant concepts down...
Bias Blind Spot
Quote:

The bias blind spot is the cognitive bias of recognizing the impact of biases on the judgement of others, while failing to see the impact of biases on one's own judgment. The term was created by Emily Pronin, a social psychologist from Princeton University's Department of Psychology, with colleagues Daniel Lin and Lee Ross. The bias blind spot is named after the visual blind spot. Most people appear to exhibit the bias blind spot. In a sample of more than 600 residents of the United States, more than 85% believed they were less biased than the average American. Only one participant believed that he or she was more biased than the average American. People do vary with regard to the extent to which they exhibit the bias blind spot. It appears to be a stable individual difference that is measurable (for a scale, see Scopelliti et al. 2015).

The bias blind spot appears to be a true blind spot in that it is unrelated to actual decision making ability. Performance on indices of decision making competence are not related to individual differences in bias blind spot. In other words, everyone seems to think they are less biased than other people, regardless of their actual decision making ability.

Naive Realism

Quote:

In social psychology, naïve realism is the human tendency to believe that we see the world around us objectively, and that people who disagree with us must be uninformed, irrational, or biased.
The Illusion of Asymmetric Insight
Quote:

When Pronin, Ross, Kruger and Savitsky moved from individuals to groups, they found an even more troubling version of the illusion of asymmetric insight. They had subjects identify themselves as either liberals or conservatives and in a separate run of the experiment as either pro-abortion and anti-abortion. The groups filled out questionnaires about their own beliefs and how they interpreted the beliefs of their opposition. They then rated how much insight their opponents possessed. The results showed liberals believed they knew more about conservatives than conservatives knew about liberals. The conservatives believed they knew more about liberals than liberals knew about conservatives. Both groups thought they knew more about their opponents than their opponents knew about themselves. The same was true of the pro-abortion rights and anti-abortion groups.

The illusion of asymmetric insight makes it seem as though you know everyone else far better than they know you, and not only that, but you know them better than they know themselves. You believe the same thing about groups of which you are a member. As a whole, your group understands outsiders better than outsiders understand your group, and you understand the group better than its members know the group to which they belong.

The researchers explained this is how one eventually arrives at the illusion of naive realism, or believing your thoughts and perceptions are true, accurate and correct, therefore if someone sees things differently than you or disagrees with you in some way it is the result of a bias or an influence or a shortcoming. You feel like the other person must have been tainted in some way, otherwise they would see the world the way you do – the right way. The illusion of asymmetrical insight clouds your ability to see the people you disagree with as nuanced and complex. You tend to see your self and the groups you belong to in shades of gray, but others and their groups as solid and defined primary colors lacking nuance or complexity.
It is my opinion that these fairly universal cognitive biases are at play in the majority of our political discourse, and have been exacerbated in the past year.


via International Skeptics Forum http://ift.tt/2BGAA1P

Aucun commentaire:

Enregistrer un commentaire