So I've been reading up on psychology lately, mostly accessible books like 'The Social Animal' and 'Switch', and a common theme is how incredibly irrational the human mind is.
Let us take framing as an example. In a well known experiment by Amos Tversky and Daniel Kahneman (http://ift.tt/1jIQryB), half the participants were presented with this scenario:
600 people carry a deadly disease. You can employ treatment A which is guaranteed to save 200 lives, or you can use treatment B which has a 33% chance of saving everyone, 66% chance of saving no one. Which do you choose?
The other half received this scenario:
600 people carry a deadly disease. You can employ treatment A which will cause 400 deaths, or you can use treatment B which has a 33% chance of causing no deaths, 66% chance of everyone dying. Which do you choose?
Now, both these scenarios are identical in everything except their wording, but over 70% of people who received the scenario with positive framing ("save 200") choose treatment A. Over 70% of those who received the scenario with negative framing ("400 deaths") choose treatment B. Simply changing the wording in the experiment caused a massive shift in decision making. There exist no logical reason for why someone would choose one over the other simply because different words were used. There are many other examples of how we are influenced by seemingly insignificant things, like this study presented in The Social Animal:
(Actual study: http://ift.tt/1jIQudL)
Then there's confirmation bias. People are likely to trust information that agree with their world view, and distrust information which disagrees with their world view. You've probably experienced it yourself, even on these forums: A poster uncritically trusts a claim that is made without any citation, but then goes to great lengths to disprove another claim. Sometimes, if these people can't find problems with the claim, they'll try and find a problem with the poster making it.
It's easy to accuse these types of posters of just being unreasonable, and not representative of humanities ability to reason, but ask yourself: Have you ever read something or heard something and believed it even though you didn't verify it? For example, when I presented Kahneman & Tversky experiment above, how many of you assumed that this was a real experiment that happened the way I described it without clicking the link to the actual paper?
Of course, confirmation bias can be countered to a certain degree by reminding yourself to be skeptical of anything whether you agree with it or not, but the problem is that a big part of confirmation bias happens at an uncontrollable level. For example, you are more likely to REMEMBER evidence that supports your view, and forget that which doesn't (http://ift.tt/1jIQryE).
Even more damning, studies using fMRI have shown physical evidence that the reasoning areas of the brain shut down when people are presented with information that causes dissonance (A more accurate and scientific explanation is on page 5 here: http://ift.tt/15W4nvn).
So, with all this in mind, do you think that people can be rational, not just occasionally and in certain situations, but as a rule, or is irrationally so ingrained in our nature that it's unavoidable?
Let us take framing as an example. In a well known experiment by Amos Tversky and Daniel Kahneman (http://ift.tt/1jIQryB), half the participants were presented with this scenario:
600 people carry a deadly disease. You can employ treatment A which is guaranteed to save 200 lives, or you can use treatment B which has a 33% chance of saving everyone, 66% chance of saving no one. Which do you choose?
The other half received this scenario:
600 people carry a deadly disease. You can employ treatment A which will cause 400 deaths, or you can use treatment B which has a 33% chance of causing no deaths, 66% chance of everyone dying. Which do you choose?
Now, both these scenarios are identical in everything except their wording, but over 70% of people who received the scenario with positive framing ("save 200") choose treatment A. Over 70% of those who received the scenario with negative framing ("400 deaths") choose treatment B. Simply changing the wording in the experiment caused a massive shift in decision making. There exist no logical reason for why someone would choose one over the other simply because different words were used. There are many other examples of how we are influenced by seemingly insignificant things, like this study presented in The Social Animal:
Quote:
The first experiment served to prime different trait categories; some of the subjects were asked to remember positive trait words (adventurous, self-confident, independent, and persistent), whereas the others were asked to remember negative trait words (reckless, con- ceited, aloof, and stubborn). Five minutes later, as part of the reading comprehension study, subjects then read an ambiguous paragraph about a fictitious person named Donald. The paragraph described a number of behaviors performed by Donald that could be interpreted as either adventurous or reckless (e.g., skydiving), self-confident or conceited (e.g., believes in his abil- ities), independent or aloof (e.g., doesnt rely on anyone), and persist- ent or stubborn (e.g., doesnt change his mind often). The subjects then described Donald in their own words and rated how desirable they considered him to be. The results showed that how they were primed influenced their impressions of Donald. When negative trait categories had been primed, they characterized Donald in negative terms and saw him as less desirable than when positive categories had been primed. |
(Actual study: http://ift.tt/1jIQudL)
Then there's confirmation bias. People are likely to trust information that agree with their world view, and distrust information which disagrees with their world view. You've probably experienced it yourself, even on these forums: A poster uncritically trusts a claim that is made without any citation, but then goes to great lengths to disprove another claim. Sometimes, if these people can't find problems with the claim, they'll try and find a problem with the poster making it.
It's easy to accuse these types of posters of just being unreasonable, and not representative of humanities ability to reason, but ask yourself: Have you ever read something or heard something and believed it even though you didn't verify it? For example, when I presented Kahneman & Tversky experiment above, how many of you assumed that this was a real experiment that happened the way I described it without clicking the link to the actual paper?
Of course, confirmation bias can be countered to a certain degree by reminding yourself to be skeptical of anything whether you agree with it or not, but the problem is that a big part of confirmation bias happens at an uncontrollable level. For example, you are more likely to REMEMBER evidence that supports your view, and forget that which doesn't (http://ift.tt/1jIQryE).
Even more damning, studies using fMRI have shown physical evidence that the reasoning areas of the brain shut down when people are presented with information that causes dissonance (A more accurate and scientific explanation is on page 5 here: http://ift.tt/15W4nvn).
So, with all this in mind, do you think that people can be rational, not just occasionally and in certain situations, but as a rule, or is irrationally so ingrained in our nature that it's unavoidable?
via JREF Forum http://ift.tt/1jIQryH
Aucun commentaire:
Enregistrer un commentaire