Wednesday, January 06, 2010

Cognitive dissonance

Ah... yes.... Evidence to the Contrary.  We've all argued a point, and while we have presented evidence supporting our position and the truth, the other side refuses to see the light.  In your experience, have your arguments ever changed another person's opinion?  Tell the truth now. 

Here are a couple articles that explain why and how people cling to their beliefs even when they are wrong.  For the strategies people use to resist information that conflicts with their beliefs, I have encountered them all.  Now I know the names.

The first link is to a shorter article and the second link is a longer study.

http://correspondents.theatlantic.com/lane_wallace/2009/09/all_evidence_to_the_contrary.php
How is it that people can cling to an opinion or view of a person, event, issue of the world, despite being presented with clear or mounting data that contradicts that position? The easy answer, of course, is simply that people are irrational. But a closer look at some of the particular ways and reasons we're irrational offers some interesting food for thought.

In a recently published study, researchers found that people often employ an approach the researchers called "motivated reasoning" when sorting through new information or arguments, especially on controversial issues. Motivated reasoning is, as UCLA public policy professor Mark Kleiman put it, the equivalent of policy-driven data, instead of data-driven policy.

In other words, if people start with a particular opinion or view on a subject, any counter-evidence can create "cognitive dissonance" - discomfort caused by the presence of two irreconcilable ideas in the mind at once. One way of resolving the dissonance would be to change or alter the originally held opinion. But the researchers found that many people instead choose to change the conflicting evidence - selectively seeking out information or arguments that support their position while arguing around or ignoring any opposing evidence, even if that means using questionable or contorted logic.

Needless to say, these findings do not bode well for anyone with hopes of changing anyone else's mind with facts or rational discussion, especially on "hot button" issues. But why do we cling so fiercely to positions when they don't even involve us directly? Why don't we care more about simply finding out the truth--especially in cases where one "right" answer actually exists?

Part of the reason, according to Kleiman, is "the brute fact that people identify their opinions with themselves; to admit having been wrong is to have lost the argument, and (as Vince Lombardi said), every time you lose, you die a little." And, he adds, "there is no more destructive force in human affairs--not greed, not hatred--than the desire to have been right."

I would define a true intellectual as one who cares terribly about being right, and not at all about having been right.  Easy to say, very hard to achieve.


http://sociology.buffalo.edu/documents/hoffmansocinquiryarticle_000.pdf

Strategies for Resisting Information

Here are the strategies used to resist persuasion that social psychologists have identified:
  • counterarguing (directly rebutting the information),
    this category was a fairly small portion of our interview sample, and was limited to partisans who were coded as either average or above average in their political information. 
  • attitude bolstering (The most popular strategy was to quickly switch the topic to other good reasons by bringing facts that support one’s position to mind without directly refuting the contradictory information), and
  • selective exposure (ignoring the information without rebutting it or supporting other positions). 

In addition, we identified two other strategies of resisting information that have not been previously noted by social psychologists:
  • disputing rationality (arguing that opinions do not need to be grounded in facts or reasoning), and
  • inferred justification (—the most unusual of our findings— a strategy that infers evidence which would support the respondent’s beliefs).  People begin with the situation and then ask themselves what must be true about the world for the situation to hold. People who displayed inferred justification assumed that since someone they trusted said or did it, there must be a good reason for it.

No comments: