Sunday, 24 April 2011

The Science of Why We Don't Believe Science

On the website MotherJones.com Chris Mooney, in piece written on the psychology of denial, conveys a strong message about how our personal and emotional biases can greatly affect our reasoning skills. (http://motherjones.com/politics/2011/03/denial-science-chris-mooney)


 He writes that, "When we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers (PDF). Our "reasoning" is a means to a predetermined end—winning our "case"—and is shot through with biases. They include "confirmation bias," in which we give greater heed to evidence and arguments that bolster our beliefs, and "disconfirmation bias," in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial."


This insight into how the human brain goes about actually processing ideas that we perceive to be rational can be seen as a complete illusion.


He goes on to talk about the backfire effect, which he precedes with research from Lord et al (1979) where 'pro- and anti-death penalty advocates were exposed to descriptions of two fake scientific studies: one supporting and one undermining the notion that capital punishment deters violent crime and, in particular, murder. They were also shown detailed methodological critiques of the fake studies—and in a scientific sense, neither study was stronger than the other. Yet in each case, advocates more heavily criticized the study whose conclusions disagreed with their own, while describing the study that was more ideologically congenial as more "convincing." ' 
From this, he goes on to say that, 'Head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.' 
Evidence for this comes from Nyhan and Reifler (2006) who found that corrections frequently fail to reduce misperceptions among the targeted ideological group.  They also document several instances of a “backfire effect” in which corrections actually increase misperceptions among the group in question.


For the remainder of his post, Mooney talks about Climate gate and the vaccine-autism link and how people still deny the scientific consensus for the lack of evidence to support said link. 


Now, just posting about this may seem trivial, so I feel the need to link it to my previous post about Ramachandran's book Phantoms in the brain.  The way in which I shall do this is link it to the possible asonognosia causes that Ramachandran brings up multiple times. He suggests that people who have a damaged right-brain don't have the process to "overhaul" the left-brain's representation of the world, despite a large amount of evidence pointing to the obvious fact that it is wrong. This can be extended to the backfire effect. By this I mean that, maybe, the more fervent the belief someone has, the more evidence there needs to be before the right-brain CAN overhaul the left-brain's worldview.  As a result, the influx of new information means little to the person as either they pass off the contrary evidence using disconfirmation bias, or the backfire effect takes hold, and makes their mindset even harder to shift. 


Although this is a drastic jump, I feel as though it can be said that the similar ideas can be put forward (in terms of the left-brain, right-brain balance of power.)  As it is well documented that the left-brain is far more powerful than the right in terms of expressed control.