The following is from Boston.com. I understand this quite well. I've thought about it a lot as part of my profession.
It’s one of the great assumptions underlying modern democracy that an informed citizenry is preferable to an uninformed one. “Whenever the people are well-informed, they can be trusted with their own government,” Thomas Jefferson wrote in 1789. This notion, carried down through the years, underlies everything from humble political pamphlets to presidential debates to the very notion of a free press. Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it’s an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.
In the end, truth will out. Won’t it?
Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
I expect to gain great insights when Taleb publishes his next book which will highlight what ain't right with brains that are convinced - maybe giving us a way to understand, discount and moderate false conviction. I will reiterate a couple of my standing propositions at this point.
1. It takes energy and effort to establish and sustain any idea in the public sphere. The more detailed the information, and the more people, the more energy and attention required. Conversely, small tightly knit groups can be convinced of almost anything.
2. When it comes to policy making and the enforcement of law, it is better to expend less energy and on fewer matters. The high energy required to establish and maintain a regime of truth and discipline to that truth is corrupting.
I beleive that political thought, works exactly like the article says, because it tries to establish, at bottom, justification for actions & policy based on a moral purpose. When facts do not conform to the moral premises they are percieved as noise or deception.
Recent Comments