Facts don't have a lot of effect on how people think. Belief seems to have a stronger hold on our minds.
If everybody re-evaluated every belief whenever a new fact was presented, we'd never do anything.If everyone believed in reexamining what they knew when they got new facts, then they'd be more useful. I try to do that all the time.
The problem is knowing what is fact and what is not.
Well...yeah. How else would homeopathy, religion and so on continue to thrive?Facts don't have a lot of effect on how people think. Belief seems to have a stronger hold on our minds.
Maybe *you* do.Beliefs are basically reliance on previous learning. If we encounter a new fact contrary to our previous learning, we reject it. When it supports it, we embrace it.
Well...yeah. How else would homeopathy, religion and so on continue to thrive?
I have a fascinating book with that very title...When teaching critical thinking is of less importance to the education process than regurgitating what's found in the textbooks, obedience to authority,
You should blog about it, or write that up somewhere!lemming-like response to peer pressure, and acquiesence to the worldview of the state, is it a condemnation of the people being educated, or of the system itself when they fail to become critical thinkers?
The weeks I'm currently spending with three (now five) teenagers have been eye-opening. And frightening.
Oh, no, now we're in a nature-vs-nurture debate. I really think environment has a lot more effect than genetics on most people and their beliefs. In fact, your first statement/question suggests it's a learned response as opposed to genetic.I think we should spread the rumor that it's really possible to stop traffic with your beliefs, if you just have enough faith. It would do wonders for the gene pool.
In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs.
Agree. Well said.I didn't see any other identifying details about the study while skimming the article. Without knowing the precise nature of the "corrected facts in news stories" that were presented to the "misinformed people," this information is nigh useless.
The article seems to emphasize facts in a political context, which is (at least in the US) a deliberately confused battlefield of pseudo-science and emotional appeal.
My guess is that the "facts" were things like clinical details of the abortion process, which jarred pro-lifers and debunked their disinformation-based opposition to it. Or the debunking of "Climategate," and the presentation of other data demonstrating climate change, which jarred those who insist climate change is myth. Etc.
In which case I'm not sure we're really discovering something new about the human mind, but rather that zealots will be zealots, true believers--true believers. In other words, people who hold strong opinions on a largely emotional, irrational basis, will not be swayed by reality. Was this ever in question?
The article itself seems to be a good example of sweeping psychological generalizations based on vague references to "science."
If everybody re-evaluated every belief whenever a new fact was presented, we'd never do anything.
I think it says more about trust (or lack of trust, rather) in the news reporting services than an ability to integrate new information. News reporting is now so partisan that few people trust them to report facts.
If everybody re-evaluated every belief whenever a new fact was presented, we'd never do anything.
Here's a recent and very pertinent article by him:...
I recommend Ben Goldacre's Bad Science for a crash course in critical thinking and media misinformation.
This death penalty example brings up a point about so many political hot-potatoes: people tend to decide on such issues based on MORAL views, as opposed to practical issues and "facts" such as whether it will lower crime, the costs involved, etc. If you believe it's wrong for the State to kill a captured and defenseless person, it doesn't matter whether or not crimes might be deterred by the use of capital punishment.The classic paper on the last of those strategies is from Lord in 1979: they took two groups of people, one in favour of the death penalty, the other against it, and then presented each with a piece of scientific evidence that supported their pre-existing view, and a piece that challenged it. Murder rates went up, or down, for example, after the abolition of capital punishment in a state, or comparing neighbouring states, and the results were as you might imagine. Each group found extensive methodological holes in the evidence they disagreed with, but ignored the very same holes in the evidence that reinforced their views.
There's some balance between inhaling and digesting every new fact that comes along and sticking to one's previous learning no matter what. And I figure I don't have to hear about everything to find out something that might be important. Anything "new and exciting" that "I really need to hear" will probably eventually filter down, make its way through several news conduits and I'll hear about it soon enough.If everybody re-evaluated every belief whenever a new fact was presented, we'd never do anything.
Beliefs are basically reliance on previous learning. If we encounter a new fact contrary to our previous learning, we reject it. When it supports it, we embrace it.
We all do. Let me expand.
We all do. Let me expand.
There is an evolutionary benefit to beliefs. A belief is a generalization from specifics. You touch a hot surface and you build a belief that similar surfaces will also be hot. We learn things in many ways. Some from our own experience, some from indirect experience -- say from our elders. Experiences that support our beliefs reinforce the learning and the belief becomes stronger. Those that go against our learned experience (our beliefs) should be viewed with skepticism. As I said, if we re-evaluated at every new fact, we'd be paralyzed. Our beliefs allow us to navigate the world we live in.
All I meant was that holding onto beliefs is not a silly thing. Yes, we should be open to new information that contradicts our beliefs, but beliefs themselves are useful.
I touch a hot surface (say, an electric hob for example) and, yes, I build a belief that similair surfaces will be hot. But the next time I have a hankering to touch a similair electric hob, I won't simply adhere to my belief that it is hot. I will slowly move my hand closer in a fact finding mission to see if it is similairly hot. If I discover the fact that this hotplate is not, in fact, hot, I will quite happily go ahead with my compulsion to touch it.You touch a hot surface and you build a belief that similar surfaces will also be hot.