• In total there are 27 users online :: 0 registered, 0 hidden and 27 guests (based on users active over the past 60 minutes)
    Most users ever online was 789 on Tue Mar 19, 2024 5:08 am

The Backfire Effect

Engage in discussions encompassing themes like cosmology, human evolution, genetic engineering, earth science, climate change, artificial intelligence, psychology, and beyond in this forum.
Forum rules
Do not promote books in this forum. Instead, promote your books in either Authors: Tell us about your FICTION book! or Authors: Tell us about your NON-FICTION book!.

All other Community Rules apply in this and all other forums.
User avatar
geo

2C - MOD & GOLD
pets endangered by possible book avalanche
Posts: 4779
Joined: Sun Aug 03, 2008 4:24 am
15
Location: NC
Has thanked: 2198 times
Been thanked: 2200 times
United States of America

The Backfire Effect

Unread post

David Brooks mentioned this in a recent column:
Classic research has suggested that the more people doubt their own beliefs the more, paradoxically, they are inclined to proselytize in favor of them. David Gal and Derek Rucker published a study in Psychological Science in which they presented some research subjects with evidence that undermined their core convictions. The subjects who were forced to confront the counterevidence went on to more forcefully advocate their original beliefs, thus confirming the earlier findings.
In short, we humans have a tendency to entrench our opinions even deeper when confronted with countering evidence and arguments. We dig our heels in rather than admit we are wrong. Apparently, the human brain places being "right" above being correct. And instead of consolidating new information into our beliefs, we look for ways to further entrench our older beliefs.

The skeptic's dictionary has a good entry on what is known as the "backfire effect." It is particularly interesting to see how we rationalize our positions, especially in light of contrary information.

http://www.skepdic.com/backfireeffect.html
. . . If one knows that there is a community of believers who share your beliefs and one believes that there is probably information you don't have but which would outweigh the contrary information provided, rationalization becomes easier. It is possible that the rationalization process leads one to give more weight to reinforcement by the community of believers. How much play one's belief gets in the media, versus the play of contrary information may also contribute to the backfire effect. If messages supporting your belief are presented far more frequently in the media than messages contrary to your belief, or presented repeatedly by people you admire, the tendency might be to give those supportive messages even more weight than before
Last edited by geo on Sun Dec 12, 2010 1:08 pm, edited 1 time in total.
-Geo
Question everything
User avatar
DWill

1H - GOLD CONTRIBUTOR
BookTalk.org Hall of Fame
Posts: 6966
Joined: Thu Jan 31, 2008 8:05 am
16
Location: Luray, Virginia
Has thanked: 2262 times
Been thanked: 2470 times

Re: The Backfire Effect

Unread post

That makes a good deal of sense, and it allows a somewhat optimistic view of the situation with people who have decided to shut their eyes to evidence. If their intransigence and vehemence mask a deeper realization that their views are doomed, maybe things are not as bad as they seem. Maybe they could wake up.
User avatar
geo

2C - MOD & GOLD
pets endangered by possible book avalanche
Posts: 4779
Joined: Sun Aug 03, 2008 4:24 am
15
Location: NC
Has thanked: 2198 times
Been thanked: 2200 times
United States of America

Re: The Backfire Effect

Unread post

DWill wrote:That makes a good deal of sense, and it allows a somewhat optimistic view of the situation with people who have decided to shut their eyes to evidence. If their intransigence and vehemence mask a deeper realization that their views are doomed, maybe things are not as bad as they seem. Maybe they could wake up.
More than anything, it's important to know our own biases and know that our brain is prone to such cognitive errors. 'Know thyself' is an apt aphorism here.
Last edited by geo on Sun Dec 12, 2010 1:05 pm, edited 1 time in total.
-Geo
Question everything
User avatar
Interbane

1G - SILVER CONTRIBUTOR
BookTalk.org Hall of Fame
Posts: 7203
Joined: Sat Oct 09, 2004 12:59 am
19
Location: Da U.P.
Has thanked: 1105 times
Been thanked: 2166 times
United States of America

Re: The Backfire Effect

Unread post

I remember a psychological phenomenon tied to our evolutionary heritage that may be related. If you remember, we err on the side of seeing false patterns in nature. This bias had survival benefits. Another bias was that once we see a pattern, we must believe it. Because, unless we actually believe such patterns are real, the survival benefits are null. We could see some stripes that look similar to a tiger in the grass, but unless we are also biased to believe it is a tiger, we wouldn't act accordingly. The third bias that plays a part in pattern seeking is that we are biased to maintain beliefs.

The example I remember is seeing which other creatures use a certain watering hole. If we see a pattern develop where predators usually stay away from the watering hole during midday, and come more often at night, it is in our best interest to only use the watering hole during midday. That way we can avoid the predators. However, if an anomaly occurs, such as a predator going to the watering hole during midday, we do not want to so casually discard our understanding that predators come more often at night. We maintain the belief in spite of contrary information. Only after repeated sightings of predators during midday at the watering hole would we change our belief.

It's interesting to read about biases from our evolutionary heritage, and then seeing them in action in society. It's almost as if it's a glitch that manifests in strangely complex ways. Since, we are undoubtedly more complex than when some of these biases have evolved.

The longer a certain belief has had to saturate a person's mind, the more contrary information it would take to make them change their minds. If the exposures are too infrequent, such as an anomaly would have been in our distant past, it is all that much more imperative that we maintain our beliefs in spite of these anomalies. It seems our tendency to rationalize is related to this bias. One of the higher order manifestations of this 'maintenance bias' is that we quickly become emotional with a belief that is attacked. We all know how well such emotion polarizes us and blinds us.

The only way to fight against these biases is to practice and exercise critical thinking. Critical thinking is primarily focused on arriving at truthful conclusions while resisting the influence of our biases.
Post Reply

Return to “Science & Technology”