In short, we humans have a tendency to entrench our opinions even deeper when confronted with countering evidence and arguments. We dig our heels in rather than admit we are wrong. Apparently, the human brain places being "right" above being correct. And instead of consolidating new information into our beliefs, we look for ways to further entrench our older beliefs.Classic research has suggested that the more people doubt their own beliefs the more, paradoxically, they are inclined to proselytize in favor of them. David Gal and Derek Rucker published a study in Psychological Science in which they presented some research subjects with evidence that undermined their core convictions. The subjects who were forced to confront the counterevidence went on to more forcefully advocate their original beliefs, thus confirming the earlier findings.
The skeptic's dictionary has a good entry on what is known as the "backfire effect." It is particularly interesting to see how we rationalize our positions, especially in light of contrary information.
http://www.skepdic.com/backfireeffect.html
. . . If one knows that there is a community of believers who share your beliefs and one believes that there is probably information you don't have but which would outweigh the contrary information provided, rationalization becomes easier. It is possible that the rationalization process leads one to give more weight to reinforcement by the community of believers. How much play one's belief gets in the media, versus the play of contrary information may also contribute to the backfire effect. If messages supporting your belief are presented far more frequently in the media than messages contrary to your belief, or presented repeatedly by people you admire, the tendency might be to give those supportive messages even more weight than before