Has memorized The Illiad
495 times in 410 posts
Re: Echo Chambers
This was a fascinating essay. I certainly agree that his distinction is important between "echo chambers" which create active discrediting of those who challenge the worldview and mere "epistemic bubbles" in which one only encounters viewpoints similar to one's own, and therefore never challenge one's own beliefs. I could quibble about the term: there is nothing about the term "echo chamber" which brings to mind the discrediting feature, and it seems to me that something more along the lines of "wagon circling" or "intellectual immunization" might be more useful. But never mind that.
I also like his connection between trust-building processes and a "social reboot." The author, C Thi Nguyen, holds that this is a narrow and fragile path, even as he (I think he's male, based on his web page) dismisses the hope for greater autonomy of thought (as in, critical thinking) as a pipe dream. I disagree on both counts. But I agree with the notion that direct intellectual assault is a bad idea, and likely to be counterproductive.
He has recognized the main point, which is that people "believe" worldviews that are counter to the evidence for social reasons, based on whom they can trust and the social reasons for that, not because they have simply strayed into error randomly. Such worldviews sustain themselves for social, not epistemological, reasons, even though the nature of the epistemology has to allow for it. (There are reasons why there is a Flat Earth Society but not a Pyramidal Earth Society or a Toroidal Earth Society, for example.)
Overall, however, I think his analysis is too shallow. An anthropologist looks for how a system works, and a single theory is often not good enough even for a single system. Belief in witchcraft, for example, is multi-faceted and does not come down to a single way it protects itself against lack of evidence or a single way it feeds off the antagonistic energy of a community. When he wants to reduce the matter to a question of "which worldview the subject encounters first" he has just opted for silliness. There are stories of people raised atheist who become religious (though many fewer than the reverse) and there are stories of people raised Protestant who choose Catholicism, and vice versa, and the same for feminism and socialism and veganism and Ultra-Orthodoxy and, well, you name it.
So, much as I like his conclusion, I think he needed to ask a couple of questions. First, what makes a poorly-evidenced theory socially reinforceable? Why do anti-vaxxers, for example, accept evidence from those who agree while rejecting it from more objective sources?
It seems to me the answer has at least three elements. The person must find their own "natural" interpretation to make some sense. ("Of course there's a God, functional complexity does not arise randomly," etc.) There must be a plausible social reason to distrust the skeptics. (They are leftists, who we all know disguise their true motivations, or they are paid experts protecting the pharmceutical industry, or whatever). And, usually connected to the second in some way, there must be a genuine social or emotional motivation to choose the "true and correct" side that one's motivated reasoning tells one is the correct version. It has to mean something and matter to you. So we have one cognitive factor and two social factors.
And note the role of assigning secret motives to the critics. It usually isn't enough to claim they are just following their own motivated reasoning. That would invite self-examination and critical thinking. So you need a bit of a paranoid mindset to get caught by these traps. They don't just disagree, they are also trying to manipulate you in some way.
Of course real cases exist of conspiracies to manipulate. Hoaxes often have some conspiracy element. Healing ministries often rely on staged fakery, similarly to spiritualist mediums. The Pentagon Papers exposed a systematic deception of the public and we later learned there was a massive-scale self-deception going on within the government. Communists really did lie for what they saw as the greater good, and Julius Rosenberg really did pass on nuclear secrets. The press really did cover for FDR's reliance on a wheelchair. Harvey Weinstein had cohorts of enablers. And let's not get started about J. Edgar Hoover.
So my start on answering "what makes a poorly evidenced theory socially reinforceable" is that it has to be emotionally meaningful and, superficially at least, intellectually plausible. And when that emotional meaning lends itself to demonizing those who don't share it (sellout materialists, or snobbish elitists, or apologists for their own sinful ways, or heedless rapists of the environment, or whatever) then the potential for an echo chamber, in Nguyen's terminology, is high. Religious fundamentalism, but also scapegoating hate theories, have fit this pattern pretty well for a long time now.
This leads, in my opinion, to a second question that should be asked. What are the motives of the demonizers? Anyone who thinks Rush Limbaugh is just in it for an honest intellectual evaluation is seriously disoriented. The usual motivations applying to entertainment figures certainly apply to him as well as Sean Hannity and Father Coughlin and, frankly, William F. Buckley, and a long line of similar figures who labored to discredit "mainstream" trust figures. The leadership of fundamentalism includes mercenary figures like Oral Roberts as well as honest status seekers (who keep their inquiry within bounds to safeguard their status) like Russell Moore. The hacks who worked first for Big Tobacco and then for a succession of other causes, documented in "Merchants of Doubt," were well-meaning right-wing scientists who saw leftist manipulation in the work of, e.g. Union of Concerned Scientists, and wanted to act against scientific evidence in the interest of "sound policy." And these days, demonization happens just to generate clickbait for dollars.
So there are a host of motivations. Why should we bother examining them? Because of the dog that didn't bark. What's missing from this list is honest search for the truth - the thing Rush has no interest in. You do find lonely opposition to mainstream views by such honest searchers - Bjorn Lomborg makes a good example - but they are not prone to systematically attacking the motives of those who disagree with them. And the obvious reason is that their views are not driven by signalling some social virtue to some group, a process that then turns disagreement into opposition to virtue.
So, in addition to Nguyen's prescription in terms of building trust by building relationship, I would add demonstrating, on a consistent basis, a purist's regard for evidence and objectivity.
This is why I get so exercised about the criticisms leveled against Stephen Jay Gould and NOMA by anti-theists such as Richard Dawkins. Their vituperation, and reckless disregard for truth, undermines whatever trust might conceivably be built that they are just assessing evidence and promoting the assessment of evidence.
Whenever someone engages in demonizing those they disagree with, they immediately forfeit any credibility they had for claims of objectivity. It's that simple. I have never seen an exception - attack the opponents, rather than their arguments, and you will not be treated as objective.
Which brings us to the heart of darkness. When I attack Dawkins for lying about Gould, people just hear the aggression. Match that up with the fact that I am a Christian, and they assemble the pieces in a way that discounts my objectivity and treats everything I say as suspect. This is entirely rational, even though it isn't evidence-based. As I said, as soon as you attack the person rather than the claim, you forfeit the presumption of objectivity. Yet, undeniably and with evidence to spare, sometimes people really do conspire against the truth.
In my opinion, the problem we face today is not primarily one of "echo chambers" in which the Rush Limbaughs are caught in self-confirming misinterpretation by suspicion of the other side. Sure, that goes on (as it has for centuries) and the internet facilitates its rapidity and breadth. No, the problem is not motivated reasoning run amok on steroids, as serious as that is. The problem is systematic exploitation of this vulnerability for profit, power or other manipulative goal. But as soon as one points this out, one forfeits the presumption of objectivity.
There is a way out, however, which is accountability. The Creation Institute always just moves on to their next crazy claim, never acknowledging that they got all the previous ones wrong, because they are not trying to get at the truth. Limbaugh, Hannity and our Dear Leader in the White House, do not accept any accountability because they are in a process that is opposed to truth, with its notorious liberal bias, not a process to learn the truth.
And I am afraid that those who want to pass the credibility test have to submit to that standard. If they can't admit the ones they got wrong, then anyone who takes their claims seriously is part of an echo chamber.