Misinformation and Its Spread

Here’s an interesting article here at Science by Kai Kupferschmidt on misinformation – it’s specifically talking about the work of Carl Bergstrom (whose PNAS paper on the subject is here). Among other things, Bergstrom has worked on ways to map scientific fields for collaborations and to find influential groups and papers, and those tools are unfortunately applicable to what he has no qualms about calling “bullshit” and the way our current technologies enable it:

Bergstrom sees social media, like many other things in life, through an evolutionary lens. The popular platforms exploit humanity’s need for social validation and constant chatter, a product of our evolution, he says. He compares it to our craving for sugar, which was beneficial in an environment where sweetness was rare and signaled nutritious food, but can make us sick in a world where sugar is everywhere. Facebook exploits humans’ thirst for contact, in his view, like a Coca-Cola for the mind, allowing people to connect with others in larger numbers during a single day than they might have over a lifetime in humanity’s past.

Such platforms also exploit human curiosity and excitement about intrapersonal conflict, reinforcement from finding others that agree with us, suspicion about the motives of others that disagree with us, and whatever other buttons there are to push that keep people using their sites. It does not say much that’s complimentary about us as a species that the most effective algorithms for monetizing social media tend towards discord and lies, but there’s a lot of evidence that this is the case. As the article notes, though, we really need more data on these effects to understand them, but the social media companies themselves are extremely reluctant to share it or allow it to be gathered. You wouldn’t want to give anyone else great ideas about increasing average session times, for one thing, and there’s the no doubt lesser problem that the real picture of how engagement is maximized would not make anyone look good.

We have of course seen all this in real time over the last two years. Huge amounts of misinformation about the coronavirus pandemic and its possible treatments have sluiced all over the world, and continue to do so. If I (or anyone else with any kind of a platform) write something endorsing vaccination, or saying that I do not think that the vaccinations themselves are causing any huge amounts of harm in the general population, or saying that I do not think that there is good evidence that the coronavirus itself was a deliberate bioweapon, or saying that I think that ivermectin is not a suitable therapy for it no matter where it came from. . .any of those and more set off an immediate response in my social media and emails. This post will surely follow the same pattern. I have correspondents who are convinced of the opposite of all of those positions above, often convinced of all of those opposites simultaneously, and they are revved up and ready to argue about them.

They have been spun up to their current high RPM, as far as I can tell, by repeated consumption of that bullshit that Bergstrom refers to. People send me links to podcasts and YouTube videos and slide presentations that are just fiestas of misunderstandings, nonsense, and outright lies, but they’re presented with a huge amount of detail in confident, passionate tones by people who sound completely committed to the cause. I can easily see where people who have not spent years in medicine, biology, chemistry and such fields would be convinced that they are truly on to the truth when they see this stuff, and after the fourth, fifteenth, or fifty-third exposure to it they’re even more so. Many have made this bullshit a treasured part of their lives and a foundation of their view of the world, and will understandably bristle at any hint of attack. This is human nature. Clive James put me on to Montesquieu’s on Pompey’s flaw when it came to evaluating the threat of Julius Caesar. People don’t want to backtrack or to admit a mistake that has done nothing but grow since they made it:

I believe that the thing above all which ruined Pompey was the shame he felt to think that in having elevated Caesar the way he did, that he had lacked foresight. He accustomed himself to the idea as late as possible; he neglected his defence in order not to avow that he had put himself in danger; he maintained to the Senate that Caesar would never dare to make war; and because he had said it so often, he went on saying it always.

And as James went on to say:

Whether or not Montesquieu was right about Pompey, for example, he was right about you and me. Once we invest our opinion, we hang on to the investment; so the more we have at stake the more we risk, even by doing nothing. And the more powerful we are the more likely we are to stick to our rusty guns: because it was firmness of purpose that made us powerful.

Science, and I and many others have pointed out, is somewhat inhuman in the way that it tries to fight against this. We scientists are supposed to always be open to changing our minds in the face of new evidence, to be ready to give up long-held beliefs if that evidence is really good enough. Now, we don’t always live up to that ideal; a quick look though the history and practice of science up to the present day will leave you no doubts about that. But we at least profess to try, and sometimes we manage to do it. I’m surprised it’s all worked as well as it has. It’s against human nature to suspend judgment, against it to be ready to entertain doubts, and definitely against it to run experiments that might kill off your own cherished ideas right in front of you.

But as Kupferschmidt’s overview makes clear, no one is quite sure what to do about this. Not everyone in the world is a scientist or thinks like one, and God know plenty of scientists don’t do it enough themselves. Ignoring bullshit only seems to make things worse, but direct engagement doesn’t seem to help that much, either. You keep circling back to the questions of whether people’s views are ever modified much by the news and information that they consume, or whether people seek out or place more value on the information that fits the views that they had already. I’m sure both of those are operating (it’s a false dichotomy like the nature/nuture argument), but that second possibility just seems to send you back around to the question of where those views came from in the first place. And what if one of those views is a generic “I don’t trust what any so-called expert tells me by default – only the outsiders and rebels see the truth“? Where do you start then? Well before all of this I had some thoughts on “expertology“, but I’m no nearer an answer now than I was then.