I’m currently immersed in a research project focused on misinformation and disinformation. For those in need of a quick primer: disinformation is the weaponization or malicious use of misinformation or low-quality information. Misinformation has long existed as a societal problem, but its rapid, tech-enabled dissemination and targeted use mark a novel and formidable challenge for us as humans today.
Through this project, I get to interview AI experts from around the world and get their take on the challenges we face as a society in accessing trusted information. At the same time, I’m also learning all about behavioral psychology, technology, adtech stacks, our attention economy, the human brain, and the list goes on.
Here is one of the more interesting things I’ve recently learned: Once a piece of false information is uttered and out there in the world, it’s very hard to take back with a simple retraction. So, all that fact-checking that happens after the speech … well, it turns out that humans are actually resistant to factual corrections. One study explains: misinformation “exerts a lingering influence on people’s reasoning after it has been corrected—an effect known as the continued influence effect.” There are “psychological barriers to knowledge revision after misinformation has been corrected.”
In my interviews, one policy expert gave me a perfect example of this. She described being at a museum with her child’s friend and that child’s mother. At one point, the other mother turned to her and began talking about how amazing it was that the museum had made this huge dinosaur exhibit when dinosaurs aren’t in fact real and they were just made up for kids. My colleague was completely stunned and unable to come up with the right words to debate a woman who vehemently believed that there was no evidence that dinosaurs had existed.
That dinosaurs never existed was her truth, and a simple retraction in this particular circumstance simply wouldn’t have worked. Moreover, my colleague didn’t have her wits about her to replace that wrong knowledge with something just as compelling and robust as that woman’s false understanding. And researchers have figured out that a retraction—without an immediate and equally striking refutation of that information with a different causal explanation—leaves a gap in our mental models.
This is super important for us to understand because that gap and that disruption of the causal events in a person’s mental model creates real discomfort in humans—so much so that it makes us want to disregard or negate the retraction of false information because we feel so uncomfortable. That’s the defensiveness and resistance that people show when they’re simply told something is not true.
Some sort of alternative explanation may help reduce the gap left by that retracted information, but it’s not always possible to unring the bell, so to speak. There’s actually a research paper called “Unringing the bell,” in which research participants were given false but rich memories in a controlled environment, aided by familial informants who helped support researchers in “planting vivid memories of events that never occurred.” The purpose of the protocol was to see how hard it would be to erase that false memory.
Even by the end of the experiment, after participants had been fully debriefed that these were in fact fabricated memories, many still believed it had happened. Immediately after the de-hoaxing, the majority of people (68 percent) still believed some part of the fabricated memory. And even three days later, there were still some participants who persisted in believing those false memories. That’s pretty amazing and scary–how long those false memories persist! This study shows how it’s so much easier to seed something false than to take it back or remove it from our minds.
Now, take this new concept that you cannot unring the bell and then layer on how fast and deep false information spreads. Three MIT researchers have “found that falsehood diffuses significantly farther, faster, deeper, and more broadly than the truth, in all categories of information, and in many cases by an order of magnitude.”
And now consider the year we’re in: 2024. According to Time magazine: “Globally, more voters than ever in history will head to the polls … representing a combined population of about 49% of the people in the world….” Another estimate says that approximately 4 billion people across 76 nations will vote this year.
I can feel our minds collectively breaking at the scale and urgency of this global challenge. Traditional tactics will not suffice. Fact-checking and retractions will not work in this environment.
One challenge is that research like the few pieces I’ve shared here is not widely accessible or easily comprehensible. We have to start putting what we know about the brain and the way humans react and behave into scaled attempts to fight this viral spread of misinformation. A select few of us can't be the ones who know this. This potential energy has to be kinetic and action-oriented instead. As one interviewee put so well, “The truth needs to be as candy-colored and bright and beautiful as the lies are.”
Like it or not, we live in an attention economy. As co-conspirator of Facebook Sean Parker put it, “How do we consume as much of your time and conscious attention as possible?” We’re not going to change this attention economy with facts alone.
If we want to counter the glut of misinformation in the world, then we will have to get better at spreading the truth faster than the lies. We will have to figure out how to weaponize storytelling for good.
Special thanks to Jill Nephew for sharing these journal articles with me.
Dr. Michelle R. Weise is the author of Long Life Learning: Preparing for Jobs that Don’t Even Exist Yet and leads Rise and Design, a strategic consulting and advisory service for businesses and higher education institutions.