S1E56: Flattening the Infodemic Curve / Claire Wardle, John Cook, Renee DiResta

Listen on:
Apple Podcasts SpotifyGoogle PodcastsStitcher
Description
Transcript

“That’s the challenge with infodemics: too much information and not knowing who to trust.” – Claire Wardle

Combating misinformation has become more important than ever during the pandemic. The novel coronavirus, social media, and a polarized political environment created something public health experts have dubbed an “infodemic” — a flood of misleading information and conspiracy theories about the coronavirus and the public response to it. In this episode of EPIDEMIC, we’ll hear how misinformation spreads online, share some tips on how to spot it, and find out what needs to change to keep misinformation from causing serious harm.

This podcast was created by Just Human Productions. We’re powered and distributed by Simplecast. We’re supported, in part, by listeners like you. #SARSCoV2 #COVID19 #COVID #coronavirus

Renee DiResta: So it was just sort of almost like a conspiracy correlation matrix. It just said, you know — hey, you’re interested, you follow these conspiracy groups, maybe you’d like these other conspiracy groups.

Claire Wardle: We’re never going to solve information pollution. We need to equip people with the skills to navigate it.

Céline Gounder: You’re listening to EPIDEMIC, the podcast about the science, public health, and social impacts of the coronavirus pandemic. I’m your host, Dr. Celine Gounder. 

Back in 2007, John Cook was at a family party. He was having a good time until he got pulled into one of those uncomfortable moments. He got into an argument with his father-in-law.

John Cook: So he is a climate denier, and at our family get-together, he threw all these different climate myths at me, and I went away and researched them, just to try to find out what the truth was about these different arguments.

Céline Gounder: John wasn’t really all that interested in climate change. He wasn’t an activist and didn’t really follow it that closely. But that didn’t matter. John just didn’t like the idea of losing.  

John Cook: Like any competitive son-in-law that wants to make sure they win the next argument with their father-in-law, I started building a list of the possible arguments that I might encounter at the next family get together.

Céline Gounder: But as he was trying to one-up his father-in-law, John actually learned a lot about climate change. He got familiar with the science and learned the consensus was people were impacting the climate. He showed up to the next family gathering armed with the facts. 

John Cook: Eventually, I realized that other people had family members like that, who might find this useful resource as well. And so I published a website, debunking climate misinformation.

Céline Gounder: The website was called Skeptical Science… and it changed his life. 

John Cook: Publishing, uh, the debunkings and just beginning public engagement on the issue of climate change, uh, really sent me down the road of, of learning more about how to debunk misinformation, eventually leading me to do a PhD, cognitive science, and, um, really studying depth, the psychology of science denial.

Céline Gounder: Today, John is a professor at George Mason University studying misinformation and how to counter it. Over the last ten years, John discovered a very effective way to fight science denial. Something called inoculation theory. 

John Cook: Inoculation theory is a branch of psychological research dating back to the fifties. And it basically borrows the idea of vaccination and applies it to knowledge. If you expose people to a weakened form of misinformation, then that builds up their immunity, their resistance, so that when they encounter actual misinformation out in the real world, they’re less likely to be influenced.

Céline Gounder: So what does this tool for fighting climate change denial have to do with the pandemic?

John Cook: In my research, I also have found that inoculating people against the techniques of denial in one topic also inoculates them against the same techniques in other topics. So inoculating people against climate denial can inoculate them against the vaccine misinformation or COVID misinformation, or evolution misinformation.

Céline Gounder: Combating misinformation has become more important than ever during the pandemic. The novel coronavirus, social media, and a polarized political environment created something public health experts have dubbed an infodemic — a flood of misleading information and conspiracy theories about the coronavirus and the public response to it. 

In this episode of EPIDEMIC, we’re going to hear more about the techniques John developed to fight misinformation and how he’s applying it to COVID. 

We’ll hear from experts how misinformation spreads online…  

Renee DiResta: I started getting quite a lot of, um, recommendations into groups that were not anti-vaccine, but that were highly conspiratorial. And sometimes actually, you know, sort of increasingly violent.

Céline Gounder: Learn how to spot misinformation…  

John Cook: There’s really just five main techniques that you need to be aware of.

Céline Gounder: And hear what needs to change to stop misinformation before it causes serious harm… 

Claire Wardle: So we focused on a post and say, is that true or false? Should it be labeled or taken down? And we’ve been so focused on these individual atoms that we fail to see the way that those individual atoms create these larger narratives. 

Céline Gounder: Today on EPIDEMIC, how to stop an infodemic. 

Claire Wardle is the U.S. director of First Draft. 

Claire Wardle: We’re a nonprofit that helps organizations tackle the challenges of misinformation. 

Céline Gounder: Since the pandemic started, her organization has been focused on coronavirus and — most recently — vaccine misinformation. I know first hand from my time working in HIV… and later… as an Ebola aid worker that misinformation and conspiracy theories are not new when it comes to public health. But Claire says misinformation in the time of COVID is spreading like never before.

Claire Wardle: So back in March, we’d see the same rumor popping up in a WhatsApp group in Mexico City and then we’d see it pop up in Perth, in Australia, and then in Delhi in India. And the interconnected nature of those rumors – I’d never seen anything like that before in an era of social networks. So we’ve had other health emergencies where rumors have been a serious problem, but we didn’t have the technology, which meant those same rumors were crossing borders in seconds.

Céline Gounder: Social media and closed messaging apps like WhatsApp have created an environment where misinformation can spread like wildfire. So the World Health Organization, or W-H-O, asked First Draft to help combat this COVID infodemic.

Claire Wardle: The WHO right at the beginning of the pandemic very clearly identified that this wasn’t just a problem with the virus. It was also a serious problem in terms of the information ecosystem. So, on one hand, you’ve got all of these rumors and falsehoods spreading and your family, WhatsApp groups, and other spaces. But also when I go to try and find information about vaccines or masks or airborne transmission. There’s all sorts of people out there trying to give me information and I’m overwhelmed because not all of it is consistent. So that’s the challenge with infodemics is too much information and not knowing who to trust.

Céline Gounder: And the stakes are high. First Draft did a case study of how misinformation about a polio vaccination program in Pakistan snowballed with deadly results. 

In 2019, a fake video showed children in a hospital who were allegedly sick because of the polio vaccine. The videos spread quickly on social media and even local news picked up the story.  

Claire Wardle: Some fake videos started circulating, and if you watch the whole video, it was quite clearly staged. But most people didn’t even click on the link. They just saw the screen grab. And in over three days, all of a sudden there was this huge response, and people started pulling their children out of vaccination programs. Then they started going to health centers, burning them down. A couple of health practitioners lost their lives. And it was a reminder of how quickly we can have misinformation crises. And what we saw in that moment was that government officials, UNICEF, other influences and newsrooms – they didn’t know how to respond quickly enough. They didn’t know how to damp down those rumors. And that absence of anything meant that the rumor had just took off and caused real, real harm.

Céline Gounder: Since then, Claire’s organization has worked with the WHO to train infodemic managers. These people help local governments around the world proactively respond to rumors like the one in Pakistan before they get out of control.  

But this kind of misinformation happens in the United States too. Renee DiResta knows about it first hand. 

Renee DiResta: Sure. So my name’s Renee DiResta and I’m the research manager at Stanford Internet Observatory.

Céline Gounder: In 2015, Renee helped start a parent-led community called Vaccinate California. The organization supported SB277, a bill — which is now California law — that requires all children to get school immunizations, regardless of personal beliefs. Renee saw how misinformation spreads while she was volunteering with the group. 

Renee DiResta: So when I was looking at anti-vaccine content in 2015, I joined a bunch of anti-vaccine groups and I wanted to try to understand what those communities were like, what they saw, um, what they were talking about.

Céline Gounder: Renee says she never made comments in these groups. She just read what others were posting in them. 

Renee DiResta: What started to be much more interesting than the content itself or any of the anti-vaccine groups or pages themselves, was actually the way in which the recommendation engines behaved when I communicated that I was interested in this content.

Céline Gounder: Once Facebook’s algorithm saw she was in an anti-vaccine group, it started to suggest she join other conspiratorial groups. 

Renee DiResta: And there was one that it pushed to me yesterday called “Viruses are Not Contagious.” Which sort of disputes the very foundational science by which we understand how diseases are transmitted.

Céline Gounder: She started to get recommendations for things like flat Earth groups, for example. But the more groups she joined the more far out and even violent the groups became. 

Renee DiResta: And so that dynamic was, was, was pretty troubling, but the extent to which this content was being pushed at people as opposed to them proactively going and searching for it. This account never did any proactive searching for any of this. It was just being pushed it constantly because it had indicated an interest in a wholly unrelated conspiratorial concept. 

Céline Gounder: Another contributor to the spread of misinformation is something called a data void. When there’s a new topic in the news, people start searching for that term online or on social media.

Renee DiResta: And what pops up is not always the most reliable information. It’s sort of the content that got there for that term first.

Céline Gounder: Many search engines rely on the wisdom of the crowd for their results. Basically, the more people reference or link to a specific source gets interpreted as a signal that that’s where to find the best or most accurate information. But for these new terms — things like “coronavirus” back in March — there’s relatively little out there. That’s the data void. 

Renee DiResta: So the voids are any spaces where there’s just not enough content, not enough reputable content to provide people with the information they’re looking for. And so unscrupulous actors, oftentimes spammers, but occasionally people with a political agenda as well will attempt to kind of squat on these keywords and gain pride of place in search results.

Céline Gounder: Renee saw a version of this when she studied the way anti-vaccine advocates used Twitter.

Renee DiResta: One of the things that we would see as we examine the Twitter conversation around the hashtag for the bill, is that a very, very small fraction of a percent of the participants in the conversation we’re producing more than half of the content in the hashtag. So very, very small number of accounts, producing huge amounts of content and propaganda related to the bill. So if you were to go search for it, that’s what you would see. 

Céline Gounder: This kind of misinformation is really hard to fight once it’s in the world. But what if there was a way to protect people from this kind of misinformation before they share it? We’ll circle back to John Cook’s research into how people can get inoculated against misinformation … after the break. 

* * *

Céline Gounder: Inoculation theory applies many of the same lessons from vaccination to misinformation. 

John Cook: I’m reluctant to use the word cure, but to change a denial of mind, we can reduce their influence and stop misinformation from spreading if we inoculate, uh, and theoretically, if we can inoculate enough of the public is critical thinking can become widespread enough, then you can eradicate certain aspects of misinformation, achieve herd immunity.

Céline Gounder: So how does this work? 

John Cook: An inoculating message basically consists of two things. Firstly, warning people of the threat that they might be misled. And then secondly, counter-arguments explaining how the misinformation misleads.

Céline Gounder: One of the more common techniques used to spread misinformation is fake experts. 

John Cook: And the example I used was the tobacco industry… They used to use fake experts back in the fifties and the mid 20th century. … And they would run these newspaper ads showing tens of thousands of doctors endorsing a particular brand of cigarettes or reassuring us that smoking soothes the throat and wasn’t harmful to our health. 

Céline Gounder: After John explained how cigarette companies were using fake experts to sell their products, he showed the same group a piece of climate denial misinformation that also used fake experts.

John Cook: Climate misinformation, which is most effective with political conservatives, was completely neutralized as well as with political liberals. So across the political spectrum, that misinformation was neutralized. So if you explain the techniques that can be used to trick them, those techniques no longer are effective on them.

Céline Gounder: This is a powerful technique because it’s effective across the political spectrum, and for any kind of misinformation. But it requires critical thinking skills. John says a lot of people are never taught how to assess the merits of an argument in school. So, John and others decided that they could develop a game to help students and adults learn these skills. They called it Cranky Uncle. It just came out in December for iOS and Android. 

John Cook: Main character in the game, he’s a cartoon cranky uncle, and he basically mentors you on how to become a science-denying cracky uncle yourself.

Céline Gounder: As the character teaches you his tricks, you’re then asked to spot the logical fallacies in different examples. 

John Cook: So we test them on vaccination and Evolution and flat earth-ism and climate change, and just general everyday fallacy arguments. And their ability to spot those policies improves across all the different issues.

Céline Gounder: For the average listener, what would be maybe a cheat sheet of what they should be aware of so that they’re not fooled by misinformation or conspiracy theories when they encounter them?

John Cook: There’s really just five main techniques that you need to be aware of: fake experts, logical fallacies, impossible expectations, cherry picking, and conspiracy theories. 

Céline Gounder: The first one John already mentioned: fake experts.

John Cook: A good example in the COVID realm is the, the Great Barrington Declaration that was a petition of mostly non-experts and just a handful of actual experts, but they were, they were really the outliers in the medical community. 

Céline Gounder: Next, logical fallacies.

John Cook: Through the course of this year the logical fallacy I seem to see the most often was a false equivalence.

Céline Gounder: Think about the debate over the summer over whether or not to re-open schools in the United States. 

John Cook: The argument was well, Germany and France are opening their schools. So why shouldn’t we? But the thing was at that time, Germany and France had crushed the curve and their COVID cases was down to very low levels while the U.S. levels were extremely high. Which meant that you really couldn’t compare the two.

Céline Gounder: Another one that’s very common with vaccines is impossible expectations. 

John Cook: We see people arguing that the vaccine should be 100% safe before, before we use it. But nothing is ever 100% safe. It’s an impossible expectation. And the risks that are there with vaccines of being mitigated and managed.

Céline Gounder: John says many of the extreme examples cited with vaccines is a form of cherry-picking — highlighting an outlier case to sow doubt. This is often used to attribute a serious ailment, like paralysis, for example, to a vaccine.

John Cook: And these kinds of stories can really tug at the heartstrings. They are first anecdotal. And secondly, they can make the fallacy of confusing correlation with causation thinking that just because two things happen in close to each other, one must have caused the other. 

The last thing is conspiracy theories, and conspiracy theories are always associated with science denial.

And, there’s the 5G conspiracy theory — that the COVID-19 pandemic was caused by 5G. Just because they both happened at the same time, 5G came in 2019 and COVID-19 also came in 2019. And that logic is like arguing that Baby Yoda came in 2019, well, maybe he created COVID-19 as well. So just because two things happen at the same time, again, doesn’t mean that one caused the other.

Céline Gounder: These are important things to be able to identify for any citizen in a democracy but inoculating people against misinformation isn’t enough. Science communicators need to change how they get out their message.

John Cook: If I had to sum up all the psychological research into debunking in the six words, it would be, fight sticky myths with stickier facts. Myths are sticky. They grab people’s attention and stay in their memories because they’re simple, they evoke an emotional response using these concrete examples or anecdotes, and they’re often in the form of a story. Science, communicators, or anyone trying to explain the facts to people should be looking at these tracks, not the deceptive ones, but just the basic traits that make messages, stickier, and apply that to their facts.

Céline Gounder: Claire Wardle agrees. 

Claire Wardle: Bad actors are really good at flooding the zone and creating all types of different memes, gifs, videos. Very, very, very visual. Those of us on the quality information side are really good at creating 87 page PDFs. 

We need to get much better at creating really emotional, engaging content.

Céline Gounder: Claire says the other missing piece is a community-driven narrative. 

Claire Wardle: And I think those kinds of misinformation experts who liked to think that if we just push out the right messages on Facebook, we’ll be okay, completely fails to understand that this, in terms of countering rumors and misinformation, it has to be at the community level. And it has to be about working with the community to think about the most effective messages, the most effective formats.  

Céline Gounder: What does that kind of community engagement look like more concretely? 

Claire Wardle: So it’s about connecting with community organizations. It’s connecting with faith leaders. It’s thinking about finding ways to bring the community into feeling that they’re part of the response.

Céline Gounder: This is hard work. It requires a lot of patience and empathy. And right now, those are both in short supply.

Claire Wardle: The failure to listen to each other means that we’re failing to understand why people are susceptible to these rumors and these conspiracies.

This is about, is about co-creating messages with different communities that are culturally relevant and specific and empathetic. And I think in a country like just the U.S. you know, understanding how these  conversations, messages, have to be shaped with the communities themselves as critical. So at the moment that conversations are pretty patronizing, patriarchal — this belief that audiences are passive. I find it deeply troubling. And until we change that mindset, I don’t think we’re going to get very far. 

 

CREDITS

 

EPIDEMIC is brought to you by Just Human Productions. We’re funded in part by listeners like you. We’re powered and distributed by Simplecast. 

Today’s episode was produced by Zach Dyer and me. Our music is by the Blue Dot Sessions. Our interns are Annabel Chen and Bryan Chen.

If you enjoy the show, please tell a friend about it today. And if you haven’t already done so, leave us a review on Apple Podcasts. It helps more people find out about the show!

Follow EPIDEMIC on Twitter and Just Human Productions on Instagram to learn more about the characters and big ideas you hear on the podcast.

We love providing this and our other podcasts to the public for free… but producing a podcast costs money… and we’ve got to pay our staff! So please make a donation to help us keep this going. Just Human Productions is a 501(c)(3) non-profit organization, so your donations to support our podcasts are tax-deductible. Go to EPIDEMIC.fm to make a donation. That’s EPIDEMIC.fm.

And if you like the storytelling you hear on EPIDEMIC, check out our sister podcast, AMERICAN DIAGNOSIS. On AMERICAN DIAGNOSIS, we cover some of the biggest public health challenges affecting the nation today. Past seasons covered topics like youth and mental health; the opioid overdose crisis; and gun violence in America.

 

I’m Dr. Celine Gounder. Thanks for listening to EPIDEMIC.

Guests
Claire Wardle Claire Wardle
John Cook John Cook
Renee DiResta Renee DiResta
Host
Dr. Celine Gounder Dr. Celine Gounder