S1E74: Techno-Racism & COVID at Home and Abroad / Mútale Nkonde, Corin Faife, Heidi Larson, Imran Ahmed

Listen on:
Apple Podcasts SpotifyGoogle PodcastsStitcher
Description
Transcript

“They benefit from traffic no matter if it’s good information or malignant misinformation. ” -Imran Ahmed

During the pandemic, disinformation campaigns have been targeting people of color with lies like African Americans can’t get COVID or denying the pandemic is even real. In this episode, we’re going to hear more about how these disinformation networks are gaming social media algorithms. We’ll hear how the United States has become a hub for disinformation exported around the world, and what legislators need to do to tackle bad actors.

This podcast was created by Just Human Productions. We’re powered and distributed by Simplecast. We’re supported, in part, by listeners like you.

EPIDEMIC s1e74-

Imran Ahmed: They benefit from traffic, no matter if it’s, if it’s that’s to look at good information or malignant misinformation.

Heidi Larson: Stories are really powerful means of communication, much more so than, and this is where the gap between science and public I think is the biggest.

Céline Gounder:You’re listening to EPIDEMIC, the podcast about the science, public health, and social impacts of the coronavirus pandemic. I’m your host, Dr. Céline Gounder.

Disinformation has always been a problem since the beginnings of the internet, but it reached a fever pitch during the 2016 presidential election. Robert Mueller’s investigation into election interference found some surprising results about where that disinformation was aimed.

Mútale Nkode: One of the major findings that came out of the report by special counsel, Robert Mueller, into election interference, where that African-Americans were the most targeted by disinformation in the 2016 election.

Céline Gounder: This is Mútale Nkonde. She’s the founder of AI For the People. It’s a non-profit organization whose research focuses on how technology impacts the lives of Black people… from voter turnout to vaccine confidence.

Mútale Nkode: And what was happening according to the report was Russian agents were creating false online communities.

Céline Gounder: The goal of these disinformation campaigns in 2016 was to depress the Black vote in swing states. The campaigns would focus on an accepted truth… and then springboard into something controversial. 

Mútale Nkode: In this case, that there is systemic racism in the United States, that it manifests itself through economics, so there is a need for reparations. And then they’ll insert a massive lie. So the massive lie, in this case, was: unless reparations are promised day one, Black Americans shouldn’t vote.

Céline Gounder: One effort asked Black people to do something called voting “down ballot.” This means that you either leave it blank or you write in a hashtag at the bottom of  the ballot. Down ballot voting makes it difficult for election officials to determine who the voters’ choice for president is, and the votes are tossed out. In 2016, this push to vote down ballot played out in the city of Detroit.

Mútale Nkode: And what happened in 2016 was 70,000 people in this, in this place, didn’t vote at the top of the ballot. They voted down ballot and that swung the state from Obama in 2012 to Trump in 2016.

Céline Gounder: In 2017, many of these suspicious accounts arguing for reparations went quiet… but not for long.

Mútale Nkode: And all of a sudden we see, um, a hashtag pop-up, which, please forgive me if I don’t name because I don’t want to amplify it, but it was by a group that at that point were really interested in foreigners not being in the US.

Céline Gounder: Mútale was surprised by where this hashtag started to appear.

Mútale Nkode: The folks that were using this hashtag, who were Black people online, it seemed, would always really side with, it seemed to be white supremacists and conspiracy theorists, and we thought that that was super, super interesting.

Céline Gounder: Mútale says this hashtag started to be associated with posts claiming Black people were immune to COVID. Once it became clear that people of color were disproportionately impacted by COVID, the hashtag started to deny the pandemic all together. It is the perfect storm for disinformation networks to swoop in and target groups for their campaigns. Mutale describes it as an information fog.

Mútale Nkode: And it just puts you into a situation where you go, you log onto your Twitter account and you literally can’t see, you don’t know what’s real. And we felt that fog there, the idea of waking up and walking out into a fog was really similar, where you just can’t really see anything and it’s not really your fault, but you, you somehow have to, you know, kind of make your way in the world anyway.

Céline Gounder: In this episode, we’re going to hear more about how disinformation networks are gaming social media algorithms…

Imran Ahmed: What they do is they find existing fissures in our society to cause a mini earthquake.

Céline Gounder: How the United States has become a hub for disinformation exported around the world…

Heidi Larson: Colleagues running the infodemic work out of WHO Africa office have said that a lot of what they’re seeing has been imported from the U.S.

Céline Gounder: And what legislators need to do to tackle bad actors…

Imran Ahmed: So it’s going to take all of us. Government, civil society, individuals, but most of all, the tech companies. They are addicted to the profits that come from misinformation and hate.

Céline Gounder: Today on EPIDEMIC, how social media and disinformation algorithms target vulnerable populations. Mutale says African Americans and other people of color may be at higher risk of exposure to misinformation. One reason, she says, is historical.

Mútale Nkode: One of the things that you’ll find is that there is a lack of trust of the news media because of historic, racial bias and wrongful, kind of framing of Black people and the Black experience. And so they were creating alternative new systems online.

Céline Gounder: So for many, social media became a place where they could find news and commentary on things that spoke to their lived experience. And social media is no small front in the disinformation war.

Mútale Nkode: Yeah, there’s actually some really great data that came out of the Pew Research Center in 2019 that found that 55% of all American adults get their primary news from social media, which was really scary to us because on social media, the news that comes to our news feeds is curated by algorithms.

Céline Gounder: Once the algorithm learns what you engage with, it will show you more and more information similar to that…

Mútale Nkode: Those algorithms are charged with looking with identifying patterns in our engagement behaviors. So what do you like, what do you retweet? What do you comment on? And then serving you more news.

Céline Gounder: The people who are learning to game the algorithms have an agenda…

Mútale Nkode: They were propagating disinformation. Disinformation as a form of warfare.

Céline Gounder: And they can be very good at fooling people.

Mútale Nkode: They have their own new sites. That they reference stories from. So for journalists, it’s pretty hard to source.

Céline Gounder: It’s hard to even comprehend the sheer volume of disinformation they produce.

Mútale Nkode: When we were talking about false accounts, the two that always stand out in my mind where they apparently tweeted 147,000 times in a day.

Céline Gounder: Well, and to put that number in context, you said 140,000 tweets in a day. That would be more than a tweet per second.

Mútale Nkode: Yes. 

Céline Gounder: [laughs] …Obviously, that’s not a real person. 

Mútale Nkode: It’s not ten people! It’s not. But then there are also commercial surfaces where we could make a system to tweet 20 times a second, and it would happen.

Céline Gounder: What can make the problem even more complex is that social media ads are often targeted to certain groups. That means that bad actors can choose what demographics will see their content.

Corin Faife: So you could just use the ad credits to say, “Let’s spend a thousand dollars and show this to just a million people between the ages of 18 and 65,” or something like that. The Facebook algorithm will then have to make a whole lot of choices about who exactly those people are.

Céline Gounder: This is Corin Faife. He’s a data reporter at The Markup. Earlier this year, Corin published a report that had some surprising findings about who gets access to accurate information on social media. Facebook has a documented history of allowing racist systems on their platform. In 2016, ProPublica published a report showing that advertisers could block certain racial groups from seeing housing advertisements. The resulting fall-out led to a 2019 lawsuit by the Department of Housing and Urban Development for violating the Fair Housing Act.

Corin Faife: Because of that, Facebook changed the targeting system to use something called ethnic or cultural interests.

Céline Gounder: But this change isn’t that different from the previous system.

Corin Faife: So you can’t say show this to a Black person, but you can say show this ad to someone interested in African American culture, or interested in Asian culture, or it might be a different proxy.

Céline Gounder: This means even after the lawsuit, Facebook continues to give advertisers the power to discriminate. And this divergence in who sees what ads on Facebook doesn’t end with housing. Earlier this year, Corin published a report examining who saw sponsored posts from public health agencies on Facebook.

Corin Faife: Only 3% of the Black people in our panel saw any of these, sponsored posts from any kind of health agency. And that was compared to 6.6% of white panelists and 9.5% of Asian panelists as well, which is quite a big discrepancy.

Céline Gounder: This might sound illegal… but it isn’t.

Corin Faife: There isn’t really a legal standard where we say that public health information has to reach people, uh, equally and not discriminate by racial groups or any other kind of protected classes.

Céline Gounder: I can’t imagine that the CDC or local public health department would intentionally want to target, I don’t know, only wealthy white people with their public health information, you know, COVID information. So how, how does this actually play out that this advertising, is, is missing some of these most vulnerable communities?

Corin Faife: We don’t really know from the view that we have with the mock-up citizen browser. We don’t know that. Things that, that I could speculate on that perhaps the CDC is just taking out ads that aren’t really targeted at all.

Céline Gounder: The findings from the report aren’t definitive. It is a relatively small sample size… but Corin stresses that this is evidence enough that some changes need to be made.

Corin Faife: At the end of the day, the point of this reporting is just to say, we, we really need more transparency, especially if Facebook is going to play such a big job in distributing public health information.

Céline Gounder: But that alone isn’t enough.

Corin Faife: There’s two big. Parts to the two big pieces to the puzzle. So on the one hand, there’s putting out accurate information and making sure the right people see it. But then the other part is actually misinformation and trying to keep a lid on it.

Céline Gounder: And just like a virus… if you allow disinformation to spread anywhere… it will… eventually… spread everywhere. We’ll hear more about that after the break.

***

Céline Gounder: If the algorithm is indeed making many of these decisions, it could lead to inequities in the distribution of information. And that disinformation often starts in the United States.

Heidi Larson: I mean, we’ve seen it on our social media mapping, how much the U.S. is a generator of a lot of the things we seeing circulating globally.

Céline Gounder: This is Heidi Larson. She is a Professor of Anthropology at the London School of Hygiene and Tropical Medicine and the Director of the Vaccine Confidence Project. Heidi and her team spend a lot of time tracking the sources of disinformation online.

Heidi Larson: It’s a bit like the, the weather room. We watch where things move and how fast they move. We capture, for instance, a word or phrase and a piece of the news or the rumor, and you can see where it goes around the world and, and then who picks it up and who amplifies it.

Céline Gounder: When it comes to making disinformation that sticks, you have to know your audience.

Heidi Larson: Those seeds, those beginnings of a rumor come from far away. And then it gets embellished with local culture, local politics, local anxieties, local histories.

Céline Gounder: One example of this was a rumor started by an American doctor early in the pandemic that a malaria medication was an effective treatment for COVID.

Heidi Larson: She’s had a number of, of different, um, rumors out there, but the other one was that hydroxychloroquine was a cure for COVID.

Céline Gounder: This drug was famously endorsed by former President Trump.

Donald Trump: Many doctors think it is extremely successful, the hydroxychloroquine coupled with the zinc and perhaps the azithromycin but many doctors think it’s extremely good… 

Céline Gounder: And that disinformation would spread like wildfire throughout Nigeria.

Heidi Larson: When you get the president of the U.S. saying, you know, just go for this organic or hydroxychloroquine, or finding these alternative cures… It just endorses at a different level with a different kind of authority.

Céline Gounder: Information voids and lack of trust in health systems and government in both communities of color and the Global South provide fertile soil for disinformation to take root.

Heidi Larson: There’s not enough alternative credible information before you start getting to the potentially harmful mis- and disinformation.

Céline Gounder: A large part of the problem has been that science communicators haven’t cracked the code on how to engage people with the right information, and government agencies like the CDC don’t have substantial communications budgets.

Heidi Larson: I think as a public health, as a health community, we need to do a better job of getting creative narratives out there that are accurate, that are credible, that are appealing in ways that some of these, um, alternative voices are out there.

Céline Gounder: Public health officials can only do so much when algorithms are working against them, and political leaders on both sides of the aisle are increasingly recognizing that misinformation and disinformation on social media are threats to public health. On March 25th, the U.S. House Subcommittee on Communications and Technology held a hearing on disinformation and social media. In attendance were the CEOs of Facebook, Twitter, and Google.

Rep Mike Doyle: There are two faces to each of your platforms…. Facebook has family and friends neighborhood but it is right next to the one where there is a white nationalist rally everyday. YouTube is a place where people share quirky videos, but down the street anti-vaxxers, COVID deniers, QAnon supporters, and flat-earthers are sharing videos. 

Céline Gounder: At the end of the hearing, the CEOs made a surprising pledge.

Imran Ahmed: Well, the social media companies said that they would take action.

Céline Gounder: This is Imran Ahmed. He’s the CEO of the Center for Countering Digital Hate. Imran traces how patterns of hate and misinformation spring up online and how our data is being used to feed it.

Imran Ahmed: Quite often the dynamics that allow the spread on social media, the amplification algorithmically of misinformation has been the fact that we’ve engaged with bad information, whether we support it or we don’t support it.

Céline Gounder: A new report released by his organization offered some insight into how our social media feeds work. The report found that more than half of the posts recommended to users in the study contained misinformation about COVID. And one fifth promoted inaccuracies about vaccines. One of the most famous propagators of that disinformation on social media is Robert F. Kennedy, Jr., son of the late Robert Kennedy.

Imran Ahmed: RFK Jr. has said that African-American kids should not take vaccines because they have stronger reactions than white kids. He said the African blood is different. He’s spoken scientific nonsense, and the end result of his misinformation, if it was believed by the African-American community, would be death.

Céline Gounder: The vast majority of this disinformation comes from a surprisingly small number of accounts, among them Robert F. Kennedy, Jr. Imran’s organization identified just twelve accounts that they say are responsible for more than 60% of all anti-vaccine content on social media. Imran’s organization called these accounts, the “Disinformation Dozen” in a report this spring. Since its release, twelve state attorneys general and U.S. lawmakers from both parties have asked social media platforms to take down these accounts.

Imran Ahmed: However, the social media companies have, once again, is shown their reluctance to take comprehensive action. And while there has been some de-platforming and we’re really pleased about that, nevertheless, most of those networks are still up. Most of the “Disinformation Dozen” are still on one or more of the platforms.

Céline Gounder: The solution to this seems simple. Why can’t the social media companies just change their algorithms? Imran says they have an incentive to keep things as they are.

Imran Ahmed: You have someone who seeks to make money, who writes an algorithm that says, okay, find the most engaging content, the most chewy. I don’t care if it’s true or not. I don’t care if it causes public harm, but I want to maximize the time spent on platform. If you can keep people scrolling for three more posts every day, you can show them one more ad let’s say one ad is 5 cents, 5 cents. Across a billion people is actually $50 million. $50 million a day over a year, that’s 20 billion dollars.

Céline Gounder: Social media has become a crucial part of how we communicate with one another, however the pandemic has made it ever more clear that the government needs to step in to stop the spread of disinformation online.

Imran Ahmed: Typically, we regulate to manage those negative externalities there’s costs that are imposed in society to create costs back. So we can find ways to either mitigate or dissuade people from creating those harms. They have managed to finesse their way through and say, “Oh, can’t do anything about it.” It’s just not good enough. No other industry is held to such low standards.

Céline Gounder: In April, the Senate held another hearing where Facebook executive Monika Bickert falsely testified that Facebook had deplatformed ten of the “Disinformation Dozen.”

Imran Ahmed: We have a culture of impunity now, not just for misinformation, but for the social media companies who are protected by law by Section 230 of the Communications Decency Act, which says they cannot be held accountable in law, either civilly or criminally, for any content on their platforms, from another party. And they have really hidden behind that. They’ve, they’ve gamed it in an aggressive way saying, you know what? We will obfuscate, will delay, will lie where necessary. We’ll just get through the next day so we can continue to reap as many profits as possible because they know, as do we all, that the time of impunity will come to an end.

Céline Gounder: Imran says that deplatforming bad actors is one of the most effective tools we’ve got to fight disinformation.

Imran Ahmed: We’ve been tracking the filings that they’ve been putting into courts, suing Facebook, suing YouTube for their deplatforming. And in that they’re extremely explicit. It damages their ability to fundraise. It damages their ability to recruit, and it, it lessens the amount of information they get about the particular likes and dislikes of their audience. So we know deplatforming is incredibly effective in disrupting the malignant activity of these actors.

Céline Gounder: But already, others are taking steps to defang one of our best tools. Late last month, Florida passed a bill that would prohibit social media companies from deplatforming political candidates. The social media companies are pushing back… saying this violates their First Amendment rights as private businesses. We as voters have a crucial role to play in holding our elected representatives accountable to represent the public’s best interest.

Imran Ahmed: Write your Senator and say, you need to be coming up with solutions on this because I will reward anyone that can find a way to ensure that the information that is consumed by all citizens, and we are highly interdependent on the information that others consume so that we can build a coherent worldview that allows us to prevent things like violent extremism, QAnon, conspiracy-ism, anti-vax misbelief, which actually causes damage to the whole of society. So write your representative and make them aware that you care about this issue.

CREDITS  

EPIDEMIC is brought to you by Just Human Productions. We’re funded in part by listeners like you. We’re powered and distributed by Simplecast.

Today’s episode was produced by Temitayo Fagbenle, Zach Dyer, and me. Our music is by the Blue Dot Sessions. Our interns are Annabel Chen, Bryan Chen, and Sophie Varma.

If you enjoy the show, please tell a friend about it today. And if you haven’t already done so, leave us a review on Apple Podcasts. It helps more people find out about the show!

Follow EPIDEMIC on Twitter and Just Human Productions on Instagram to learn more about the characters and big ideas you hear on the podcast.

We love providing this and our other podcasts to the public for free… but producing a podcast costs money… and we’ve got to pay our staff! So please make a donation to help us keep this going. Just Human Productions is a 501(c)(3) non-profit organization, so your donations to support our podcasts are tax-deductible. Go to justhumanproductions.org/donate to make a donation. That’s justhumanproductions.org/donate.

And if you like the storytelling you hear on EPIDEMIC, check out our sister podcast, AMERICAN DIAGNOSIS. On AMERICAN DIAGNOSIS, we cover some of the biggest public health challenges affecting the nation today. Past seasons covered topics like youth and mental health; the opioid overdose crisis; and gun violence in America.

I’m Dr. Celine Gounder. Thanks for listening to EPIDEMIC.

END

Guests
Corin Faife Corin Faife
Heidi Larson Heidi Larson
Imran Ahmed Imran Ahmed
Mútale Nkode Mútale Nkode
Host
Dr. Celine Gounder Dr. Celine Gounder