Claire Wardle (Ep. 7)
BY Future of StoryTelling — May 13, 2020

Misinformation expert and First Draft cofounder Claire Wardle discusses how misinformation spreads, why it's so powerful, and how journalists can leverage storytelling techniques to turn the tide.

Available wherever you listen to your podcasts:

Apple Podcasts  |  Spotify  |  Google Podcasts  |  Stitcher  |  iHeartRadio 

Charlie Melcher: Hi, I'm Charlie Melcher. Welcome back to the Future of Storytelling podcast. I wanted to start today by saying how thrilled we are to have PBS back as our partner. They are in the process of capturing this amazing multimedia portrait of America, created for and by us—all of us—who are here in the country at this critical time. It's called PBS American Portrait, and it's a storytelling project that invites you to participate in a national conversation about what it means to be an American today. I'm proud to be a part of it. I just posted my own story on their website, and I encourage you to do the same by visiting

My guest today is Claire Wardle, a leader on the frontlines of the battle to combat misinformation on the social web. As the co-founder and US director of First Draft News, one of the world's leading nonprofits addressing the threat of mis and disinformation. She's worked tirelessly to train journalists in the best practices for information verification, to conduct cutting edge research into the information landscape, and to help newsrooms work together to identify and debunk misinformation. In the midst of the COVID-19 crisis with social media usage at an all time high and dangerous conspiracy theories spreading like wildfire, her work is more crucial than ever before. I'm very pleased to welcome Claire Wardle to the FoST podcast. Claire, welcome to the podcast. It's such a delight to have you here with me today.

Claire Wardle: Thanks for inviting me.

Charlie Melcher: So, you spend every day on the front lines of tackling misinformation. I'd really like to know what drives you and why is combating misinformation so important?

Claire Wardle: So, my background is I'm an academic, and I'd done some research back in 2007 about user generated content, which was people were increasingly taking photos of things that they were seeing and emailing them to organizations like the BBC. And I was really interested in what motivated people to do that and how did newsrooms cope with this influx of information that they couldn't really trust because they hadn't filmed it themselves. So, I am British and we had these terrible bombings, the 7/7 bombings in 2005, and it was the first time that the BBC led with footage that had been filmed by somebody's camera phone inside the tube. And I thought, there's a shift coming.

So, I started being really interested in how do you verify material online? And it was a very niche freakish subject, but then as you can imagine over the last five years, it's become the subject that lots of people have cared about. And the reason that this drives me is that I've watched this space for the last 15 years, go from very occasionally you might get a hoaxster trying to push false information, to now we're swimming in these rivers, these very polluted information streams. And I just keep going on and saying, "We can't live like this."

Charlie Melcher: So, how do you and your team at First Draft go about confronting this problem?

Claire Wardle: So, we're very fortunate, we're funded to have a number of staff. We have about 17 journalists around the world who wake up every day and basically use keywords to search for information on a number of different platforms, whether that's Facebook, Instagram, Reddit, YouTube, and they are monitoring, essentially, the kind of content that's being shared to understand what aspects of that content might be misconstrued, what might be causing people's behavior to be somewhat harmful. And we make decisions about how do we work with journalists to help audiences understand these rumors and falsehoods? So, we're kind of the first alert system to try and see some of this content before it causes harm.

Charlie Melcher: Claire, can you describe the difference between misinformation and disinformation?

Claire Wardle: Disinformation is false information that the people creating it and sharing it know it's false and they're sharing it to cause harm. That is distinct from misinformation, which is also false, but the people sharing it don't realize it's false. So, this could be my mom resharing something on Facebook, she doesn't know it's false and she certainly doesn't want to cause any harm, but she's been confused. And if we want to think about how to slow down misinformation, that's different to the kind of techniques we might use to stop disinformation.

And I actually also use a third category called malinformation, which is genuine information that's shared to cause harm. So, for example, revenge porn or the leaking of certain emails, that the leaking of them, taking them from private to public is done to cause harm. I mean, some, if it's a whistleblower that's not trying to cause harm, it's in the public interest. But if we don't think about these terms, then we can't actually think about interventions and solutions.

Charlie Melcher: But from the earliest days of the written word, there's always been misleading information, people putting out propaganda. What makes it so much worse or toxic today?

Claire Wardle: So, yes, it certainly isn't new. And as humans we've always gossiped, it's a way that we connect with each other. We've always had white lies, as you say, we've always had propaganda. What we've never had is an ability for anyone to be a publisher in real time. And when they publish, the distribution mechanism means that it speeds across the globe in seconds, and we've never had a distribution platform that worked at speed. So, the content itself isn't that different, we use the same techniques that everybody's always used, but it's the speed. It's the fact that all of us, unfortunately, are nodes sharing and pushing this content out. If none of us shared anything, we wouldn't have a problem.

Charlie Melcher: We're in an age now where technology is enabling us to have these incredibly convincing deep fakes, whether that's video or audio. If we had trouble discerning between things that were true and things that were intentionally misleading before, aren't we about to enter into an era that's going to be all that much more difficult, when we get these really convincing deep fakes?

Claire Wardle: Deep fakes are definitely something that we need to be nervous about and I would say that I don't think that they will play a role in the 2020 election. I would argue by 2022, the technology will be so good that it will be very difficult for anybody to verify it because the AI systems will become so sophisticated. The glimmer of hope is that there's a lot of very, very smart computer scientists who are working very hard on detection technology, so that we would be able to tell the difference.

The other thing about deep fakes is that they tend to be used against politicians and famous people. So, it's easier to say, "Well, did Donald Trump ever really say that?" But the other really terrifying thing about deep fakes is a lot of this is about women whose faces or bodies get put into porn films. So deep fakes are certainly a technology we need to be really concerned about. However, I do worry a little bit that we've had a lot of sensational reporting around deep fakes, like headlines that scream, "We'll never be able to trust anything again." And I would argue that that in itself is a problem because we need to be pretty responsible to say, "Yes, it's a concern, but there are very smart people working on it and hopefully this will be a kind of an arms race where there's a detente, when they cancel each other out."

Charlie Melcher: So, I'm somebody who grew up believing that the worldwide web was going to bring us together. It was going to be this incredible tool for helping to connect the global village and remind us all of our shared humanity. But it seems that the ability of the internet to share disinformation is actually doing the opposite right now. What did I miss? What did the rest of us tech utopians not understand?

Claire Wardle: That's a difficult question to answer because I was in your camp as well and I really hoped that the good information would outweigh any bad information, but I think what we didn't recognize is that the internet was always only going to reflect human nature. And in the same way as there are incredible people doing wonderful work every single day, we also have people, for many different reasons, who commit crimes and people recognized that like everything in society, they could make money from it, they could harm somebody's reputation, they could just see what they could get away with.

Charlie Melcher: Is there something about the medium that makes it more difficult to understand what's true versus what's false?

Claire Wardle: So, as humans, we're always looking for heuristics, these mental shortcuts that help us understand and make sense of the world. So, if I'm having a conversation with a group of people on the street, there are visual cues about who they are. It might be that I'm looking to see how they're dressed or what words they're using or how eloquent they are. And the same is true online, but unfortunately in the online space, everything looks the same. For example, if I'm reading the New York Times, it looks very professional versus kind of a ... It used to be the case of a not very professional blog, but now on my Facebook feed, everything looks similarly professional. So, our brains are overloaded, desperately seeking these credibility heuristics, and there are very few of them to help us make sense of that information we're seeing.

Charlie Melcher: What kind of misinformation are you seeing right now related to the COVID-19 pandemic?

Claire Wardle: Around the beginning of March we really saw the growth, firstly of worried text messages about lockdowns and that the army were going to be coming out, and there was going to be no food in the shops. And so, we saw that initial wave and many of them were on WhatsApp and text messaging, which was kind of surprising for Western Europe and the US, although we have seen that kind of behavior in places like India and Brazil. And then the next couple of weeks it was all about, well if you gargle with salt water, if you hold your breath for 10 seconds, I mean, I think we all saw it, different versions of it, but like 10 things you need to know about coronavirus. And then the last two to three weeks we've seen a really worrying trend of these conspiracy theories, whether it's people suggesting that 5G phone masts are causing coronavirus, whether it's people saying that this was created in a lab by Bill Gates because he wants to vaccinate everybody, wants to make it mandatory and then wants to microchip us.

We're seeing these kinds of conspiracies, which are, I would say, pretty surprising, because we're seeing them move into ... We've seen celebrities and influencers talking about them. We've certainly seen the president of the United States talking about some of these ideas that previously we would not have thought to be mainstream at all. So, it has been a wild journey. Unfortunately, we're seeing increasing amounts of information that I think might cause real world harm.

Charlie Melcher: These kinds of misinformation really do have profound or in some cases, life or death implications. Can you give an example of that?

Claire Wardle: This weekend the president was talking about drinking bleach. Well, we've seen forms of that in conspiracy communities for a while. So, for example, there are Facebook groups that tell mothers whose children are autistic to feed them bleach. I mean, we do see Facebook groups and communities that really recommend behaviors that are exceptionally dangerous.

And so, right now we saw a spike over the weekend of people going to emergency rooms because they had ingested bleach and poisonous liquids. I mean, that in itself is really harmful behavior. We're seeing many, many people now believing that this is a hoax, that the lockdown is designed by the Left to keep people at home, so that then we'll have to have a mail-in election in November. So, we're already seeing, in April, kind of the planting of narratives around the safety of the election and trying to dismiss the integrity of the election. So, again, all of these things are interlinked in really interesting ways.

Charlie Melcher: So, who do you think bears responsibility for testing the veracity of these things or helping to clear them out of our social media feeds? Is it the companies or is it the individuals?

Claire Wardle: None of this is easy, but nobody, I would say, singularly deserves the responsibility on this. I think the platforms have to be more transparent about their decisions and what they're taking down. Governments need to put more pressure on the platforms to say, "Release the data so we can study what's happening and what's not." But also as a public we have to say, "We're part of this mess." And actually we need to hold each other to account, and then if your crazy uncle in the family email group keeps forwarding conspiracies, you need to sit your uncle down and talk about what it means, this kind of just pollution and what that means to our public sphere.

Charlie Melcher: And I think people are just not aware. I mean, if it's something that's reaffirming your filter bubble, then you don't think that's a conspiracy theory, you think that's something everyone needs to see.

Claire Wardle: So, when you've kind of got a scandal-ridden White House or you've got, in my country, we've just gone through Brexit and there was all sorts of hoo-ha about that. And so, in those environment ... We're still reeling from the 2008 financial crisis, there's a lot of reasons why people have mistrust in institutions. And so, then when that's your foundation, you layer on the kind of conspiracies and falsehoods we're seeing now and people are like, "I told you so. I told you you couldn't trust those guys in DC." And it fits into all of that, and so we can't understand the information in isolation to that kind of socioeconomic foundation that's actually feeding so many of these conspiracies.

Charlie Melcher: There's such a challenge between the people who are trying to deal in facts and the people who are free to deal in fictions or stories that are maybe based on some facts, but really they have complete reign to make stuff up. Can you talk about that challenge of the newsroom versus the person who's actively putting out the misinformation?

Claire Wardle: It's definitely an asymmetrical fight. So, I often do trainings for newsrooms or large organizations like the WHO or CDC and say, "Listen, this is how anti-vaxxers push their messages. Very engaging visuals, lots of emotion, bite sized, compelling, shareable piece of content. And this is what we create, either 800 word fact checks with no images, or an 87 page PDF with a picture of a dripping needle on the front cover." I mean, that's how the facts-based organizations work, we love texts. We like a lot of texts, we love complexity and all the psychological literature shows that actually we need to be much more simple and we need to be much more visual. I wish that we could put designers in every newsroom, in every health authority and say ... Look at the flatten the curve graphic, the flatten the curve graphic is probably the most effective description of how we had to respond to COVID-19 and that was a very simple diagram. And that we just need to get much, much better at thinking about visual ways, visual storytelling essentially.

Charlie Melcher: Visual and also emotional, right?

Claire Wardle: Yeah. And the emotional part makes newsrooms feel really uncomfortable. Newsrooms like distancing language and we are the gatekeepers and we're telling you what to believe. And that doesn't work in an environment where people want more authentic storytelling and they want more emotional storytelling, that's what people engage around, and that's what drives the algorithms of all the platforms. And so, we need to find a way that newsrooms can understand the use of emotion, but to do so in a way that's not sensational, to do so in a way that doesn't break ethical norms, but we have to find a way to loosen up that, I would argue, news reporting in a way that doesn't do damage to the facts, but otherwise this is an asymmetrical war and the other side is definitely winning.

Charlie Melcher: It makes me think that there is a tremendous role for better storytellers when it comes to dissemination of news and trying to get people to trust or believe in fact-based information. Do you think that the business model is one of the things that's led to people's trust in news organizations being undermined or lessened? Their understanding that they are ultimately owned by these big media conglomerates, many of them, that they're out there to make a profit, does that sort of taint the way people think about their journalism?

Claire Wardle: I mean, it's definitely a struggle when you look around the American news ecosystem to see how few corporations own so many outlets. And we also have to recognize that the way that headlines are written or the way that 24 hour TV channels drive certain narratives, it's because actually, again, we're human, we're emotional and so what do we click on? What do we yearn for? It used to be that you would get your newspaper delivered and you would read all the way through. You might start at the cartoons, but you'd kind of be flicking through the Syria coverage as you went through.

Now, when you can put into Google whatever you want to search for, and particularly YouTube, you find the thing you love and the algorithm just keeps serving up what you want more of and you could argue the same is true for a kind of a CNN or some of these other outlets, that they have the metrics, they know what works. And so, unfortunately it's easier because people need money, to carry on giving people more of what they want, but as humans is what we want always the best thing for us?

Charlie Melcher: So, what do you recommend to individual people to help them solve the problem of disinformation?

Claire Wardle: We have to become really good at developing emotional skepticism. So, if you see a piece of information that either makes you really angry, or really sad, or really scared, or makes you immediately want to buy something to cure something, that means that you have had a visceral reaction to information and that's when you should say, "Okay, take a breath." All of the research shows that if we built friction into the system, so for example, if I couldn't retweet something on Twitter until I had read the article or even I couldn't retweet for two minutes, we would be in a lot better shape than we are.

Two months ago we were being told not to wear masks. Two months ago we were told if you were under 60 you'd be fine. As things are changing, people are getting angry that that information is changing. The best thing, I would say, most people can do right now is not to share unless it's from a really reputable organization. And even then there needs to be a caveat which is like, "This is what the New York Times are saying today, but it might be different next week." And that feels very uncomfortable when it feels like facts are on quicksand, but we have to be realistic. We've never had a virus that seems to be changing and our knowledge about it changes almost on a weekly basis.

We have to become really good at developing emotional skepticism. So, if you see a piece of information that either makes you really angry, or really sad, or really scared, or makes you immediately want to buy something to cure something, that means that you have had a visceral reaction to information and that's when you should say, 'Okay, take a breath.'

Charlie Melcher: Can we talk a little bit about the disinformation campaign around Bill Gates and vaccines? Can you use that as a case study, maybe, and help us understand how it's spread and then how you could help counteract that narrative?

Claire Wardle: So, most famously he gave this Ted talk in 2015 where he warned of a pandemic and said we wouldn't be able to cope. So, you've got this footage of Bill Gates wandering around a stage with a huge virus behind him. So, exhibit A, you've got that working. You then have got the fact that he's worked very closely with the WHO for years. He's also a well known face, so it works globally. There's already a lot of anti-Western feeling in many countries that believe that the West is coming in to force mandatory vaccination. So, in countries like Pakistan, for example, there's this sense that this is the West, this is about population control of nonwhite people.

So, again, you've got this foundation to build off. Bill Gates is now saying he's going to build seven labs to try to create seven different vaccines. So, he's very actively involved in something that then you can argue that he's doing this to make money. I would argue that A) We need to be more honest about the fact these conspiracies are spreading. We need to explain about why conspiracies take hold, and we need to do a better job of explaining what is actually happening because it's these information vacuums that allow these to breed. So, absence of transparency is also what causes this. We just need to be better at the way that we talk about these things as opposed to pretending that they're not happening.

Charlie Melcher: It always makes me think, though, that we're on the defensive. The way you described that is, we have to explain why this is wrong, we have to try to use facts, we have to debunk. And then, in a sense, you're already on your back foot. I mean, do we fight by defending or do we fight by creating wonderfully powerful stories and narratives that can get across the point that we want to make?

Claire Wardle: Oh, we need wonderful narratives. The problem right now with coronavirus is, because there isn't a story of how this started, it's allowing these conspiracies to breed. I mean, if we allow 12 to 18 months of a drip, drip, drip of conspiracies around Bill Gates and vaccines, we could have a situation next year where we only have 35% of the global population who decide to take it and then we're in trouble. So, this is actually critical. And rather than saying, "Oh, those crazy conspiracies." We need to be like, "Hang on, these messages are really being shared widely. How do we start now to create a pro-vaccine narrative, that means that when the vaccine is available, people trust it?" And I'd say that's absolutely critical.

Charlie Melcher: Do you ever get depressed, feel like we're overwhelmed and it's not going to go ultimately in the right direction?

Claire Wardle: Sometimes it's hard, I'm not going to lie, it's hard looking at this stuff 14 hours a day. But I think we have to recognize that this is going to be a long-term cultural change and we've seen those changes happen in societies, and one of those is, for example, drink driving, 30 years ago if I was at a party with you, Charlie, and you were drunk at the end of the night I'd be like, "Ooh, I hope you got home safely." And maybe the next day I would call and hope you weren't dead. But now if I saw you walk out of the house, I would have to take the keys away from me because society would shame me for not stopping you. And that has taken 30 years to come to pass, but it was a societal shift.

Claire Wardle: And I think the way that we interact with information as a society will be similar. This stuff matters and I don't think we are doing enough to teach one another how to talk to one another about this. We say, "Oh, we need more media literacy training." We actually need training in conflict resolution and how do you have difficult conversations with other people who believe different things? And that skillset I don't think is really being developed.

Charlie Melcher: What gives you hope?

Claire Wardle: It seems like an odd answer, but I've been tackling this problem for a long, long time now and I would argue that through the last three months there has been a recognition that speech matters, that speech can lead to harmful behaviors and that we need to take it seriously. And I've certainly had people at dinner parties being like, "Ugh, Claire, whatever. It's just hoaxes. Why are you studying this out of everything you could study?" And I think, I hope that people are now starting to take it seriously. We're seeing the platforms step up. We're seeing much more of a conversation about the harmful effects of misinformation. So, I'm hopeful that this might be a bit of a turning point, that as a society we start having the conversations with each other about what kind of information environment do we want? And where's the line we want to draw between absolute freedom of expression versus understanding that some sorts of speech might cause harm?

I think the public needs to be part of those conversations. These shouldn't be conversations with white men in Silicon Valley. This needs to be a global conversation around speech and the harmful effects of speech. So, I hope after this horrific time passes that there's also time for some conversations about the role of speech and information and the kind of societies that we want.

Charlie Melcher: I think that's a lovely place for us to end today's discussion. So, thank you so very much for participating. It's really a pleasure to get to chat with you and I learned a tremendous amount, so thank you. Thank you, Claire.

Claire Wardle: Thanks for having me.

Charlie Melcher: Thank you for joining us today and a special thanks to PBS American Portrait for their generous support, and to Claire Wardle for her enlightening conversation. If you enjoyed listening and would like to hear more, we'd appreciate if you'd subscribe to and rate our podcast. In addition, if you know anyone else who might enjoy our show, please be sure to pass it along and a big thank you to our production partner Charts and Leisure. We'll see you next week for another conversation, until then, please be safe, be strong and story on. For more information about Future of Storytelling or to subscribe to our newsletter, FoST in Thought, visit us at