美国国家公共电台 NPR Fake News Is Scary. Here's How To Spot Misinformation(在线收听

 

MILES PARKS, HOST:

This is your NPR LIFE KIT on information and, maybe more importantly, false information and how to see through it or avoid it. I'm Miles Parks with the NPR Politics team. Let's start with a story that shows just how high the stakes are when fake news spreads. Caitlin Dickerson is an immigration reporter for The New York Times. She covered this story. And it took place in Twin Falls, Idaho, before the term fake news was even in the national vocabulary.

CAITLIN DICKERSON: City council meetings are usually pretty low stakes. But at this meeting in the summer of 2016, during the public comment period where anybody can come to the microphone and just offer comments on what's happening around town, a series of people get up one after the other...

PARKS: All asking about the same thing. They say a horrible crime has been committed in their community.

DICKERSON: And they start to ask about a sexual assault case involving refugees, Muslim refugees.

PARKS: It's an appalling story. Basically, the people in Twin Falls are concerned about a 5-year-old who people say was assaulted by a group of Syrian refugees. But at the meeting...

DICKERSON: The city council members have no idea what they're talking about.

PARKS: There's lots of confusion. The local government officials say they'll look into it. But over the next few days and weeks, it goes from these sort of fringe, highly-partisan blogs into the national media. It goes viral at a time of intense fears about refugee resettlement and Islamic terrorism. People from all over the country and even the world are accusing the government of a cover-up operation.

DICKERSON: Members of the local government - the mayor, the city council members, local judges, the county prosecutor - they were basically inundated for months on end with threats, violent threats, very visceral and descriptive threats from all over the world.

PARKS: The mayor's wife received a disturbing threat on her work voicemail. It was in a monotone, sort of slow and deliberate voice.

DICKERSON: He basically asked the mayor's wife, you know, what would it be like for you if you were raped and assaulted and nobody paid any attention to you? Very, very scary and unsettling stuff.

PARKS: All of this outrage, it was based around something that didn't even happen. It's a false set of facts. There were no Syrian refugees resettled in Twin Falls at the time. The alleged assault that took place, it involved three kids - a 5-year-old, a 7-year-old and a 10-year-old videotaping on another kid's phone. Disturbing, yes, but nothing like the sort of violence that was circulating on the Internet and in the minds of the people threatening the city officials.

The story tapped into people's worst fears, and it took over the city for months. Bad information, whether it's lies, propaganda or just ignorance, it can have real-world impacts. If you want to be a member of a functioning democracy, you've got to avoid it. And we want to help you avoid taking it in and avoid sharing it.

(SOUNDBITE OF MUSIC)

PARKS: So as an immigration reporter, Caitlin runs into misinformation all the time. We'll get into why that is in just a bit. But before we get into the nitty-gritty, I wanted her to give her biggest overarching tip for avoiding wrong information - the thing you should bring with you every time you go online and every time you pick up something to read.

DICKERSON: I think the most important thing when you pick up a news story is to read it with skepticism. And I don't view that as a judgment of the reporter who wrote the story or of the outlet that published the story. I just think that's a smart way to read the news. It's the way that I've always done it. It's the way that I hope people do it, including when they're reading my stories.

PARKS: There's a fine line between that healthy skepticism and making sure it doesn't turn into cynicism about the truth. Caitlin isn't saying you can't trust anything. She's just saying don't follow anything blindly and unquestioningly.

DICKERSON: A really good sign for me, one that makes me feel comfortable, you know, with what I'm reading and believing what I'm reading is when a reporter shows their work. Not only do they tell me what it is they know, but they tell me how they know it. They tell me who told them.

PARKS: A lot of important stories use anonymous sources. That doesn't have to be a red flag all the time. People who are close to a story might just not be authorized to speak about it publicly. That doesn't mean what they're saying isn't true.

DICKERSON: I at least want to know what that person's job is, what's their connection to the story itself, and how do they know the information?

PARKS: So let's start there with your first tip. Wherever you're getting your information, be skeptical. Ask the source to prove their work. And in general, it's probably good to take in information from a bunch of different sources even if you have a favorite.

So I want to take a step back at this point and look at the bigger picture. Your second tip is a little bit of a bigger ask - understand the misinformation landscape. For help on this, I went to Carl Bergstrom and Jevin West. They teach an overwhelmingly popular class at the University of Washington. The topic is relevant, but it's actually probably the title that attracts the swaths of 18 to 22-year-olds.

JEVIN WEST: Are we allowed to use the term?

PARKS: For our public radio purposes, we'll call it calling BS.

WEST: Let's go with BS.

PARKS: Let's just go with BS to be safe.

WEST: OK.

CARL BERGSTROM: Yeah, totally.

PARKS: There's something inside me dying.

(LAUGHTER)

BERGSTROM: It's OK.

PARKS: Yeah, let's do BS to be safe.

They started planning the class long before the 2016 election that put information consumption front and center in America. After all, the idea of BS or bad information, it isn't new, explains Carl.

BERGSTROM: I'm a evolutionary biologist. And this, you know, notion of misinformation predates humans.

PARKS: Take the raven, for example. When a raven has food, it likes to hide some of it for later - also known as caching.

BERGSTROM: But it will look around first and see if anyone's watching it. And if other ravens are watching it, then it'll do what's called fake caching.

PARKS: The bird will actually fake like its burying its food and then just move along.

BERGSTROM: So that when they try to steal it, it's gone off and hidden it somewhere else.

PARKS: So we humans aren't the only ones trying to deceive each other. But Jevin also says when it comes to bad information, there's a key difference between people who are outright lying versus people who are peddling BS.

WEST: Liars actually know the truth and they're just sort of moving you away from the truth. Whereas people that BS, you know, don't really care so much about the truth as much as wanting to impress and persuade you and to sort of grab your attention.

PARKS: Which is where, in the modern age, the Internet comes in. The 2016 presidential election was followed by this huge reckoning. The U.S. intelligence agencies agreed that Russia attempted to influence the results, which also isn't new for Russia. The country's engaged in informational warfare, also known sometimes as active measures, for decades. What was new was that the American public was this giant target, all tied to a platform that was tailor-made to spread misinformation like wildfire - social media.

Carl says the reason social media is such an effective tool for spreading BS is that in some way it's in the best interests of the Facebooks, the Twitters and the YouTubes of the world to allow some level of this persuasive content, regardless of the truth. They're for-profit companies, after all.

WEST: They don't care about what you're seeing. They don't care if you're watching a video that explains how the earth is flat. They only care that you're on the platform. They just want your attention. I think they're sort of like the grandest of BS-ers (ph).

PARKS: Carl says the sites are basically conducting science experiments on our brains, figuring out what will be the most addictive.

BERGSTROM: The content that we're delivered has been curated and selected by a set of machine learning algorithms that are basically running large-scale experiments on all of the users of the platform to see what keeps people clicking, what keeps people on the site.

PARKS: The big social media sites have improved their platform since 2016 - deleting hundreds of millions of fake accounts, adding fact-checking widgets - but the fundamental problems remain. Their business model depends on engagement, not on any sort of obligation to the truth.

I'm wondering why the answer isn't just use social media less.

WEST: Well, for sure. I mean, that's something I advocate my personal life and also even in the classroom. I wouldn't want to go back to a world without it. I think it provides a lot of good things for the world. But I think we are using it too much.

PARKS: So your second tip - understand the environment. Misinformation isn't new, but the platforms for engaging with it are. We'll get into some more tips about how you can use social media in smart ways. But maybe the first thing to think about is just maybe use it less.

OK. So misinformation isn't new. That also means it can be sort of predictable. Our third tip is to recognize the types of topics that draw a lot of false information, and be vigilant when you're consuming news or media about those topics. So I want to come back to something we touched on earlier. Caitlin covers immigration for The New York Times. She says she runs into misinformation all the time, whether it's online or from politicians or from advocacy groups.

DICKERSON: It's frustrating, you know. It's something that both Democratic and Republican politicians do. And it's something that totally predates the Trump administration, although it's probably reaching a peak.

PARKS: I asked her exactly why it is that immigration is such a magnet for bad info. It's clear she's thought about that question a lot, and she breaks it down into two key factors.

DICKERSON: One is that immigration is incredibly complicated. You know, lawyers who study it compare it to the tax code. People don't have time, you know. People have lives. They have jobs. They don't have time to familiarize themselves with the nuances of immigration law. And I don't begrudge that, but I think it becomes easy then for activists to capitalize on the lack of information and boil it down in ways that aren't fully accurate.

PARKS: When she mentions activists, she's talking about anyone who has a stake on any side of the issue. These are the people who want to persuade, and that means they can have a lot of motivation to want to bend the truth. This is broader than immigration, too. It's true for any topic with complicated policies. Voting, for instance, is something that's a little different in every single state, so it leaves open the door for a lot of misinformation.

DICKERSON: The other thing is that immigration is just this incredibly emotional issue. It relates to race. It relates to demographics. It relates to religion. It relates to people's feelings about the national identity as well as their personal identity. And so it's this emotional, hot-button issue that people get really excited about on every side of the political spectrum.

PARKS: And that emotional response should be a red flag. That's something I talked about with Peter Adams. He's the senior vice president of the News Literacy Project. He says that if you're having that emotional response, it actually means you should double-check whatever information you're getting, whether it's from a politician's stump speech or from reading your news feed. If it makes your blood boil, check it.

PETER ADAMS: If something is causing you to be fearful or outraged and to experience strong emotions like that, take a moment and maybe do a quick web search to see if it's been debunked or if anyone else is reporting it who is credible.

PARKS: The same goes for stories that are rapidly changing or developing. Peter says people who make money from clickbait or fake news stories don't need to follow the same ethical guidelines that media organizations do.

ADAMS: Propagandists and disinformation agents and trolls and just chaos mongers seek to sort of shoot the curiosity gap or get something out in the first 15 minutes before legitimate news outlets are even able to get information and verify it and get it out. You know, news outlets aspire to standards - right? - and they have to verify information before they share it. And that takes time. Sometimes it takes five or 10 minutes, which is pretty quick. But in that five or 10 minutes, other folks with bad intentions online can also push a lot of garbage.

PARKS: So tip No. 3 - recognize the times when bad info works best. Just ask yourself, is this a complicated subject? Is it something that's hitting on my emotional triggers, or is this a breaking news story that's going to change?

OK. So when you see a story about gun ownership or reproductive rights or immigration, you're going to be on the lookout. But what do you do when you sniff it out? How are you going to double-check what you're reading? Peter's group wanted to answer exactly that, so they're working on an app called Informable. If you're really well-read or well-listened in the case of podcasts, you might be thinking you could tune out at this point, that the people who fall for this stuff are sort of simpletons. Peter says not so fast.

ADAMS: We all have our blind spots. Especially if you consume a lot of information, it means you are often sort of info grazing. So the bigger the news junkie, the more instances you might have where you're quickly making a judgment waiting in line at the grocery store, you know, refreshing your timeline or whatever. And that actually puts you at a disadvantage if you're making decisions about sharing and liking things.

PARKS: The app focuses on a few categories that you can use to model your own thinking. I asked Peter to walk me through it. The first category is ad or not. This is basically discerning whether something you're reading or seeing is there because someone paid for you to see it. If someone paid and it's a biased source of information, telling the difference is especially tough on social media or organic posts. Things from your family and friends get mixed with paid content from companies and even politicians.

ADAMS: We have, first off, a shot of a piece published on BuzzFeed. And it says at the top, you know, in pretty prominent all caps, PAID POST. And the headline is "10 Ways To Succeed In College According To People Who Went To College." And further down it lists Course Hero as the, quote, "brand publisher."

PARKS: Second is evidence or not. When a source makes a claim, you just need to question whether there's reliable information that proves their point. If someone is talking about population, for instance, and they cite census data, that's evidence. But as Peter's playing through this section in the app, he runs into a sketchy tweet.

ADAMS: It has a picture of some Baby Ruth candy bars and Airheads and Cheez-Its and cans of tuna next to a cardboard box. And the claim is these are the FEMA "meals" my brother received today, and meals is in quotes. And it has a hashtag, you know, #HurricaneMaria.

This may well be true, but it doesn't really provide evidence that this is, in fact, a FEMA meal. Anyone could take a picture of some Baby Ruth and other candy and Cheez-Its and say this is a FEMA meal. So that would be something that would warrant looking into further right before you sort of accept that as evidence.

PARKS: Next, he runs into a very cute Instagram post that shows two dogs embracing.

ADAMS: And it says, you know, this dog hugs every other dog he sees during his walk. It is just a still image of two dogs that appear to be in an embrace, but it is not evidence that that dog, in fact, does that.

PARKS: Now, this part was tough for me to swallow. I mean, I have to question every time I see a cute puppy post? Is nothing sacred anymore? I asked Peter about this after he showed me a meme of some kids goofing off in the back of an auditorium. The caption says, university students hard at work during a lecture. And one of the kids is sort of half asleep on, like, three different chairs. And another kid is watching "Game Of Thrones" on a laptop.

You know, there's some people who would say it's entertainment at that point.

ADAMS: Right.

PARKS: And if I'm laughing about what these two students are doing supposedly in the back of this auditorium during class, does it matter to me whether they actually did that or whether - or does it matter that I got a laugh out of it?

ADAMS: You know, I think it does because one thing you can't predict is how misinformation is perceived by different people, you know. If someone sees that and it affirms a belief that they already have or it strengthens a belief that they already have that, you know, kids today are lazy, that they're not learning anything, that college is a waste of time, it may reinforce those beliefs. So, you know, for you, it might be funny because you don't necessarily have that belief, but someone else who does, it may act as evidence for them.

PARKS: Yeah. Like, if I was against like public funding for universities or something like that, then it would be - have this completely different effect on me than the person who, like, originally shared it.

ADAMS: Exactly.

PARKS: When it comes to evidence, numbers come up a lot. The problem is that all numbers aren't created equal. Here's Jevin from the University of Washington.

WEST: You can always remove one variable. You can always, you know, only include men instead of women. There's all sorts of ways to cut up numbers and tell stories. And that's the whole point that we want the public to be aware of, that when they see a number, to question, you know, where that number comes from.

PARKS: When Caitlin is covering immigration, for instance, a number she has to deal with a lot is the number of border crossings every month at the U.S.-Mexico border.

DICKERSON: For example, there were months over the summer when tens of thousands of people were crossing the border in a single month, you know, 60, 70, 80,000 people, which sounds like a huge, huge amount.

PARKS: Those numbers were accurate, according to the Department of Homeland Security. But how someone chooses to present them can vary really widely depending on the story they're trying to tell.

DICKERSON: There weren't enough resources along the border to actually be able to house and care for people who are crossing in a safe way. And we saw the effects of that. But, you know, we also have had our reporters who cover the economy do stories about how when you look at the overall population and you look at the economy, it can, in fact, sustain those very large numbers of people. And in many economic ways, we actually need them to support the economy.

So when you look at a number like 70, 80,000 people on its face and you think, oh, my God, that's huge. That's untenable. We have to change something. And then you actually look at the numbers and find out, well, actually, it's quite easy for those people to get jobs and to integrate into society. That's another way of looking at it.

PARKS: When it comes to numbers, context is the key. A number like a city's murder rate - it might look big and scary, but if you check what it was 10 years ago and you see that it's gone down substantially, then your takeaway will be really different, or if you compare it to a neighboring community.

Jevin and Carl teach their students this one quick trick that when you're looking at a graph, check the x-axis and the y-axis to make sure they're not overly zoomed in or zoomed out to show a convenient amount of data.

BERGSTROM: The important thing to recognize is that the person presenting the numbers does have - even if they're true numbers - does have a lot of power over controlling how you feel about them, how you respond about them.

PARKS: The last really good indicator for when something is suspect is whether you know the source. On social media, that means knowing the creator of the meme, not just the person sharing it. Let's say your Uncle Burt retweets a meme on Twitter. You see the picture and a notification indicating Uncle Burt reshared it, which your brain initially interprets as Uncle Burt being the source. But you need to rewire to then either not take the meme as fact or to follow the rabbit hole down to the creator. If you can't get a firm grasp on who that is, then Carl says you shouldn't recirculate it.

BERGSTROM: Uncle Burt is the one that retweeted it. But then if you don't know the person that sent it, then, you know, you don't really know where that information is coming from, you know. So sharing information that you don't know where it's coming from is kind of like picking up candy on the street and just eating it.

PARKS: So that's your fourth tip - ask some questions of what you're seeing and reading. Is it paid for by a company or a politician or another biased source. Is there good evidence? And are the numbers in context?

OK. Now that you're a pro in information spotting, you're going to start noticing that people in your life share bad stuff sometimes. You see it on Facebook or at a family party or you might hear about it at work. It's just going to happen, I promise. We want to help you with that because it's important to value the truth, but it's also really hard to tell someone they're wrong if it's about something they care about.

DICKERSON: I think the most important thing is to recognize that when people believe strongly in something, even if it's not point out by the facts, there's a reason. You know, they've read something or they've been exposed to something, they've been told something that was compelling to them and that seemed legitimate. And so it's worth acknowledging that, you know. And that - their beliefs may also be informed by their experiences and challenges that they've faced. And so I think, you know, meeting people where they are is the first and most important step to having a productive discussion over disagreement.

PARKS: Jevin and Carl said they're careful as they teach their course, to walk a fine line.

BERGSTROM: We start with a few rules for calling BS that we really want to, you know, instill in our students because we don't want to create a legion of jerks or the well-actually guy or whatever.

PARKS: But truth is important for our democracy and for our health, so we have to be able to talk to each other. The first ground rule if you're going to try and correct someone who's wrong is to make sure you're right.

BERGSTROM: So you want to make sure that you've got your facts straight and you don't leap in and, you know, call BS when you're wrong. That's a sure way to undermine your credibility.

PARKS: So once you've got your case mapped out, think about opening with common ground and a question. Carl used the example of vaccines as one that's really charged but also really important. There's no scientific link between vaccines and autism, and yet a chunk of the population believes there is. That's lead to outbreaks in recent years of diseases like measles. So if you're in the day care pickup and you start talking to somebody who starts telling you everything they know about the negative consequences of vaccines, Jevin says start with the common ground.

BERGSTROM: It's really stressful being a parent. There's a lot of decisions you have to make. And so, you know, what have you been reading? What have you been looking at? Then that'll give you a chance. You know, you can hear from them.

Yeah, you've got this common ground. You guys both really care about your kids. And then you can kind of share what you've learned and say, well, you know, from what I've learned, measles is not a minor disease. It's really serious. And I care about my kids a lot. And so since there aren't - there isn't this evidence, you know, that vaccination is causing autism, I was careful to get them vaccinated as soon as I could.

Just finding that common ground - if you set it up as like, you know, you're a dumb hippie, you're a big pharma tool, then you're not going to get anywhere, right? If you're both just, you know, you care about your kids, then you can kind of go from there.

PARKS: OK. So clearly this is not an easy conversation to have. And honestly, when misinformation is online, it can be even tougher to engage. Think about a comments section and how often these things just turn into this bickering back-and-forth instead of a productive conversation.

BERGSTROM: One of the problems with the discussions that take place on Twitter and Facebook is that they're typically public discussions. And so instead of you taking me aside and saying, hey Carl, look, like, you know, I understand why you think that, that's actually not true and here's why - you're sort of very publicly saying, Carl, what you said is BS and here's why, and you shouldn't have said that.

And you're calling me out in public. And so, you know, it sort of seems that the informal first rule of discussion on the Internet is to always double down on your own stupidity. And I think some of that is a response to being called out in public for getting things wrong...

PARKS: Like, it seems like it, like, ups the ante. It's like me going up to you and kind of, like, shoving you and expecting you to, you know, react well to what I'm saying.

BERGSTROM: Exactly. I think that's right.

PARKS: So it's not that you shouldn't engage when you see something hateful or wrong. But if you're tempted to write a comment, maybe just think about sending an email instead. Or if it's someone who you see regularly, think about bringing it up in person. Which brings us to your last tip - if you know you're right and you want to help correct misinformation, be humble. Don't assume bad intentions or stupidity, just meet the other person where they are, and be curious. Try to have the conversation in person or at least in a private online setting.

So that's it. You're an information pro now. You're basically an encyclopedia. Let's go back over the key points when it comes to finding good info.

No. 1 - be a little skeptical of everything you read and see, even if it's your favorite reporter or your favorite magazine. It's probably best to take in news from a few different places, too. Tip No. 2 - know the landscape. Social media sites don't have to tell the truth, so don't treat them like they do. Tip No. 3 - look out for red flag topics. Be really careful if it's a complicated subject or a breaking news story or an emotional trigger.

ADAMS: A lot of misinformation exploits our values. It exploits our patriotism. It exploits our religious faith. It exploits our dedication to ideals like equality.

PARKS: Tip No. 4 - take in information with a few questions at the top of your mind. Is it paid for by a company or a politician or another biased source? Is there good evidence? Are the numbers in context?

Tip No. 5 is, basically, don't be a jerk. If someone believes wrong information about something, there's a reason for that. Be curious and helpful.

(SOUNDBITE OF MUSIC)

PARKS: For more NPR LIFE KIT, check out our other episodes. I hosted one on how to vote in an election and how to run for office yourself. You can find those at npr.org/lifekit. And while you're there, subscribe to our newsletter so you never miss an episode. And here, as always, a completely random tip, this time from listener Viva Dadwal (ph).

VIVA DADWAL: What's my life hack? Well, it includes talking to strangers, including finding all sorts of excuses to be able to do so. So sometimes I take a telescope and I set it up on a corner and invite people to come look up at the sky. And it more than often works.

PARKS: If you've got a good tip or want to suggest a topic, email us at [email protected]. This episode was produced by Sylvie Douglis. Meghan Keane is the managing producer. Beth Donovan is the senior editor. And this episode was edited by Brett Neely. Our digital editor is Beck Harlan. And our project coordinator is Clare Schneider. I'm Miles Parks. Thanks for listening.

  原文地址:http://www.tingroom.com/lesson/npr2019/11/489483.html