PBS高端访谈:脸书在社会问题上会更加谨慎?(在线收听

   JUDY WOODRUFF: Now questions about the ever-growing scope of Facebook's empire and social network, and whether the company is embracing enough responsibility for its reach.

  Today, Facebook CEO Mark Zuckerberg announced that they will add 3,000 more people to monitor live video, after problems with violence and hate speech.
  Hari Sreenivasan takes it from there.
  HARI SREENIVASAN: The decision comes after a series of cases where people shared live video of murder and suicide, recent examples, a murder in Cleveland last month that was posted live on Facebook, and a man in Thailand posted video of him murdering his 11-month-old daughter. It wasn't removed for 24 hours.
  Once Facebook makes these announced hires, there will be 7,500 employees to monitor thousands of hours of videos uploaded constantly.
  Farhad Manjoo is a tech columnist for The New York Times who has been closely covering Facebook. He joins me now to talk about this issue and other questions facing the company.
  Farhad, so let's first — today's news, how significant is this?
  FARHAD MANJOO, The New York Times: I think it's significant.
  I mean, it's a significant sort of step up in their ability to monitor these videos, and it should help. The way it works is, there's lots of videos going on, on Facebook all the time. If somebody sees something that looks bad, that looks like it may be criminal or some other, you know, terrible thing, they flag it, and the flagged videos go to these reviewers.
  And just having more of these reviewers should make the whole process faster. So, it should help. I mean, I think the question is why it took them a year to do this.
  HARI SREENIVASAN: So, put the scale in perspective here. If they have 1.2 billion active users a month or whatever it is that they talk about, even at one-half of 1 percent, if they wanted to harm themselves and put this on Facebook, that's six million people.
  How do these 7,000 stop that?
  FARHAD MANJOO: Yes.
  I mean, the way that tech companies generally work is, they manage scale by, you know, leveraging commuters, basically. There's a lot of kind of algorithmic stuff that goes into making sure — they try to, you know, cut down the pool that the human reviewers have to look at.
  And there is some experience in this in the Valley. I mean, YouTube has had to deal with this sort of thing for years. And the way they have really come around to doing it is a similar process. Like, they have thousands and thousands of hours of videos uploaded essentially every minute, and they count on kind of the viewers to flag anything that's terrible, and then it goes to these human reviewers.
  So, it's a process that can work. The difficulty in Facebook's case is, it's live video, so they have to get it down much more quickly. And so, you know, it's possible that they may need more people or some other, you know, algorithmic solution, but I think this is a — you know, it should be an improvement over what they have now.
  脸书在社会问题上会更加谨慎?
  HARI SREENIVASAN: You mentioned it took them so long to get to this point. Why?
  FARHAD MANJOO: I think this is a real sort of cultural blind spot for Facebook in general.
  Oftentimes, they go into these projects — you know, Facebook Live is an example, but many of the other things they have done — with, you know, tremendous optimism.
  As a company, and Mark Zuckerberg as a technologist, he has tremendous optimism in technology. And they often fail to see or appreciate the possible kind of downsides of their technology and the ways that it could be misused.
  I mean, I think that what we have seen with live — with the live video is a small example. The way that Facebook has sort of affected elections, the way that — you know, the fake news problem we saw in the U.S. election, the way it's been used as a tool for propaganda in various other parts of the world, you know, those are huge examples of, you know, what looked like a fairly simple solution technologically, like we're going to get everyone connected and have them share the news.
  You know, it brings some real deep, like, social questions that they are only lately beginning to confront in a serious way.
  HARI SREENIVASAN: So, this combination of, I guess, an optimism in the technology and design and a faith in users are ultimately good and will make the right choice, I mean, is that the sort of core cultural concern or problem that keeps the company making these sorts of decisions?
  FARHAD MANJOO: That's part of it. And the other thing to remember is, you know, they're a technology company, and speed is of the utmost concern for them.
  One of the things that was happening in the tech industry last year is that a whole bunch of other companies were rolling out live video systems, and Facebook didn't want to be left behind. And so they created their live video system.
  And it became, you know, the biggest, because they're the biggest social network. But with that sort of size comes, you know, an increased opportunity for misuse and more power, right. Like, a video on Facebook that can be seen by, you know, potentially much more people has a lot more potential for being misused.
  And I think they — it's not right to say that they don't consider those things, but it seems like it's on a back burner for them. And I think what's happening at Facebook is a shift toward thinking about these issues at an earlier stage.
  And we have really seen this more recently in their work with the news industry. I mean, after the — after what happened — after what happened in the election and the kind of controversy about fake news, they have started to — they have rolled out a bunch of initiatives to do stuff to improve how news is seen on Facebook. They have added fact-checkers and other things.
  So, I think their attitude is changing, but it may be changing too slowly, compared to how quick the technology they're rolling out is changing.
  HARI SREENIVASAN: All right, Farhad Manjoo of The New York Times, thanks so much.
  FARHAD MANJOO: All right, great. Thanks so much.
  原文地址:http://www.tingroom.com/lesson/pbs/pbssy/406426.html