-
(单词翻译:双击或拖选)
JUDY WOODRUFF: Let's turn now to a social media story that's been generating lots of reaction, including anger, over the past few days.
It's in response to a study Facebook conducted with hundreds of thousands of its users. The study in question goes back to 2012, when Facebook manipulated the incoming content of pages belonging to almost 700,000 of its users for a week, without telling them. It was designed to see how people's attitudes were affected1 when they read either a stream of more positive posts or more negative ones in their so-called news feeds.
The results were published in a respected scientific journal in June. As that information has come to light, many are upset at what Facebook did and how they did it. It's also prompted concerns about the ethics2 of the research, the journal where it was published and much more.
To fill in the details, we're joined now by Reed Albergotti of The Wall Street Journal.
Welcome to the program.
Reed, first of all, where did the idea for this study come from? What did Facebook hope it was going to accomplish by doing this?
REED ALBERGOTTI, The Wall Street Journal: Well, around the time of this study, there was sort of a meme going around the Internet that when you go on Facebook and you see all these wonderful things that your friends and family are posting about their lives, you start to feel a little bad about your own life.
And there was some research, some academic research at the time that really kind of backed up that theory. And Facebook wanted to find out whether or not that was true. And that's why they embarked3 on this research project. And they say they have debunked4 that theory, and they weren't shy about it. They worked with Cornell to publish the study and tell the public what they'd found.
JUDY WOODRUFF: Did they raise any questions internally? Is it known about the propriety5 or the ethics of doing this?
REED ALBERGOTTI: No.
And, in fact, Cornell issued a statement saying they looked at it and they decided6 they were not subject to federal guidelines, laws actually, that require informed consent of human research subjects, because the study was done by Facebook without the involvement of the Cornell researchers at the time.
JUDY WOODRUFF: But what about Facebook? Do we know if there was discussion about whether they should have let people know ahead of time?
REED ALBERGOTTI: Well, Facebook says that it has an internal review process, but it said at the time it wasn't as rigorous as it is now. And it's one thing that we have been pressing Facebook to tell us more about, is, you know, how did this internal review process evolve? And what are really the procedures in place now?
JUDY WOODRUFF: So, Reed Albergotti, what are the ethical — I mean, first of all, are there any legal considerations to this, that maybe they violated a law by doing this?
REED ALBERGOTTI: Well, I think, right now, it's really more of a question of ethics.
The laws really apply to government institutions — institutions that receive federal funding, like Cornell University, and not really to private companies. In fact, Facebook isn't the only social media company or tech company that's gathering8 reams of personal data and using it in these scientific experiments.
But Facebook is one that publishes it publicly more than other companies.
JUDY WOODRUFF: So, if — set aside any legal question. What about the ethics of it? What are you, what are others saying about what ethical lines might have been crossed here?
REED ALBERGOTTI: Well, look, I have talked to a lot of academic researchers here about this study, and I think really there's a consensus9 sort of being formed that there needs to be a strong, hard look at the ethics of this.
It's a growing trend really in the scientific community, private companies, corporations using their data in conjunction with research institutions for scientific studies. And, right now, it's really an ethical gray area.
And I think researchers would like to see something like another level of informed consent that Facebook would put in front of its users when they enter them into these types of studies. But, right now, it's so early, I think we will have to look at how this backlash shakes out to see if that actually happens.
JUDY WOODRUFF: And just to clarify, give us an example of how the news — so-called news feed was manipulated. As we said earlier, they were in some cases making sure they were seeing more positive information, in other cases more negative. What's an example of how that worked?
REED ALBERGOTTI: Well, there was actually an algorithm, a computer algorithm, that had certain words that were associated with positive or negative news feed posts.
So the algorithm was run totally automatically without any hands-on involvement of the data scientists at Facebook. And that's because they wanted to keep these research subjects totally anonymous10. So the algorithm decided which posts were positive and negative, and then automatically removed those from the news feeds of those users for about a week.
And then after that week was up, some of those posts might have been reintroduced to those news feeds and the users might have eventually seen them.
JUDY WOODRUFF: I just wanted to read. We — on our Web site, we asked some of our visitors what they thought about this. We got comments both positive or at least — at least not so critical and others.
I'm just going to read two quickly, one from someone named Carrie. She said: “So, read the TOS, terms of service, and don't sign if you don't agree. That's the point. People don't read terms of service, and then they get upset when Facebook does something that the terms allows.”
And then from another visitor, Scott. He wrote: “The problem is that the terms of service is deliberately11 so vague that they can basically claim that they do whatever they want at any time. Would you buy a TV from Sony if the manual said that they could for any reason decide what programs you could watch on their TV?”
How typical would you say those reactions are?
REED ALBERGOTTI: Oh, I think they're very typical.
We saw similar reactions on our own website in the comments section. And I think what academic researchers are saying is, yes, Facebook has these terms of service that really indemnify them against any legal repercussions12, although that may be debated in the future, but there really needs to be — in order for this academic research to be ethical, according to very acceptable — accepted guidelines, there needs to be another terms of service.
Users need to be asked again when they're being entered into a study if really they want to and they need to be told about the risks. In this case, the risk could have been, if someone was predisposed to depression, that might have triggered some sort of emotional instability. So there are big questions that we need to answer here.
JUDY WOODRUFF: Certainly are. I think a lot of people didn't even realize how much there's just a regular adjustment of what people see on their Facebook pages. But that's a subject for a future conversation.
Reed Albergotti, we thank you.
REED ALBERGOTTI: Thanks for having me.
点击收听单词发音
1 affected | |
adj.不自然的,假装的 | |
参考例句: |
|
|
2 ethics | |
n.伦理学;伦理观,道德标准 | |
参考例句: |
|
|
3 embarked | |
乘船( embark的过去式和过去分词 ); 装载; 从事 | |
参考例句: |
|
|
4 debunked | |
v.揭穿真相,暴露( debunk的过去式和过去分词 ) | |
参考例句: |
|
|
5 propriety | |
n.正当行为;正当;适当 | |
参考例句: |
|
|
6 decided | |
adj.决定了的,坚决的;明显的,明确的 | |
参考例句: |
|
|
7 ethical | |
adj.伦理的,道德的,合乎道德的 | |
参考例句: |
|
|
8 gathering | |
n.集会,聚会,聚集 | |
参考例句: |
|
|
9 consensus | |
n.(意见等的)一致,一致同意,共识 | |
参考例句: |
|
|
10 anonymous | |
adj.无名的;匿名的;无特色的 | |
参考例句: |
|
|
11 deliberately | |
adv.审慎地;蓄意地;故意地 | |
参考例句: |
|
|
12 repercussions | |
n.后果,反响( repercussion的名词复数 );余波 | |
参考例句: |
|
|