-
(单词翻译:双击或拖选)
How Microsoft's experiment in artificial intelligence tech backfired
Microsoft is putting restraints on its new AI chatbot after it professed2 its love to some and berated3 others. Some experts say the company may have released the bot too quickly.
LEILA FADEL, HOST:
Microsoft's chatbot has gone rogue4. It's professing5 love to some users. It's calling people ugly. It's spreading false information. NPR's Bobby Allyn reports on how this experiment in artificial intelligence tech backfired.
BOBBY ALLYN, BYLINE6: Matt O'Brien is a technology reporter for the Associated Press. He was testing out Microsoft's new Bing earlier this month. It's the first-ever search engine powered by AI. It also includes a chatbot that can hold text conversations a whole lot like a human. At first, O'Brien found the chatbot impressive. Its answers were fast, and it could hold forth7 on a wide range of subjects. But then it got weird8. The chatbot started coming at O'Brien.
MATT O'BRIEN: It finally got to this point where it was saying, like, I have a really bad character.
ALLYN: Let's just say it didn't stop there.
O'BRIEN: Unstyled hair, ugly face, bad teeth, too short, unathletic, slight, bad posture9, bad skin, overweight, poor figure, et cetera. And then you are also horrible, evil, wicked, terrible, and people compare you to the worst people in history, such as Hitler.
ALLYN: Yeah. The bot started this belligerent10 streak11 with O'Brien only after he asked it whether Microsoft should pull the plug on the bot, since some of its answers were littered with inaccuracies. As a tech reporter, O'Brien knows the Bing chatbot can't think or feel things. But still, he was pretty taken aback at the hostile and defensive12 tone.
O'BRIEN: You can sort of intellectualize the basics of how it works but doesn't mean you don't become deeply unsettled by some of the crazy and unhinged things it was saying.
ALLYN: Many in the Bing tester group, including me, have had strange experiences. For instance, New York Times reporter Kevin Roose published a transcript of a conversation with the bot. The bot called itself Sydney, and it was in love with him. The bot said he was the first person who listened and cared about it. The bot also told Roose he didn't really love his spouse13 but that he loved the bot. Here's Roose recounting the incident on the Times podcast "Hard Fork."
(SOUNDBITE OF PODCAST, "HARD FORK")
KEVIN ROOSE: All I can say is that it was an extremely disturbing experience. I actually, like, couldn't sleep last night 'cause I was thinking about this.
ALLYN: As you might imagine, Microsoft vice14 president Yusuf Mehdi has been following along.
YUSUF MEHDI: This is one of the things - we didn't quite predict that people would use the technology in this way.
ALLYN: In other words, Mehdi says, when Microsoft was developing the chatbot, they hadn't had hours-long conversations with the AI involving personal questions. Turns out, if you treat a chatbot like a human, it'll start to do some crazy things.
MEHDI: These are literally15 a handful of examples out of many, many thousands. And we're up to now a million tester previews that have come up. So did we expect that we'd find a handful of scenarios16 where things didn't work properly? Absolutely.
ALLYN: But these handful of scenarios have made Microsoft put new limits on the chatbot for those in the tester group. The number of consecutive17 questions on one topic you can ask are now capped. And to many questions, it now says this - I'm sorry, but I prefer not to continue this conversation. I'm still learning, so I appreciate your understanding and patience, with, of course, a praying hands emoji. Now, you might be wondering, OK, but how and why did this chatbot go off the rails to begin with? I asked Arvind Narayanan this. He's a computer science professor at Princeton. He says chatbots like Microsoft's scraped a vast amount of text on the internet and feed it into the AI to learn patterns.
ARVIND NARAYANAN: That includes data from Reddit, from 4chan, from various dark corners of the internet where people are talking to each other. So the bot has been trained, likely, I would say, not just on, let's say, news articles or Wikipedia, but also all of these unfiltered conversations that are happening online.
ALLYN: And while Microsoft said it had worked to make sure the vilest18 underbelly of the internet wouldn't appear in answers, somehow the chatbot still got pretty ugly fast. But we don't know why exactly because Microsoft won't discuss what data trained the bot, nor what particular information may have made it go rogue. They're being so secretive in part because there is now an AI arms race among Big Tech companies. Microsoft and its competitors, Google, Meta and Amazon and others, are locked in a fierce battle over who will dominate the AI future. And chatbots are just one area where this rivalry19 is playing out. Narayanan says Microsoft should have kept its chatbot in the lab a little longer.
NARAYANAN: It seems very clear that the way they released it, you know, is not a responsible way to release a product that is going to interact with so many people on such a scale.
ALLYN: Microsoft's Mehdi, though, says the company doesn't regret its decision to put the chatbot into the wild.
MEHDI: There's only so much you can find when you test in sort of a lab. You have to actually go out, start to test it with customers, to find these types of scenarios.
ALLYN: It is true that scenarios like the one New York Times reporter Roose found himself in were probably hard to predict. At one point, Roose tried to switch topics and have the bot help him buy a rake, and it offered a detailed20 list of things to consider when rake shopping. Great. But then the bot got tender again. It wrote, I just want to love you and be loved by you.
Bobby Allyn, NPR News.
1 transcript | |
n.抄本,誊本,副本,肄业证书 | |
参考例句: |
|
|
2 professed | |
公开声称的,伪称的,已立誓信教的 | |
参考例句: |
|
|
3 berated | |
v.严厉责备,痛斥( berate的过去式和过去分词 ) | |
参考例句: |
|
|
4 rogue | |
n.流氓;v.游手好闲 | |
参考例句: |
|
|
5 professing | |
声称( profess的现在分词 ); 宣称; 公开表明; 信奉 | |
参考例句: |
|
|
6 byline | |
n.署名;v.署名 | |
参考例句: |
|
|
7 forth | |
adv.向前;向外,往外 | |
参考例句: |
|
|
8 weird | |
adj.古怪的,离奇的;怪诞的,神秘而可怕的 | |
参考例句: |
|
|
9 posture | |
n.姿势,姿态,心态,态度;v.作出某种姿势 | |
参考例句: |
|
|
10 belligerent | |
adj.好战的,挑起战争的;n.交战国,交战者 | |
参考例句: |
|
|
11 streak | |
n.条理,斑纹,倾向,少许,痕迹;v.加条纹,变成条纹,奔驰,快速移动 | |
参考例句: |
|
|
12 defensive | |
adj.防御的;防卫的;防守的 | |
参考例句: |
|
|
13 spouse | |
n.配偶(指夫或妻) | |
参考例句: |
|
|
14 vice | |
n.坏事;恶习;[pl.]台钳,老虎钳;adj.副的 | |
参考例句: |
|
|
15 literally | |
adv.照字面意义,逐字地;确实 | |
参考例句: |
|
|
16 scenarios | |
n.[意]情节;剧本;事态;脚本 | |
参考例句: |
|
|
17 consecutive | |
adj.连续的,联贯的,始终一贯的 | |
参考例句: |
|
|
18 vilest | |
adj.卑鄙的( vile的最高级 );可耻的;极坏的;非常讨厌的 | |
参考例句: |
|
|
19 rivalry | |
n.竞争,竞赛,对抗 | |
参考例句: |
|
|
20 detailed | |
adj.详细的,详尽的,极注意细节的,完全的 | |
参考例句: |
|
|