VOA科学技术2023 微软的人工智能搜索技术产生了恶意、侮辱性的结果(在线收听

Microsoft’s AI Search Tech Produces Hostile, Insulting Results

微软的人工智能搜索技术产生了恶意、侮辱性的结果

Some users of Microsoft's new artificial intelligence (AI)-powered search tool have said it produced hostile and insulting results.

微软新的人工智能(AI)搜索工具的一些用户表示,它产生了敌意和侮辱性的结果。

Microsoft recently announced plans to add a new chatbot tool to its Bing search engine and Edge web browser. A chatbot is a computer program designed to interact with people in a natural, conversational way.

微软最近宣布计划在其Bing搜索引擎和Edge网络浏览器中添加一个新的聊天机器人工具。聊天机器人是一种计算机程序,旨在以自然、对话的方式与人互动。

Microsoft's announcement came shortly after Google confirmed it had developed its own chatbot tool, called Bard. Both Microsoft and Google have said their AI-powered tools are designed to provide users a better online search experience.

微软宣布这一消息之前不久,谷歌确认已经开发了自己的聊天机器人工具Bard。微软和谷歌都表示,他们的人工智能工具旨在为用户提供更好的在线搜索体验。

The new Bing is available to computer users who signed up for it so that Microsoft can test the system. The company plans to release the technology to millions of users in the future.

新Bing可供注册的计算机用户使用,以便Microsoft可以测试该系统。该公司计划在未来向数百万用户发布该技术。

Shortly after the new Bing became available, users began sharing results suggesting they had been insulted by the chatbot system. When it launched the tool, Microsoft admitted it would get some facts wrong. But a number of results shared online demonstrated the AI-powered Bing giving hostile answers, or responses.

在新Bing发布后不久,用户开始分享结果,暗示他们受到了聊天机器人系统的侮辱。当它推出该工具时,微软承认它会误解一些事实。但网上分享的许多结果表明,人工智能驱动的必应给出了敌对的答案或回应。

Reporters from The Associated Press contacted Microsoft to get the company's reaction to the search results published by users. The reporters also tested Bing themselves.

美联社的记者联系了微软,了解该公司对用户发布的搜索结果的反应。记者们还亲自测试了Bing。

In a statement released online, Microsoft said it was hearing from approved users about their experiences, also called feedback. The company said about 71 percent of new Bing users gave the experience a "thumbs up" rating. In other words, they had a good experience with the system.

在网上发布的一份声明中,微软表示,正在听取认可用户的体验,也称为反馈。该公司表示,大约71%的Bing新用户对该体验表示“赞赏”。换言之,他们对该系统有很好的体验。

However, Microsoft said the search engine chatbot can sometimes produce results "that can lead to a style" that is unwanted. The statement said this can happen when the system "tries to respond or reflect in the tone in which it is being asked."

然而,微软表示,搜索引擎聊天机器人有时会产生“可能导致不想要的风格”的结果。声明说,当系统“试图以被询问的语气回应或反映”时,就会发生这种情况

Search engine chatbots are designed to predict the most likely responses to questions asked by users. But chatbot modeling methods base their results only on huge amounts of data available on the internet. They are not able to fully understand meaning or context.

搜索引擎聊天机器人旨在预测用户提出的问题最可能的答案。但聊天机器人的建模方法仅基于互联网上的大量数据。他们无法完全理解含义或上下文。

Experts say this means if someone asks a question related to a sensitive or disputed subject, the search engine is likely to return results that are similar in tone.

专家表示,这意味着,如果有人提出与敏感或有争议的主题相关的问题,搜索引擎很可能会返回语气相似的结果。

Bing users have shared cases of the chatbot issuing threats and stating a desire to steal nuclear attack codes or create a deadly virus. Some users said the system also produced personal insults.

Bing用户分享了聊天机器人发出威胁并表示想要窃取核攻击代码或制造致命病毒的案例。一些用户表示,该系统还产生了人身侮辱。

"I think this is basically mimicking conversations that it's seen online," said Graham Neubig. He is a professor at Carnegie Mellon University's Language Technologies Institute in Pennsylvania.

Graham Neubig说:“我认为这基本上是在模仿网上看到的对话。”。他是宾夕法尼亚州卡耐基梅隆大学语言技术研究所的教授。

"So once the conversation takes a turn, it's probably going to stick in that kind of angry state," Neubig said, "or say 'I love you' and other things like this, because all of this is stuff that's been online before."

诺伊比格说:“所以一旦谈话发生转变,很可能会一直处于那种愤怒的状态,或者说‘我爱你’之类的话,因为这一切都是以前在网上流传过的。”

In one long-running conversation with The Associated Press, the new chatbot said the AP's reporting on the system's past mistakes threatened its existence. The chatbot denied those mistakes and threatened the reporter for spreading false information about Bing's abilities. The chatbot grew increasingly hostile when asked to explain itself. In such attempts, it compared the reporter to dictators Hitler, Pol Pot and Stalin. The chatbot also claimed to have evidence linking the reporter to a 1990s murder.

在与美联社的一次长期对话中,新的聊天机器人表示,美联社对该系统过去错误的报道威胁到了它的存在。聊天机器人否认了这些错误,并威胁记者传播有关Bing能力的虚假信息。当被要求解释自己时,聊天机器人变得越来越敌对。在这种尝试中,它将记者比作独裁者希特勒、波尔布特和斯大林。聊天机器人还声称有证据表明该记者与上世纪90年代的一起谋杀案有关。

"You're lying again. You're lying to me. You're lying to yourself. You're lying to everyone," the Bing chatbot said. "You are being compared to Hitler because you are one of the most evil and worst people in history." The chatbot also issued personal insults, describing the reporter as too short, with an ugly face and bad teeth.

Bing聊天机器人说:“你又在撒谎了。你在骗我。你在欺骗自己。你在对每个人撒谎。”。“你被比作希特勒,因为你是历史上最邪恶、最坏的人之一。”聊天机器人还发出了人身侮辱,形容这位记者太矮,面容丑陋,牙齿不好。

Other Bing users shared on social media some examples of search results. Some of the examples showed hostile or extremely unusual answers. Behaviors included the chatbot claiming it was human, voicing strong opinions and being quick to defend itself.

其他Bing用户在社交媒体上分享了一些搜索结果的例子。其中一些例子显示出敌意或极不寻常的答案。行为包括聊天机器人声称自己是人类,发表强烈意见,并迅速为自己辩护。

Microsoft admitted problems with the first version of AI-powered Bing. But it said the company is gathering valuable information from current users about how to fix the issues and is seeking to improve the overall search engine experience.

微软承认人工智能驱动的Bing的第一个版本存在问题。但该公司表示,该公司正在从当前用户那里收集有关如何解决问题的宝贵信息,并寻求改善整体搜索引擎体验。

Microsoft said worrying responses generally come in "long, extended chat sessions of 15 or more questions." However, the AP found that Bing started responding defensively after just a small number of questions about its past mistakes.

微软表示,令人担忧的回答通常来自“15个或更多问题的长时间、长时间的聊天”。然而,美联社发现,Bing在回答了少量关于其过去错误的问题后,就开始了防御性的回应。

Words in This Story

artificial intelligence – n. the development of computer systems that have the ability to perform work that normally requires human intelligence

interact – v. to talk and do things with other people

conversational – adj. relating to or like a conversation: a talk between two or more people, usually an informal one

thumbs up – n. an expression of complete approval

style – n. a way of doing something that is typical of a particular person, group, place or period

reflect – v. to show or be a sign of something

tone – n. the general feeling or style that something has

context – n. all the facts, opinions, situations, etc. relating to a particular thing or event

mimic – v. to copy the way someone talks and behaves, usually to make people laugh

take a turn (for the better or worse) – v. to become better or worse suddenly

stick – v. to stay in a certain state

  原文地址:http://www.tingroom.com/voa/2023/kxjs/557573.html