搜索关注在线英语听力室公众号:tingroom,领取免费英语资料大礼包。
(单词翻译)
Google reportedly moved to increase its control over research reports written by its scientists.
In 2020, the company created a new review, or examination, process for research writings, Reuters news agency reports. The process asks researchers to get advice from legal, policy and public relations teams before writing about some issues.
Review Process
The new review process aims to identify subjects that could be considered "sensitive" for the company. In at least three cases, Google officials asked writers not to show its technology in a bad light, Reuters reported. The news agency said its report was based on Google documents it was able to examine, as well as information from researchers involved in the work.
One company explanation told researchers that technology progress and growing complexities2 in the outside "environment" had led to projects that could raise moral, legal or other problems.
Reuters could not confirm the date of the company communication. Three current employees said the policy began in June. Google did not have any comment on the Reuters story.
The new process for "sensitive" subjects adds more careful study to Google's usual review process for research papers, eight current and past employees said.
One subject considered "sensitive" was how some of Google's machine learning-powered services might be biased3 against some groups of people. Other subjects included the oil industry, China, Iran, Israel, COVID-19, home security, location data, religion, self-driving vehicles, telecommunications and systems designed to suggest websites.
For some projects, Google officials intervened later in the research. Earlier this year, a Google official reviewing a study on content suggestion technology told the writers to "take great care to strike a positive tone," documents provided to Reuters showed.
Strike a positive tone is an expression that means to avoid negative or critical language.
The official added, "This doesn't mean we should hide from the real challenges" created by the software.
Additional messages from a researcher to reviewers shows the writers made changes "to remove all references to Google products."
Four researchers, including scientist Margaret Mitchell, said they believe Google is starting to interfere5 with important studies on the possible harms of technology.
"If we are researching the appropriate thing given our expertise6, and we are not permitted to publish that on grounds that are not in line with high-quality peer review, then we're getting into a serious problem of censorship," Mitchell said.
Google states on its public website that its scientists have "substantial" freedom. Substantial is a term that means a large amount.
Disagreements between Google and some of its employees broke into public view this month after research scientist Timnit Gebru said she had been fired by the company. Gebru, along with Mitchell, led a 12-person team that studied ethics, or moral decisions, in artificial intelligence (AI) software.
Gebru says she was fired after questioning an order not to publish research claiming that AI software that copies speech could hurt some groups. Google said it had accepted Gebru's resignation. Reuters could not confirm whether Gebru's paper had gone through a "sensitive" subjects review.
Google Senior Vice1 President Jeff Dean said in a statement this month that Gebru's paper discussed possible harms without discussing efforts underway to address them.
Dean added that Google supports AI ethics research and said the company was "actively8 working on improving our paper review processes."
Words in This Story
sensitive – adj. likely to cause people to become upset
biased – adj. having or showing a bias4 : having or showing an unfair tendency to believe that some people, ideas, etc., are better than others
location – n. a place or position
challenge – n. a difficult task or problem : something that is hard to do
reference – n. the act of mentioning something in speech or in writing : the act of referring to something or someone
appropriate – adj. right or suited for some purpose or situation
peer – n. a person who belongs to the same age, education or social group as someone else
censorship – n. the system or practice of censoring9 books, movies, letters, etc
artificial intelligence – n. an area of computer science that deals with giving machines the ability to seem like they have human intelligence
1 vice | |
n.坏事;恶习;[pl.]台钳,老虎钳;adj.副的 | |
参考例句: |
|
|
2 complexities | |
复杂性(complexity的名词复数); 复杂的事物 | |
参考例句: |
|
|
3 biased | |
a.有偏见的 | |
参考例句: |
|
|
4 bias | |
n.偏见,偏心,偏袒;vt.使有偏见 | |
参考例句: |
|
|
5 interfere | |
v.(in)干涉,干预;(with)妨碍,打扰 | |
参考例句: |
|
|
6 expertise | |
n.专门知识(或技能等),专长 | |
参考例句: |
|
|
7 ethics | |
n.伦理学;伦理观,道德标准 | |
参考例句: |
|
|
8 actively | |
adv.积极地,勤奋地 | |
参考例句: |
|
|
9 censoring | |
删剪(书籍、电影等中被认为犯忌、违反道德或政治上危险的内容)( censor的现在分词 ) | |
参考例句: |
|
|
本文本内容来源于互联网抓取和网友提交,仅供参考,部分栏目没有内容,如果您有更合适的内容,欢迎 点击提交 分享给大家。