-
(单词翻译:双击或拖选)
The fields of artificial intelligence and machine learning are moving so quickly that any notion of ethics1 is lagging decades behind, or left to works of science fiction.
由于人工智能和机器学习领域发展得太迅速,以致于任何伦理概念都滞后几十年,或是留给了科幻作品。
This might explain a new study out of Shanghai Jiao Tong University, which says computers can tell whether you will be a criminal based on nothing more than your facial features.
这也许能够解释上海交通大学的一项新研究。该研究表明,计算机只需根据你的面部特征就能分辨出你是否是一个罪犯。
上海交大研发人工智能 通过脸部识别技术辨别罪犯
In a paper titled "Automated2 Inference on Criminality using Face Images," two Shanghai Jiao Tong University researchers say they fed "facial images of 1,856 real persons" into computers and found "some structural3 features for predicting criminality, such as lip curvature, eye inner corner distance, and the so-called nose-mouth angle."
在一篇题为《基于面部图像的自动犯罪概率推断》的文章中,两位上海交通大学的研究人员表示,他们将"1856个真人的面部图像"录入计算机,发现"一些能够预测犯罪率的结构特征,例如上唇曲率、内眼角间距和鼻唇角角度。"
They conclude that "all classifiers perform consistently well and produce evidence for the validity of automated face-induced inference on criminality, despite the historical controversy4 surrounding the topic."
他们的结论是:"尽管该主题一直具有历史争议,但是所有的分类器都表现出色,并为人脸识别技术辨认罪犯的有效性提供了证据。"
In the 1920s and 1930s, the Belgians, in their role as occupying power, put together a national program to try to identify individuals' ethnic5 identity through phrenology, an abortive6 attempt to create an ethnicity scale based on measurable physical features such as height, nose width and weight.
在20世纪20年代及30年代,比利时人以占领国的身份制定了一项国家计划,试图通过骨相来识别个人的民族特性,试图根据可测量的身体特征,如身高、鼻子宽度和重量,来划分一个的种族范围。
The study contains virtually no discussion of why there is a "historical controversy" over this kind of analysis — namely, that it was debunked7 hundreds of years ago.
此项研究几乎没有讨论为什么这种分析有一个"历史争议",它在几百年前就被揭穿了。
Rather, the authors trot8 out another discredited9 argument to support their main claims: that computers can't be racist10, because they're computers.
相反,作者提出了另一个可信的论点来支持他们的主要论断:计算机不能成为种族主义者,因为它们是计算机。
Unlike a human examiner/judge, a computer vision algorithm or classifier has absolutely no subjective11 baggages, having no emotions, no biases12 whatsoever13 due to past experience, race, religion, political doctrine14, gender15, age, etc.
与人类检查员/法官不同,计算机视觉算法或分类器绝对没有主观看法、没有情绪、没有由于过去经验、种族、宗教、政治信条、性别、年龄等而造成的偏见。
Besides the advantage of objectivity, sophisticated algorithms based on machine learning may discover very delicate and elusive16 nuances in facial characteristics and structures that correlate to innate17 personal traits.
除了客观性的优势,基于机器学习的复杂算法可能发现面部特征和结构中非常微妙和难以捉摸的细微差别,这些细微差别与先天的个人特征相关。
点击收听单词发音
1 ethics | |
n.伦理学;伦理观,道德标准 | |
参考例句: |
|
|
2 automated | |
a.自动化的 | |
参考例句: |
|
|
3 structural | |
adj.构造的,组织的,建筑(用)的 | |
参考例句: |
|
|
4 controversy | |
n.争论,辩论,争吵 | |
参考例句: |
|
|
5 ethnic | |
adj.人种的,种族的,异教徒的 | |
参考例句: |
|
|
6 abortive | |
adj.不成功的,发育不全的 | |
参考例句: |
|
|
7 debunked | |
v.揭穿真相,暴露( debunk的过去式和过去分词 ) | |
参考例句: |
|
|
8 trot | |
n.疾走,慢跑;n.老太婆;现成译本;(复数)trots:腹泻(与the 连用);v.小跑,快步走,赶紧 | |
参考例句: |
|
|
9 discredited | |
不足信的,不名誉的 | |
参考例句: |
|
|
10 racist | |
n.种族主义者,种族主义分子 | |
参考例句: |
|
|
11 subjective | |
a.主观(上)的,个人的 | |
参考例句: |
|
|
12 biases | |
偏见( bias的名词复数 ); 偏爱; 特殊能力; 斜纹 | |
参考例句: |
|
|
13 whatsoever | |
adv.(用于否定句中以加强语气)任何;pron.无论什么 | |
参考例句: |
|
|
14 doctrine | |
n.教义;主义;学说 | |
参考例句: |
|
|
15 gender | |
n.(生理上的)性,(名词、代词等的)性 | |
参考例句: |
|
|
16 elusive | |
adj.难以表达(捉摸)的;令人困惑的;逃避的 | |
参考例句: |
|
|
17 innate | |
adj.天生的,固有的,天赋的 | |
参考例句: |
|
|