-
(单词翻译:双击或拖选)
SCOTT SIMON, HOST:
Axon, formerly1 known as TASER International, makes tasers and body cameras for police departments. And in the near future, body cams may be equipped with facial recognition software. The company has created a new ethics2 board to consider some of the implications of this software and the emerging use of artificial intelligence in local policing. Axon CEO Rick Smith joins us now from their offices in Scottsdale, Ariz. Thanks so much for being with us.
RICK SMITH: Thanks for having me on today.
SIMON: As I don't have to tell you, Mr. Smith, you announced your ethics board, and a group of more than 40 civil rights and tech groups wrote you a letter in which they said you should never develop real-time facial recognition through police body cameras because they say the risks of misidentification are too high. Innocent people could be pursued by the police - sometimes suffer fatal consequences. How do you respond to that?
SMITH: Well, we agree philosophically3 with the issues that were raised. But I think it's counterproductive to say that a technology is unethical and should never be developed. I think what we need to do is take a look at how this technology could evolve. What are the risks? But basically today, an individual officer might have to make life-or-death decisions based only on their own perceptions and prejudices. Do we think that computers getting information to those officers that could them make better decisions would move the world in the right direction? And I think the answer is unequivocally, yes, that could happen.
SIMON: As the technology stands now - as we've heard reported any number of times, the technology's especially faulty when it comes to seeing the differences in darker faces.
SMITH: I think that has to do with the types of training data sets that have been used historically. Certainly those are one of the issues that before we developed anything to be deployed4 in the field, we would take a very hard look at that.
SIMON: I gather Chinese police, for example, routinely use facial recognition technology. Some of them even have sunglasses that come equipped with cameras that can identify faces in real time. They say it's allowed them to arrest suspected criminals. And in, you know, China criminals can include people who just believe in free speech. Do you have reservations about using that technology here?
SMITH: Well, you know, for example, there are police forces around the world that use batons5 and guns in very abusive ways. And yet ultimately, we know that our police, in order to do their job, need to have those same types of tools. We understand that these technologies could be used in ways that we don't want to see happening in our society. However, I think it's too blunt an instrument to say that because there is a risk of misuse6, we should just write them off. I think we need to dig a layer deeper and understand what are the benefits and what are the risks.
SIMON: What are the benefits in your mind?
SMITH: Well, I mean, you could imagine many benefits. I think one example you can look at is DNA7 technology. You know, when DNA was first being introduced, there was much concern about false positives and false matches. And yet ultimately, I think DNA technology has done more than any other key technology in exonerating8 people that were wrongfully convicted. I think we'll see other biometrics, including facial recognition technology, that properly deployed with the right oversight9 over the coming decades could ultimately reduce prejudice in policing and help catch dangerous people that we all agree we don't want out in our communities and do it in a way that, at the same time, respects police transparency and rights of privacy of the average citizen.
SIMON: Maybe this is generational, but, Mr. Smith, how do you feel about the fact that we might soon have a technology that - well, when you leave the office today, it'll recognize you and know when you get into the elevator. It will recognize you when you're in the parking lot. It will recognize you when you stop at a stoplight on your way home and know where you are all the time.
SMITH: Well, it's certainly an interesting world that we're moving into where notions of privacy are changing pretty dramatically. And what's most interesting is I think people are actually opting10 into these systems. Knowingly and willingly, they're deploying11 these types of technologies for the convenience that they offer to themselves. And then that opens questions of what does privacy mean in the world we live in today. And frankly12, what's it going to mean in another 10 or 20 years?
SIMON: Rick Smith, CEO of Axon, formerly known as TASER International. Thanks so much for being with us, sir.
SMITH: Great. And thank you for the thoughtful questions.
1 formerly | |
adv.从前,以前 | |
参考例句: |
|
|
2 ethics | |
n.伦理学;伦理观,道德标准 | |
参考例句: |
|
|
3 philosophically | |
adv.哲学上;富有哲理性地;贤明地;冷静地 | |
参考例句: |
|
|
4 deployed | |
(尤指军事行动)使展开( deploy的过去式和过去分词 ); 施展; 部署; 有效地利用 | |
参考例句: |
|
|
5 batons | |
n.(警察武器)警棍( baton的名词复数 );(乐队指挥用的)指挥棒;接力棒 | |
参考例句: |
|
|
6 misuse | |
n.误用,滥用;vt.误用,滥用 | |
参考例句: |
|
|
7 DNA | |
(缩)deoxyribonucleic acid 脱氧核糖核酸 | |
参考例句: |
|
|
8 exonerating | |
v.使免罪,免除( exonerate的现在分词 ) | |
参考例句: |
|
|
9 oversight | |
n.勘漏,失察,疏忽 | |
参考例句: |
|
|
10 opting | |
v.选择,挑选( opt的现在分词 ) | |
参考例句: |
|
|
11 deploying | |
(尤指军事行动)使展开( deploy的现在分词 ); 施展; 部署; 有效地利用 | |
参考例句: |
|
|
12 frankly | |
adv.坦白地,直率地;坦率地说 | |
参考例句: |
|
|