-
(单词翻译:双击或拖选)
U.S. Reps. Rashida Tlaib and Justin Amash raised questions this week in a hearing about the use of facial recognition technology by law enforcement agencies, including Detroit Police Department.
The hearing was held Wednesday by the U.S. House Committee on Oversight1 and Reform.
The hearing came after a report by the Georgetown Law's Center on Privacy and Technology found Detroit has become a pioneer in deploying2 facial recognition technology to try to find suspects in criminal cases.
The Georgetown study says Detroit signed a three-year, $1 million contract with DataWorks Plus, a South Carolina company that offers real time face recognition services on video streams.
The study notes that the facial recognition system was designed to be able to connect to the hundreds of video streams from around the city that the police department monitors through a program called Project Green Light. And it could go even further, according to the study:
At Wednesday's hearing, U.S. Rep. Rashida Tlaib, D-Mich., whose district includes parts of Detroit, raised questions about how the Detroit Police Department has used the new technology.
With little to no input3, the city of Detroit created one of the nation's most pervasive4 and sophisticated surveillance networks with real-time facial recognition technology, Tlaib said during the hearing.
She said the residents of her district already face economic problems, and structural5 racism6. On top of it, she said policing has become more militarized and flawed.
Now we have for-profit companies pushing so-called technology that has never been tested in communities of color, let alone been studied enough to conclude that it makes our communities safer, Tlaib said.
The ability of computer programs to recognize and identify faces has grown rapidly in recent years. Many people encounter it when they upload a photo of a friend to Facebook, and Facebook identifies the friend, and suggests tagging them in the photo. iPhone users can also now unlock their phones using their face.
But this technology has also been increasingly used by law enforcement to identify suspects. And that's raised serious concerns for many people who track the technology.
Joy Boulamwini is a researcher at MIT, and she's studied the biases8 embedded9 in how computer programs identify human faces.
She testified that facial recognition algorithms are often trained with photo sets that disproportionately contain photos of white men. So the algorithms for facial recognition, such as Amazon Rekognition, make more mistakes on people with darker skin, and people who are not men.
They had error rates of over 30 percent for darker skin females and zero percent error rates for lighter10 skin men, Buolamwini said at the hearing. So it is the case that there is verifiable research that shows you have issues with Amazon Rekognition.
Buolamwini has also found biases in the software created by IBM and Microsoft. In one study, she analyzed11 the photos of famous, iconic black women. The programs frequently made mistakes identifying the women.
In one test I ran, Amazon Rekognition even failed on the face of Oprah Winfrey, labeling her male, Buolamwini told members of the committee.
These biases, Buolamwini and others testified, could have serious consequences as the technologies are applied12 to police work. The problem of a misidentification could lead to innocent people being detained or arrested. And because the programs make mistakes more often with people of darker skin, those mistakes will affect people of color more than others.
The policy in Detroit
Still, the Detroit Police Department is defending the use of the technology. Chief James Craig wrote a letter responding to the Georgetown study, saying he takes great umbrage13 at the suggestion the department is using the technology to monitor innocent people in real time.
If there is a report of a crime or a crime is witnessed by a DPD member, the crime is reported to sworn members to investigate, Craig wrote. If there is articulable reasonable suspicion that an individual is observed or reported to have committed a crime, only then is their still image provided for analysis with the Facial Recognition Program.
The Detroit Police Department has a nine-page policy on when and how the facial recognition system can be used. The Georgetown Law's Center on Privacy and Technology obtained a copy of the policy and posted it online, along with other materials it received from the city regarding the program.
The policy outlines who has access to the software and says it only can be used in active investigations15 where there is reasonable suspicion that a person committed a crime, or is planning to commit one.
The policy also allows officers to request permission from the department's legal advisors16 to collect face images at First Amendment17-protected events, which could include political rallies.
The department has not said if it has ever done so. Nor has it disclosed how many times the facial recognition software has been used overall, or how effective it's been.
Fourth Amendment concerns
At Wednesday's hearing, there were other questions about whether the technology should ever be used, even in cases where there's a legitimate18 criminal suspect being investigated.
Even if we require law enforcement to get a warrant to run a search on face recognition data from surveillance cameras, would it be possible for such cameras to use such face recognition technology in public areas without effectively gathering19 or discovering information on innocent people who are not the subject of an investigation14? asked U.S. Rep. Justin Amash, R-Mich.
No, responded Clare Garvie, the researcher at Georgetown who was the lead author on the facial recognition study. That's not the way the face recognition systems work. Unfortunately, in order to identify the face of the person you're looking for, you have to scan every single face of everybody else, who you're not looking for.
One question this raises is whether such a scan would violate a person's rights under the Fourth Amendment to the Constitution, which protect people in America from unreasonable20 searches.
"A lot of agreement"
At Wednesday's House oversight committee hearing, lawmakers from both sides of the aisle21 seemed to agree that the use of facial recognition searches by law enforcement should at least be regulated, if not banned.
There is a lot of agreement here, thank God, said House oversight committee chairman Elijah Cummings, D-Maryland, who added that he's optimistic the hearings will result in legislation
Even one of the experts who testified during the more than two-hour-long hearing seemed to change his mind as the hearing went on.
I certainly would rather not see a moratorium22, said Cedric Alexander, the former president of the National Organization of Black Law Enforcement Executives. However, if the issues that have been articulated here today are as serious as we believe they to be, then we have to go back and ask ourselves that question.
For now, the Detroit Police Department hasn't announced any changes to its own policy of using facial recognition software.
1 oversight | |
n.勘漏,失察,疏忽 | |
参考例句: |
|
|
2 deploying | |
(尤指军事行动)使展开( deploy的现在分词 ); 施展; 部署; 有效地利用 | |
参考例句: |
|
|
3 input | |
n.输入(物);投入;vt.把(数据等)输入计算机 | |
参考例句: |
|
|
4 pervasive | |
adj.普遍的;遍布的,(到处)弥漫的;渗透性的 | |
参考例句: |
|
|
5 structural | |
adj.构造的,组织的,建筑(用)的 | |
参考例句: |
|
|
6 racism | |
n.民族主义;种族歧视(意识) | |
参考例句: |
|
|
7 bias | |
n.偏见,偏心,偏袒;vt.使有偏见 | |
参考例句: |
|
|
8 biases | |
偏见( bias的名词复数 ); 偏爱; 特殊能力; 斜纹 | |
参考例句: |
|
|
9 embedded | |
a.扎牢的 | |
参考例句: |
|
|
10 lighter | |
n.打火机,点火器;驳船;v.用驳船运送;light的比较级 | |
参考例句: |
|
|
11 analyzed | |
v.分析( analyze的过去式和过去分词 );分解;解释;对…进行心理分析 | |
参考例句: |
|
|
12 applied | |
adj.应用的;v.应用,适用 | |
参考例句: |
|
|
13 umbrage | |
n.不快;树荫 | |
参考例句: |
|
|
14 investigation | |
n.调查,调查研究 | |
参考例句: |
|
|
15 investigations | |
(正式的)调查( investigation的名词复数 ); 侦查; 科学研究; 学术研究 | |
参考例句: |
|
|
16 advisors | |
n.顾问,劝告者( advisor的名词复数 );(指导大学新生学科问题等的)指导教授 | |
参考例句: |
|
|
17 amendment | |
n.改正,修正,改善,修正案 | |
参考例句: |
|
|
18 legitimate | |
adj.合法的,合理的,合乎逻辑的;v.使合法 | |
参考例句: |
|
|
19 gathering | |
n.集会,聚会,聚集 | |
参考例句: |
|
|
20 unreasonable | |
adj.不讲道理的,不合情理的,过度的 | |
参考例句: |
|
|
21 aisle | |
n.(教堂、教室、戏院等里的)过道,通道 | |
参考例句: |
|
|
22 moratorium | |
n.(行动、活动的)暂停(期),延期偿付 | |
参考例句: |
|
|