美国国家公共电台 NPR It Ain't Me, Babe: Researchers Find Flaws In Police Facial Recognition Technology(在线收听

It Ain't Me, Babe: Researchers Find Flaws In Police Facial Recognition Technology

play pause stop mute unmute max volume 00:0003:52repeat repeat off Update Required To play the media you will need to either update your browser to a recent version or update your Flash plugin. RENEE MONTAGNE, HOST: 

And here's a statistic that surprised me - nearly half of all American adults have been entered into law enforcement facial recognition databases. That's according to a report from Georgetown University Law School. NPR's Laura Sydell reports on the forensic use of that technology and the many problems with its accuracy.

LAURA SYDELL, BYLINE: There's a good chance your driver's license is in one of the databases. The Georgetown report says there's more than 117 million photos. Facial recognition can be used, say, when investigators have a picture of a suspect and they don't have a name. They can run the photo through a facial recognition program to see if it matches any of the license photos. It's kind of like a very large digital version of a lineup.

JONATHAN FRANKLE: Instead of having a lineup of five people who have been brought in off the street to do this, the lineup is you. You're in that lineup all the time.

SYDELL: This is Jonathan Frankle, a computer scientist and one of the authors of the report just put out by Georgetown University Law School's Center on Privacy and Technology. Frankle says the photos the police may have of a suspect aren't always that good. They're often from a security camera.

FRANKLE: Security cameras tend to be mounted on the ceiling. They get great views of the top of your head, not very great views of your face. And you can now imagine why this would be a very difficult task, why it's hard to get an accurate read on anybody's face and match them with their driver's license photo.

SYDELL: And Frankle says their study also found evidence that facial recognition software didn't work as well with people who are dark skinned. There's still limited research on why this is. Some critics say the developers aren't testing the software against a diverse enough group of faces, or it could just be lighting.

FRANKLE: Darker skin has less color contrast. And these algorithms rely on being able to pick out little patterns in color to be able to tell people apart.

SYDELL: Because of its flaws, facial recognition technology does bring a lot of innocent people to the attention of law enforcement. And Patrick Grother says most people have a few doppelgangers out there. He's a computer scientist with the National Institute of Standards and Technology, part of the U.S. Commerce Department.

PATRICK GROTHER: The larger you go, the greater the chance of a false positive. Inevitably, if you look at a billion people, you will find somebody that looks quite similar.

SYDELL: And even with the photos taken at the Department of Motor Vehicles, there can be differences in how they're shot. Grother thinks if those photos are going to be used for facial recognition, there needs to be more uniform standards in lighting, height, focus.

GROTHER: Without those things, without those technical specifications, then face recognition can be undermined.

SYDELL: And yet, it's sophisticated enough to be useful in critical situations. Anil Jain, a computer science professor at Michigan State University, did an experiment after the Tsarnaev brothers, who committed the Boston Marathon bombing, were caught. He wanted to see if facial recognition technology could have helped police name them sooner. Police had photos of them from a security camera. Jain ran those photos against a database of a million driver's licenses. The software found 10 matches for the younger brother.

ANIL JAIN: We were able to locate him in the top 10 candidates. But the older brother, we couldn't locate, and the reason was he was wearing those dark glasses.

SYDELL: Of course, it did identify nine people who were not guilty. In a statement responding to the Georgetown study, the FBI says it only uses facial recognition as an investigative lead, not for positive identification. The Georgetown authors aren't saying that this technology should never be used, only that lawmakers need to create standards. Otherwise, it can be misused and harm innocent people. Laura Sydell, NPR News.

  原文地址:http://www.tingroom.com/lesson/npr2016/10/389681.html