-
(单词翻译:双击或拖选)
Flaws plague a tool meant to help low-risk federal prisoners win early release
Thousands of people are leaving federal prison this month thanks to a law called the First Step Act, which allowed them to win early release by participating in programs aimed at easing their return to society.
But thousands of others may still remain behind bars because of fundamental flaws in the Justice Department's method for deciding who can take the early-release track. The biggest flaw: persistent2 racial disparities that put Black and brown people at a disadvantage.
In a report issued days before Christmas in 2021, the department said its algorithmic tool for assessing the risk that a person in prison would return to crime produced uneven3 results. The algorithm, known as Pattern, overpredicted the risk that many Black, Hispanic and Asian people would commit new crimes or violate rules after leaving prison. At the same time, it also underpredicted the risk for some inmates4 of color when it came to possible return to violent crime.
"From the beginning, civil rights groups cautioned Congress and the Justice Department that use of a risk assessment5 tool to make these determinations would lead to racial disparities," said Aamra Ahmad, senior policy counsel at the American Civil Liberties Union.
"The Justice Department found that only 7% of Black people in the sample were classified as minimum level risk compared to 21% of white people," she added. "This indicator6 alone should give the Department of Justice great pause in moving forward."
The rule of unintended consequences
Risk assessment tools are common in many states. But critics said Pattern is the first time the federal justice system is using an algorithm with such high stakes.
Congress passed the First Step Act in 2018 with huge bipartisan majorities. It's designed to prepare people in prison for life afterward7 by offering credits toward early release for working or taking life skills and other classes while behind bars.
Lawmakers like Sens. Sheldon Whitehouse of Rhode Island and John Cornyn of Texas took inspiration from similar criminal justice reforms in states, which they said led to drops in both prison populations and crime. The senators pointed8 out that some 9 in 10 people in prison eventually return home, and they contended that preparing them for release made good sense for formerly9 incarcerated10 people and for public safety.
Only inmates who pose a low or minimal11 risk of returning to crime can qualify for the programs, with that risk level determined12 using the Pattern algorithm.
"The significance of this risk assessment tool is that it divides all federal prisoners essentially13 into two groups: people who can get credit for doing this programming and get out early, and people who can't," said Jim Felman, an attorney in Tampa, Fla., who has been following the First Step Act for years.
The implementation14 has been rocky. The Justice Department finished the first version of Pattern in a rush because of a tight deadline from Congress.
It then had to make tweaks after finding Pattern suffered from math and human errors.
About 14,000 men and women in federal prison still wound up in the wrong risk categories. There were big disparities for people of color.
"The legislation, I think, came from a good place," said Melissa Hamilton, a professor of law and criminal justice at the University of Surrey who studies risk assessments15. "It's just the rule of unintended consequences is not really realizing the impediments it was going to have."
Risk assessment tool "sounds highly technical, but it's not"
"You use a term like 'risk assessment tool,' it has this patina16 of science, it sounds highly technical, but it's not," said Patricia Richman, who works on national policy issues for the Federal Public and Community Defenders17. "A risk assessment tool is just a series of policy decisions."
Those policy decisions are made by determining what counts as a risk factor and by how much.
Criminal history can be a problem, for example, because law enforcement has a history of overpolicing some communities of color. Other factors such as education level and whether someone paid restitution18 to their victims can intersect with race and ethnicity, too.
In its December report, the Justice Department concluded that some of the disparities could be reduced, "but not without tradeoffs" such as less accurate risk predictions. The department also said using race as a factor in the algorithm could trigger other legal concerns.
Still, it is consulting with experts about making the algorithm fairer and another overhaul19 of Pattern is already underway.
Attorney General Merrick Garland has directed the department to look for ways to assess racial bias20 and make the tool more transparent21, a spokeswoman said.
One option is to adjust the cutoff points between the risk categories, allowing more prisoners to earn credits for release, which would "maximize access to First Step Act relief while ensuring public safety," she said.
Ultimately, Garland will have to sign off on a new version. Then, Justice has to reevaluate the 14,000 people in prison who got lumped into the wrong category.
"This is just one example of the ways that harmful artificial intelligence systems are being rolled out in everything from the criminal legal system to employment decisions to who gets access to housing and social benefits," said Sasha Costanza-Chock, director of research and design for the Algorithmic Justice League, which studies the social implications of artificial intelligence.
Costanza-Chock said the burden is on the Justice Department to prove the Pattern tool doesn't have racist22 and sexist outcomes.
"Especially when systems are high risk and affect people's liberty, we need much clearer and stronger oversight," said Costanza-Chock.
Looking for resolution
Felman, the Florida lawyer working with the American Bar Association, worried that the tool will continue to put many prisoners of color at a disadvantage.
"We will start to see more prisoners get out early," he said. "My concern is that the color of their skin will not be reflective of fairness."
The ACLU's Ahmad said she's seen enough.
"There are no technical fixes to these problems that could make Pattern and similar tools safe and fair to use," Ahmad said. "We would urge the Justice Department to suspend the use of Pattern until it can adequately address these concerns."
Hamilton, who studies risk assessments, thinks the Pattern tool may be worth saving. Consider the alternative, she said: decisions made by people who have all kinds of biases23.
"So that's the unfortunate thing is, it's better than gut24 instinct of the very flawed humans that we all are, and can we improve it more than marginally, and that's what we're all working on?" Hamilton said.
1 transcript | |
n.抄本,誊本,副本,肄业证书 | |
参考例句: |
|
|
2 persistent | |
adj.坚持不懈的,执意的;持续的 | |
参考例句: |
|
|
3 uneven | |
adj.不平坦的,不规则的,不均匀的 | |
参考例句: |
|
|
4 inmates | |
n.囚犯( inmate的名词复数 ) | |
参考例句: |
|
|
5 assessment | |
n.评价;评估;对财产的估价,被估定的金额 | |
参考例句: |
|
|
6 indicator | |
n.指标;指示物,指示者;指示器 | |
参考例句: |
|
|
7 afterward | |
adv.后来;以后 | |
参考例句: |
|
|
8 pointed | |
adj.尖的,直截了当的 | |
参考例句: |
|
|
9 formerly | |
adv.从前,以前 | |
参考例句: |
|
|
10 incarcerated | |
钳闭的 | |
参考例句: |
|
|
11 minimal | |
adj.尽可能少的,最小的 | |
参考例句: |
|
|
12 determined | |
adj.坚定的;有决心的 | |
参考例句: |
|
|
13 essentially | |
adv.本质上,实质上,基本上 | |
参考例句: |
|
|
14 implementation | |
n.实施,贯彻 | |
参考例句: |
|
|
15 assessments | |
n.评估( assessment的名词复数 );评价;(应偿付金额的)估定;(为征税对财产所作的)估价 | |
参考例句: |
|
|
16 patina | |
n.铜器上的绿锈,年久而产生的光泽 | |
参考例句: |
|
|
17 defenders | |
n.防御者( defender的名词复数 );守卫者;保护者;辩护者 | |
参考例句: |
|
|
18 restitution | |
n.赔偿;恢复原状 | |
参考例句: |
|
|
19 overhaul | |
v./n.大修,仔细检查 | |
参考例句: |
|
|
20 bias | |
n.偏见,偏心,偏袒;vt.使有偏见 | |
参考例句: |
|
|
21 transparent | |
adj.明显的,无疑的;透明的 | |
参考例句: |
|
|
22 racist | |
n.种族主义者,种族主义分子 | |
参考例句: |
|
|
23 biases | |
偏见( bias的名词复数 ); 偏爱; 特殊能力; 斜纹 | |
参考例句: |
|
|
24 gut | |
n.[pl.]胆量;内脏;adj.本能的;vt.取出内脏 | |
参考例句: |
|
|