-
(单词翻译:双击或拖选)
Hari Sreenivasan: A recent study published in Science magazine found significant racial bias1 in an algorithm used by hospitals across the nation to determine who needs follow up care and who does not. Megan Thompson recently spoke2 with STAT's Shraddha Chakradhar, who explained what the researchers found.
Megan Thompson: Where exactly was this bias coming from?
Shraddha Chakradhar: There are two ways that we can identify how sick a person is. One, is how many dollars are spent on that person. You know, the assumption being the more health care they come in for, the more treatment that they get, the more dollars they spend and presumably the sicker they are if they're getting all that treatment. And the other way is that, you know, we can measure actual biophysical things, you know, from lab tests, what kind of conditions or diseases they might have. So it seems like this algorithm was relying on the cost prediction definition. In other words, the more dollars a patient was projected to spend on the part of an insurance company or a hospital, then that was a sign of how sick they were going to be. And that seems to be where the bias emerged.
Megan Thompson: I understand that the researchers then sort of use the algorithm using a different type of data. Can you just tell us a little bit more about that? What did they use?
Shraddha Chakradhar: Yeah. So instead of relying on just costs to predict which patients are going to need follow up care, they actually used biometric data, physical biophysical data, physiological4 data, and they saw a dramatic difference, you know, in the previous model. The algorithm missed some 48,000 extra chronic5 conditions that African-American patients had. But when they rejiggered the algorithm to look more at actual biological data, they brought that down to about 7,700. So it was about an 84 percent reduction in bias.
Megan Thompson: Do we know anything about how the use of this biased6 algorithm actually affected7 patient care?
Shraddha Chakradhar: We don't actually know that. But as I mentioned, the algorithm is used by hospitals to help them flag patients who might need extra care in the coming year, whether it's, you know, an at-home nurse or making sure that they come in for regularly scheduled doctor's appointments. So we can only presume that if black patients, sicker black patients weren't being flagged accurately8, that they also missed out on this follow up care.
Megan Thompson: Are there any consequences for the company, Optum, that was behind this algorithm?
Shraddha Chakradhar: Yes. So the day after the study came out, actually, New York regulators, the Department of Financial Services and the Department of Health sent a letter to the company saying they were investigating this algorithm and that the company had to show that the way the algorithm worked wasn't in violation9 of anti-discrimination laws in New York. So that investigation10 is pending11. One encouraging thing is that when the researchers did the study, they actually reached back to Optum and let them know about the discrepancy12 in the data. And the company was glad to be told about it. And I'm told that they're working on a fix. And the other encouraging thing is that the researchers have actually now launched an initiative to help other companies who may be behind similar algorithms to help them fix any biases13 in their programs. So they've launched a program based out of the University of Chicago's Booth School to do this work on a pro3 bono basis so that they can sort of catch these things in other algorithms that might be used across the country.
Megan Thompson: All right, Shraddha Chakradhar of STAT, thank you so much for being with us.
Shraddha Chakradhar: Thank you for having me.
1 bias | |
n.偏见,偏心,偏袒;vt.使有偏见 | |
参考例句: |
|
|
2 spoke | |
n.(车轮的)辐条;轮辐;破坏某人的计划;阻挠某人的行动 v.讲,谈(speak的过去式);说;演说;从某种观点来说 | |
参考例句: |
|
|
3 pro | |
n.赞成,赞成的意见,赞成者 | |
参考例句: |
|
|
4 physiological | |
adj.生理学的,生理学上的 | |
参考例句: |
|
|
5 chronic | |
adj.(疾病)长期未愈的,慢性的;极坏的 | |
参考例句: |
|
|
6 biased | |
a.有偏见的 | |
参考例句: |
|
|
7 affected | |
adj.不自然的,假装的 | |
参考例句: |
|
|
8 accurately | |
adv.准确地,精确地 | |
参考例句: |
|
|
9 violation | |
n.违反(行为),违背(行为),侵犯 | |
参考例句: |
|
|
10 investigation | |
n.调查,调查研究 | |
参考例句: |
|
|
11 pending | |
prep.直到,等待…期间;adj.待定的;迫近的 | |
参考例句: |
|
|
12 discrepancy | |
n.不同;不符;差异;矛盾 | |
参考例句: |
|
|
13 biases | |
偏见( bias的名词复数 ); 偏爱; 特殊能力; 斜纹 | |
参考例句: |
|
|