-
(单词翻译:双击或拖选)
Revisiting the idea of whether AI might help those dealing1 with isolation2, depression
The need for mental health care far exceeds the supply of therapists. Could AI technology help bridge the gap ethically5 and safely?
STEVE INSKEEP, HOST:
So many people need advice on their mental health in this country that there are not enough professionals to meet their needs. So what if a computer could help? Some people seek answers from an app on their phones. Artificial intelligence might address isolation or depression, although it also raises new ethical4 questions. NPR's Yuki Noguchi reports.
YUKI NOGUCHI, BYLINE6: Chukurah Ali overcame a traumatic childhood and, several years ago, opened Coco's Desserts in St. Louis, Mo. Her ornate cakes looked fit for baking shows. But those aren't even her favorite.
CHUKURAH ALI: Chocolate chip cookies (laughter). So simple (laughter). Those are my favorite. My grandma used to make them.
NOGUCHI: But last February, things fell apart. A car accident left Ali, a single mom who also cares for her mother, hobbled by injury, from head to knee.
ALI: I could barely talk. I could barely move. I felt like I was worthless because I could barely provide for my family at that moment. And now I lost my car. I can't even take care of my daughter.
NOGUCHI: Darkness and depression engulfed7 Ali.
ALI: The pain, my emotions, migraines.
NOGUCHI: Her orthopedist urged her to find a therapist, but none were available. Plus, Ali could no longer afford health insurance. She had to close the bakery.
ALI: That's stressful, too. That was my second baby.
NOGUCHI: So her doctor suggested a mental health app called Wysa. Its chatbot-only service is free, though it also offers teletherapy services with a human. The chatbot asks questions like, how are you feeling, or what's bothering you? It analyzes8 answers but doesn't generate its own responses. Instead, it draws from a database of psychologist-approved messages that deliver support or advice about managing chronic9 pain, say, or grief. That is how Ali found herself on the frontier of technology and mental health. Initially10, she felt silly opening up to a robot.
ALI: I thought it was weird11 at first 'cause I'm like, OK, I'm talking to a bot. It's not going to do nothing. I want to talk to a therapist (laughter). But that bot helped.
NOGUCHI: Confined to her bed, she could text it at 3 a.m.
ALI: I would just start chatting with it. How are you feeling today? I'm not feeling it. Then it would give me these little options that I could do.
NOGUCHI: Like a simple exercise or deep breathing or listening to soothing12 music. It focused Ali on things other than pain, and it reminded her of the in-person therapy she did years ago.
ALI: What I noticed it was doing - CBT therapy, the cognitive13 behavioral therapy. It's not a person but it makes you feel like it's a person because it's asking you all the right questions.
PAOLA PEDRELLI: And that is really what a therapist does.
NOGUCHI: Paola Pedrelli is a psychologist and professor at Harvard researching uses of AI to monitor mental health.
PEDRELLI: Reflect back what you're saying, naming and labeling your emotion. And now chatbot are able to do that.
NOGUCHI: Companies and researchers like Pedrelli are looking at various ways technology might improve therapy. Motion sensors14 and online activity and things like apps might help flag a patient's worsening mood. AI might also alert therapists when patients skip medications or might keep more detailed15 notes about a patient's tone or behavior during meetings. Other forms of AI interact directly with patients like Chukurah Ali, serving up suggestions based on known therapeutic16 methods.
Skeptics warn there hasn't been enough research or regulatory review and point to dangers of a chatbot misunderstanding or responding inappropriately. Many people may not be receptive to it. But research also shows some people prefer machines. There's no stigma17 with no human at the other end. Ali says as odd as it might sound, she relies on her chatbot.
ALI: I think the most I talked to that bot was, like, seven times a day (laughter).
NOGUCHI: She says mostly, it helps her help herself.
ALI: Or I will go to my physical therapist appointment, when before I'm like, no, I can do it today. I'm going to have to reschedule it.
NOGUCHI: That's precisely18 why Ali's doctor, orthopedist Abby Cheng, suggested she use the app. Cheng treats physical ailments19, but says almost always, mental health challenges accompany those.
ABBY CHENG: Sometimes, if we can't address the mental health aspect of things, we feel stuck.
NOGUCHI: And patients, in turn, get stuck because of a lack of therapists, transportation, insurance, time or money.
CHENG: In order to address this huge mental health crisis we have in our nation and even globally, I think digital treatments and AI can play a role in that and at least fill some of that gap in the shortage of providers and resources that people have.
NOGUCHI: But getting to that future also requires figuring out thorny20 issues like health privacy and legal liability. And even AI's proponents21 argue, computers aren't ready to replace human therapists, especially for handling people in crisis.
Cindy Jordan is CEO of Pyx Health, a company that uses AI as part of its service to help people who feel chronically22 lonely. She worries, for example, about a chatbot responding to a suicidal person.
CINDY JORDAN: Oh, I'm sorry to hear that or, worse, I don't understand you. That makes me nervous. You know, we have not reached a point where - in an affordable23, scalable way - where AI can understand every sort of response that a human might give, particularly those in crisis.
NOGUCHI: So as a backup, Pyx staffs a call center with people who call users when the system identifies them as potentially in crisis. But for more routine support Chukurah Ali says she believes technology could help many more people and recommends the app to all her friends. She constantly finds herself passing along mental health advice she learns from it.
ALI: I wasn't like this before, but now it's like, so what you going to do today to make you feel better? How about you try this today (laughter)?
NOGUCHI: It isn't just a technology trying to act human, she laughs. She's now mimicking24 the technology.
Yuki Noguchi, NPR News.
INSKEEP: Hey, listen, I'm telling you, person to person, if you or someone you know may be considering suicide or in crisis, call or text the 988 Suicide & Crisis Lifeline. Just three digits25 - 988.
1 dealing | |
n.经商方法,待人态度 | |
参考例句: |
|
|
2 isolation | |
n.隔离,孤立,分解,分离 | |
参考例句: |
|
|
3 transcript | |
n.抄本,誊本,副本,肄业证书 | |
参考例句: |
|
|
4 ethical | |
adj.伦理的,道德的,合乎道德的 | |
参考例句: |
|
|
5 ethically | |
adv.在伦理上,道德上 | |
参考例句: |
|
|
6 byline | |
n.署名;v.署名 | |
参考例句: |
|
|
7 engulfed | |
v.吞没,包住( engulf的过去式和过去分词 ) | |
参考例句: |
|
|
8 analyzes | |
v.分析( analyze的第三人称单数 );分解;解释;对…进行心理分析 | |
参考例句: |
|
|
9 chronic | |
adj.(疾病)长期未愈的,慢性的;极坏的 | |
参考例句: |
|
|
10 initially | |
adv.最初,开始 | |
参考例句: |
|
|
11 weird | |
adj.古怪的,离奇的;怪诞的,神秘而可怕的 | |
参考例句: |
|
|
12 soothing | |
adj.慰藉的;使人宽心的;镇静的 | |
参考例句: |
|
|
13 cognitive | |
adj.认知的,认识的,有感知的 | |
参考例句: |
|
|
14 sensors | |
n.传感器,灵敏元件( sensor的名词复数 ) | |
参考例句: |
|
|
15 detailed | |
adj.详细的,详尽的,极注意细节的,完全的 | |
参考例句: |
|
|
16 therapeutic | |
adj.治疗的,起治疗作用的;对身心健康有益的 | |
参考例句: |
|
|
17 stigma | |
n.耻辱,污名;(花的)柱头 | |
参考例句: |
|
|
18 precisely | |
adv.恰好,正好,精确地,细致地 | |
参考例句: |
|
|
19 ailments | |
疾病(尤指慢性病),不适( ailment的名词复数 ) | |
参考例句: |
|
|
20 thorny | |
adj.多刺的,棘手的 | |
参考例句: |
|
|
21 proponents | |
n.(某事业、理论等的)支持者,拥护者( proponent的名词复数 ) | |
参考例句: |
|
|
22 chronically | |
ad.长期地 | |
参考例句: |
|
|
23 affordable | |
adj.支付得起的,不太昂贵的 | |
参考例句: |
|
|
24 mimicking | |
v.(尤指为了逗乐而)模仿( mimic的现在分词 );酷似 | |
参考例句: |
|
|
25 digits | |
n.数字( digit的名词复数 );手指,足趾 | |
参考例句: |
|
|