Drew, a 21-year-old in Irvine, California, needs help: He’s transgender, and after starting hormone replacement therapy he’s facing harassment from coworkers. It’s gotten so bad, Drew tells a crisis counselor via a text-based chat session, that he’s considering suicide. He can’t quit his job, however, because he needs the money.
“i think about killing myself pretty constantly these days,” Drew types.
The counselor reassures Drew — thanking him for reaching out to talk, telling him he’s not alone — and draws out details about how Drew plans to kill himself.
“Have you done anything today to try to kill yourself?” the counselor asks.
After a pause, Drew responds, “no i haven’t done anything today.”
It’s a hard conversation to read, even with the knowledge that Drew isn’t a real person, but rather an artificially intelligent chatbot created by The Trevor Project, a suicide prevention and crisis intervention group for LGBTQ youth.
While chatbots are often thought of as a necessary (and at times obnoxious) outgrowth of online customer service, Drew’s purpose is far different from helping customers do things like return a pair of pants or get an insurance quote. Drew simulates conversations with volunteer crisis-counselors-in-training who will go on to staff The Trevor Project’s always-available text- and chat-based helplines (the group also has a staffed 24/7 phone line). LGBTQ youth are at a higher risk of depression and suicide than other young people, and research indicates this may have worsened during the pandemic due to factors such as isolation from school closures.
The overall training process for new counselors who will respond to texts and chats takes months, and role-playing is a key part of it. The hope is that, with the aid of capable chatbots like Drew, the nonprofit can train many more counselors more quickly than by conducting role-playing sessions staffed by people.
“You can watch a lot of training videos and you can read all the handbooks. You can get cognitively how this is supposed to go. But actually doing it and feeling the feelings of being in one of these conversations, even if it’s simulated, is just a different kind of learning experience,” said Dan Fichter, head of AI and engineering for The Trevor Project.