Intervention via Automation
Organs can be transplanted. Advanced cancer treatments are saving lives. Babies are being born to previously infertile couples. Without a doubt, the medical field has come a long way.
But despite plenty of innovative treatment plans, one area that has held steady is the number of people with alcohol and drug addictions. In 2007, 22.3 million people aged 12 or older were classified with substance dependence or abuse—and that number has barely budged since 2002, according to the 2007 National Survey on Drug Use & Health from the Department of Health and Human Services’ Substance Abuse and Mental Health Services Administration (SAMHSA).
Like many diseases, proactive early intervention can make all the difference in getting a person on the road to recovery and good health. That’s exactly what’s driving biomedical researchers at Boston Medical Center (BMC) who, with one foot planted firmly in medicine and the other in information technology, have designed an automated, voice-activated phone system they’re now using to screen patients for possible substance abuse.
"Research shows many people will answer sensitive questions for any kind of computerized system when they will not necessarily to a real person," says Dr. Amy Rubin, director of automated programs for the BMC team and a counseling psychologist. "Using speech, what we’re trying to do, and what the research also shows is helpful, is to identify people early on and to treat them in a nonjudgmental medical setting. If you wait until someone is seriously ill or dependent, then they may have already lost jobs and their families—and that’s really very late."
Keeping Tabs
Even before SAMHSA issued an RFP looking for universal ways that primary care practices could better detect alcohol and drug problems among their patients, BMC had been using interactive voice response (IVR) systems to care for patients with chronic health conditions. For example, in the late 1980s, BMC’s Medical Information Systems Unit team built a system to monitor patients with hypertension between their doctor appointments.
"In the ICU we’re monitoring people second to second, [but out of the office] how do we ensure that they actually do what we tell them?" says Dr. Robert Friedman, head of the BMC team developing voice solutions and a physician who has worked in the field of biomedical informatics since the early 1960s. "We rely too much on patients and family members. I thought, we need a way to somehow monitor these patients between visits. Then I thought, ah, this phone thing, this IVR thing. What if we just called them up and somehow could get their blood pressure?"
That was the start of a system that proved successful in getting patients to take their blood-pressure medications, Friedman says, and paved the way for other IVR systems designed to monitor people with more complicated diseases, such as diabetes. Friedman’s team also automated a simple telephony-based screening instrument used by the World Health Organization to detect drinking problems, followed by a federally funded program that involved building an IVR system to treat heavy drinkers. But before that project could be completed, SAMHSA’s request came along. That was early 2006. BMC contacted the Massachusetts Department of Public Health, which, given the way federal agencies work, had to apply for the five-year, $15 million program, which kicked in the following January.
"SAMHSA funded 11 programs nationally, but we were the only one chosen who proposed doing an automated program," Friedman says. "The people who are in charge are quite supportive because these human-based programs where you’re paying counselors to make 7 zillion interviews just aren’t practical. You can do it in a funded project, but once the project is over and the funding goes away, it can’t be supported."
What BMC proposed, and then went on to develop, is an IVR-based substance-abuse screening program that cycles through a series of doctor-designed questions—as many as 44, though only a few patients have gone through them all—that were recorded by a professional voice actor and imported into the system via Nuance Communications’ speech software. Answers to the screening questions are ported into a Web-based questionnaire and received by BMC’s specially trained staff, who counsel patients who screened positive about their treatment options when they arrive at the clinic.
For now four different clinics within BMC, as well as a few nearby community health centers, use the system, which runs on Envox Communications Development Platform 7, a standards-based voice solution platform from Envox Worldwide. "We were attracted to using Envox because, first of all, it’s [VoiceXML]-based and object-oriented, which is considered the accepted approach to programming compared to alternatives," Friedman says. "The other part that was attractive was that Envox had developed the environment around it in terms of tools to make it easy [for our team] to program."
The system went live in August, and so far the results are encouraging. In the first month, outbound calls were made to nearly 600 people, 48 percent of whom were directly reached (as opposed to a busy signal, answering machine, disconnected number, or person other than the patient answering the call, all of which the system is designed to handle), according to Rubin. Of those patients reached, about 19 percent answered enough questions to be classified as negative or positive, she says.
"We can take the following perspective," Friedman adds. "Since [before] no one was screened, even if a half or one-third of the people do it, that’s a lot of people. The technology is scalable, as well. We’re planning to screen hundreds of people a day, but you can go up to tens of thousands or hundreds of thousands."
Screening in Action
So, let’s say you’re a BMC patient with an upcoming office visit. About a week before your appointment, you’ll receive an automated outbound call from the clinic as a reminder. Once the system has ensured it’s speaking to the right person, it will begin the prescreening process, explaining that your answers are confidential, just like your medical records. How you answer four basic prescreening questions, like whether you’ve used narcotic pain medications, sedatives, or amphetamines in the past three months, will dynamically determine which questions follow; the system knows to drop a topic and move to the next if you answer no. Should you test positively for signs of an addiction, the system will let you know that when you go to your appointment, someone in the clinic will speak to you regarding treatment.
"They are also told that if they don’t want to talk to anyone about it, it’s OK, and they’ll still get their regular medical care," Rubin says. "We also give them choices along the way. We’ll say, ‘We have a few more questions now. Do you have time to answer them? Or would you like to reschedule or finish this in the clinic?’"
Should you decide you need a break, the system can call you back and pick up where it left off. If you decide to finish the questions at your doctor’s office, the person whose job it is to interview you will have your completed answers at hand. Going through the process in one shot can take up to 20 minutes, Rubin says.
No matter the time, the real question, of course, is whether an addiction can be successfully treated by automated speech technology. "We wouldn’t do it if we didn’t think so," Friedman says. "If you go and see a human professional—and I don’t care what kind it is—there’s a bias to please that person, tell them what they want to hear, and not embarrass yourself. That doesn’t really exist with computer systems, and that has been shown in multiple studies."