Why Voice Technology is the Next Frontier of the Patient Experience
Lower costs, a more "human" experience and fast information access
If there’s one industry where self-service is urgently mandated, it’s healthcare.
As treatment costs balloon, patients seek ways to self-diagnose symptoms, track vitals like sleep quality and heart rate using wearables and query licensed doctors through chat applications regarding garden-variety ailments.
Voice technology has emerged as the next frontier for self-service in healthcare, promising a more “human” experience and enabling users to access information quickly without navigating a complicated interface.
While the use cases are still nascent, they pose myriad benefits to practitioners and patients alike – especially the elderly and disabled, those afflicted with chronic disease or living in rural areas.
The most critical applications of voice include disease management (system tracking, journaling, medication adherence), data collection and cost reduction. In the future, the technology could evolve into a diagnostic tool using voice biomarkers like tone, inflection, breathing patterns and more to detect abnormalities.
Amazon has patented technology through which a voice assistant detects abnormal physical or emotional conditions in a user’s voice, such as coughing, sniffling or fatigue and serves targeted audio content back to the user.
Justifying the business case for voice
Like any novel technology solution, voice must solve a business problem, such as engaging patients between doctor’s visits, improving access for patients in clinical trials and removing friction in overall treatment.
“Voice is not a solution; it’s a tool in your toolbox. You have to think about the problem holistically and see where voice is applicable and where it’s not,” Anna Kravets, chief digital officer for health services and solutions at Merck said in a panel discussion at the VOICE Summit in Newark.
An expert panel at VOICE Summit. Photo credit: Kindra Cooper for CCW Digital
Healthcare organizations are investing in innovation hubs staffed by professionals like chief information officers, chief digital officers and even tech-literate chief nursing officers. They’re exclusively responsible for probing the tech landscape for uncharted use cases for which there is a verifiable customer need.
While Apple’s iPhone is a classic example of a product-centric approach towards selling gadgets people didn’t know they needed rather than starting from proof-of-concept, there’s a balance between pioneering emerging technologies and listening to the market.
“One of the challenges with voice in particular is that it can solve a lot of small problems, and innovation centers are often designed to solve smaller problems, but businesses consume solutions to big problems,” said Dan Solomon, a professor of medicine at Harvard Medical School.
Think big from the outset and avoid framing a hyper-specific problem, otherwise you downplay the potential ROI and alienate stakeholders.
“Start big and then find a place where you can prove the value at a small scale,” said Jonathan Berman, senior manager of consulting, strategy and analytics at Deloitte. “When you start thinking big you broaden the aperture on the problem and the places where it connects.”
Voice for patient engagement
The most mission-critical use case for voice so far is symptom-tracking for patients with chronic illness. They may see their doctor episodically every two or three months, but in between visits, voice assistants log and track their symptoms and adhere to medication by administering reminders or prompting the patient to schedule their next appointment.
Hospitals are experimenting with automated interactive phone calls through voice assistants, which are similar to robocalls at face value except that they aren’t irrelevant, disruptive and unsolicited.
These automated check-ins prompt patients to schedule appointments, help them prepare for procedures and standardize care information provided before and after treatment. Every patient interaction with a voice assistant generates real-time data, which leads to more personalized care.
Currently, most healthcare consumers use voice assistants for information services, such as Mayo Clinic’s First Aid skill on Alexa and WebMD’s symptom tracker. With voice assistants being essentially a “black box” lacking visual cues, they’re suitable for quick hits and guided interactions, but not conveying lengthy or complicated information.
Similarly, Solomon and his team created a library of FAQs with “Harvard-approved answers to simple questions.”
Voice has also been shown to increase information retention when used to deliver medical advice versus reading a pamphlet.
One healthcare provider in Minnesota ran an assessment of patients recently diagnosed with Type II diabetes. One group was sent home with a standard paper brochure, while the other received a dedicated voice assistant in addition to the brochure to answer basic questions about their condition.
Researchers found that patients who were given a virtual assistant better-retained information than those who read the pamphlet.
Removing friction for healthcare providers
Solomon also uses voice technology to administer assessment surveys on a daily or weekly basis to preemptively detect red flags and be up-to-date on the patient’s condition on their next visit.
“As a provider, I struggle with, OK, so what were you like two months ago? And how have you been over the last ten weeks? Patients can’t tell you unless they called before or they’re crazy diary keepers,” Solomon added.
Despite being highly trained, clinicians aren’t exempt from administrative duties. Traditionally, they spend a lot of time entering information on their latest patient interaction in an electronic health record.
Advanced voice assistants equipped with natural language processing can pick up context in a conversation between doctor and patient and automatically generate patient notes, while others enable doctors to dictate their notes using speech-to-text capabilities.