In 2012, a miscommunication led Rotem and Omri Shor’s diabetic father to take an additional dose of insulin when he shouldn’t have. This was a life-threatening mistake, and one the brothers later learned is quite common. Patients frequently forget to take their medications or, forgetting they have already taken their usual dose, take another. The brothers founded Medisafe in an effort to prevent mistakes like these.
The Medisafe digital drug companion began as a “mobile pillbox app”: a simple, visual way for patients to track their medications and remember to take them on time. The early version combined on-phone text notifications and a visual interface with educational resources and the ability to push notification to patients’ family members if the patient misses a medication check-in. But the digital drug companion has evolved since then as AI and machine learning capabilities have expanded, becoming more personalized to meet each patient’s needs.
In the past year, the company developed a machine learning model that can predict when a patient is likely to engage with the app by indicating they took their medication, or by responding to a reminder notification. Due in large part to increases in data storage and processing abilities, plus the growing availability of AI and machine learning applications, this complex model can be run on a very large and ever-growing dataset at real-time or near-real-time speed. This means the app can tailor its notifications to a patient’s specific situation. Instead of sending a patient a reminder to take their medication at the same time every morning, for example, the app sends notifications when the patient is likely to be able to take medication.
Big Data Science Trends to Watch in the Near Future
- Increased accessibility to as-a-service AI and machine learning solutions that can be applied by companies without in-house data science teams.
- Advances in natural language processing like OpenAI’s GPT-3 model that allow for increased understanding of context in verbal communication.
- Automated agent technology, particularly in customer service experiences and the knowledge workforce.
For example, if a patient is driving — based on GPS data coupled with past engagement behavior — it wouldn’t make sense to remind them to take their medication. The assistant can auto-snooze the notification until the user leaves the car.
Rotem Shor, Medisafe’s chief technology officer, told Built In that the company expects to increase the platform’s personalization abilities to the point of identifying the level of assertiveness different patients need in their patient support. While some patients need to be reminded repeatedly in quick succession to take their medication, for example, some only need one reminder.
This level of empathetic AI-powered customization might sound a bit like science fiction, but it might not be that far off. Shor and other experts Built In talked to for this story point to the growing access companies have to AI and machine learning tools — particularly in the area of natural language processing — as likely to create a future with tools and assistants that are more personalized, contextualized and empathetic.
Machine Learning Tools Are More Accessible Today
In years past, big players like Google and Microsoft have created supercomputers to train AI models for — among other things — natural language processing. These supercomputers can take up more space than several single family homes, they require massive amounts of power and water to run, and operating one requires several teams with highly specialized skills.
While many companies can benefit from the results of these processes, few can afford the investment in hardware and personnel. Luckily, the massive investments by big companies have created a sprawling ecosystem of as-a-service cloud computing that includes AI and machine learning solutions. This has allowed more companies without in-house data scientists to apply “elegant and high-powered machine learning and AI solutions” like image recognition, text translation or voice analysis to their business, according to Frederik Hagstroem, chief technology officer at Emergn, a tech-focused consulting firm.
“Once we have some of these capabilities that are really enabled by AI, as a planet, we just won’t go back.”
A large-scale example would be BBC using Microsoft’s Azure Cognitive Services and Bot Service to create BEEB, its own branded voice assistant that helps readers (or viewers or listeners) find content they’re interested in verbally rather than via text. Like Siri, but for BBC content.
It’s a good thing AI and machine learning tools are becoming more widely available, because more companies are going to need them, according to Rodrigo Liang, co-founder and CEO of SambaNova Systems, an AI innovation company.
In the coming years, Liang said, these tools will enable inventions that we wouldn’t even think about a decade ago, but that will seem indispensable in retrospect. As an example, he pointed to driver-assist features like automatic emergency braking.
“Who actually would not want to stop the car hitting the car in front of them?” Liang asked. “Once we have some of these capabilities that are really enabled by AI, as a planet, we just won’t go back.”
Improved Language Processing Will Drive More Personalization
Natural language translation is a key AI tool every type of business will need in the near future, according to Liang. He and others we spoke to pointed to the increasing value of models that can understand colloquial human speech and its context.
Liang gave the example of talking to conversational AI assistants like Siri, Alexa and Google Assistant that are used regularly by a quarter of U.S. adults as of 2020 according to NPR’s Smart Audio Report. Ask them “what is the weather today?” and they will readily pull up an overview of the day’s weather outlook for your area. Ask it instead, “Hey, how’s it look out there” and it will be confused. While a person could understand that more informal question as being a request for the same information as the first, more specific question, that context is lost on current conversational AIs.
And context is key going forward with natural language translation for businesses, according to Liang. He pointed out that we don’t send lawyers into the medical field and expect them to understand medical jargon and terminology without training, for example.
“Machines are the same,” he said. “They need context.”
According to Liang, AI’s growing sophistication in understanding context can be attributed in large part to the GPT-3 model pioneered by OpenAI. It is unique for its size — containing 175 billion parameters, compared to the roughly 10 billion parameters of its prior iteration and other competing NLP models — and it is a generalized transformer that can be applied widely to a variety of applications instead of to just one use case. Given its size, it can better extrapolate context from any given piece of text.
Liang also sees GPT-3’s translation abilities as having the potential to open up business opportunities for companies currently blocked by language barriers by reducing the cost associated with translation.
“So therefore we will personalize your experience to be not just for you, but for the frame of mind we think you’re in.”
“You can take this model — a single model — and translate to over 100 languages worldwide,” he said. Thus far, the cost for a company to translate its materials into new languages, particularly languages with smaller speaker bases, has usually been too high for it to make sense for companies to try to expand into those markets.
“What if you can actually have one source and automatically the machine can translate to [those] languages? Colloquially?” Liang asked.
He noted that “we’re still not there” perfectly, but he sees the doors opening to that possibility soon. And he is not the only one watching OpenAI and its language models for what it can do for translation. Hagstroem said that OpenAI’s language models have made translation of text almost “a solved problem.” While he also acknowledged that translation abilities are not perfect yet, he did note that built-in translation options on websites, in programs or for digital documents were not as widely available five years ago as they are today, but they have grown rapidly. He also expects the ability to translate will move beyond text and into other mediums such as audio.
Hagstoem also sees another AI and machine-learning powered language-based personalization coming for customer support experiences. Specifically, personalization that will be situational, based on what real-time contextual information can be gleaned about customers based on their voice and behaviors in a specific interaction.
“Emotions, including stress, disappointment, confusion, happiness and aggression can be relatively easy to detect using AI through facial recognition, voice or text analysis,” said Hagstroem.
Anger is one of the easiest to detect, he added. If an AI-powered customer support tool recognizes the signs of an angry customer, the customer might receive a survey designed to get a better sense of their frustrations and delivered in a more apologetic tone of voice, for example. An angry customer in an e-commerce situation might get faster response, speedier checkout and fewer extra options or suggestions. A customer who gets frustrated with a registration process might be given a more guided step-by-step user flow or help texts.
“So therefore we will personalize your experience to be not just for you, but for the frame of mind we think you’re in,” said Hagstroem.
AI Will Drive Personalization in Medicine, from R&D to Patient Support
The medical field also has an eye to improving and personalizing the experiences of patients through increased use of AI and machine learning, according to Shor. In addition to the efforts of his own company to apply AI to the patient support side of things, he sees the pharmaceutical industry adopt AI solutions to the research and development side.
“Pharma companies [will] start moving into this area and invest more and more in the companies that are going to help them find the next drug,” he said, pointing to drugmaker Sanofi’s recent $180 million investment in AI startup Owkin as an example. Not only could AI expedite the drug-discovery process, but Shor sees personalized drugs — pharmaceutical formulations tailored to the specific needs of the individual patient taking them — as the next logical step.
Shor also expects the pairing of AI-powered tools with the medical field will also provide patients with more options in other areas, including his own field of patient support.
“I’m looking at the automated agents,” he said, specifically referencing the Google Duplex calling assistant. In the world of patient support, sometimes patients need a human voice to interact with.
“I believe that the automated agents will be able to save a lot of money for the industry to be able to interact with the patients around the bureaucracy, insurance and stuff like that. And the clinical support will be there as an escalation point,” Shor said. He described these potential automated medical agents as being yet another tool in the patient support toolbox of the future alongside texts, emails and calls from actual people.
Digital Colleagues Changing the Face of Knowledge Work
Several people Built In talked to anticipate a future in which automated agents become part of the professional lives of knowledge workers and part of consumers’ experiences. This is especially true of repetitive tasks in data-intensive fields. Think paperwork.
Hagstroem gave a current example of automation in the auto insurance claims process, which in some cases can be end-to-end automated today. Documents can be read and processed automatically. If something is missing that is needed to complete a claims process, the automated agent can even reach out to the claimant and request the needed information. In the past, such automated contact was easy for most people to spot.
“But we are now getting to the point where you wouldn’t necessarily know,” Hagstroem said. Or people wouldn’t necessarily have reason to think about it. “You would just have the service provided and then maybe later think, ‘I wonder if there was a human involved or not,’” he said.
“Five years ago, we were on the safe side of the Turing test,” Hagstroem said. “Now we’re not.”
More than just assistants sending well-timed reminders and unobtrusively providing useful services, automated agents will become “our day-to-day working partners” with which (whom?) we can converse, according to Peter Miscovich, managing director of strategy and innovation at JLL, a real estate and investment management firm.
“I think it could free us up from a lot of what I would call some of our challenging work activities ... and it will hopefully allow more time for all of us to be more creative and more innovative and let us do our best work for our best selves.”
But if automated colleagues are going to be the norm in the knowledge workplace of the future, knowledge workers will have to upskill and shift their focus. Hagstroem described that potential future as one where the knowledge worker shifts more into the role of topic expert who trains and supervises a machine learning algorithm — sort of like a Jedi master/padawan relationship. Rather than doing the work directly, the knowledge worker would have the responsibility to check the work of the automated agent and work to improve its decision-making abilities.
“That’s how we think we’re going to move the white-collar work to be a little bit of overseeing and operating and working on exceptions more than working on routine and standard decisions,” he said.
Miscovich acknowledged that it sounds a bit like science fiction. But it might not be that far off.
“Similar to what happened to the factory worker, the assemblage folks, the warehouse folks who were replaced by robots, we will see it by 2025, and certainly beyond, certain segments of the white-collar workforce will be replaced by machine intelligence and intelligent automation.”
While that can sound threatening, Miscovich said it is also an opportunity for a large portion of the workforce.
“I think it could free us up from a lot of what I would call some of our challenging work activities,” he said. “And it will hopefully allow more time for all of us to be more creative and more innovative and let us do our best work for our best selves.”