Why Every Customer Service Rep Will Soon Have an AI Assistant
Umesh Sachdev brims with optimism and self-belief.
“If it isn’t innovative, if it isn’t groundbreaking, if it isn’t something that hasn’t been attempted or done, I’m not touching it,” he said. “But if someone tells me about a very hard problem, one that the biggest companies in the world cannot solve, I go after it with a vengeance.”
That kind of aggressive optimism and self-confidence are both traits he needs to be successful both as an immigrant — he grew up in India — and the founder and CEO of Uniphore, a conversational AI company. Uniphore began life back in 2008, and has since amassed more than $220 million in funding.
The company makes AI that listens in on customer service calls and performs manual tasks like pulling up customer information, logging data into the CRM and applying changes to an account as the customer requests them. The idea, Sachdev said, is to equip customer service professionals with a digital assistant who takes notes and performs mundane tasks while the humans talk.
In the wake of its most recent round — a $140 million investment led by Sorenson Capital Partners — Sachdev sat down with Built In to explain his long-term vision for conversational AI and how his background informs his approach to corporate leadership.
What are some of the long term technology trends you’re watching in your industry right now?
The benefits of living life on a 20- to 30-year time horizon is that you’re forced to question what will make you relevant 15 years from now, and you sow the seeds of that foundation with that long-term horizon.
If I think about the human race as a species, our lowest common denominator in terms of communication — irrespective of where we were born, rich or poor, the language we speak — is our voice. We express ourselves through our words and our tone of voice. It’s a very complex and exciting area of research, and it’s not one that’s going to be fully solved from an R&D standpoint in our lifetime. I could spend my entire life trying to get computers as close to understanding human beings as I can and still not get there. That has always fascinated me.
Lately I’ve also realized, especially during the pandemic, that we don’t just use our voice and tone. It’s one of the reasons why we choose to meet on Zoom calls instead of an audio-only connection: non-verbal communication. When you nod, when you raise your eyebrow, when you do the iPhone prayer, when you look away and I know you’re distracted. Those are forms of communication and I adjust what I’m saying accordingly. So if we really want to get good at making people more efficient in the workplace while they’re having conversations, we’re going to combine verbal communication, tonal communication and visual communication, just like the human brain cognitively does. AI will have to take all three into consideration in real time. It could tell you when you lost your audience, or the parts of your speech or conversation that they really liked.
So how are you executing against that long-term vision right now?
We have some of the top scientists and PhDs in the world working on solving the problem of computational linguistics, on getting the machines to understand our words in as many languages and dialect variations as we possibly can.
Natural language processing is also truly important for us. Once you’ve transcribed a meeting or a call, how do you get the machine to understand exactly what was going on? So we’re working on intent extraction, entity extraction and topic modeling to help AI make sense of a conversation.
“It’s not like somebody’s ever challenged me to build the biggest startup I can. But I have this internal hunger and desire to prove something to myself.”
The third area we’re working on is computer vision. Earlier this year we acquired a bright young Spanish startup called Emotion Research Lab that does facial emotion recognition. We’re in the process of combining their technology — which can recognize nods, facial expressions and other movements — with our verbal and cultural technology. It’s a massive exercise in inclusion, too, because we are making sure that our algorithm is trained on people from more than 20 countries, across gender, across age groups, across race. So when this technology is looking at video conversations of several speakers, it hopefully has zero bias in deciphering what it sees, and what it’s analyzing and the recommendation it’s willing to make about people’s emotions and sentiments.
How would you say your approach to leadership affects Uniphore as a business?
If I know that what I’m doing has a good shot at impacting millions of lives and making the world a better place, that’s what drives me. I don't fault somebody for being driven by just building great products, selling them off and going on to the next thing. If you don’t want to scale it, you hand it over to a larger corporation, sell your IP and do the next thing. But I’m not cut from that cloth.
I was born and raised in India. I founded this company in India before moving into the U.S. and scaling us out as a global company now headquartered in Silicon Valley. The perspective that I’ve gained from all these years of meeting immigrants from different parts of the world who come to the United States of America is that most immigrants have a strong desire to prove something to somebody. It’s an unexplainable, insatiable hunger. It’s not necessarily driven just by money. Money is usually one of the top three motivators, but never the number one issue.
I see that in myself, too. It’s not like somebody’s ever challenged me to build the biggest startup I can. But I have this internal hunger and desire to prove something to myself, that I can move to the United States of America, the mecca of technology and innovation, and make something of myself. I think that is unique to this country.