We often take for granted the fact that our friends and family recognize us when we walk into a room. Or that they can read our facial expressions and know if we’re frustrated, worried, happy or excited. Or that they smile and roll their eyes if we say something outrageous or exaggerated.

But now, thanks to advances in artificial intelligence and other AI technology like computer vision, which helps computers understand images and videos, robots are able to do many of these same things, albeit without the empathy and true emotional understanding that keeps them from becoming, you know, sentient.

Today, despite this lack of consciousness, these robots, which are commonly referred to as social robots, are providing companionship and emotional and learning support to children and adults who play, talk and even snuggle with them like they would a pet. They’re also working together in warehouses and factories transporting goods, and being used as research and development platforms, allowing researchers to study how humans interact with robots and make even further advances in the field.

What Is a Social Robot?

A social robot is a robot capable of interacting with humans and other robots. Social robots are developed using artificial intelligence and are often equipped with sensors, cameras, microphones and other technology so they can respond to touch, sounds and visual cues much like humans would. 

 

What Is a Social Robot?

Social robots are artificial intelligence platforms, paired with sensors, cameras, microphones and other technology, like computer vision, so they can better interact and engage with humans or other robots.

Social robots come in many different shapes and sizes, from human-like faces on static pedestals to furry, tail-wagging puppies. Their aesthetic design is an important element to foster human engagement and interaction, but like humans, what often matters most is what’s on the inside.

What most people think of when they think of a social robot is a robot that is capable of communicating and understanding intent and mood, much like humans are able to do, according to Alexander Kernbaum, interim director of SRI International’s Robotics Laboratory, a research institute that developed Apple’s Siri and the first telerobotic surgeon, DaVinci.

But there are so many ways robots can do that, whether that’s by deciphering facial expressions and countering with a smile, or tracking humans with their eyes to prove it’s paying attention. Recently, in an experiment to improve pedestrian safety when interacting with autonomous vehicles, researchers in Japan slapped a large pair of googly eyes on the front of a self-driving golf cart to see how pedestrians would respond.

“If you’re crossing in front of the car, the car’s eyes will follow you and then you know that the car sees you,” Kernbaum told Built In. “It’s a kind of communication.”

Whether people will accept a car with googly eyes plastered on the front is another question, Kernbaum said, but those googly eyes do increase engagement with humans, adding a layer of interactivity humans normally enjoy with one another.

Related Robotics Reading18 Companies Turning AI Robots Into Real-Life Wins

 

How Are Social Robots Used?

Today, social robots are most often found working as companions and support tools for childhood development, specifically autism therapy and social-emotional learning. Social robot pets can even be an effective form of therapy for people with dementia.

Social robots also work as concierges in hotels and other settings like malls, where they provide customer service.

And depending on how loose one’s definition of a robot is, social robots have become even more personal. When not living in our pockets, smartphones use built-in social artificial intelligence tools like Siri to help us avoid traffic jams, compose texts and add events and meetings to our calendars.

More on RoboticsRobots and AI Taking Over Jobs: What to Know About the Future of Jobs


Social Robot Examples

Whether in the workplace, on our streets or working with loved ones, social robots continue to evolve. Here are a few examples of social robots being used today.

 

Furhat by Furhat Robotics

Furhat is a customizable social robot that’s used for prototype and application development. Researchers and developers are able to update Furhat’s code to test out verbal and non-verbal response modes. It can also be outfitted with different masks to represent a variety of human likenesses across age, gender and race. Furhat can even depict dogs and anime characters. More than 200 voices are also available.

According to Furhat Robotics’ website, their social robot is also capable of communicating through facial expressions, head movements and eye gaze, and can even raise its eyebrows for added emphasis during conversation. Equipped with computer vision for face tracking, Furhat can analyze facial expressions and interact with as many as 10 people at the same time.

 

Jennie by Tombot

Jennie is a robotic dog that’s used to provide emotional support to people living with health conditions like dementia, helping to minimize the effects of depression, anxiety and loneliness. Equipped with touch sensors and voice command software, Jennie is a social robot that’s modeled after a golden Labrador retriever puppy. It can wag its tail and make real puppy sounds.

 

Misty by Misty Robotics

Misty is a programmable mobile social robot developed by Misty Robotics, a social robot company that was recently acquired by Furhat, that has been used to perform temperature screenings at workplaces and healthcare settings, and is now being marketed to researchers as a tool for working with people with autism and Alzheimer’s, according to TechCrunch. Using its eyes and sounds, Misty is capable of expressing a range of emotions, from joy to sadness. The robot’s head, neck and arms also move, allowing it to communicate curiosity or excitement. It responds to touch and is able to recognize and remember people and uses artificial intelligence to detect 80 different object classes, according to Misty Robotics’ website.

 

Moxie by Embodied

Deemed by Time as one of the best inventions of 2020, Moxie, a social robot developed by robotics and artificial intelligence company Embodied, supports social-emotional learning in children by engaging them in play-based activities. Moxie is designed for children around the ages of 5 to 10 and is able to respond to conversation, eye contact and facial expressions. It’s also capable of remembering people, places and things. In a six-week study of 12 children who interacted with Moxie for 15 minutes three times a week, many children that engaged with Moxie experienced a 55-percent increase in self-esteem and 43-percent increase in emotional regulation, while heightening their conversation and relationship skills.

 

Orbit

Orbit is a small social robot developed by Ben Powell, a graduate of Longborough University in England, to help children with autism develop social skills through storytelling, physical touch and engagement, as well as visual communication like facial expressions. According to a Longborough University press release, Powell, who has been diagnosed with mild, high-functioning autism, created Orbit so children could “see emotions in context and develop social skills independently.”

“By giving Orbit a personality, children can build a connection with the robot and then empathize with it,” Powell said in the release. “This will teach the user social appropriateness and help them recognize how their actions may make the robot feel — i.e., if they press Orbit too hard or hit it, the robot will look sad or scared.”


QTrobot by LuxAI

QTrobot is another social robot used by educators and families looking to support children with autism, at school and at home. According to LuxAI’s website, QTrobot is able to boost attention, engagement and concentration while reducing anxiety and overstimulation in a learning environment. LuxAI has also developed a research and development platform, QTrobot V2, which is equipped with 3D cameras, high-quality microphones, text to speech, skeleton tracking along with emotion and gesture recognition — which can be purchased by institutions and researchers for the study of social robotics and human-robot interaction.

Related ReadingCould You Kill a Robot Dinosaur?

 

The Future of Social Robots

It’s not just how social robots will evolve, but also how humans evolve alongside social robots that will likely determine their future. One of the key issues will be whether or not humans will be capable of understanding a social robot’s intent, given we’re not always capable of understanding one another as it is.

Design — think tracking eyeballs on self-driving cars — will be one key to improving human-robot interaction through common human-like indicators we’re already familiar with, which will not only increase safety, but also help when it comes to managing human expectations of robots.

“Once we know those social cues, then we just listen, right?” Kernbaum said. 

But managing expectations can be a complex design problem, according to Brad Porter, founder and CEO of Collaborative Robotics, a California-based company developing a collaborative robot, or cobot, for use in industrial settings, where Porter believes human-robot interactivity and social awareness plays an important role.

“I think if you’re trying to do something in a workplace environment, even if the robot is socially interactive and collaborative, you can do a lot in signaling with the design — what it’s capable of and what it’s not,” Porter said.

Geared toward children — or in other social support settings — a social robot has to behave and interact differently than if it’s in a workplace, where expectations are much different.

“If you make your robot look like a human-like cartoon character, then you kind of expect that it has child-like behaviors and interaction ability,” Porter said. “That expectation could be quite hard to achieve, because children are unique and funny and creative, and trying to endow all that into a robot can be very hard.”

Besides design, artificial intelligence will also be critical to the future of social robots.

When it comes to AI, Kernbaum believes it’s important to consider what artificial intelligence — and the robotic platforms we ultimately house it in — can do better than humans given their tremendous access to data. “I’m not worried about them taking our jobs,” Kernbaum said. “But there are going to be things that they’re better at and maybe we want to focus on those.”

Today, AI-powered social robots are showing promise when it comes to working as personalized learning and support tools for children and older adults. But like in other areas of robotics, it’s not that social robots are always able to perform a specific task better than a human counterpart, it’s that they are able to do so consistently — something we’re not able to do, given the everyday struggles of being a human.

Great Companies Need Great People. That's Where We Come In.

Recruit With Us