Mobile World Congress always has more than its fair share of weird. Last week at MWC, the winner’s prize for bonkers went to a Korean company called Hyodol, which proudly showed off a disturbing-looking ChatGPT-enabled companion doll aimed at older adults. Now, this $1,800 AI-enabled doll may well look like something you’d find in a haunted attic, but it’s actually meant to act as an interactive digital pal for people experiencing loneliness or in long term care facilities.
Thanks to the large language model stuffed inside the doll, the Hyodol can supposedly hold conversations with its owners, as well as provide health reminders such as when to take medication or eat a meal. It’s every bit as connected as you can imagine, with a companion app and web monitoring platform that lets caretakers monitor the device and its user from afar.
It’s meant as a balm for the epidemic of loneliness, which has affected everyone from older adults in nursing homes to college students. Elizabeth Necka, a program director at the American National Institute on Aging, says there’s something to this kind of tech, particularly when used in nursing homes that are already suffering from widespread staffing shortages.
“The idea that there might be a low-cost solution that can mitigate feelings of loneliness is very attractive,” Necka says. “Whether or not ChatGPT can actually achieve those feelings of connections, to me, it seems a little bit premature to say that.”
There is certainly an industry for these devices. The market for adorable social robots is especially active in countries such as Japan. Companies like Lovot and Qoobo (“a tailed cushion that heats your heart”) have made cuddly, adorable companion bots en vogue. These devices have been utilized in Western countries as well, but there’s much less cultural acceptance for them. But the current tendency for companies to put generative AI into everything means everywhere is probably due for a wave of these conversational Chuckies.
“I think the industry is still trying to understand the market,” says Lilian Hung, an assistant professor and Research Chair in Senior Care at the University of British Columbia School of Nursing. “It’s still in its infancy, but it has certainly taken off.”
Not that there haven’t been an array of other attempts. Jibo, a social robot roommate that used AI and endearing gestures to bond with its owners had its collective plug unceremoniously pulled just a few years after being put out into the world. Meanwhile, another US-grown offering, Moxie, an AI-empowered robot aimed at helping with child development, is still active.
It’s hard not to look at devices like this and shudder at the possibilities. There’s something inherently disturbing about tech that plays at being human, and that uncanny deception can rub people the wrong way. After all, our science fiction is replete with AI beings, many of them tales of artificial intelligence gone horribly wrong. The easy, and admittedly lazy, comparison to something like the Hyodol is M3GAN, the 2023 film about an AI-enabled companion doll that goes full murderbot.
But aside from offputting dolls, social robots come in many forms. They’re assistants, pets, retail workers, and often socially inept weirdos that just kind of hover awkwardly in public. But they’re also sometimes weapons, spies, and cops. It’s with good reason that people are suspicious of these automatons, whether they come in a fluffy package or not.
Wendy Moyle is a professor at the School of Nursing & Midwifery Griffith University in Australia who works with patients experiencing dementia. She says her work with social robots has angered people, who sometimes see giving robot dolls to older adults as infantilizing.
“When I first started using robots, I had a lot of negative feedback, even from staff,” Moyle says. “I would present at conferences and have people throw things at me because they felt that this was inhuman.”
However, the atmosphere around assistive robots has gotten less hostile recently, as they’ve been utilized in many positive use cases. Robotic companions are bringing joy to people with dementia. During the Covid pandemic, caretakers used robotic companions like Paro, a small robot meant to look like a baby harp seal, to help ease loneliness in older adults. Hyodol’s smiling dolls, whether you see them as sickly or sweet, are meant to evoke a similar friendly response.
Hyodol isn’t alone in its AI companion endeavors for older adults. ElliQ, an AI-enabled product made by the Israeli company Intuition Robotics has been used in trial programs for assisting older adults in New York. It’s less cuddly though, coming in the form of a lamp-like bulb that can sit on a nightstand. What Hyodol is aiming to do is combine that functionality with the fuzzy, feel-good form factor of the big-eyed Paro seal. (Hyodol did not respond to multiple requests for comment.)
Even without AI in them, these pseudo-sentient companion dolls have garnered their share of concerns. Moyle, who has helped oversee studies of these devices in elder care, says, in some cases, people who depend on them for care can become too attached to the dolls.
“One of the negative aspects we had to contend with was some residents loved their doll so much, they were babies to them,” Moyle says. “They were babies that they could carry, they could have with them. Some small majority loved them so much that it became too much a part of their life. We had to try and reduce the amount of time that they were using them.”
Putting language capabilities in a companion doll, especially ones as prone to hallucination and weirdness as ChatGPT, means they might take a high dive straight into the uncanny valley. Infusing these devices with AI invites all the same concerns skeptics have had about sticking AI into everything else. Generative AI hallucinates, spouts false information, and is subject to all sorts of potential security issues, not to mention that all the data from a ChatGPT integration goes back to OpenAI. They’re also replete with privacy and security concerns, like any device that monitors a person and shares that data. There’s also the potential for even more practical points of failure, like if caretakers over-rely on a robot to remind patients to take medications.
“There’s a lot of work that needs to be done to make sure that robot conversations are going to be safe,” Hung says. “That it’s not going to guide people to do anything that’s unethical. It’s not going to collect any information. The robot is not going to ask the older person, what’s your credit card number?”
These are the inherent risks that come when companies ask people to turn to their products in their most vulnerable moments. Moyle says she’s of two minds about the issue.
“If we give somebody an opportunity to talk to AI, does that remove all other social opportunities?” Moyle says. “Does that mean that families stop visiting? Does that mean that the staff stops talking to the person?” It’s a risk, but she says, in her experience, many older adults in care facilities are often left by themselves for the vast majority of their days and nights as it is. “Giving them something, if it makes them happy, is much better than giving them nothing at all.”
Of course these devices aren’t the same as a human. Large language models don’t understand the person interacting with them; they’re just very good at predicting what will sound like a good response. And they certainly don’t know how to fully understand a person’s emotions or mental state.
“People can be displaying quite challenging emotions that are not being picked up by the AI,” Moyle says. “As AI becomes more sophisticated, that probably will get better, but at the moment it’s certainly not.” She pauses, then laughs and adds, “But a lot of humans can’t assess emotions very well either, so…”
For lots of people, it doesn’t really matter if a robot can’t love them back. It’s why we still mourn our robots dying slow, somber deaths, and hold funerals for robot dogs. It’s why we want personalities in our sexbots and trust them with our deepest desires. When a human interacts with a robot, it’s less about whether the robot can love you back, and more so about how people derive value from the act of pouring their own feelings into someone (or something) else.
“What the cat and what the baby gave us is a sense that they need our love, and that’s what we are longing for as humans,” Hung says. If someone is looking to interact with a cute and cuddly robot, it’s often to fulfill that same function. “We buy these robots because we want to give our love to them—so we feel that the robot needs our love, so we feel that there’s something who needs us. That’s the nature of humans.”