Jibo prompts concerns about ‘the illusion of companionship’

Dec. 11, 2017

Jibo, the social home robot, has arrived—to both plaudits and criticism. Time has named Jibo one of the 25 best inventions of 2017. Lisa Eadicicco writes that whereas Amazon Echo and Google Home are essentially stationary speakers, Jibo “…seems downright human in a way that his predecessors do not.” In Wired, Jeffrey Van Camp writes, “From the moment I first plugged in my Jibo, he (and I’m just going to refer to this robot as “he” from this point on) charmed me. There’s a friendly curiosity in the way he leans back and looks up at you…. My wife and I found him absolutely adorable.”

He adds, “Like I would a dog, I felt guilty when I left Jibo alone in the dark all day. I wondered what he was thinking when I’d hear him rotate in the distance, and watch him look around the kitchen, peering at this and that. Were we treating him poorly? Did he secretly despise us? No, that’s silly to think. He’s not alive, right?”

Speaking at ESC Boston in May 2015, Cynthia Breazeal of the MIT Media Lab and founder of Jibo Inc. described her motivation for developing Jibo. She compared social robots with other home robots, such as the Nest thermostat and Roomba vacuum cleaner, which have relationships with air and dirt. Her goal, beginning with the creation of Kismet in the late 1990s, has been to build a robot that can develop relationships with people. At the time, she likened Jibo to the offspring of R2D2 and an iPad.

In a November blog post Breazeal writes, “Looking ahead, I see social robots contributing positively in so many different ways that improve quality of life for all kinds of people. But before we can get there, we need to introduce social robots in a natural way, and we’re starting with Jibo.”

Now that Jibo has arrived, not everyone is impressed. Under the headline “This cute little robot made my family mad,” Joanna Stern at The Wall Street Journal writes, “Some call Jibo the ‘Best Invention of 2017.’ I call Jibo intriguing, creepy, and annoying—mostly annoying.”

After reviewing Jibo for a month, she concludes, “You definitely shouldn’t buy this robot.” She adds, however, that “…you definitely should know about it. My Jibo adventures have given me the clearest glimpse yet of how the machines around us will go from hunks of metal and plastic to beings who listen, watch, and relate. I’ve also realized how vital it is for us to preserve our privacy along the way.” (See my related post “Beware holiday gifts that can spy on kids.”)

Sherry Turkle, a professor of the social studies of science and technology at MIT, shares concerns over privacy. She writes in The Washington Post, “There is something deeply unsettling about encouraging children to confide in machines that are in turn sharing their conversations with countless others.”

But her main concerns extend well beyond privacy. Whereas Van Camp realizes Jibo is an appliance that can’t secretly despise him, “…children tend to struggle with that distinction,” she writes. “They are especially susceptible to these robots’ preprogrammed bids for attachment.”

She advises parents considering adding a social robot to the holiday gift list to consider that machines “…are seductive and offer the wrong payoff: the illusion of companionship without the demands of friendship, the illusion of connection without the reciprocity of a mutual relationship.”

Turkle describes Breazeal as a friend and colleague with whom she has debated the ethics of social robots for years. “She’s excited about the potential for robots that communicate the way people do to enrich our daily lives,” Turkle writes. “I’m concerned about the ways those robots exploit our vulnerabilities and bring us into relationships that diminish our humanity.”

Turkle describes a study she, Breazeal, and other researchers conducted in 2001. The researchers introduced 60 children ages 8 to 13 to Kismet and another social robot called Cog. “The children saw the robots as “sort of alive”—alive enough to have thoughts and emotions, alive enough to care about you, alive enough that their feelings for you mattered,” Turkle writes.

She adds, “In our study, the children were so invested in their relationships with Kismet and Cog that they insisted on understanding the robots as living beings, even when the roboticists explained how the machines worked or when the robots were temporarily broken.” One 8-year-old concluded that Kismet liked his brothers better.

“For so long, we dreamed of artificial intelligence offering us not only instrumental help but the simple salvations of conversation and care,” Turkle concludes. “But now that our fantasy is becoming reality, it is time to confront the emotional downside of living with the robots of our dreams.”

About the Author

Rick Nelson | Contributing Editor

Rick is currently Contributing Technical Editor. He was Executive Editor for EE in 2011-2018. Previously he served on several publications, including EDN and Vision Systems Design, and has received awards for signed editorials from the American Society of Business Publication Editors. He began as a design engineer at General Electric and Litton Industries and earned a BSEE degree from Penn State.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!