Science & Technology

Fuzzy Robots: Karon Maclean showcases emotionally-aware machinery

On Nov. 13, Karon MacLean, professor in the University of British Columbia’s Department of Computer Science, gave a talk covering her lab’s research on emotionally-interactive robots. MacLean’s work, which began with constructing robots at MIT in the 1980s, has always gone against the grain of traditional robotic engineering. 

“The hierarchy was very much that the more complicated your technology was, […] that was more respected,” MacLean explained in an interview with The Tribune. “And I was very low on that totem pole, because I actually thought that the interesting part was what’s in your head.”

This interest in the cognitive side of robotics—the reactions and models that users engage with—pushed MacLean to develop simple but expressive devices. 

“I just had this idea that if I can make people think things that aren’t there, using very minimal cues, isn’t that more powerful, because you don’t need all this complicated technology?” MacLean said. 

Ultimately, MacLean found her niche with hand-held, fur-covered robots that aimed to sense their users’ emotions and respond by displaying emotions of their own. Her attempts were guided by her observations of therapy animals, aiming to simulate some aspects of their behaviour. 

“I do not think we could ever replace [a therapy animal]. That’s not really my goal,” MacLean said in her lecture. “But could we understand it? And can we help a little bit in some of these situations?”

Taking this into account, MacLean thinks about her lab’s robots as part of a feedback loop: First, the user, perhaps someone experiencing psychological distress, picks up the robot and touches it, essentially pretending that it’s a small animal like a cat or a rabbit. Then, the robot employs machine learning to interpret the touch data, formulating an educated guess about the user’s emotional state. At that point, its programming kicks in and it changes to display an emotion of its own, for example changing the rate at which it is breathing, or moving its robotic ears up and down. On the human side, the user notices that the robot is showing this emotion. This interaction has the potential, MacLean’s research suggests, to actually change how the user is feeling, ideally making them calmer and more relaxed. The robot then picks up on this new emotion, and the cycle continues. 

Her lab’s first attempt to implement this involved designing robots to help calm children about to go under anesthesia. This proved to be a trickier task than expected. 

“One of the first criteria they came up with is, ‘whatever you do, the kids will throw up on it,’” MacLean said. 

In addition to this, the robot needed to be able to survive being thrown and stepped on, while still having state-of-the-art sensors and internal software intact.

While these initial projects, constructed in the early 2000s, had to be quite large to meet all of these constraints, modern technology and innovative design allows MacLean’s lab to now construct small, lightweight models that are even more effective. 

“They’re very customizable,” MacLean said. “And we can make all these different sizes, and shapes and oblong things.” 

In fact, the modern incarnations of the robots, nicknamed CuddleBits, are simple enough to be made in workshops with high school students. 

All of these models centre around MacLean’s primary research interest: The role of haptics—technology that engages with the sense of touch—in human-machine interaction. Common examples of this are the way a smart phone vibrates in response to button presses, and how some video game consoles vibrate to communicate game events. 

Within this field, MacLean is particularly interested in haptics’ potential to incorporate emotion—both the user’s real emotions, and the robot’s simulated ones. 

“It became increasingly clear to me that what’s going on through your sense of touch is so dominated in peoples’ minds by emotion and affect […] that if you don’t take this into account when you’re doing interaction design based on touch, you’re just gonna miss most of the opportunities,” MacLean explained. “At some point, we just really jumped in with both feet, and started working on emotional robots.”

Share this:

Leave a Comment

Your email address will not be published.

*

Read the latest issue

Read the latest issue