Non-Dystopian Emotional Robots
Is emotionally connected hardware the end of the world?
After some thought, I've decided: I'm probably not building your dystopian future.
After hearing about my work with emotional robots, several friends sent me articles like this one, which warns us about giving over the responsibilities of emotional maintenance to non-humans. I think that people might think that this is my intention, with the emotionally connected objects I have been working on this year. But nothing could be further from the truth.
Robots that imitate and understand human emotion can feel unfamiliar and bizarre. However, they don't release humans from the "burden" of emotional care. People need to connect with emotions, both their own and those of other people. This is true regardless of robotic presence. Emotional technology does not communicate a new emotional experience, it evokes existing memories of emotions and emotional attachments users have already formed. Less, "This robot has created an entirely new emotional experience for me." More, "This reminds me of previous emotional experiences."
Paro, the robotic seal recently featured in Aziz Ansari's "Master of None," is often presented as the poster child for robotic replacements for emotional care. However, I feel this is a misinterpretation of Paro's interaction design. Paro is not an emotional replacement. It is a medium upon which existing emotions and memories can be imposed. And that's supported by the way that people interact with it. In a WSJ article about Paro, one nursing home resident said "I love this baby." Paro is not creating a new emotional experience -- it is bringing up emotional memories of infants the resident has interacted with in the past. Paro is a doll, a toy. The nursing home resident would have similar memories stirred up by a baby doll or a stuffed puppy. Paro's movements and sounds just give it more stirring power.
But my point is, toys aren't dangerous, and neither is Paro. We impose emotions on our toys all the time. It doesn't mean we genuinely believe the toys love us back or experience emotions of their own.
When I was a medical student, I engaged in play therapy sessions with inpatient children at the psychiatric hospital. I noticed that kids could be emotionally attached to their toys. However, that attachment didn't seem to be about the toy being convincing or unconvincing about portraying an emotion. Give two kids the exact same doll, and one will say the doll loves apples and the other will say the doll despises them. In play, the toy is an extension of the self. Saying the doll loves you doesn't necessarily mean you think it truly has feelings and loves you. But it might mean that somewhere in yourself, you think you are worthy of being loved.
The WSJ quotes one woman as saying, after holding Paro, "I know you're not real, but somehow, I don't know, I love you." I know you're not real, I know it's just me, but I care and am capable of care. We don't think our toys are real. But the emotions they create, and what those emotions say about us, most certainly are.
There are a lot of things to worry about with emotional robots. The social dynamics of care, for example, influencing the design of the objects and reinforcing harmful stereotypes. Subtle emotional manipulation by careless people. The collection and use of emotional data and the intentions of collecting it in the first place. User-protective design is going to be very important in the era of emotional technology. But I don't think we need to fear anthropomorphizing emotional things at a basic level. I don't think we need to worry that emotional objects will replace emotional humans. Objects have always been emotional things, because we have created and used them, and we are emotional beings. Using technology to enhance what we already see in ourselves and reflect in our creations sounds neither scary nor innocuous. It sounds human.