Should we make robots that feel pain? Is that a good idea?
Hailing from Osaka University in Japan, Affetto is a robot designed to realistically mimic the facial expressions of a toddler. Why? Its three creators — Hisashi Ishihara, Yuichiro Yoshikawa, and Minoru Asada — are working on what they call “cognitive development robotics.”
From the original paper published in 2011:
“The research aims to better understand the development of human intelligence through the use of robotics. The Affetto robot is designed to mimic the facial expressions of a young child, in the one to two year age range, and will be used to study the early stages of human social development.”
When I last saw Affetto, it looked a bit different. For example:
Pretty haunting. In 2018, the robot received a new face, which added nuance to its expressions. And hair.
“The researchers have now found a system to make the second-generation Affetto more expressive. Their findings offer a path for androids to express greater ranges of emotion, and ultimately have deeper interaction with humans.”
That’s a little better, but I’ll probably still have some major nightmares later tonight.
Two years on, work on Affetto is still progressing, and the team at Osaka University is now experimenting with ways to make a robot “feel” pain. They’ve added soft, touch-sensitive sensors to Affetto, and when pressed, they result in wincing facial expressions, as seen in the following video:
“Affetto shows wincing faces when the amplitude of the input signal added onto a tactile sensor exceeds a threshold.”
Sorry about that, Affetto.
According to Professor Asada, making robots feel pain may, in the future, allow them to better empathize with humans. How this “feeling” will ultimately be created or programmed in robots is yet to be seen.
However, as ScienceNews points out, some working on the idea believe such artificial “feelings” may arise naturally through soft robotics and deep learning, as a robot learns to protect itself.
That will be an interesting threshold to cross.