Drones have pretty much revolutionized our lives over the past few years. Think about it: in 2021, depending on where we live, we can eat snacks delivered by drones while we watch movies shot by drones about disaster victims being saved by drones.
For all their talents, there’s still something not exactly, well, friendly about drones – you probably wouldn’t want one as a pet, for instance. But now, researchers at Ben-Gurion University of the Negev (BGU) have set out to change that perception. In research that they presented at the recent virtual ACM Conference on Human Factors in Computing Systems, they showed for the first time that humans can be made to emotionally recognize and empathize with drones.
“There is a lack of research on how drones are perceived and understood by humans,” explained Professor Jessica Cauchard of the BGU Department of Industrial Engineering & Management. “For the first time, we showed that people can recognize different emotions and discriminate between different emotion intensities.”
The research team conducted two studies, showing online participants static images and video footage respectively of drones “feeling” various emotions. The drones showed their emotional state the same way you or I do: through their facial expressions.
Although you might think it’s a simple thing to draw a face, a lot of research went into creating the right facial features for the drones. Perhaps surprisingly, the team realized that they needed to keep the faces simple and unrealistic for the experiment, both to reduce the cognitive effort required to interpret the “emotion” and to stop the participants from getting freaked out by how lifelike the drones looked. The drones were therefore given 2D cartoon faces with just four facial features: eyes, eyebrows, pupils, and a mouth.
With these core facial features, study participants were able to recognize with a high degree of accuracy which emotion the drone was conveying – and how intensely they were “feeling” it. The only emotion people weren’t great at picking up was disgust, with less than one-third of participants discerning it from facial clues. However, joy, sadness, fear, anger, and surprise were all recognized in both static images and dynamic video by up to 99 percent of the participants, with fear being the only emotion that suffered in the video experiment.
But it seems what the researchers didn’t quite expect was just how much the study participants would connect with the drones.
“Surprisingly, participants created narratives around the drone's emotional states,” explained Professor Cauchard. “[They] included themselves in these scenarios.”
In other words, the study participants didn’t just deduce that the drone was experiencing some emotion – they also came up with a reason why it was feeling that way. As one participant volunteered (and unwittingly named the resulting research paper), “the drone looks like it’s in love!”
And amazingly, not only did participants come up with possible causes for the drones’ feelings, but their perceptions of the drones’ emotions also affected what they actually saw. For instance, despite the drones flying at a constant speed throughout all videos, many participants reported that a “sad” drone would be flying slower, or an “angry” one would speed up. As one participant reported: “Its rotors even seemed to spin faster the madder it got.”
The researchers at BGU hope that this discovery will help the development of drones for social use and everyday support and companionship. In particular, they believe their work has applications in health and behavior change, since it turns out that people don’t like sad drones, and want to cheer them up.
So are we heading for a future full of faithful drone companions? Maybe, but it doesn’t sound too terrible. After all, as one participant put it: “just looking at its happy face made me feel happy for a moment and uplifted.”