Scientists Are Teaching Computers To Read Mice's Facial Expressions

Mice have different facial expressions in response to a big clock of cheese, something they can't identify, and pricking their feet, and computers have been taught to recognize these. MPI of Neurobiology/Kuhl

Animal research is hindered by the fact that animals can't tell us how a treatment makes them feel. We can read their blood pressure, measure their life span, or take their temperature, but it's a lot harder to tell if an antidepressant is improving their mood or a painkiller easing their distress. A team of scientists at the Max Plank Institute of Neurobiology are now working to resolve that by teaching computers to read animals' expressions, starting with mice.

Many animal researchers have inferred animal emotions based on signs of distress or calm in their subjects. These tend to be subjective and potentially influenced by experimental bias, however – particularly if the researcher knows which animal is on the drug and which is on the placebo. To make such assessments more reliable and scalable, Dr Nadine Gogolla turned to computers.

Across human cultures, certain facial expressions have a constant meaning. We smile with pleasure, widen our eyes in fear, and engage in a complex but easily recognizable set of movements when something disgusts us. Most people think they can spot the same patterns in their pets, but Gogolla set out to obtain something more objective.

Gogolla started by filming the responses of mice to situations where we can be confident about their response. “Mice that licked a sugar solution when they were thirsty showed a much more joyful facial expression than satiated mice,” Gogolla said in a statement

Similarly, Gogolla was able to identify what mouse disgust looks like by giving her charges a mix of water so salty one taste was enough to make them not try again. Proving the expressions reflect inner mood, rather than a physical response, foods that initially induced expressions of pleasure evoked disgusted faces after the mice learned to associate them with something unpleasant.

Gogolla then trained an artificial intelligence system by showing it frames of the mice before and after certain stimuli and tested its capacity to predict future reactions, reaching greater than 90 percent success. Pleasure, disgust nausea, pain and fear could all be reliably recognized.

In Science, Gogolla and co-authors report they were even able to identify single neurons whose activation correlated to specific facial expressions, a beginning to understanding the neural basis to how we show our feelings. When these neurons were activated with light, the mice produced the associated facial expressions. This suggests mice, and probably humans as well, have “emotion neurons” tied to a particular expression and the emotions that cause it.

Gogolla sees this as a stepping stone to investigating the neural underpinnings of human emotions and why they sometimes go wrong, leading to outcomes like anxiety disorders. Meanwhile, with computers newly trained to read rodent reactions, animal experimenters may be able to expand the ways they can objectively measure animals' responses to whatever is being tried.

 
Comments

If you liked this story, you'll love these

This website uses cookies

This website uses cookies to improve user experience. By continuing to use our website you consent to all cookies in accordance with our cookie policy.