Researchers have used a machine-learning algorithm to decipher the seemingly inscrutable facial expressions of laboratory mice. The work could have implications for pinpointing neurons in the human brain that encode particular expressions.
Their study “is an important first step” in understanding some of the mysterious aspects of emotions and how they manifest in the brain, says neuroscientist David Anderson at the California Institute of Technology in Pasadena.
Nearly 150 years ago, Charles Darwin proposed that facial expressions in animals might provide a window onto their emotions, as they do in humans. But researchers have only recently gained the tools — such as powerful microscopes, cameras and genetic techniques — to reliably capture and analyse facial movement, and investigate how emotions arise in the brain.
“I was fascinated by the fact that we humans have emotional states which we experience as feelings,” says neuroscientist Nadine Gogolla at the Max Planck Institute of Neurobiology in Martinsried, Germany, who led the three-year study. “I wanted to see if we could learn about how these states emerge in the brain from animal studies.” The work is published in Science1.
Gogolla took inspiration from a 2014 Cell paper2 that Anderson wrote with Ralph Adolphs, also at the California Institute of Technology. In the study, they theorized that ‘brain states’ such as emotions should exhibit particular characteristics — they should be persistent, for example, enduring for some time after the stimulus that evoked them has disappeared. And they should scale with the strength of the stimulus.
Artificial Intelligence: Face value
Gogolla’s team fixed the heads of mice to keep them still, then provided different sensory stimuli intended to trigger particular emotions, and filmed the animals’ faces. For example, the researchers placed either sweet or bitter fluids on the creatures’ lips to evoke pleasure or disgust. They also gave mice small but painful electric shocks to the tail, or injected the animals with lithium chloride to induce malaise.
The scientists knew that a mouse can change its expression by moving its ears, cheeks, nose and the upper parts of its eyes, but they couldn’t reliably assign the expressions to particular emotions. So they broke down the videos of facial-muscle movements into ultra-short snapshots as the animals responded to the different stimuli.
Machine-learning algorithms recognized distinct expressions, created by the movement of particular groups of facial muscles (see ‘Emotional animals’). These expressions correlated with the evoked emotional states, such as pleasure, disgust or fear. For example, a mouse experiencing pleasure pulls its nose down towards its mouth, and pulls its ears and jaw forwards. By contrast, when it is in pain, it pulls back its ears and bulks out its cheeks, and sometimes squints. The facial expressions had the characteristics that Anderson and Adolphs had proposed — for example, they were persistent and their strength correlated with the intensity of the stimulus.
“This way of looking at facial expression is a big advantage because it avoids any biases of the experimenter,” says Camilla Bellone at the University of Geneva in Switzerland, who studies neuropsychiatric disorders.
The scientists then searched for brain cells that might encode these emotions in the brain. Using a technique called optogenetics, they targeted individual neuronal circuits in mice that have been shown to trigger particular emotions in humans and other animals. When the authors directly stimulated these circuits, the mice assumed the relevant facial expressions.
Finally, the team used a technique called two-photon calcium imaging to identify individual neurons in the mouse brain that fired