Facial Expressions in Mice Reveal Latent Cognitive Variables and Their Neural Correlates
Revealing Latent Cognitive Variables Through Facial Expressions — In-depth Analysis of Latest Research from Nature Neuroscience
Background: The Silent Dialogue Between Brain and Behavior
In the fields of animal and human behavioral science, traditional theories have mainly focused on “intentional” and adaptive actions—movements that are produced to accomplish tasks or pursue objectives. However, scientists have long noticed that brain activity can also “unintentionally” leak into bodily performance, producing a series of involuntary and non-adaptive movements and expressions. These are termed “incidental movements” or “incidental facial expressions,” and have been considered by popular psychology to be related to emotions and internal states, although systematic neurological evidence has been lacking in biology and neuroscience.
In recent years, researchers have discovered that bodily movements, especially facial expressions, are not merely markers of emotion but can reflect complex cognitive variables in animals, such as memory, decision-making, and internal states. For example, the posture of rodents can reflect the contents of working memory (references 3,4); reflex gain in human arms during decision-making can track the accumulation of sensory evidence (reference 5); and changes in pupil diameter in animals, beyond being influenced by luminance, can reflect states of arousal and uncertainty (references 10–12). These findings broaden the boundaries of traditional cognitive science and behavioral science, raising a fundamental question: can bodily manifestations serve as noninvasive indicators to “read out” the ongoing hidden computational processes of the brain? And are these “leaked” variables merely related to task performance, or can they transcend actual behavioral purposes to reveal deeper cognitive dynamics?
However, there is a central challenge in this field: do bodily movements truly reflect cognitive variables, or are they simply associated with the physical requirements of task execution? For instance, some incidental movements may only be present because they are related to reporting behavior (such as pressing a button or turning the head), rather than truly reflecting internal cognitive computations. Therefore, it is key to test this hypothesis by finding bodily expressions of “abstract” cognitive variables that are independent of task reporting and execution.
Paper Source and Author Team
This original research was published in Nature Neuroscience (volume 28, November 2025, 2310–2318), under the title “facial expressions in mice reveal latent cognitive variables and their neural correlates.” The main team includes Fanny Cazettes, Davide Reato, Elisabete Augusto, Raphael Steinfeld, Alfonso Renart, and Zachary F. Mainen, from institutions such as the Champalimaud Foundation (Lisbon, Portugal), Institut de Neurosciences de la Timone (Marseille, France), Departement BEL, Centre CMP, Mines Saint-Etienne (Gardanne, France). Team members span neuroscience, cognitive science, animal behavior, and other fields, integrating resources from top research institutes.
Research Workflow and Experimental Details
This study used mice as experimental subjects and adopted a highly innovative multimodal research design. The main workflow includes:
1. Behavioral Task Design — Probabilistic Foraging Decision Task
Experimental mice were head-fixed on a linear treadmill and could freely switch between two artificial foraging sites. The reward status of each site was controlled by a hidden state (rewarding/depleted). Mice would lick a spout to receive a sucrose water reward, with each lick having a 90% chance of delivering 1 μL of water in the rewarding state. Each lick also had a 30% chance to “deplete” the current rewarding site, after which the mouse must switch to the other site for more reward. The core decision for the mouse is to judge when to leave the current site, using complex cognitive strategies such as evidence accumulation and reward integration.
Behavioral Strategy Differentiation and Modeling — LM-HMM Algorithm
Researchers employed a Hidden Markov Model combined with Linear Regression (LM-HMM) to model the mice’s decision strategies and dissect the “decision variables” used at each moment:
- Inference-based strategy: fully relies on the accumulation of consecutive failures, resetting upon reward.
- Stimulus-bound strategy: depends on accumulated rewards (negative value); the more rewards received, the longer the mouse stays, with decision variables changing with reward accumulation.
- Impulsive inference strategy: a transient, impulsive mode determined by high “bias.”
With nested cross-validation and maximum likelihood estimation, the model robustly identified mice switching between three strategies across different time periods and accurately captured the expression of decision variables under each strategy.
2. Synchronous Recording of Neural and Behavioral Data
Simultaneous recording includes:
- Behavioral Data: High frame-rate (60 fps) cameras captured facial videos during licking.
- Neural Data: Multi-electrode Neuropixels arrays (374 recording points) collected multidimensional neuronal ensemble activity in regions such as the secondary motor cortex (M2).
- Video Data Processing: Open-source tool Facemap extracted “motion energy” features from the videos, and Singular Value Decomposition (SVD) produced 100-dimensional principal components (PCs) of facial movement.
3. Multimodal Decoding of Facial Expressions and Hidden Cognitive Variables
- Multivariate Regression (GLM): Used movement PCs as predictors for hidden cognitive variables (consecutive failures/negative value) and adopted regularized linear models with strict cross-validation.
- Controlling Potential Confounding Factors: Orthogonalization of decision variables to action outcomes and lick rate to eliminate correlations and resolve each variable’s “unique” contribution.
- Spatial Representation Analysis: Superimposed weighted facial masks to reconstruct the spatial expression pattern of hidden variables on the mouse’s face and compared expression consistency across individuals and task conditions.
4. Neurosource Experiment and Causal Testing
- Latency Analysis: Applied sliding window delay GLM regression for neuronal firing and facial movement, comparing decoding accuracy and latency between regions such as M2, orbitofrontal cortex (OFC), and olfactory cortex (OC), probing causal relationships.
- Optogenetic Intervention: Used Vgat-ChR2 mice to activate GABAergic neurons with blue light (inhibiting M2). In 30% of task cycles, M2 was randomly inactivated and facial movements were monitored in real time, analyzing how facial expressions affected the decoding accuracy and latency for hidden cognitive variables.
Key Experimental Results and Logical Relationships
(1) Recognition of Behavioral Strategies and Hidden Cognitive Variables
The LM-HMM distinctly identified three strategies, each relying on different decision variables. Multivariate logistic regression revealed that the consecutive failures variable better explained inference-based strategies, while the negative value variable was closely related to the stimulus-bound strategy.
(2) Facial Expressions Encode a Variety of Cognitive Variables
- Facial movement principal components (PCs) can accurately decode not only the cognitive variables driving current behavior, but also “latent” cognitive variables being computed by the brain yet not expressed behaviorally.
- Different facial regions’ motion energy manifested distinct spatial patterns for different variables—such as the negative value variable being mainly reflected in nasal movements while consecutive failures presented subtler facial patterns.
- These expressions were highly consistent across individuals and very stable across repeated experiments in the same mouse.
(3) Facial Expressions Serve as a “Reservoir” of Cognitive Variables
The study found that even when a mouse was not currently employing a specific cognitive strategy, its corresponding cognitive variable was still clearly represented in its facial expression, and decoding accuracy was independent of strategy. This suggests that facial expressions can “leak” multiple cognitive variables being simultaneously computed in the brain, forming a “cognitive reservoir.”
(4) Neural-Facial Latency and Causality
- Decoding accuracy for cognitive variables from facial expressions and M2 ensemble neural activity were similar. Expression of cognitive variables in facial expressions lagged slightly behind M2 neural activity (about 50 ms), while associations with OFC and OC were weaker and more delayed.
- Optogenetic inactivation of M2 altered facial expressions; accuracy for decoding hidden cognitive variables dropped significantly and latency increased. The magnitude of change was highly correlated with changes in motion energy.
Research Conclusions and Scientific Significance
This research comprehensively demonstrates that animal facial expressions are not only reflections of emotion but also a real-time window into the brain’s computation of complex cognitive variables. These micro facial movements can accurately and stably represent multiple decision variables being computed by the brain in parallel—even when these variables are not expressed through behavior. Furthermore, the expression of these cognitive variables in facial expressions partially originates in M2 neural activity, making facial expression an important interface between neural, cognitive, and behavioral levels.
From a scientific perspective, this research reveals the “multi-channel leakage” of cognitive variables, challenging previous views that behavioral output merely reflects immediate decisions, and provides a new perspective for animal cognition and neuroethology. In terms of practical value, noninvasive monitoring of facial expressions could help us understand internal mental processes, and may partially replace neural recording technologies in diagnosis, human-computer interaction, smart health, and more.
Research Highlights and Innovations
- Innovative Workflow: Multimodal, simultaneous recording of neural and facial movement; innovative LM-HMM model enabling orthogonal analysis of multiple cognitive variables.
- Methodological Innovation: For the first time, high-dimensional video motion energy principal components are used to decode both behavior-related and non-behavior-related cognitive variables, validating the scientific rationale for facial expressions as a “window for latent cognitive variable leakage.”
- Causal Validation: Optogenetic causal manipulation of M2 demonstrates the neural basis for cognitive variable expression in facial movement.
- High Consistency: Across animals and in repeated sessions for the same mouse, facial expression patterns are highly consistent and stable, with broad biological relevance and generalizability.
- Task Generalizability: Facial expressions can decode slow cognitive variables (such as trial number) and show robust decoding ability across tasks (e.g. auditory 2AFC tasks), independent of task specificity.
Additional Value Information
This article also highlights privacy concerns arising from the boom in noninvasive video and biometric technologies, warning that biological features like facial expressions have great potential for revealing “internal activity.” It anticipates more effective protection protocols in ethics and regulation in the future. Meanwhile, analytical tools developed in this study (e.g. Facemap, LM-HMM models) have been open-sourced, providing invaluable resources for researchers worldwide.
Summary and Outlook
This work not only cracks the code between animal facial expressions and brain cognitive variables, but also pushes cognitive neuroscience toward a new paradigm of “expression-cognition” interaction beyond “behavior-brain.” In the future, as technology develops, this research may find application in psychiatric diagnosis, intelligent surveillance, human-computer interaction, and become a cornerstone for interdisciplinary collaboration.