Paving the Way for Social Touch at a Distance: Sonifying Tactile Interactions and Their Underlying Emotions
Academic Background
Touch is one of the earliest developed human senses and is crucial for physical and mental well-being. However, with the increasing prevalence of virtual communication, the lack of tactile interaction in remote exchanges may lead to psychological issues such as anxiety and loneliness. Previous studies have shown that touch can effectively convey emotions (e.g., stroking conveys love, hitting conveys anger), but its mechanisms have not yet been realized across modalities (e.g., auditory) for remote transmission.
This study integrates cutting-edge findings from social touch and movement sonification to propose an “audio-touch” technology, addressing the following questions:
1. Can the physical characteristics of tactile interactions (e.g., force, speed) be accurately transmitted through sound?
2. Can the socio-emotional intentions behind touch (e.g., anger, sympathy) be recognized through sonification?
3. Does the material of the interacting surfaces (skin vs. plastic) influence auditory perception?
Source of the Paper
- Authors: Alexandra de Lagarde (Sorbonne Université), Catherine Pelachaud (CNRS), Louise P. Kirsch (Université Paris Cité), Malika Auvray (Sorbonne Université)
- Journal: PNAS (May 6, 2025, Volume 122, Issue 19)
- DOI: 10.1073/pnas.2407614122
Research Process and Results
1. Experimental Design
The study comprised four online experiments involving 264 participants. Audio-touch stimuli were played through headphones to test their ability to recognize tactile gestures, emotional intentions, and surface materials.
Experiment 1: Acoustic Recognition of Tactile Gestures
- Stimuli: Vibrational signals from four skin-contact gestures (rubbing, stroking, tapping, hitting) were recorded using a piezoelectric transducer to capture forearm vibrations (Figure 1a).
- Procedure:
- No context: Participants freely described the sounds; 58.6% of responses were touch-related (e.g., “rubbing sound”).
- Tactile context: When informed the sounds originated from skin contact, gesture recognition accuracy rose to 74% (e.g., 85.7% for rubbing).
- Forced-choice task: With four options, accuracy further improved to 93.5% (hitting) and 86.3% (stroking).
- No context: Participants freely described the sounds; 58.6% of responses were touch-related (e.g., “rubbing sound”).
- Key finding: The rhythm and spectral features of the sounds (Figure 1b) directly correlated with the physical properties of tactile gestures (e.g., speed, pressure).
Experiment 2: Acoustic Transmission of Tactile Emotions
- Stimuli: Six emotional intentions (anger, attention, fear, joy, love, sympathy) were sonified.
- Results:
- Free description: Anger (violent hitting sounds) was often described as “tense,” while love (gentle stroking) was associated with “soothing.”
- Forced-choice task: Joy (light tapping) had the highest recognition rate (66%); love and sympathy were frequently confused (52% vs. 37%) due to similar gestures.
- Valence ratings: Positive emotions (e.g., sympathy) scored significantly higher than negative ones (e.g., anger) (p < 0.001), aligning with tactile valence.
- Free description: Anger (violent hitting sounds) was often described as “tense,” while love (gentle stroking) was associated with “soothing.”
Experiments 3-4: Influence of Surface Material
- Expanded stimuli: Compared skin (3 types) and plastic (3 thicknesses).
- Findings:
- Gesture recognition: Stroking (63.9% vs. 36.1%) and hitting (89.7% vs. 81%) were better recognized on skin (p < 0.004).
- Emotion recognition: Sympathy was more accurately identified on skin (64.6% vs. 29.5%) and rated as more pleasant (p < 0.001).
- Material classification: Plastic sounds were easier to identify (93.4% accuracy vs. 55.8% for skin).
- Gesture recognition: Stroking (63.9% vs. 36.1%) and hitting (89.7% vs. 81%) were better recognized on skin (p < 0.004).
Conclusions and Value
Scientific Significance
- Cross-modal perception: First proof that social touch information (gestures, emotions, materials) can be remotely transmitted via sonification, offering new evidence for multisensory integration theory.
- Technical methodology: The “audio-touch” technology requires only vibrational signal capture and acoustic mapping, providing a low-cost solution for virtual tactile interaction.
Applications
- Virtual reality: Enhances social realism of virtual agents when combined with audiovisual signals.
- Mental health: Mitigates touch deprivation in social isolation, e.g., remote “acoustic hugs” for comfort.
Highlights
- Innovative method: Direct sonification of skin vibrations avoids artificial sound synthesis.
- Ecological validity: Uses real interpersonal touch prototypes (e.g., Hertenstein’s anger hits), not lab-simplified actions.
- Interdisciplinary integration: Combines psychology (emotion coding), engineering (signal processing), and neuroscience (multisensory interaction).
Additional Findings
- Limitations: Excludes complex tactile behaviors (e.g., hugging); future work could expand signal capture with air microphones.
- Future directions: Machine learning could further analyze mappings between acoustic features (e.g., spectral peaks) and emotional categories.