Social assistive robots need emotion recognition to enable adaptive and user-centered interactions. Traditional facial expression recognition is often unreliable, especially in real-world settings. Moreover, most literature studies face the classification of binary conditions considering the Russell affective state model. This study proposes a multimodal emotion recognition system, integrating physiological monitoring with kinematic analysis from RGB-D skeletal tracking to capture both autonomic and postural markers of emotion to estimate the seven basic emotions of Ekman’s model. Results show that Random Forest achieves the highest accuracy (71.30±3.47%) when using Multimodal features to classify the seven basic emotions, significantly outperforming physiological and kinematic data alone. The findings confirm that body movements and autonomic responses provide complementary information, enhancing emotion classification. This approach offers a robust alternative to methods that rely on a single modality, improving the feasibility of real-time emotion recognition in social robotics.

A Multimodal Emotion Recognition Approach for Socially Assistive Robots

Tamantini, Christian;Zollo, Loredana;Cordella, Francesca
2026-01-01

Abstract

Social assistive robots need emotion recognition to enable adaptive and user-centered interactions. Traditional facial expression recognition is often unreliable, especially in real-world settings. Moreover, most literature studies face the classification of binary conditions considering the Russell affective state model. This study proposes a multimodal emotion recognition system, integrating physiological monitoring with kinematic analysis from RGB-D skeletal tracking to capture both autonomic and postural markers of emotion to estimate the seven basic emotions of Ekman’s model. Results show that Random Forest achieves the highest accuracy (71.30±3.47%) when using Multimodal features to classify the seven basic emotions, significantly outperforming physiological and kinematic data alone. The findings confirm that body movements and autonomic responses provide complementary information, enhancing emotion classification. This approach offers a robust alternative to methods that rely on a single modality, improving the feasibility of real-time emotion recognition in social robotics.
2026
9789819523818
9789819523825
Affective Computing; Emotion Recognition; Social Robotics
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12610/93977
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact