An autonomous cognitive empathy model responsive to users’ facial emotion expressions

Research output: Contribution to journalSpecial issue

3 Citations (Scopus)

Abstract

Successful social robot services depend on how robots can interact with users. The effective service can be
obtained through smooth, engaged and humanoid interactions in which robots react properly to a user’s affective
state. This paper proposes a novel Automatic Cognitive Empathy Model, ACEM, for humanoid robots to achieve
longer and more engaged human-robot interactions (HRI) by considering human’s emotions and replying to
them appropriately. The proposed model continuously detects the affective states of a user based on facial
expressions and generates desired, either parallel or reactive, empathic behaviors that are already adapted to the
user’s personality. Users’ affective states are detected using a stacked autoencoder network that is trained and
tested on the RAVDESS dataset.
The overall proposed empathic model is verified throughout an experiment, where different emotions are
triggered in participants and then empathic behaviors are applied based on proposed hypothesis. The results
confirm the effectiveness of the proposed model in terms of related social and friendship concepts that participants
perceived during interaction with the robot.
Original languageEnglish
Article number3341198
Pages (from-to)1-23
Number of pages23
JournalACM Transactions on Interactive Intelligent Systems
Volume10
Issue number3
DOIs
Publication statusPublished - 8 Nov 2020

Keywords

  • Empathy,
  • Non-verbal Behavior
  • Adaptive Interaction
  • Facial Emotion Detection
  • Social Robots
  • human robot interaction (HRI)

Fingerprint

Dive into the research topics of 'An autonomous cognitive empathy model responsive to users’ facial emotion expressions'. Together they form a unique fingerprint.

Cite this