An autonomous cognitive empathy model responsive to users’ facial emotion expressions

Elahe Bagheri, Pablo Gomez Esteban, Hoang-Long Cao, Albert De Beir, Dirk Lefeber, Bram Vanderborght

Onderzoeksoutput: Special issuepeer review

21 Citaten (Scopus)

Samenvatting

Successful social robot services depend on how robots can interact with users. The effective service can be
obtained through smooth, engaged and humanoid interactions in which robots react properly to a user’s affective
state. This paper proposes a novel Automatic Cognitive Empathy Model, ACEM, for humanoid robots to achieve
longer and more engaged human-robot interactions (HRI) by considering human’s emotions and replying to
them appropriately. The proposed model continuously detects the affective states of a user based on facial
expressions and generates desired, either parallel or reactive, empathic behaviors that are already adapted to the
user’s personality. Users’ affective states are detected using a stacked autoencoder network that is trained and
tested on the RAVDESS dataset.
The overall proposed empathic model is verified throughout an experiment, where different emotions are
triggered in participants and then empathic behaviors are applied based on proposed hypothesis. The results
confirm the effectiveness of the proposed model in terms of related social and friendship concepts that participants
perceived during interaction with the robot.
Originele taal-2English
Artikelnummer3341198
Pagina's (van-tot)1-23
Aantal pagina's23
TijdschriftACM Transactions on Interactive Intelligent Systems
Volume10
Nummer van het tijdschrift3
DOI's
StatusPublished - 8 nov 2020

Vingerafdruk

Duik in de onderzoeksthema's van 'An autonomous cognitive empathy model responsive to users’ facial emotion expressions'. Samen vormen ze een unieke vingerafdruk.

Citeer dit