A Novel Model For Emotion Detection From Facial Muscles Activity

Elahe Bagheri, Azam Bagheri, Pablo Gomez Esteban, Bram Vanderborght

Research output: Unpublished contribution to conferenceUnpublished paper

51 Downloads (Pure)


Considering human’s emotion in different applications and systems
has received substantial attention over the last three decades. The main approach
to cope with emotion detection is first extracting different features and then applying traditional machine learning approaches like SVM to find the true class.
However, recently proposed Deep Learning (DL) based models outperform traditional machine learning approaches.
This paper proposes a novel deep learning based facial emotion detection model
which uses facial muscle activities as raw input to recognize the type of expressed
emotion in real time. To this end, we first use OpenFace to extract the activation value of facial muscles which are then presented to a Stacked Auto Encoder
(SAE) as feature set. Afterward, the SAE returns the best combination of muscles
in describing a particular emotion, these extracted features at the end are applied
to a soft-max layer in order to fulfill multi classification task.
The proposed model has been applied to the CK+, MMI and RADVESS datasets
and achieved respectively an average accuracy of 95.63%, 95.58% and 84.91%
for emotion type detection in six classes, which outperforms state of the art algorithms.
Original languageEnglish
Number of pages13
Publication statusPublished - 28 Aug 2019


  • Facial Emotion Recognition
  • Feature Abstraction
  • Facial Muscles Activity

Fingerprint Dive into the research topics of 'A Novel Model For Emotion Detection From Facial Muscles Activity'. Together they form a unique fingerprint.

Cite this