Fusion in Multimodal Interactive Systems: An HMM-Based Algorithm for User-Induced Adaptation

Bruno Dumas, Beat Signer, Denis Lalanne

Research output: Chapter in Book/Report/Conference proceedingConference paper

18 Citations (Scopus)
50 Downloads (Pure)

Abstract

Multimodal interfaces have shown to be ideal candidates for interactive systems that adapt to a user either automatically or based on user-defined rules. However, user-based adaptation demands for the corresponding advanced software architec- tures and algorithms. We present a novel multimodal fusion algorithm for the development of adaptive interactive systems which is based on hidden Markov models (HMM). In order to select relevant modalities at the semantic level, the algorithm is linked to temporal relationship properties. The presented algorithm has been evaluated in three use cases from which we were able to identify the main challenges involved in de- veloping adaptive multimodal interfaces.
Original languageEnglish
Title of host publicationProceedings of the 4th International Conference on Engineering Interactive Computing Systems (EICS 2012), Copenhagen, Denmark, June 2012
Publication statusPublished - 25 Jun 2012
Event4th ACM SIGCHI Symposium on Engineering Interactive Computing Systems - IT University of Copenhagen, Copenhagen, Denmark
Duration: 25 Jun 201228 Jun 2012

Publication series

NameProceedings of the 4th International Conference on Engineering Interactive Computing Systems (EICS 2012), Copenhagen, Denmark, June 2012

Conference

Conference4th ACM SIGCHI Symposium on Engineering Interactive Computing Systems
Country/TerritoryDenmark
CityCopenhagen
Period25/06/1228/06/12

Keywords

  • multimodal interaction
  • multimodal fusion
  • HMM-based fusion
  • user interface adaptation

Fingerprint

Dive into the research topics of 'Fusion in Multimodal Interactive Systems: An HMM-Based Algorithm for User-Induced Adaptation'. Together they form a unique fingerprint.

Cite this