From Multimodal Interaction to Multimodal Synchrony to Behavioral Adaptivity and Back: a Multi-Adaptive Agent Modeling Approach (Keynote Speech)

  • Sophie Hendrikse (Speaker)
  • Treur, J. (Speaker)

Activity: Lecture / PresentationAcademic

Description

This joint Keynote Speech focuses on agent modeling for multimodal interactions of both humans and artificial agents. Human interaction often happens through different modalities such as movements, facial expressions, and verbal utterances. These multimodal human interactions often become attuned to each other. For instance, partial mimicry of movements and facial expressions emerges. Such types of attunement are also indicated by the term interpersonal synchrony, which can occur for different modalities. Interpersonal synchrony usually results in increased behavioral adaptivity. Such behavioral adaptivity encompasses a range of behavioral outcomes, from better cooperation to increased liking or bonding. Since the link between interpersonal synchrony and behavioral adaptivity is an overall mechanism that arises automatically, for a human feel it is recommendable to build it into human-computer interaction as realistically as possible. Our agent models, based on an adaptive network-oriented modeling approach, offer an adequate tool to simulate and analyse these emergent processes, and can therefore provide a good basis for adaptive human-like virtual agents in various contexts.
Period20 Oct 202222 Oct 2022
Event titleThe 14th International Conference on Intelligent Human Computer Interaction, IHCI'22
Event typeConference
LocationTashkent, UzbekistanShow on map
Degree of RecognitionInternational