Description
This joint Keynote Speech focuses on agent modeling for multimodal interactions of both humans and artificial agents. Human interaction often happens through different modalities such as movements, facial expressions, and verbal utterances. These multimodal human interactions often become attuned to each other. For instance, partial mimicry of movements and facial expressions emerges. Such types of attunement are also indicated by the term interpersonal synchrony, which can occur for different modalities. Interpersonal synchrony usually results in increased behavioral adaptivity. Such behavioral adaptivity encompasses a range of behavioral outcomes, from better cooperation to increased liking or bonding. Since the link between interpersonal synchrony and behavioral adaptivity is an overall mechanism that arises automatically, for a human feel it is recommendable to build it into human-computer interaction as realistically as possible. Our agent models, based on an adaptive network-oriented modeling approach, offer an adequate tool to simulate and analyse these emergent processes, and can therefore provide a good basis for adaptive human-like virtual agents in various contexts.| Period | 20 Oct 2022 → 22 Oct 2022 |
|---|---|
| Event title | The 14th International Conference on Intelligent Human Computer Interaction, IHCI'22 |
| Event type | Conference |
| Location | Tashkent, UzbekistanShow on map |
| Degree of Recognition | International |