Abstract
Enabling computer-based applications to display intelligent behavior in complex social settings requires them to relate to important aspects of how humans experience and understand such situations. One crucial driver of peoples' social behavior during an interaction is the interdependence they perceive, i.e., how the outcome of an interaction is determined by their own and others' actions. According to psychological studies, both the nonverbal behavior displayed by Motivated by this, we present a series of experiments to automatically recognize interdependence perceptions in dyadic face-to-face negotiations using these sources. Concretely, our approach draws on a combination of features describing individuals' Facial, Upper Body, and Vocal Behavior with state-of-the-art algorithms for multivariate time series classification. Our findings demonstrate that differences in some types of interdependence perceptions can be detected through the automatic analysis of nonverbal behaviors. We discuss implications for developing socially intelligent systems and opportunities for future research.
Original language | English |
---|---|
Title of host publication | ICMI 2021 |
Subtitle of host publication | Proceedings of the 2021 International Conference on Multimodal Interaction |
Editors | Zakia Hammal, Carlos Busso |
Publisher | Association for Computing Machinery, Inc |
Pages | 121-130 |
Number of pages | 10 |
ISBN (Electronic) | 9781450384810 |
DOIs | |
Publication status | Published - Oct 2021 |
Event | 23rd ACM International Conference on Multimodal Interaction, ICMI 2021 - Virtual, Online, Canada Duration: 18 Oct 2021 → 22 Oct 2021 |
Conference
Conference | 23rd ACM International Conference on Multimodal Interaction, ICMI 2021 |
---|---|
Country/Territory | Canada |
City | Virtual, Online |
Period | 18/10/21 → 22/10/21 |
Bibliographical note
Funding Information:This research was (partially) funded by the Hybrid Intelligence Center, a 10-year programme funded by the Dutch Ministry of Education, Culture and Science through the Netherlands Organisation for Scientific Research, https://hybrid-intelligence-centre.nl, grant number 024.004.022 and the MINGLE project number 639.022.606.
Funding Information:
Data collection was funded by an ERC Starting Grant (#635356) awarded to Daniel Balliet.
Publisher Copyright:
© 2021 Owner/Author.
Funding
This research was (partially) funded by the Hybrid Intelligence Center, a 10-year programme funded by the Dutch Ministry of Education, Culture and Science through the Netherlands Organisation for Scientific Research, https://hybrid-intelligence-centre.nl, grant number 024.004.022 and the MINGLE project number 639.022.606. Data collection was funded by an ERC Starting Grant (#635356) awarded to Daniel Balliet.
Funders | Funder number |
---|---|
Horizon 2020 Framework Programme | 635356 |
European Research Council | |
Ministerie van Onderwijs, Cultuur en Wetenschap | |
Nederlandse Organisatie voor Wetenschappelijk Onderzoek | 024.004.022, 639.022.606 |
Keywords
- Situation Perception
- Social Signal Processing
- User-Modeling