Towards Situated AMR: Creating a Corpus of Gesture AMR

Lucia Donatelli, Kenneth Lai, Richard Brutti, James Pustejovsky

Research output: Chapter in Book / Report / Conference proceedingConference contributionAcademicpeer-review

Abstract

© 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.In this paper, we extend Abstract Meaning Representation (AMR) in order to represent situated multimodal dialogue, with a focus on the modality of gesture. AMR is a general-purpose meaning representation that has become popular for its transparent structure, its ease of annotation and available corpora, and its overall expressiveness. While AMR was designed to represent meaning in language as text or speech, gesture accompanying speech conveys a number of novel communicative dimensions, including situational reference, spatial locations, manner, attitude, orientation, backchanneling, and others. In this paper, we explore how to combine multimodal elements into a single representation for alignment and grounded meaning, using gesture as a case study. As a platform for multimodal situated dialogue annotation, we believe that Gesture AMR has several attractive properties. It is adequately expressive at both utterance and dialogue levels, while easily accommodating the structures inherent in gestural expressions. Further, the native reentrancy facilitates both the linking between modalities and the eventual situational grounding to contextual bindings.
Original languageEnglish
Title of host publicationDigital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management. Health, Operations Management, and Design - 13th International Conference, DHM 2022, Held as Part of the 24th HCI International Conference, HCII 2022, Proceedings
EditorsV.G. Duffy
PublisherSpringer Science and Business Media Deutschland GmbH
Pages293-312
ISBN (Print)9783031060175
DOIs
Publication statusPublished - 2022
Externally publishedYes
Event13th International Conference on Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management, DHM 2022 Held as Part of the 24th HCI International Conference, HCII 2022 - Virtual, Online
Duration: 26 Jun 20221 Jul 2022

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference13th International Conference on Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management, DHM 2022 Held as Part of the 24th HCI International Conference, HCII 2022
CityVirtual, Online
Period26/06/221/07/22

Funding

This work was supported in part by NSF grant DRL 2019805, to Dr. Pustejovsky at Brandeis University, and an NSF Student Grant to Kenneth Lai, Richard Brutti, and Lucia Donatelli, also funded by NSF grant DRL 2019805. We would like to express our thanks to Nikhil Krishnaswamy for his comments on the multimodal framework motivating the development of Gesture AMR. The views expressed herein are ours alone.

FundersFunder number
National Science FoundationDRL 2019805
Brandeis University

    Fingerprint

    Dive into the research topics of 'Towards Situated AMR: Creating a Corpus of Gesture AMR'. Together they form a unique fingerprint.

    Cite this