Multimodal dance style transfer

Wenjie Yin*, Hang Yin, Kim Baraka, Danica Kragic, Mårten Björkman

*Corresponding author for this work

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

This paper first presents CycleDance, a novel dance style transfer system that transforms an existing motion clip in one dance style into a motion clip in another dance style while attempting to preserve the motion context of the dance. CycleDance extends existing CycleGAN architectures with multimodal transformer encoders to account for the music context. We adopt a sequence length-based curriculum learning strategy to stabilize training. Our approach captures rich and long-term intra-relations between motion frames, which is a common challenge in motion transfer and synthesis work. Building upon CycleDance, we further propose StarDance, which enables many-to-many mappings across different styles using a single generator network. Additionally, we introduce new metrics for gauging transfer strength and content preservation in the context of dance movements. To evaluate the performance of our approach, we perform an extensive ablation study and a human study with 30 participants, each with 5 or more years of dance experience. Our experimental results show that our approach can generate realistic movements with the target style, outperforming the baseline CycleGAN and its variants on naturalness, transfer strength, and content preservation. Our proposed approach has potential applications in choreography, gaming, animation, and tool development for artistic and scientific innovations in the field of dance.

Original languageEnglish
Article number48
Pages (from-to)1-14
Number of pages14
JournalMachine Vision and Applications
Volume34
Issue number4
Early online date9 May 2023
DOIs
Publication statusPublished - Jul 2023

Bibliographical note

Funding Information:
This study has received funding from the European Commission Horizon 2020 research and innovation program under Grant Agreement Number 824160 (EnTimeMent). This work benefited from access to the HPC resources provided by the Swedish National Infrastructure for Computing (SNIC), partially funded by the Swedish Research Council through Grant Agreement No. 2018-05973. We thank the dancers from Stockholm University of the Arts, KTH Dance Club, and others. We are also grateful to the reviewers for their thoughtful comments.

Funding Information:
This study has received funding from the European Commission Horizon 2020 research and innovation program under Grant Agreement Number 824160 (EnTimeMent). This work benefited from access to the HPC resources provided by the Swedish National Infrastructure for Computing (SNIC), partially funded by the Swedish Research Council through Grant Agreement No. 2018-05973. We thank the dancers from Stockholm University of the Arts, KTH Dance Club, and others. We are also grateful to the reviewers for their thoughtful comments.

Publisher Copyright:
© 2023, The Author(s).

Keywords

  • Dance motion
  • Generative models
  • Multimodal learning
  • Style transfer

Fingerprint

Dive into the research topics of 'Multimodal dance style transfer'. Together they form a unique fingerprint.

Cite this