TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Music-Driven Group Choreography

Music-Driven Group Choreography

Nhat Le, Thang Pham, Tuong Do, Erman Tjiputra, Quang D. Tran, Anh Nguyen

2023-03-22CVPR 2023 1Motion Synthesis
PaperPDFCode(official)

Abstract

Music-driven choreography is a challenging problem with a wide variety of industrial applications. Recently, many methods have been proposed to synthesize dance motions from music for a single dancer. However, generating dance motion for a group remains an open problem. In this paper, we present $\rm AIOZ-GDANCE$, a new large-scale dataset for music-driven group dance generation. Unlike existing datasets that only support single dance, our new dataset contains group dance videos, hence supporting the study of group choreography. We propose a semi-autonomous labeling method with humans in the loop to obtain the 3D ground truth for our dataset. The proposed dataset consists of 16.7 hours of paired music and 3D motion from in-the-wild videos, covering 7 dance styles and 16 music genres. We show that naively applying single dance generation technique to creating group dance motion may lead to unsatisfactory results, such as inconsistent movements and collisions between dancers. Based on our new dataset, we propose a new method that takes an input music sequence and a set of 3D positions of dancers to efficiently produce multiple group-coherent choreographies. We propose new evaluation metrics for measuring group dance quality and perform intensive experiments to demonstrate the effectiveness of our method. Our project facilitates future research on group dance generation and is available at: https://aioz-ai.github.io/AIOZ-GDANCE/

Results

TaskDatasetMetricValueModel
Pose TrackingAIOZ-GDANCEFID43.9GDANCER
Pose TrackingAIOZ-GDANCEGMC79.01GDANCER
Pose TrackingAIOZ-GDANCEGMR51.27GDANCER
Pose TrackingAIOZ-GDANCEGenDiv9.23GDANCER
Pose TrackingAIOZ-GDANCEMMC0.25GDANCER
Pose TrackingAIOZ-GDANCEPFC3.05GDANCER
Pose TrackingAIOZ-GDANCETIF0.217GDANCER
Motion SynthesisAIOZ-GDANCEFID43.9GDANCER
Motion SynthesisAIOZ-GDANCEGMC79.01GDANCER
Motion SynthesisAIOZ-GDANCEGMR51.27GDANCER
Motion SynthesisAIOZ-GDANCEGenDiv9.23GDANCER
Motion SynthesisAIOZ-GDANCEMMC0.25GDANCER
Motion SynthesisAIOZ-GDANCEPFC3.05GDANCER
Motion SynthesisAIOZ-GDANCETIF0.217GDANCER
10-shot image generationAIOZ-GDANCEFID43.9GDANCER
10-shot image generationAIOZ-GDANCEGMC79.01GDANCER
10-shot image generationAIOZ-GDANCEGMR51.27GDANCER
10-shot image generationAIOZ-GDANCEGenDiv9.23GDANCER
10-shot image generationAIOZ-GDANCEMMC0.25GDANCER
10-shot image generationAIOZ-GDANCEPFC3.05GDANCER
10-shot image generationAIOZ-GDANCETIF0.217GDANCER
3D Human Pose TrackingAIOZ-GDANCEFID43.9GDANCER
3D Human Pose TrackingAIOZ-GDANCEGMC79.01GDANCER
3D Human Pose TrackingAIOZ-GDANCEGMR51.27GDANCER
3D Human Pose TrackingAIOZ-GDANCEGenDiv9.23GDANCER
3D Human Pose TrackingAIOZ-GDANCEMMC0.25GDANCER
3D Human Pose TrackingAIOZ-GDANCEPFC3.05GDANCER
3D Human Pose TrackingAIOZ-GDANCETIF0.217GDANCER

Related Papers

DeepGesture: A conversational gesture synthesis system based on emotions and semantics2025-07-03VolumetricSMPL: A Neural Volumetric Body Model for Efficient Interactions, Contacts, and Collisions2025-06-29DuetGen: Music Driven Two-Person Dance Generation via Hierarchical Masked Modeling2025-06-23PlanMoGPT: Flow-Enhanced Progressive Planning for Text to Motion Synthesis2025-06-22Motion-R1: Chain-of-Thought Reasoning and Reinforcement Learning for Human Motion Generation2025-06-12DanceChat: Large Language Model-Guided Music-to-Dance Generation2025-06-12MotionRAG-Diff: A Retrieval-Augmented Diffusion Framework for Long-Term Music-to-Dance Generation2025-06-03MotionPro: A Precise Motion Controller for Image-to-Video Generation2025-05-26