Dance Generation
Blending Creativity and Generative AI for Next-Gen Choreographing
Our Dance Generation initiative aims to transform how dance is created, curated, and imagined. Building on the foundation of Editable Dance Generation from Music (EDGE) [1], we are developing EDGE++—a diffusion-based generative model capable of producing high-fidelity, context-aware dance movements for diverse applications. By sourcing a dataset significantly larger than AIST++ [2], we explore scaling laws in dance generation, enabling multi-modal artistic intent. Additionally, integrating concepts from Bailando [3] allows us to ensure stylistic continuity in extended sequences. These efforts aim to democratize access to professional choreography tools, empowering both the dance and game developer communities.
Innovation and Integrity
Our priority is to protect choreographers and foster a sustainable creative ecosystem for dancers. With a deep understanding of AI’s potential impact, we are committed to balancing technical innovation with artistic integrity. All datasets used in our generative training are ethically sourced, with consent from choreographers, ensuring that our tools uplift and respect the dance community. By prioritizing these values, we aim to build technology that enhances creativity without compromising the artistry or rights of its creators.
Research Vision
EDGE++ Development: Extend the capabilities of music-conditioned dance generation by building broader datasets and yielding higher quality dance sequences with scaling laws and improved optimization.
Editable Choreography Synthesis: Enable creators to refine and customize AI-generated or existing dance sequences, tailoring movements to specific styles or artistic preferences. Incorporating mechanisms for long-range coherence, inspired by Bailando [3], ensures stylistic continuity in extended sequences.
Multimodal Ideation: Investigate the role of music, text, and video in enhancing stylistic coherence and intent in generative dance AI. Recognizing the opportunities highlighted by DanceGen [4], we aim to provide AI-assisted tools for ideation and prototyping, supporting professional choreographers in their creative processes.
Planned Integrations
MVNT STUDIO Compose: A core feature of MVNT's creative suite, allowing developers and game studios to generate high-quality dance movements for virtual environments using music or other inputs.
MVNT STUDIO Customization: Enables users to modify the style of existing dance sequences, adjusting the mood or aligning them with specific artistic influences, such as K-pop groups or styles associated with entertainment companies like HYBE, JYP Entertainment, and SM Entertainment.
Choreography Assistant: An interactive tool for professional choreographers to ideate, refine, and visualize dance sequences with AI-suggested movements. Incorporates inputs such as music, text, and images to tailor choreography to specific artistic intent.
References
Tseng, J., Castellon, R., & Liu, C. K. (2023). EDGE: Editable Dance Generation From Music. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
Li, C., Tang, X., Wang, J., et al. (2021). AI Choreographer: Music Conditioned 3D Dance Generation with AIST++. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV).
Li, S., Yu, W., Gu, T., Lin, C., Wang, Q., Qian, C., Loy, C. C., & Liu, Z. (2022). Bailando: 3D Dance Generation by Actor-Critic GPT with Choreographic Memory. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
Liu, Y., Sra, M., et al. (2024). DanceGen: Supporting Choreography Ideation and Prototyping with Generative AI. In Proceedings of the ACM Symposium on User Interface Software and Technology (UIST).
Own your movement.
MVNT
Last updated