top of page

Dyads  (Work-in-Progress)

I am working with choreographer and particle physicist Dr. Mariel Pettee on an exciting development using A.I. to create dance choreography. This innovation comes from training AI models on videos captured from a dancer's performance, allowing the machine to learn and generate new dance moves, a process well illustrated in our paper Beyond Imitation: Generative and Variational Choreography via Machine Learning. Building on this very interesting work, our project aims to take things a step further by focusing on duets. We're interested in exploring how AI can choreograph dances that involve complex interactions between the dancers, reflecting a more dynamic and authentic expression of human connection and creativity. The idea is to expand the aforementioned existing work in AI-generated choreography by introducing modern machine learning techniques to create dance sequences for duo performances. Initially using a generative model with LSTM units to learn from motion-captured dance movements, this project proposes a new approach by incorporating self-attention mechanisms and Graph Neural Networks. These technologies are expected to allow a deeper understanding of the relationships within and between dancers' movements over time. More specifically, the project will explore the integration of self-attention mechanisms to replace traditional LSTM sequences, enabling the model to more effectively emphasize significant connections across frames of dance movements. Additionally, it seeks to benefit from capturing the spatial relationships between dancers' joints by using GNNs for pre and post-processing. The deliverables of the project basically consist of the developed dataset and its visualizations, the implemented agents with their respective weights, and a report together with a repository containing the full documentation and code for each step.

bottom of page