top of page

Google Summer of Code (2024)

ChoreoA.I.

Most existing AI dance research treats dance as a visual activity — a single body or group of individuals moving through space. ChoreoAI insisted on the duet as its unit of analysis, because partnering — the negotiation of weight, trust, momentum, and mutual attention between two bodies — is where the ethical dimensions of physical interaction live, and where the most compelling questions for both art and technology reside.

Project Overview

The animating premise of ChoreoAI is that AI, at its most artistically meaningful, can deepen live experience. When an audience watches two dancers move together, they perceive surfaces: bodies in space, shapes in time. What they cannot easily see are the invisible threads of influence pulling between those bodies — the way one dancer's shift of weight redirects their partner's momentum, the tension held between hands not yet touching, the mutual attunement that makes a duet feel alive rather than merely coordinated. ChoreoAI asks whether computational tools can make those invisible dynamics perceptible, and in doing so, whether they can change what it means to be present in the room with live performance.

The central question driving the work was therefore not a technological one: What can AI reveal about the dynamics between dancing partners that embodied experience alone cannot fully articulate? Most existing AI dance research treats dance as a solo activity — a single body moving through space. ChoreoAI insisted on the duet as its unit of analysis, because partnering — the negotiation of weight, trust, momentum, and mutual attention between two bodies — is where the ethical dimensions of physical interaction live, and where the most compelling questions for both art and technology reside.

The artist-technologist collaboration was structured as a genuinely reciprocal inquiry. Artists helped define the scope of data collection, consented to and shaped how their movement would be used, co-developed vocabulary that could bridge choreographic and computational frameworks, and evaluated and responded to model outputs not as end-users but as co-investigators. The research explicitly rejected a "problem-solving" framing — AI was not being asked to optimize partnering or produce better choreography. It was being asked to make legible what bodies already know but cannot easily see.

 

This orientation extended beyond the research team into public-facing participatory work. Using the findings and frameworks developed through ChoreoAI as a foundation, I designed and facilitated workshops inviting general audiences — not trained dancers — to encounter the ethics of physical interaction directly through their own bodies. Participants moved through guided partnering studies drawn from the Partnering Lab's somatic practice: exercises in weight-sharing, mutual attunement, and responsive listening that asked them to notice what trust, care, and consent actually feel like in the body, not just as abstract concepts. The AI research served as both provocation and frame: when audiences learned that algorithms trained on partnered movement could detect tension, pull, and influence between bodies — connections invisible to the naked eye — it reoriented how they understood their own physical interactions with others.

 

The goal was not to teach dance technique but to use movement as an instrument of ethical inquiry accessible to anyone with a body. In this way, ChoreoAI was never only a research project: it was a participatory platform for asking, in public, what it means to be physically responsible to another person. Two papers were produced: "Invisible Strings: Revealing Latent Dancer-to-Dancer Interactions with Graph Neural Networks" and "Dyads: Artist-Centric, AI-Generated Dance Duets".

 

These technologies can allow for a deeper understanding of the relationships within and between dancers' movements over time. More specifically, the project explored the integration of self-attention mechanisms to replace traditional LSTM sequences, enabling the model to more effectively emphasize significant connections across frames of dance movements. Additionally, it sought to benefit from capturing the spatial relationships between dancers' joints by using GNNs for pre and post-processing. The deliverables of the project consisted of the developed dataset and its visualizations, the implemented agents with their respective weights, and a report together with a repository containing the full documentation and code for each step. 

TheArt&ScienceofPartnering_SchoolandStaffClass_2019_pGraceKathrynLandefeld16_edited.jpg

© 2025 Ilya Vidrin

bottom of page