Discipline: Software Development, Exhibition Design, Generative Design
Location: Cambridge, Massachusetts
Date: April 2015
An undergrad graduation project under the supervision of Professor Markus Wilczek of Tufts University.In this dance performance, dancers dance on a stage. There is a large projection display to the back of the dancers and a motion sensor in front. The motion sensor tracks and records the dancer’s bodies as they dance. The data captured with the sensor is run through an open-source algorithm to translate the recorded motion data into an animation. This animation is projected on the display at the back of the dancers, in real-time. The goal behind this project is to artfully visualize human-computer interaction. When dancers dance in front of a sensor, they interact with the computer. When the computer records, interprets and projects this dance on the display, the computer interacts back with the dancers. Dancers see their live image on the display and they reinterpret and change their mood and dance movements. The communication goes on back and forth.
I wanted the visual aesthetic of the animation to be very pixelated and digital looking to support the idea behind the project. Everything looked was very raw, like the visual aesthetic of the terminal program in Mac. Since this performance was meant to visualize a human-computer interaction, there was no need to polish the graphics. I also filmed the first performance and edited my footage.
Even though dancers had a loosely defined choreography and learned how Kinect work in rehearsals, at the final show the choreography was mostly improvisational. The goal was to generate a different performance and a new conversation, between dancers and the computer, at each show.