I’ve been meaning to document my work on this project for about year now. TBA 2014 reminded me it is time to actually do it! So here it is:
For the past year and a half I’ve been working with a dance company called bobbevy. I’ve been creating graphics that go along with the dance performance called “This is how we disappear”. Here’s a review at Portland Monthly.
More behind the scenes information after the break!
Jesse Meija was doing music and got me involved with this project, I’m very grateful!
Effects for the first set of performances:
- A forest of trees created from a drawings by David Stein.
- Particle effects that mimiced the dancers movement.
- Particle effects that just fly across the screen
In order to accomplish this, I wrote a piece of software that would do the animation and handle tracking the dancers. There were two versions, the first version was used to perform a few times. Notably Dance+ in 2012 and as part of Experimental Half Hour XXXVII. It consisted of the following pieces:
- Cinder, used as a framework
- Freenect, used to interface to the Microsoft Kinect.
- Control, used to control the software from an iPad
This version worked ok. Before I started using Control, I had been triggering all of the sequences with keyboard commands. It worked fine, but I had to have a cheatsheet that told me what keys did what. Also, each command just mutated the state of the program, so if you triggered things in a different order you’d end up in different states. This made some rehersals hard, because it was difficult to return the graphics to a previous state. However, with Control, it became easier to use the software. bobbevy performed in Milwaukee without me and was able to use the software just fine! For Dance+ the Kinect refused to work in the studio, I think because the temperature in the room was so high. So I ended up “drawing” the dancers with a multi-touch interface in Control.
For the particle effects that followed the dancers, I ended up using blob tracking and distingugishing blobs based on distance away from the Kinect. I liked the stateless design because the dancers would move in and out of view of the Kinect and I feel that keeping track of them properly would have been a nightmare. This created some surprising benefits though. The swarms move between the dancer when their relationship to the Kinect changes and it created some really nice animations. Also, this piece has a lot of tension between the dancers and the particles ended up expressing some tension when the dancers were about the same distance away from the Kinect.
Additional effects for the second set of performances:
- Static/simple projection mapped screens (similar to my Party House project).
- Realtime projection mapped patterns on the dancers bodies
For the second version of the software, I used the following new pieces:
- QTimeline, this is a timeline that allows one to control tweens of variables
- QuNeo, as much as I liked Control using a touchscreen while not looking at it (I had to look at the dancers for my cues!) is not ideal. A physical controller allows you to rest you finger on the button/slider you need to push without triggering it. (The cool kids pronounce it keeen-wah).
- LabMidi, used to interface to the QuNeo
- For TBA, 3, yes 3! Projectors. Two of them were used to cover the wide background, and one was used to project onto the dancers themselves.
The timeline solved many problems for me. It took what I used to have hard code in the application (fade times, animation speed, etc) and moved it to a data format. The editing GUI was nice to have as well. A new version of Cinder that made using multiple displays easier to use was really nice to have as well. I didn’t need to mirror my desktop screen anymore, which meant I could display debug and other helpful info on my screen. The QuNeo also allowed me to directly control ramp parameters which meant I didn’t need to rely on predetermined fades as much. This also allowed me to be more engaged with the visuals which was really fun. I think the trick to this will be finding the right balance between direct control and triggers to presets. It is probably the same balance electronic musicians search for.
The newest effect for the second run was the projection mapped dancers. In order to accomplish this, I was going to have to find the dancers with the Kinect and then project onto them as close as possible. I used the vvvv patch from here as a starting point to learn how to calibrate my projector with the Kinect. In the end, I wrote my own calibration code because it fit the setup workflow a bit better.
The projection mapped dancers worked pretty well. I was really excited to see them turn into just an indistigushable mass at moments and then turn back into dancers the next. I think this is what projection mapping should do: transform objects and confuse you, then bring you back to reality. I hope to do more of this in the future!
Cross posted from my blog.