I went to the DorkbotPDX Open Lab today (Feb 26th, 2012) at ADX and sat down to play with the Kinect I got a while ago. The result is a "Cat Sequencer" that uses the Kinect to trigger samples. There are two looping samples and one non looping sample. The trigger areas for the samples have been placed around the plates. They are visible on the computer as cubes. The more solid the cubes the more filled the trigger area is. Once the trigger is filled enough, it plays a sample. Read on for more details and code.
I used the Virtual Drum Kit sample from Making Things See by Greg Borenstein an old DorkbotPDX hacker guy. I added a point cloud for debugging and also the ability to move the triggers at runtime so I could get the kit set up the way I wanted to. I can also save/restore the trigger positions to a simple text file. After I did all of that, I needed a video to show it off. So I decided to get my felines involved, here's the result.
Making Things See is a good book. If you're already a software guy like me, it's a really fast read and a good intro to the OpenNI API. I had tried playing with OpenNI a while ago while learning OpenFrameworks but I was never able to get OpenNI to find my Kinect. Making Things See pointed me at fast/easy installers to get OpenNI working with Processing. Which meant I could just start hacking and not worry about getting the exact versions of various libraries matched up. What a relief! The book's projects are great springboards for all kinds of cool things. I'm excited to keep exploring.
Next up: I plan to play with the skeletal data.