I used the Virtual Drum Kit sample from Making Things See by Greg Borenstein an old DorkbotPDX hacker guy. I added a point cloud for debugging and also the ability to move the triggers at runtime so I could get the kit set up the way I wanted to. I can also save/restore the trigger positions to a simple text file. After I did all of that, I needed a video to show it off. So I decided to get my felines involved, here's the result.
Making Things See is a good book. If you're already a software guy like me, it's a really fast read and a good intro to the OpenNI API. I had tried playing with OpenNI a while ago while learning OpenFrameworks but I was never able to get OpenNI to find my Kinect. Making Things See pointed me at fast/easy installers to get OpenNI working with Processing. Which meant I could just start hacking and not worry about getting the exact versions of various libraries matched up. What a relief! The book's projects are great springboards for all kinds of cool things. I'm excited to keep exploring.
Next up: I plan to play with the skeletal data.