Today we finished our latest micro project, a Multi-Touch Midi Sequencer for OS X made with OpenFrameworks.
This was a very short project, but still the sequencer works pretty well. Any software that can receive Midi can be used together with it. In the video we're using ReacTIVision for blob tracking and Reason for generating audio. Good Stuff!
Here are the results of the "fast prototyping" we did the other day. We made two conceptual drum kit apps for our multi-touch screen, but at the moment the latency is far too high. While making this we came up with the winning idea for our next project, an intuitive MIDI-sequencer for multi-touch systems. More about that later...
Junhee appears to be the worst drummer ever, but it's mostly because of the high latency, right? : )