I’ve uploaded a video to Vimeo that shows what Miko is all about. The demo was recorded last week and ran about 60-150 fps.
To view the demo in HD, make sure to follow this link to the vimeo website, or click on the video itself.
Just a quick recap on what Miko is about:
Every image is decomposed in to key elements and recreated using various fields of data.
The sampled pixel data is used to drive a fluid simulation, with image recognition layered on top. Custom vector maps are derived to let elements `search` each other. The results can be displayed using fields, particles , geometry and image data.
A midi device can be used to control parts of the simulation and application. Various controls can be mapped such as particle decay rate, feature selection, image selection, vector map selection, particle count etc. A bpm counter is included to sync images and triggers to a feed of music.
My plan is to build an interface around the various elements that control the simulation and compositor. After that I’d like to start looking into OpenCL (or Cuda) and other forms of image control. If you’d like to help me out with the code or get a build, send me a mail: firstname.lastname@example.org
This version is compiled using MSVC 2010 and runs stable on Windows 7, but I’d like to compile a version for OSX. Again, if you’re interested in helping me out here, let me know! I’d love to support multiple platforms.
Other thoughts? Send them over!
And for the non believers out there, a couple of quick pictures taken at Nachtwerk, where I had my first test run: