Juggle Music

I've made some progress with my Juggle Music project, there is a public repo over on github where I'm managing the code.

After more research I decided to skip SimpleCV and just dive directly into learning OpenCV.

I made this call due to the lessons I learned while coming to grips with developing the GUI for my python powered iRacing Stats project. Initially I used GUI2Py which wraps around the WXPython library, because WX looked very scary to begin with. As I progressed and my GUI became more complex I started hitting issues with GUI2Py and rather than hack on it to get it to interface how I wanted with WX I realized I was better off just doing it all in WX directly.

I've got the basics of the object tracking completed. It is color based and works nice and simply. You fire up the app and it prompts you to select the 3 colored objects. I do this rather than hard coding the color ranges because this method handles different lighting conditions and allows me to change props.

For the music playback I'm using the Mingus library along with Fluidsynth. I've got a lot of learning to do on this side of things, my knowledge of all things midi is very basic at this stage. Currently I've got it rigged up to simply playback notes based on how high the objects are thrown. At this stage it really is just a proof of concept.

I'll be working on configuring the "juggle space"; working out how many segments I can cut the webcam image up into so I'm able to accurately trigger samples. I can also see that drawing these segments on the output window would be helpful.

Contents © 2017