I worked with Aaron Arntz to produce this little experiment. The idea is that the camera splits the screen into blocks, then captures color values for each block, and then plays music whenever the colors shift… It’s a bit of a mess right now but it works!
I dealt with image processing and display, and Aaron used the minim library to produce the sounds.
The code’s over on github.
And here’s a video of the project in action:
As you can see something went terribly wrong/right with my recording setup…