Turning a camera into an instrument (of sorts).

I worked with Aaron Arntz to produce this little experiment. The idea is that the camera splits the screen into blocks, then captures color values for each block, and then plays music whenever the colors shift… It’s a bit of a mess right now but it works!

I dealt with image processing and display, and Aaron used the minim library to produce the sounds.

The code’s over on github.

And here’s a video of the project in action:
[vimeo https://vimeo.com/75279998 w=640]

As you can see something went terribly wrong/right with my recording setup…

“I can always make more”

Here are the results of my collaboration with Billy Dang and Yiyang Liang, tentatively called “I can always make more”. The audio clips used are from field recordings we made at MOMA (including recordings of audio installation pieces), interviews with strangers and friends, and sounds of the city.

[soundcloud url=”http://api.soundcloud.com/tracks/111100701″ params=”” width=” 100%” height=”166″ iframe=”true” /]

Here are some pics of the recording process:

Photo Sep 13, 2 05 59 PM

Photo Sep 13, 2 17 21 PM

Photo Sep 13, 2 25 46 PM

Photo 13-9-13 下午2 13 16


For my second computational media project I worked with Brian Clifton to make this prism extruding thingy. We were inspired by the code we found here.

Click here to see the sketch on openprocessing.org. You can mouse over to extrude the triangles, click them to change their color, or move the slider (or mouse wheel) to change their max height.

Here’s a vid:

[vimeo https://vimeo.com/74787712 w=640]

And here’s the code: