Friday, April 15, 2016

Working Demo

For this week I put together a working demo that successfully maps gestures performed on my spandex wall to a filter of ambient noise in MaxMSP.  Getting to this point required figuring out the right interaction fabric so that a user could push into it without touching the wall, setting up a provisional lighting and camera set-up, and setting up white walls behind the fabric for contrast.  It also took me a while to navigate Jaime Oliver's patch and successfully route the data to MaxMSP via a network port.  Here is a video documenting the progress so far, with a simple right to left gesture mapped on to a bandpass filter of some cafe noise:




I had to take down the installation for this weekend's ECM concert, but I can get back to work on Sunday.  Starting on Sunday, I will install a stronger light and set up the second camera for the vertical axis.  From there I'll scale up to build my second wall and elaborate the sound design.

Saturday, April 2, 2016

A quicky this week:

Been working hard at using water as an interface, finally made some positive steps in that direction this afternoon.  The first thing is a video of a mock-up of the actual patch that uses two video sources.  The real patch uses a webcam on one side, but it was hard to make the video and move the water at the same time, so I made a quick recording of the water and used that recording to show the processing.  The patch takes two sources and uses the method of lumakeying to blend them.  A lumakey essentially takes all the pixels within a certain value of one video and makes them transparent, thus revealing the second video behind.  So here's the video of the screen shot:



And the second is a video of the result projected on to a 4x8 piece of black burlap from about ten feet away.  It makes the weird masking decision on the previous video make more sense: