Friday, April 15, 2016

Working Demo

For this week I put together a working demo that successfully maps gestures performed on my spandex wall to a filter of ambient noise in MaxMSP.  Getting to this point required figuring out the right interaction fabric so that a user could push into it without touching the wall, setting up a provisional lighting and camera set-up, and setting up white walls behind the fabric for contrast.  It also took me a while to navigate Jaime Oliver's patch and successfully route the data to MaxMSP via a network port.  Here is a video documenting the progress so far, with a simple right to left gesture mapped on to a bandpass filter of some cafe noise:




I had to take down the installation for this weekend's ECM concert, but I can get back to work on Sunday.  Starting on Sunday, I will install a stronger light and set up the second camera for the vertical axis.  From there I'll scale up to build my second wall and elaborate the sound design.

No comments :

Post a Comment

Note: Only a member of this blog may post a comment.