Working on this project for a while and it’s finally taking shape. I hope to be able to post something more soon about how it looks. But in the meantime the code is available here at github.
The central premise is that visuals at concerts often concentrate on the music or the band, but rarely offer the audience an opportunity to feedback into the music and the aesthetics. The mural functions as a mirror for the audience, offering them a new way to engage the performance and via the visuals the performers. It allows a feedback mechanism that is tuned to both the audience’s motion and energy as well as the music.
The program uses the Kinect to capture data about the space and people and augments it with audio data from FFT algorithms.
Computer vision isn’t necessarily new. In fact there’s been a bit of appropriating this technology already posted on my blog earlier, but there’s a lot of room to explore and here’s an exploration using OpenFrameworks and OpenCV to recognize faces, flip them and attach them at roughly where the chin is, hence bringing new meaning to the term “double chin.”
Maybe it’s because I’m in Big Games or maybe I’m just on a gaming bend, but I’ve been working on developing another game as well. For our class in Spatial Media, my friend and I decided to build a spatially aware installation called the Waiting Game to help pass the time for those people stuck in line at some location.
During the design process we considered the different types of places and experiences you might have at those places while waiting in line. My friend Mark Kleback and I were originally inspired by the long lines at the Post Office near NYU, but we quickly decided against designing any interaction for the Post Office simply because the space was so charged as it is that an interactive game might inflame passions rather than provide a fun diversion.
So instead we set our eyes upon the waiting lines at check-in counters at the airport, specifically JFK’s JetBlue check-in terminals. Mind you, we love JetBlue, but we could do with waiting less in line anywhere. So with this in mind we designed a simple interactive piece where little airplanes would take off at the beginning of the line and fly in determined flightpaths through the line and then land in the end while circumventing objects along the path.
Using the Kinect we tracked people as they walked through the line and had the planes divert around the object until they could return to their flight plans. Travelers can interact with the planes by sticking out their hands and feet and gently guiding the planes through the line. The code has a few bugs that we need to fix, but for the most part the code works wonderfully. Unfortunately due to bad projector selection we had to scale down our design as you can see in the video below. With the right equipment however we don’t see any reason why we couldn’t implement this at JFK.
The code was written with OpenFrameworks and OpenCV. You can find the code online on github.