Smart IxD Lab

Menu

Thoughts, observations and experimentation on interaction by: Smart Design

Check us out

Last night was the opening event for the show “Gimme More” at Eyebeam Art+Technology Center, featuring a collection of experimental projects that made use of augmented reality digital layering techniques and a panel discussion organized by Laetitia Wolff.

We captured some short videos of some of our favorite pieces,  ”Tatooar”, “Last Year” and  ”Beatvox”.

“Tatooar” (shown in the video above) displays an animated tattoo that drips and crawls over a participants’ skin when viewed in a mirror.

In “Last Year”, by Liron Kroll, memorabilia such as post cards and personal notes come alive when viewed through the window of an iPad screen.

“Beatvox” by Yuri Suzuki is composed of a drumset with robotic drumsticks attached to each piece. A microphone is used as the main input, and allows viewers to control the drums using only their voices as they make beatbox-type sounds.
 

Smart Interaction Lab was invited to be part of the panel discussion where the question “Is augmented reality the next medium?” was asked. Here’s what we riffed on:

Our view of augmented reality in our future is different from the one we hear about that involves an all-purpose prosthetic like a pair of glasses or contact lenses. What we think about as a product designers is how our experience of an environment can be enhanced by augmented reality. Consider a surface that turns into a computer workspace that you manipulate by gesture, like our vision for PSFK’s Future of Work report. A keyboard layer lets you do text, a photoshop layer lets you manipulate images with fingers or a paintbrush. Take this one step further and we can consider dynamic interfaces on other surfaces that can morph and change depending on the context. A camera that recognizes a cutting board can project information about a recipe or the nutritional content of the food. Microsoft recently published a vision piece about a project they are calling IllumiRoom where projected images fill an entire space, so your TV is no longer within the frame of a rectangle but on every surface. And the design firm Berg London recently compiled a fascinating blog post on their experiments for Google Creative Lab where they used Kinect to create interactive projection mapping on physical desktop objects.

Another aspect of this can also be what we might call “remote augmented reality”. The technology guru Kevin Kelly talks about a “planetary membrane” comparable to the complexity of the human brain. He talks about how there are cameras everywhere (“three billion artificial eyes”) that see our world and can provide data to an augmented projected layer. In addition to these static cameras, we can think about moving camera-vision entities, essentially robots, that we can tap into to give us an augmented view of a space that we’re not even in. At the moment, the most direct application of this is security surveillance, but it’s fun to think of less fear-driven applications of remote augmented reality as well. For example, telepresence for remote conversations, or for spectator situations such as being virtually present at a performance. The media lab’s Opera of the Future group, is experimenting to see if  “you can take a live experience, whether it’s a concert or a theater show or hanging out with people you care about, and experience that somewhere else” — not only observe it, but feel as if you’re participating in it as well.

 

Posted by: Carla Diana

Leave a comment

Your email address will not be published. Required fields are marked *