Smart IxD Lab

Menu

Thoughts, observations and experimentation on interaction by: Smart Design

Check us out

We’ve been keeping an eye on technologies being developed to enable cloud robotics and recently stumbled upon the Robot App Store, a really accessible and quick way to download and share apps for common commercially available robots using just a PC and standard connectors. For example, apps for Roomba let you drive it remotely from a screen interface or by waving an android phone around. Apps for the Aibo can let you upload disco dancing behaviors or even whole personality overhauls.

In an interview with Mario Tremblay, the CEO of online robot retailer RobotShop, he decribed cloud robotics as, “[what] happens when we connect robots to the Internet and then, by doing so, robots become augmented with more capacity and intelligence”. He goes on to explain that, “Connected robots equal augmented robots. By collaborating with other machines and humans, robots transcend their physical limitations and become more useful and capable, since they can delegate parts of their tasks to more suitable parties.”

Although the current Robot App Store offerings (and for the most part the robots themselves) are best be described as playthings, the service is a powerful proof of concept and an indication of what the future holds for intelligent, connected devices.

We’re getting our Roomba ready to strut its stuff by singing and dancing for the holiday party in the NY Smart Design studio.

At Smart Interaction Lab we’ve done a lot of exploration around ways that objects communicate through light. As electronics are getting more affordable, it’s become possible to have a full spectrum of colors and patterns, allowing even the simplest of objects to be expressive in new ways. With network-connected Internet of Things components becoming a reality, we’re starting to see light animations emerging not only as expressions of pre-preprogrammed behaviors, but as reflections of live, networked data.

Years ago, the Ambient Orb was a product that was perhaps before its time, letting people hook up a glowing ball to a data feed such as the weather or a stock price. Today, some interesting stand-alone products are letting us use existing object archetypes and imbue them with “smarts”, recreating some of what the Ambient Orb’s function, but inserting them into everyday life. For example, the Spark Socket (currently under development) lets you plug in an ordinary light bulb and connect it to the internet so that you can control it from a smartphone, tablet or computer. You can customize it based on automated behaviors, like setting up a light to turn on to wake you up in the morning, or one that turns off at a certain time every day, or another that flashes when you’ve received a text message.

Greenwave Reality in Denmark is creating a similar technology with a system of connected light bulbs, controllers and apps.

The LIFX (shown in the video below and in the image above) connected LED light bulb adds the dimension of color into the mix, letting people control not only the light intensity, but also the color temperature. It’s got a wifi receiver built it so it will let people use their existing light bulb sockets. Joining the ranks of the Kickstarter success stories (they’ve raised 1.3 M), their product is also in development for a target date some time next year.

The HUE wifi LED light bulb is a similar product, from consumer electronics giant Philips. It uses a bridge or a hub that connects to a WiFi router.  Unlike the products in development, this one is available at the Apple Store now. Though the SDK for the HUE app is closed, one developer hacked it to work with Siri.

http://vimeo.com/51921782

The Blink(1) “super status light”, currently under development by our Sketching in Hardware friends at ThingM, is an indicator light that plugs into a computer USB port. It is ultra-hackable, being designed with an enclosure that’s easy to open and with sample code that’s accessible to anyone who works with off-the-shelf microcontrollers like Arduino. The product is also designed to interface with existing services like Twitter, Facebook, Pinterest, IFTT.com and Boxcar.io

_____________________________________________________________

This post is part of our IoT snapshot series. Join us at the Interaction 13 Conference in Toronto in January where we’ll be discussing “Making Meaning in an Internet of Things”.

This fall Smart Interaction Lab visited Ars Electronica, a global festival for Art, Technology and Society which takes place every year in Linz, Austria. The theme for this year’s festival was “The Big Picture” and featured projects such as “Occupy George”, dollar bills with graphics illustrating the disparity between the 1% and everyone else to “tweetscapes”, a sonification of live twitter data.

Free Univeral Construction Kit

Other projects that caught our eye included the stunning display of bare scanners from UdK-Berlin University (pictured above), wearable music from The Interactive Technolgies Group at the University Pompeu Fabra in Barcelona, and the robotic abstractions of artist Seiko Mikami’s “Desire of Codes”. The festival also featured many digitally-physical hybrid explorations such as Golan Levin and Shawn Sims “Free Universal Construction Kit” and “Protei”, an open-source sailing drone, with robotic behaviors “programmed to sense and clean the oceans” (see video, below).

Desire of Codes

Eric Paulos’ Energy Parasites

For a full review, check out this summary from Barcelona-based curator and journalist Barbara Sansone: http://interartive.org/2012/10/ars-electronica-2012-eng/

 

A while ago we were looking  at Disney’s Appmates, and loved how they blended the virtual world of an app with actual physical toys. While these were exciting, it’s nice to see a new product in development that explores mixed reality for kids in a new way, without all the baggage of licensed characters and pre-fixed story lines.

Tiggly is one such platform, with chunky, rugged, toddler-friendly toys in simple shapes: square, circle, triangle and star. When placed on the iPad with the Tiggly apps, they unlock a corresponding virtual world, offering all the benefits of tangible play, while adding the magic and dynamic content that touch-screen interactivity has to offer. There are currently 3 apps in development: one for drawing, another for shape matching, and one that brings in animated characters in a safari.

The iPad is a “toy” that we already know can mesmerize kids as young as 18 months old. Coupling it with physical shapes can help develop motor skills and keep kids playing in a social, creative way in the real 3D world.

Currently a Kickstarter project, you can get a good sense of it from the “Watch the Tiggly story” video link on their webpage.

We’re looking forward to getting a set in our hands as soon as they are ready.

The chill in the air reminds us that winter is around the corner, and this year we’re excited to see how sensor systems and the internet of things will creep their way into winter sports. One exciting project we’ve been tracking is the Push Snowboarding platform, a collaboration that brings together snowboarding freaks from Burton with data geeks from Nokia. The result is a collection of sensors that snowboarders can wear to track five key metrics: heart rate, foot pressure, board orientation, overall speed, and something they call “rush”, which is a measure of galvanic skin responses, or how riders are “holding their nerve”. Each sensor communicates wirelessly via Bluetooth, so data can be sliced, diced and visualized   (though a Nokia Symbian app, of course).

In addition to letting boarders compete with each other for speed, the data shows things like how the board has been flipped, or how many times it turned during a spin. It can be used to give feedback on form by comparing foot pressure and board position, or to create an emotional connection between a viewing crowd and an athlete. Imagine a feedback loop where an audience can see a snowboarder’s “nerve” response after a tricky maneuver and then cheer him or her on for the next one. The project is very much a work in progress, where developers worldwide are encouraged to riff on the existing code and hardware setup, and snowboarders can offer feedback along the way.

The entire system is open source and documented online, complete with notebook sketches of Arduino circuit schematics. They’ve also been collecting data from competitive Snowboarders, and have offered it up for download, so we can all get a sense of what’s happening out there on the slopes even if we’re inside sipping hot toddies.

Check out the video here:

And the full website here: http://www.pushsnowboarding.com/

And some groovy visualizations here:

At Smart Interaction Lab we are always on the lookout for new tools that make it easy for objects, sensors and people to communicate with one another. A while back we featured the breakout.js by our pal and former lab-mate Jeff Hoefs, and last week we visited our neighbors at the Rockwell Group Lab to check out their new toolkit called Spacebrew.

Spacebrew is “a dynamically re-routable software toolkit for choreographing interactive spaces”, or in other words, an awesome and simple way to connect interactive things to one another. Every element that you hook up to the system is identified as either a subscriber (reading data in) or a publisher (pushing data out). Data is in one of three standardized formats: a boolean (true/false), a number range (0-1023) or a string. Once these elements are set up, you can use a visual switchboard of sorts to connect or disconnect publishers and subscribers.

We checked out the system during a recent Friday-night hackathon where we used things like heart-rate monitors and browser buttons as inputs (publishers) and light strips, table lamps, motors and a robotic hand (affectionately called “Clawdia”) as outputs (subscribers). With Spacebrew, we could easily hook up the heart rate monitor to the table lamp, and then disconnect it to see what would happen if we had the same input controlling a motor. Fun!

Here’s a short video of Clawdia being controlled by a mobile phone interface.

Sign up for news about Spacebrew releases at http://www.spacebrew.cc

Smart Interaction Lab was honored to be on the program at this year’s Maker Faire event in New York City, which took place on September 28-29 at the New York Hall of Science. We were part of a group presentation, “Design and DIY: How Makers Are Influencing Product Design” put together by Allan Chochinov of SVA and Core77.

During the talk, we shared the story of how Smart Interaction Lab came to be, emphasizing the importance of hands-on tinkering and experimentation as a way to build expertise and maintain momentum alongside our client projects.

Also presenting was Jared Ficklin from frog design, who talked about where interaction design is headed and shared this video of a Kinect hack that combines gesture and voice commands to give people a natural way to communicate with technology.

In the same session, Tad Toulis of Teague shared some behind-the-scenes stories of the creation of their Teagueduino project, and Gadi Amit of New Deal Design talked about the role of hacking in the design process.

Live sketch-note artist ImageThink captured the whole thing while it was taking place, and left us with the image above as an amazing souvenir.

Thanks, Allan, Sabrina Merlo and Maker Faire for making us a part of this amazing weekend.

We’re enjoying sharing this new booklet by Jan Borchers, Head of the Media Computing Group at RWTH Aachen University. Written in a clear step-by-step fashion and illustrated with helpful illustrations from fritzing.org (an open-source hardware initiative to encourage  creative work with electronics), the text starts with the classic “blink an LED” project and progresses to cover controlling servo-motors, and an excellent list of resources in the “Some Pointers” section.

It’s specifically aimed at people who have done “a bit of programming at some point in their lives, but are new to electronics”, and the author has challenged himself to a one-page-per-experiment limit, so the content feels manageable. He plans to keep it updated as new boards and IDEs emerge.

It’s a free download here:  http://hci.rwth-aachen.de/arduino

 

In the explosion of nifty DIY/arduino products appearing on the market (through crowd-sourced funding or small-run manufacturing strategies) this one makes us smile for so many reasons. The ReaDIY system features a collection of designer toy paper robots that not only look cool when they are just sitting there, but are outfitted with small motors and are compatible with Arduino so they can actually move around and respond to the world around them. In addition to being composed of actual robot parts, they can communicate with the Internet, making them accessible for customized robotic behaviors that can be set up with rules using a cloud-based interface.

We’re impressed with the remarkable networked tech that these toys have,  but what makes them even better is that they are customizable in the physical as well as the virtual – the kit comes with templates and instructions that encourage you create your own paper robot designs, so you can customize the behaviors, while re-thinking the entire forms themselves.

The project, created by the folks who brought us the one-of-a-kind, before-its-time Nabaztag, completed a successful kickstarter campaign and advertises a ship date of fall 2012.

Move over kids, there are a few adult geeks in line to play with these.

The designer, author, and visionary, Bill Moggridge has passed away this past weekend. Let us take a moment to remember the one who coined the phrase that kicked off the discipline we practice.

Bill was not only our hero, but a friend of Smart and one of the first visitors to Smart Interaction Lab when he came to the NY studio to give a talk in 2011.