Smart IxD Lab

Menu

Thoughts, observations and experimentation on interaction by: Smart Design

Check us out

At Smart, we’re interested in building the relationship between people and the products they use in their everyday lives. When we bring electronics into the mix, we can create product behaviors that communicate through light, sound and movement. Our LUMI project is an exploration of light behaviors. It contains a series of color-changing lights that can be programmed to move and glow in any pattern, allowing us to create a non-verbal way for the product to communicate.

We use these experiments when designing products such as vacuum cleaners and washing machines.

As an experiment, we installed the LUMI system in one of our conference rooms so that people could use it as a way to communicate the mood of the space. Here’s a video of what happened:

We did another experiment where a LUMI was used as a way for a person to communicate his or her status such as “I’m busy”, “stressed”, or “come talk to me”. The glowing lights can be seen by others across the room:

For more information about LUMI, check out our previous post describing the project.

As part of of our LUMI exploration we implemented a system that maps flash animation to BinkM LED’s. This allows any designer to create lighting sequences simply by animating the tint property. Check out the LUMI in action below (coupled with an RFID tagging system):

LUMI_sample

If you would like to make a light sequence you can download the authoring file here.  Send what you’re working on to lab@smartdesignworldwide.com and we’ll send back a video of your light sequence in action.

If you’re interested in using the BlinkM and Flash system in full glory, you can download the framework here and read about it in this previous post.

Happy Flashing!

Smart Design’s latest article for Fast Company’s Co.Design series is a hybrid Femme Den and Smart Interaction Lab effort.

Entitled “How Women Are Leading the Effort to Make Robots More Humane”, it covers Arduino, MAKE Magazine, the LilyPad board, Simon the Robot and high school Fab Labs. Check it out:
http://www.fastcodesign.com/1665597/how-women-are-leading-the-effort-to-make-robots-more-humane

Image 1: One of our experiments: using LUMI as a desktop device to display a person’s mood. A series of discs representing emotional states can be selected to change the light patterns.

Smart Interaction Lab is in the midst of exploring how our everyday products become expressive through dynamic electronic behaviors. For example, the color of LED lighting on a robotic vacuum cleaner has the potential to alert people of its status: green, slow pulsing indicates “All systems go.”; rapid red flashing pleads, “Help! Something is amiss here.” Sound and movement can enhance and reinforce these messages. A jubilant melody at the end of a washing machine cycle says, “Everything went well and your clothes are ready!” A video conference camera repositioning its head expresses “Bye! Going to sleep now.”

Image 2: A great example of communicative design: Aerohive wirelss LAN router designed by Smart Design. The corner is “cut away” to offer a peek at what’s inside. The LED-light strips inform an IT specialist about the status of each line or when there is an error.

As part of this exploration, we’ve created an easy way for our designers to create and examine light behaviors in products. With more affordable options for LED lights, we can give our products a beautiful glow that can change color and show animated lights based on what’s happening between the product and the user.

Image 3: The LUMI system for creating expressive light behaviors in objects. No coding needed.

The LUMI platform contains a round box with a ring of 8 LED lights embedded just below a translucent surface. We’ve programmed it in a few experiments to try to show how we can express people’s emotions through light and will be posting our videos shortly.

In order to make it quick and easy for designers to make changes and create new patterns, the lights  in the LUMI system can be programmed just by changing a few colors in a simple animation in Adobe Flash. Changing the colors on the screen will change the color, intensity and movement of the light. No coding needed! We can make behaviors that show all the lights glowing, turn some lights on or off, or coordinate dimming and glowing to create animations like “chase” sequences where the light appears to be moving around the object surface.

Images 4 and 5: An example scenario: Using LUMI to express the “mood” of a room. Different light behaviors let passers-by know if they should be quiet and serious or if they can be friendly and join in.

If you’re a designer and would like to try this out for yourself, check out our “From Flash to BlinkM” post and download the source files. We’re using BlinkM, the Smart LED, created by Mike Kuniavsky and the fine folks at ThingM.

Stay tuned for the videos!

The Smart Interaction Lab is in the midst of exploring the link between design and emotion as it relates to dynamic behaviors, or expressions that can happen through light, sound and movement. In a series of blog posts and experiments we’ll be sharing some of our discoveries along the way.

Mac Laptop: A simple pulsing from dim to bright captures the essence of breathing

Objects can express emotion
We all know that objects can communicate a lot of very human values through things like form and color. At Smart Design, we often talk about “form semantics” or how the shape of things can communicate something very emotional, such as a drill that looks “aggressive”, a skate that feels “serious”, or  a toaster that is “friendly”. In the film Objectified, Chris Bangle talks about automotive design and how the front of a car can look like a smile. The cognitive psychologist Don Norman even has a book titled Turn Signals are the Facial Expressions of Automobiles. Emotion can be similarly communicated through more dynamic behaviors such as light, sound and movement, which are elements we encounter every day in our product design explorations. Since our interaction lab was created to explore the intersection of the physical and the digital through technology, we’ve become really excited about the possibilities that embedded electronics offer when it comes to crafting emotion.

Look! It’s alive!
Talk to anyone who has a Mac laptop and they can tell you all about how the little blue light in the corner is “breathing” and is a way that the machine lets you know it’s alive. During the design process, we collect examples of how products feel animated and come to life through similar behaviors. Things that sing, glow and pose themselves have the ability to enchant people while immersing them in the interaction needed at a given moment, enhancing an overall emotional connection between the person and the product. We’ve noticed that these kinds of electronic changes give products a sense of personality, so we have been developing methods for defining collections of these behaviors so that they work together. This allows us to essentially craft communicative personalities that are appropriate to the vision and brand of a given product and allow the product to seem animated and “alive”.

Customers post videos of the products they love. This Youtube home video has a caption that reads, “Our happy magical LG Electronics washer dryer”

Building a rich emotional connection
When people feel that a product has a personality, they build a deeper emotional connection than something that feels anonymous. Just taking a quick look at Amazon.com reviews for the Roomba confirms the fact that people project human personalities onto things because of their animated behaviors, and. Here are some of the ones we found:

“We have named our new Roomba Rosie. She is my new best friend. I vacumed with my fifteen year old Kirby before letting Rosie do her thing… She is wonderful. Most of my furniture is too heavy to move each week. She cleaned under those pieces and I was amazed at what she picked up.”

“The Roomba is higher maintenance than a regular vacuum. I don’t mind since we love both of our little hard working buddies (named ‘Romba’ and ‘Murry’).”

“I have a one bedroom apartment and ‘Marvin’ went through and cleaned the whole thing… It was definitely entertaining to watch him buzzing around.”

“Now all I have to do is hit a large glowing button and my robot butler ‘Mr. Belvadere’ takes care of the cleaning.”

An image from an "I love my Roomba" blog. These folks are pictured on the floor to pose with their Roomba.

Some folks at Georgia Tech have been studying the phenomenon of product personality with Roomba[1][2], so if you’d like to learn more check out the references at the end of this post.

A new palette for design through tangible interaction
Until recently, the design of common household products was often limited by the use of off-the-shelf components that were affordable choices for implementation. This meant that we could only use very standard shaped knobs and switches, and lights were often limited to one color that could be on or off, but not dimmed. In the last few years, costs have dropped and the ability to create custom dynamic behaviors with electronic components has offered product us a new and rich palette for interactions that include light sound and movement.
When crafting such product behaviors, we often build physical prototypes of the light systems in order to experience them in real time and create a vocabulary of responses for user testing as well as final product design and specifications. In future blog posts, we’ll share some of the experiments we’ve been creating in order to have a flexible, easy to use system available during the design process to allow designers to craft a wide vocabulary of light behaviors for almost ay product.

Send us your product behaviors!
Have you seen a product that has a particularly expressive light or sound, like an webcam that nods its head or a wristband that glows? If so, send a link, video or image our way and we’ll add it to our collection at Smart Design and mention your name in the Smart Interaction Lab blog.

You can post a link in the comments, or email us at lab@smartinteractionlab.com

References:

[1] Sung, J.Y., Grinter, R.E., Christensen, H.I., Guo, L (2008). Housewives or Technophiles?: Understanding Domestic Robot Owners. Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction. (pp. 129-136).
[2] Sung, J.Y., Guo, L., Grinter, R., Christensen, H.I. (2007). “My Roomba is Rambo”: Intimate Home Appliances. Ubicomp 2007: Ubiquitous Computing, Volume 4717/2007, (pp.145-162).

Smart Interaction Lab has been up to some digital tricks and treats. On Friday, we collaborated with the newly launched Fab Lab at the Marymount School of New York  (http://marymountnyc.org/) to design a groovy but spooky pumpkin face and try our hand at laser cutting the squash.

The result? More cute than spooky, but definitely promising. All our pumpkins were so large they were bumping up on the laser when we put them in sideways, so we made our first few tests with the pumpkin turned on its face.

The laser didn’t quite cut through the pumpkin the way we wanted, though it did singe through the outer shell. Next year we’ll try again, bumping up the laser intensity and using a slightly smaller squash!

At Smart we’ve been working with Arduino since the beginning. In fact, we bought the first Arduino in the US for one of my projects (to make usability prototypes for HP). More recently I’ve joined forces with Chris Anderson of Wired magazine and a group of amazing developers to do a side project around aerial robotics. Chris deveoped a community site is called DIYDrones.com to evangelize open source and open hardware innovation.

My current project is the Arducopter, which is a fully autonomous thrust-vectoring helicopter. I’m team lead and main coder for the flight control SW. We were able to present the work at the Oakland Maker Faire this weekend. Over 100 people came to see the talk and demos of the helicopter which included GPS enable autonomous flight.

Here’s a quick video demo after the cowds dispersed and we could play around.

CNN is also taking notice. They visited Chris’s hardware team in San Diego called 3DRobotics to see first hand how the Arduino platform can be leveraged for aerial innovation.

The DIYDrones developers are also working on the Arduino Core Dev team. I’ll be helping on the IDE, but that’s about all I can talk about!

Photo: Chris Anderson @ wired offices

This past weekend the Smart Interaction Lab made a special appearance in Chicago to explore the physical/digital intersection with student’s from IIT Chicago’s Institute of Design. Smart’s Carla Diana teamed up with the Rockwell Group Lab’s Jeff Hoefs (formerly of Smart Design’s Interaction team) and presented a Friday evening talk to the Chicago Design community entitled “Interaction Design at the Intersection of Objects, Information and Spaces”. Saturday’s events consisted of an all day intensive hands-on workshop where students learned about RFID and NFC technologies and then worked in teams to develop concepts and create real working prototypes of products.

The resulting concepts included

  • “Smart bag” that reminds you when you’ve forgotten to pack something
  • Jenga-based digitally enhanced Math learning game system
  • Walking stick and wayfinding system for the visually impaired
  • Weather game that’s played by bringing different objects into a room (and features awesome paper-cutout stop motion animation)
  • The “Portfolio in a Chip” system designed to let employers collect and compare candidates’ portfolios

The Smart Interaction Lab wants to thank everyone at IIT for their energy, creativity and hospitality.

Here’s a link to the post from the IIT faculty blog with more photos and information about the workshop:

http://instituteofdesign.typepad.com/blog/2011/09/prototyping-workshop-at-id….

One of the things that I’ve sorely missed in After Effects is the ability to add and delete frames mid-layer.  Extending a composition by a few seconds is a multi-step process of extending certain layers, shifting others, and moving keyframes.  It’s okay if the composition is simple, but if it has scores of layers it becomes very tedious.  Flash has the F5 key that does a great job, but some how such a feature was never available in AE.

For a recent project this was exactly the function I needed to fine tune animation.  A few scripts on aescripts.com (which by the way, is a great place to look for AE scripts) did parts but not everything.  So after a few hours of scripting, Add Delete Frames.jsx was born.

This script makes it super easy to add or delete frames. To extend an entire composition, place the current time indicator to where you need to edit (make sure no layers are selected), run the script, and start adding or removing frames.  If you only need to edit a few layers, just select the ones that needs to be changed.

To install and run:

  1. Download and unzip.
  2. Place Add Delete Frames.jsx into /Applications/Adobe After Effects CS5.5/Scripts/ScriptUI Panels.  
  3. With After Effects open, select Windows > Add Delete Frames.jsx 
  4. In the Add/Delete Frames field, put in a postive number to add frames, and negative number to delete.
  5. Press the Do it! button.

Note: this script has only been tested on AE5.5. If you use it for any other version of AE I’d love to know if it is working or not.

Exporting Quicktime or individual PNGs out of Adobe Flash can be a pain when scripts fail, embedded video doesn’t play, or visual artifacts clutter the output. In a recent video camera project, we needed to show a fully animated mockup of an advanced UI which included some really beautiful slow-motion video clips. The mockup would need to be exported into AfterEffects and output as a video presentation.

We tried every route to get Flash to export decent video and ended up having to roll our own solution:  a Perl script that would communicate with Flash directly and grab stills with OS X’s screencapture tool. I had about an hour or so to write these scripts and get the project moving forward again before the deadline, so there is definitely room for improvement or advanced features.

How it works:
A Perl script opens a network socket and begins to communicate with a Flash SWF file. The Perl script receives a message that a new frame has been drawn to the screen. The script then calls the unix command to create a screen capture with the appropriate name and frame number. The Image Magic toolset is then used to trim the screen grabs to the correct proportion and size.

Example project: Download

First the Perl Script (flash_ripper.pl):
Perl is very similar to C and is tightly integrated with the OS X unix underpinnings. Perl also has a lot of extras that do things like parse text or dynamically handle memory. So for simple tasks, it’s fast and very forgiving. Modules can extend Perl to do just about anything.

The Perl script is based on a simple Socket Server example I found on the web almost 10 years ago. So long I can’t find the reference. It takes a folder name as an argument and opens the socket. To invoke it, go to the unix terminal and navigate to the folder containing the script. type” ./flash_ripper.pl myNewFolderName” and a new folder will be created. The script will now wait for Flash to connect on port 127.0.0.1 : 7070

Flash:
Add the following enclosed classes to your flash project . You may need to rename them if you are already using external scripts in your project. The Class field under Publish is where you will enter the class name. Lower your frame rate to 2 or 3 so the screen capture utility can keep up. (I actually lower my screen resolution when capturing to avoid lots of big image files.)

When you run Flash, the SWF will attempt to connect to the running perl script and wait until you press <Space> to start the capture process.

Cropping:
When your movie is finished, you can see the PNGs in the capture folder, ready to be cropped. I use ImageMagick which can be found at http://www.imagemagick.org. This simple Perl script automates ImageMagick:

The key command to know is this: convert image.png -crop 480×800+0+44 image.png

The numbers refer to width, height, X offset, Y offset from the top left.

Dealing with missing frames:
If you are missing frames, Adobe AfterEffects will complain loudly. This script will find missing frames, report them and then dup the preceding image files to close the gaps.

I hope this helps someone out there who has run into the same issues. -Jason