Smart IxD Lab

Menu

Thoughts, observations and experimentation on interaction by: Smart Design

Check us out

Displays are a big part of how we build expression into the design exploration work we do at the Smart Interaction Lab, and LED matrices are particularly compelling as they allow us to have lights that glow beneath a surface and disappear when not in use. Currently, we are working with a 32 x 32 RGB LED matrix that’s about 5″ square and share our experience with displaying type here in this post.

For starters, Adafruit offers a library that allows for drawing functions and displaying text on the matrix. The library is preloaded with a generic font, if you want to have custom typography or even iconography mixed in with the type, you’ll want to create your own font. We came across this challenge, and decided to reverse engineer the library files.

In order to create our own font we modified the glcdfont.c document within the Adafruit-GFX library folder. This document stores the program memory, which has the code for each character of the ASCII table. The first thing that we should notice is that the font is going to be 5 x 7 pixels.  When you open glcdfont.c, you can see that the font is a series of hex values separated by commas under a font[] array. The PROGMEM  is the memory of the array that has each character value. I have commented on green the ASCII value for each series of  hex values that comprise each character. Make sure that you use that when you do this, you utilize these symbols:  ”// comment ” or “/* comment text*/”

Screen Shot 2014-03-14 at 5.46.53 PM

1.

In the image above you can see that the capital letter “A” has a value of 65, since all ASCII symbols are represented in this file, all of the characters are in that numerical order, so “A” is line number 65. Each character is defined by hex values, which  are a shortened version of binary values. Each binary string is translated to five hex code bytes, each byte represents a line of seven 1′s or 0′s each representing either an “on” or “off” pixel in a glyph, but they are flipped 90° to the right.

We used excel to redraw each of the glyphs and extract the hex value of each line using their handy conversion function “=BIN2HEX(1111110)” which will return “7E”. Add “0x” for the correct formatting of “0x7E”, and you will have the first byte of the capital letter “A”.  The second line will be ”=BIN2HEX(1001)”, which returns “9″ this time we add another zero in front which would be “0×09″, and would keep our format uniform. Each of these hex values are the separated by a comma,  and each character or glyph has five pixels in width which has been previously defined. The size of each character is 5 x 7 pixels, and can be enlarged proportionally via other functions within the Arduino code, such as ”matrix.setTextSize(1).”

Screen Shot 2014-02-28 at 3.47.34 PM

 

If you are interested in creating your own fonts, here is the link for the Smart IxD modified file glcdfont.c: glcdfont NeatoFont

Here is the excel file that shows how arrive at the hex code: NeatoFont

Most of us have heard from health experts that we’re supposed to consume at least eight 8-ounce glasses of water a day, but how can we know if we’re hitting this target? Smart Intern and Lab collaborator Simone Capano set out to explore a solution to this problem with Hydramate, a simple system that passively reminds us when it’s time to drink more water.

hydra3

The project explores the notion of time and how it can be represented in an ambient way through a 3-dimensional interface. “I’ve always been very interested in finding a way to manage time in a physical way. Time is for sure an abstract element, and nowadays a lot of applications allow us to manage time effectively, but what if time were controlled through a tangible object, acting as a reminder, speaking to the user with a specific language/behavior?” asks Simone.

hydra4

Hydramate is an electronic coaster that measures a one hour cycle, divided in four quarters, each representing 15 minutes. As time passes, each quarter begins to gently glow giving the user a visual cue of how many times they have raised their glass to drink. Once a whole hour has passed since the last sip, the device begins to blink signaling that it is time to get hydrated. The blinking becomes stronger while the user is drinking, and once they set the glass back down it resets to the gentile glow of the first quarter .

hydra5

Simone created a fully functioning prototype with an Arduino micro controller. The shell  is  made of spray-painted polycarbonate, and the technology inside of it is very simple:

- A Photocell senses when a glass has been placed on it

- An Arduino Pro Mini  powered by a 3.3V Lithium batter receives the input from the photocell and controls the LEDs accordingly

We look forward to hearing about how this project has developed since the prototype was constructed, and how well test users feel the device helps them to manage their hydration.

 

389514_179436285534399_249098610_n

With Christmas time right around the corner, it’s getting down to the wire to find that find that last minute gift for everyone on your Christmas list. While this task is something that we’re all familiar with (and stress over) for parents of kids with physical disabilities this task can be something that’s even more stressful. For many such children playing with off-the-shelf toys is not possible, which is where Hacking for Holidays comes in.

Last Saturday I took part in a DIYAbility’s and the Adaptive Design Association Hacking for the Holiday’s, which sets out to invite makers, hackers,occupational, music and recreational therapists to come together and hack some toys, to make them switch accessible, for children with physical disabilities. The term “switch accessible” refers to the idea of making the toy usable by simple switches which can connect through a mono jack in the toy. If a child can move their head, feet, arm, mouth or any other part of their body it is possible to use a switch pluged into a mono jack to play with the toy. For example, rather than using a joystick to control an RC car, tack switches or momentary switches can be substituted. Adding these switch jacks to a toy does not affect the original toy; the existing buttons will operate as normal and kids who use accessibility switches will now be able to operate the toy, so it works for all users.

As part of the workshop, each participant brought a toy to hack, and together as group we worked toward integrating simple mono-jacks into the toy.

IMG_9298-sized

For my toy, I chose the Fast Lane Lightening Striker. Opening up the remote control revealed four simple spring clip switches(forward, backward, left, right), which would serve as the soldering points for the mono jacks.

IMG_9295-markup

Isolating the PCB from the housing I soldered up four monojacks and then soldered them to the spring clips. A few holes in the housing to feed the monojacks through and the toy was back up and running with the ability for accessibility switches  to be plugged in.

IMG_9299-sized

Overall the Hacking for the Holidays event was great time to geek out, while doing something meaningful. I’ll be looking forward to do it again next year

Screen Shot 2013-12-21 at 12.33.53 PM

One of our current projects in the lab is the StressBot, a friendly console that can read heart activity through the Pulse Sensor to understand whether or not a person is in the physiological state of stress, and then offer coaching exercises and feedback to reduce stress through breathing exercises. We’ve been continuously testing our setup with research participants to try to create an interface that’s clear and accessible to anyone in our studio who might approach the device.

Since the last time we posted about this project, we have learned much more about correlating stress to heart rate.  Joel Murphy, the creator of the Pulse Sensor, has helped us understand IBI (Interbeat Interval, or the time that passes between each heartbeat) and released some code that helped us grasp ways to map heart activity to stress. We have been using IBI measurements and the amplitude function Joel created to assign a specific value for stress, allowing us to measure whether it is relatively high or low.

HRV-1267

Most of our previous prototypes focused on trying to match a visual pattern with the heart rate. This proved to be very complicated, and worst of all, stressful. We also found that having one’s eyes closed is often the best way to achieve a state of relaxation. After a few iterations, we discovered that audio feedback the best way to provide helpful feedback to a person who is trying to relax. This allows the person to close his or her eyes, and focus on finding a constant tone rather than something visual. The image above shows the first trial involving mapping the amplitude of the heart rate to the amplitude of a sine wave, and the IBI to the frequency of the sound. The upper most waves are showing the sound and the lower most wave is displaying the heart rate signature.

Below you can see the various explorations of mapping the same sound wave that is being altered by the user’s heart rate to another visual cue. The concentric circles show a rippling effect based on the IBI change, and the droplets are another way of visualizing the same sonic effect. The user in this case is trying to reach uniformity in pattern, either through the distance of the concentric circles or the distance and shape of the drops.

 HRV-0018

Screen Shot 2013-12-18 at 6.29.32 PM

HRV-0678 HRV-0343

Below you can find the latest iteration of the interface. Using the screen and physical enclosure, the device acts as a coach character to help people know how to approach it and use the interface. It helps engage users with their bio signals, while providing the bot with the signification of the IBI state and a visual cue to ensure them that they are on the right track. Although the project is not complete, we are getting close! Our next steps involve experimenting with parameters, and doing more user testing.

HRV-0485 HRV-0667 HRV-2270

 

 

Eg-j_0KKr42P0ZIJNld3i7cqvCs9b154XBQKDISCo8U

Earlier this year, Smart Intern Floriane Rousse-Marquet looked into ways to blend analog and digital in order to create impactful and meaningful experiences. She shares her discoveries in the post below:

We love designing around the physical world because it gives us more sensory experiences to play with, relate to, and personalize in order to trigger strong emotional responses. The digital world, on the other hand, offers the benefit of being efficient. It allows scalability and the possibility to share what’s been created and experienced. It also offers the potential to learn and update in the background. So how can we find balance between the analog and the digital to maximize the emotional value of a product?

VGbG8NRLMYqlAg3Cus4Yu_IHlnvaZ9zyptCVqwj6N1s

To answer this question, I started by analyzing media consumption (text, sound, image), looking for strengths and weaknesses for each category from an emotional point of view. For example, for sound, physical albums don’t let you transport your music the way you could with an MP3, but they allow for a richer gestural ritual. Albums are easily scratched, but  because they are fragile you develop a deeper attachment to the object because you want to protect it. On the other hand, digital music allows for sharing, portability and easily navigating among songs and albums.

I tried to identify patterns and similarities that lead to an anticipated emotional response and a cherished product by leveraging the meaningful parts of the analog while harnessing the power of the digital. These are summarized in the 6 principles below.

AnologandDigital

1. Create rituals through the senses. Use different emotional triggers from the experience such as materials and shape, and enable a step-by-step discovery. Gesture matters. Letting people connect positively to an activity in a physical way builds memorable interactions. 

2. Be conscious of the value of tangible things. Give objects character through unique forms and materials, and implement a sense of lifetime and age. Make the link between the medium and the content visible to create a feeling of responsibility and care over it.  

3. Craft personality through the look and feel and the details to help build product identity. 

In Emotional Design Don Norman said,

“I value my teapots not only for their function for brewing tea, but because they are a sculptural artwork’ , ‘beyond the design of an object, there is a personal component as well’(…) we take pride in them, (…) because of the meaning they bring to our lives,’‘can I tell a story about it? Does it appeal to myself image, to my pride?’”

4. Offer the opportunity to create a personal mark. Take advantage the tangible manifestation of a physical action on the content, and highlight the data through display. The interface should be able to learn from you and help you empathize with the object.

5. Bring focus to the core experience. Define a limit to information coming in and the type of information given in order to offer a more curated experience.

6. Enable people to pass experiences on to one another by making the missing item visible when shared, giving a second life to the data and allowing people to share their experiences.

One key insight we gleaned is that the best digital experiences are enhanced by the platform they are in. Historically, so many digital experiences have been delivered through a single platform. The Internet of Things has the potential to release the tension between digital and analog by making digital more tangible. We believe that designers should break through the barriers of the screen. An object should be able to live on its own, not only through a virtual platform, and bring new forms of interaction between the user and her environment.

 

The diagram below summarizes the key moments of a  physical book experience.

analog_book 

And this diagram has the key moments of the experience of an e-reader.

digital_kindle

OpenBCI stands for Brain Computer Interface, and it’s the first EEG platform commercially available that gives users access to brainwave raw data, without proprietary algorithms or signal tampering. This board is a new programmable EEG (electroencephalography) system that is completely open source. It uses the ADS1299 chip by Texas Instruments, an 8-channel, low-noise, 24-bit analog-to-digital converter designed specifically for measuring teeny-tiny electrical brain signals. It has the capability to read 8 channels simultaneously with a daisy-chain option to give users 16 EEG channels. Considering that the MindFlex has only one and Emotive 5 channels, OpenBCI is the best option for brainwave data available to creative technologists, hobbyists, and research institutions alike. Versions I and II are Arduino shields, but the new boards  have an ATmega328p on board that is programmable over Bluetooth LE.

b2b9df8fd5cdd284ed6b047f500cf4f7_large

 

The possibilities for this technology are endless, outsourcing EEG data might help neurologists better understand the brain while walking down the street, or eating an ice-cream, things that were previously impossible to be measured in a neuroscience lab. Interaction Design is dawning upon a new challenge, to create systems that can adapt to our cognitive behavior. Brain plasticity and AI will take over our buttons and digits. It’s already happening in the field of prosthetics, it’s only a matter of R&D before we see our lives being easily controlled by merely thinking.

 

55bc41df1193bd562a6b4ea6b5db2bf9_large

This is an image of my brainwaves,  you can see on the rightmost graph that there is a slight dip on channels 1 and 2 just over the 4 second mark. This undulation is a blink. As you can see throughout the rest of the channels there is a pattern emerging in the waves. My eyes are completely closed, this pattern is the activity in the occipital lobe where the visual cortex lays. On the left graph, you can see a peak at the 10 Hz. mark, these are the Alpha waves which are between 8-13 Hz. These waves are most prominent when your eyes are closed, and they inhibit the areas of the cortex that are not in use, while playing an important part in communication and coordination.

The minds behind this project are Joel Murphy, the creator of Pulse Sensor, and Conor Russomano, a recent graduate from MFA Design and Technology at parsons. Together they have spearheaded this project as an initiative by Chip Audette, who has attained DARPA funding to create an open source EEG. For more information about this project check out their kickstarter page:

http://www.kickstarter.com/projects/openbci/openbci-an-open-source-brain-computer-interface-fo

 

 

 

Circuit Stickers from Jie Qi on Vimeo.

Stickers were the official currency of any kid that grew up in the 80′s and 90′s. These generations learned the concept of value one sticky trade at a time. Everyone had a book carefully curated, with pages organized by the sticker’s value, ones were to trade during recess and others to keep forever. Fuzzy, puffy, glittery, holographic, smelly, or lenticular, stickers where the object of everyone’s desire. Today, the interest for stickers seems to have dwindled a bit, perhaps because more kids are captivated by digital content.

Jie Qi and Bunny are introducing a new kind of computational sticker system called Chibitronics. These component stickers can be placed on conductive lines to create a functional circuit, becoming a basic circuitry design learning tool. These stickers are the glue between digital kids and the old standing tradition of sticker trading.

Chibitronics are designed with a flexible polyimide substrate with anisotropic tape or “Z-tape” laminated on the back. This tape allows for electricity to travel either laterally or vertically, but not both simultaneously, it’s almost like an axis diode. Below you can see a close up of the metallic particles that are embedded in this conductive tape. For more technical information about 3m’s anisotropic tape here is the data sheet. For more information about the project and to find out how to get stickers of your own: http://www.crowdsupply.com/chibitronics/circuit-stickers 

 

 

Smart Interaction Lab’s Erika Rossi recently gave a talk at TEDx Salzburg event, where she presented her Master’s thesis project I Mirabilia (“The Wonders”), a set of interactive dolls for hospitalized children.

Here, Erika tells us more about the event and her presentation:

TEDxSalzburg was a great experience and an amazing opportunity to share my project while making new connections with inspiring and smart people. The conference theme, “I CAN HEAR YOU–YOU CAN HEAR ME” focused on people’s need to be heard, and the fact that while they’re struggling to gain attention, they tend to forget to listen.

In the words of the conference organizers,“Listening and being heard is a critical success factor if you want to give answers to question really asked. You don’t have to be a trendscout to come up with relevant (and innovative) products and solutions: you just have to watch and listen.”

The event was divided in 4 sections I-CAN-HEAR-YOU and I was assigned to the third one “HEAR” because my project is about letting hospitalized children be heard by providing them with additional tools to express their fears and emotions. Below is a description:

I Mirabilia (“The Wonders”)  are a family of three interactive dolls for children who spend a long period of time in hospital, due to terminal illnesses or periodic  therapies. Drawing on interviews and observations in a children’s  hospital, three dolls were designed to help overcome specific emotional difficulties faced by children in this situation.  The different interactions and behaviors triggered by the dolls enable the children to improve their relationships and make new connections with the people within the hospital, such as doctors, psychologists and other hospitalized children.

The other talks covered a wide range of topics, from a new interactive platform for learning music by Albert Frantz (http://www.key-notes.com/) to how the latest sensor technology could be used in healthcare applications to improve patient’s monitoring by Fritz Höllerer (http://www.aeskulapp.com/).

Overall, it was a wonderful experience!

 - Erika Rossi

 

We’re super excited about two new robots named “Bo” and “Yana” that are part of a programmable system for kids called Play-i. We’ve been following their development for the last few weeks and are excited to see the amazing support and excitement around it.

The main concept behind the system is the combination of a very simple and graphical bluetooth-based remote control app with a pair of moving robot toys to let kids control and set up programs for arm, eye, gesture, and wheel commands. The system is set up to encourage learning by introducing coding in the context of storytelling, and aimed at a variety of age groups to enable even preschoolers to begin coding. (Different interfaces are geared at different age groups.)

Below is a screenshot of the interface, and you can learn more about the project at the Play-i website or in this video.

steve faletti by Carla diana

This summer Pratt Institute instructor and SVA IxD MFA candidate Steve Faletti worked closely with Smart Designers and the Lab on a range of prototypes and experiments as part of his internship. A big part of his process involved using microcontroller platforms such as Arduino, though lately he has become fond of some alternative boards. In this blog post, Steve shares the pros and cons of a variety of the tools and methods he’s been using, along with a summary of his typical workflow, from coding to compiling.

 

I LOVE ARDUINO, I REALLY DO

I feel obligated to begin this post by saying how much I love Arduino. It’s an amazing project that has put physical computing tools and understanding in the hands of many artists, designers, students, and hobbyists. It has changed the world and has become synonymous with microcontroller development and low-level computing. It was my first foray into electronics, providing not only a relatively painless path into playing with microcontrollers, but an immeasurable amount of information and learning along the way. I really do love Arduino and am infinitely thankful to the wonderful people who conceived of and develop it, yet lately I rarely use it in prototyping or development.

That’s not entirely true since I still use the language, libraries, and compiler chain extensively. Software capability and efficiency were the Achilles heel of the project in its early days. In those days—before the board even had a USB port—analogRead and PWM were convenient compared to setting up timers and bit-flipping ports, but used more than a few extra clock cycles to provide that. Now, nearly a decade later, those core libraries have been trimmed down and make much better use of the AVR resources. They’re fantastic.

The core language is great, and keeps growing and evolving for both functionality and speed, but the greatest value of the Arduino project is the thousands of available libraries. When I first started using Arduino, I needed to set up a servomotor for a project. The Servo library either didn’t exist, was unstable, or I just did not know about it, and it took over a week to figure out how to write my own—very buggy—code to control one. Now, while I could roll and debug my own servo code inside of an hour, it only takes about 15 seconds to grab the library and implement an object. Or two if I need it. The same goes for debouncing buttons, accessing EEPROM, communicating via SPI, or countless other tasks. This is the real power of the Arduino project; the huge community of developers that have created and refined simple-to-use and accessible code. (I like to think of Arduino more as an AVR framework than its own microcontroller, and I somewhat lament the fact that the two names have become synonymous.) So, with some regret, here are the reasons why I frequently avoid using the rest of Arduino framework.

arduino Uno sketch by Carla Diana

LARGE FOOTPRINT

The hardware is too big, too expensive, and too limited for my needs anymore. Originally built to be something of a standalone development tool, and based on components sourced in 2005, the standard Arduino footprint is massive. There is also the weird legacy error in pin layout that will forever lock the Arduino to its unique footprint. I like to keep my circuits completely on the breadboard if possible, and the standard Arduino doesn’t allow that. They’re also $30 a pop at the time of this writing. I use a lot of microcontrollers and tend to leave them in projects. Most of the time the cost and size just don’t make an Uno a viable option.

**Note that I develop on OSX, and than instructions here have only been tested on that OS. Some of the software I use, like Cornflake Terminal, is only available for the Mac. There are many Windows and Linux alternatives and equivalents.

Mini Arduino Pro sketch by Carla Diana

 SMALLER ARDUINO-BASED SOLUTIONS

For simple projects I tend to either burn the Arduino bootloader straight onto a bare ATMega, or buy one preburned, and then essentially build an Arduino around it (it’s not hard, and a great learning experience—do it at least once). I’m also a hug fan of the ‘Pro’ line of Arduinos from Sparkfun, especially the minis. They’re just cheap enough that I don’t care if I lose or fry one, yet save me 5 minutes of hooking up wires. Note that to program either of these stripped down options you will either need some kind of FTDI convertor or cable, or you can use a standard size Arduino.

teensy tracing by carla diana

THE TEENSY ALTERNATIVE 

I find that many of my projects require some kind of interface with a computer, and here the ATmega 32u4 is my new favorite chip. This is slightly different from the 328 on the Uno. It offers more I/O connections, more analog pins, and, most importantly, built-in USB capability. This means that not only do you not need another chip to translate between it and your computer, it can also easily emulate keyboards, mice, and joysticks. Arduino has offered this chip for a while in the Leonardo package, and more recently as the Micro, but I greatly prefer the Teensy 2.0 from PJRC. It ridiculously small, fits on a breadboard, and only costs $16 if you’re willing to source your own header pins. There are usually a few rows of these lying around my studio and soldering them on takes a couple minutes. While this is the same chip in the Arduino options, I find that the bootloader and USB profiles (not open source), are a bit more reliable.

Teensy’s developer, Paul Stoffregen, is a big fan of Arduino and maintains regular communication with the community. As such, he’s ported the loading protocol for Teensy into the Arduino chain by way of a convenient plugin, called Teensyduino. With this installed, there is almost no difference between working with Arduino or Teensy, and the same code can be uploaded to either platform over 95% of the time. Teensyduino also offers the option to add just about every working library upon install, which means I don’t need to go hunting for one later. Paul has also built an ARM-based micro board, called the Teensy 3.0. It’s also cheap, compatible with Teensyduino, easy to work with, and powerful. Paul recommends pairing it with a Raspberry Pi to get a nice, inexpensive, and powerful hardware setup that can handle advanced sound, video, and connectivity. I don’t have any experience with the ARM-based Arduino Due, and looking at the specs, I don’t know that I would try to compare them directly. The Due appears to have more features and pins while the Teensy 3.0 is less than half the price.

 

TEENSY DEVELOPMENT WORKFLOW

I want to wrap this up by talking about my choice of IDE. In addition to physical computing projects, I do a fair amount of screen and web work. My favorite editor right now is Sublime Text 2. It has some great features, and with the huge collection of packages available, it is very powerful. Anybody who writes more than a few lines of code a week will quickly grow to hate the Arduno IDE, based on Processing. There is nothing wrong with it per se, it’s just a very bare bones editor, offering little more than clean up and highlighting. Thankfully, the Stino project exists to essentially plug the Arduino IDE into Sublime Text 2. I’d suggest using the excellent Package Control plugin for Sublime, though if you’re resourceful you can do it manually. This will allow you full Arduino functionality inside the Sublime editor. You can edit, choose a target board, and upload sketches. It has a serial monitor built in, though I usually use Cornflake. It also brings in Teensyduino, provided it’s already been installed. I use the ST2-Arduino snippet set for completion. This unfortunately needs to be installed manually, but it’s not that hard to do. I just git-cloned the repository into my ‘Packages” folder. I honestly don’t know if this is the preferred method, but it seems to work just fine. This article may help.

So, Arduino is great. Really, really great. But I’ve found that as my development skills grow and I look for more flexibility and convenience, some of the tools offered by the project no longer fit into my workflow. That’s fine. the Intent of Arduino is to help people learn about electronics and physical computing. The fact that parts of it are amazing for rapid prototyping and development is a bonus.

- Steve Faletti, October 10, 2013

________________________________________________________________

Have any tips of your own to share? Is there a new development board that you’re enjoying using? Let us know!