indexicon.jpg

Googly Process Work

Googly Design Process


1. Concept Background

 

“What if everyday objects really are alive and interactive?”                                                                                                                                                                                             

 What kind of experience can be achieved when the user feels objects are alive?” 

We consciously and unconsciously assign characteristics and personalities to objects. It is not rare to see people naming their cars, computers, or other objects they hold dear and treat those as if they are alive. We oftentimes see ‘faces’ in car’s headlights and grille and imagine possible characteristics the car may have. People ‘eye-bomb’ in public places with eyes stickers. Characterization of public bus in Seoul into Tayo bus have gone viral among kids. The idea to design an interactive eye module that anthropomorphizes everyday objects started by reflecting on such human behavior.


2. Research

Interaction

Interaction based on proxemics influences how users interact and communicate with objects.[1] Media player application based on proxemic interaction shows how different information can be presented by the distance of the user to the product. Proxemic interaction has four factors: position, identity, movement and orientation.

 Interaction based on proxemics: from awareness to direct interaction [1]

Interaction based on proxemics: from awareness to direct interaction [1]

Experience

Movement of objects such as a faucet [2], a toaster [3], and a television [4] that follow the user around engages the user and increases his sympathy in everyday objects. Also, Robot-like devices can influence human behaviors. Robots can cause people to do more public good [5] and their presence can make people be more honest [6].       

 Thrifty Faucet in different postures: seeking, curious, and rejecting [2]

Thrifty Faucet in different postures: seeking, curious, and rejecting [2]

Form Factor

Pinoky [7] is a device that can be attached to any plush toys to generate movements. Pinoky has significance in that it is attachable to existing toys, without modifying the exterior form. It also has scalability to be applied anywhere. Nikodama [8] is a non-interactive device that anthropomorphizes objects by attaching eyes onto them. 

 Pinoky [7] (left) and Nikodama [8] (right)

Pinoky [7] (left) and Nikodama [8] (right)


3. Concept Development

I defined Googly as "a proxemic-based interactive eye module that further engages user by anthropomorphizing everyday objects" and set the design goals. 

  • Responsiveness - As an interactive device that responds to the user, Googly should have good responsiveness for quality interactions.
  • Scalability - Googly should be able to be attached to almost any everyday objects.
  • Aesthetically pleasing - Googly should have an aesthetic value. It should be aesthetically pleasing to be turned on even in ambient environment.
  • Appropriately realistic - While maintaining the overall look and motion of eyes, Googly should avoid falling into uncanny valley.
  • Compactness - Googly should be as compact as possible to be naturally integrated to the object attached onto.  

Once the concept was defined, I started to make the idea tangible by rough prototyping. Through many cycles of iteration and refinement, I solidified the concept.

Concept test 01 was carried out to see the validity of the design concept. Wizard of Oz method was used with one servo motor.

Concept test 02 was carried out to test proximity sensors. 4 proximity sensors and 2 servo motors were used.

Concept test 03 was done to test angry eyes. Proximity sensors and servo motors were used. 

 Concept test 04 was to test face detection with a Macbook camera.


4.1 Design & Prototyping - Interaction Design

After the series of rough prototyping, I defined the interaction phases for Googly (the table on the right).  For each interaction phase based on proxemics, I defined what to detect and what to give as feedbacks. Then, I explored and tested more sensors to find the right ones to realize the interaction.

Far Phase - Camera Testing 

(1) USB Camera (Low Resolution)

(2) OjOcamFace

(3) 1080p 30fps USB Camera

     
 A low resolution USB camera was tested. The face detection worked well. However, USB camera always had to be connected to a computer, weakening Googly's concept of stand-alone module. 

A low resolution USB camera was tested. The face detection worked well. However, USB camera always had to be connected to a computer, weakening Googly's concept of stand-alone module. 

ojo.jpg
  OjOcamFace module by Withrobot is a camera module that has embedded f  ace detection program . The module could make Googly cable-free, however, it took too long to process the data, lowering the responsiveness of Googly.

OjOcamFace module by Withrobot is a camera module that has embedded face detection program . The module could make Googly cable-free, however, it took too long to process the data, lowering the responsiveness of Googly.

 The USB camera module in high resolution showed better rate in detecting face than the one in low resolution. It still had the same problem with a cable that must be connected to a computer. 

The USB camera module in high resolution showed better rate in detecting face than the one in low resolution. It still had the same problem with a cable that must be connected to a computer. 

After testing the three camera modules, reconciliation had to be made between using OjOcamFace module to make Googly cable-free, yet with low response rate and using a USB camera to make Googly respond quickly, yet with a cable. Slow response could degrade the whole experience of Googly. Therefore, 1080p 30fps USB Camera, a high resolution USB camera, was decided to be used. 

 

Close Phase - IR Sensors Testing

For Close Phase interaction, Sharp digital distance sensors were tested and used. The IR sensor can detect objects that are within 10cm range.

                       Sharp digital distance sensor (10cm)

                      Sharp digital distance sensor (10cm)

Contact Phase - Touch Sensors & Sound Recording/Playing Testing

For Contact Phase, MPR121 Touch sensor module and capacitive touch sensor were tested. Capacitive touch sensor could present analog values of touch input and the threshold for sensitivity of the sensor could be controlled easily. Therefore, capacitive touch was used. For sound feedback, DFPlayer MP3 module was tested. However, it needed pre-recorded sounds in a micro SD card. To make Googly truly customizable in any context, I added recording function to allow users to record their own voices. I embedded a microphone inside and made the sound come out from an external speaker. The played back voice has sound effect applied to give Googly a characteristic. 

 


4.2 Design & Prototyping - Prototype v.1

The first prototype of Googly, prototype v.1, had its meaning in that it integrated all the features and functions it needed. Prototype v.1 presented ideas of what Googly can do and how it should be developed further. 

A user study was carried out with four kindergarten and elementary school kids to see their reactions. In the interview, all four kids expressed their interests in keeping Googly as a toy to provide eyes to inanimate objects.     

Googly has two different modes of emotion. One is a neutral mode, designed to express inviting atmosphere and attract users for further interactions. The other is an angry mode, designed to keep users at distance and prevent further engagements. Both modes consist of three interaction phases based on proxemics: Far Phase, Close Phase, and Contact Phase. The sensory feedbacks from Googly in angry mode are different from those in neutral mode. Angry Googly was designed to be even more angry when user is detected by frowning and growling.

Prototype v.1 neutral eyes

Prototype v.1 angry eyes 


4.3 Design & Prototyping - Prototype v.2

The second prototype, prototype v.2, was to make the inner structure of Googly. The goal of prototype v.2 was to make Googly as compact as possible. To fit all the electronic components using minimal space, all parts were 3D modelled using Rhinoceros. Then the exact space for the electronic components was calculated. Inner structures that hold sensors and parts were 3D printed. 

Study of Eyes Representation

 Paper Eyes                   E-ink panel                                LED matrices (yellow)                            LED matrices (yellow)2                                LED matrices (white)

Paper Eyes                   E-ink panel                                LED matrices (yellow)                            LED matrices (yellow)2                                LED matrices (white)

While making prototype v.2, various methods of representing eyes were studied. Analog feeling was considered to be the top priority to prevent from falling into uncanny valley, causing repulsion. In prototype v.1, Googly's eyes were drawn on a paper (A). This was to give purposely unfinished look and feel. However, the method had a major drawback in that the servo motors attached to the paper eyes could only allow 180 degree 2D rotational movement. Also the eyes on the paper were static and could not blink or alter emotions. For better experience, digital panel was inevitable to use. An E-ink panel (B) only had two states: on-state with blank ink and off-state without any ink. Since it is digital, it could represent blinking eyes and express various emotions through the shape of pupil. E-ink, however, was not visible in dark and the refreshing rate of the panel was significantly slow. LED matrix (C,D,E), although it seems highly digital, had its own analog feeling from the pixelated feature. LED matrix also could deliver various movements of eyes including blinking and eye-rolling motions without delay. For the final design, white LED matrices with pupils presented as off-state (E) was used. 

 

The ways to represent Googly's angry emotion with LED matrices were explored. Below are some sketches for angry eyes.


5. Final Design

The final prototype has working inner structure and electronic components with acrylic casing outside. The material of casing was important because it should be penetrant enough to have all the sensors inside working properly, but at the same time, dark enough to hide them and make them invisible from outside. After testing out several materials and production methods, Googly casing was made of black acrylic with some opacity and produced by Computer Numerical Control (CNC). Googly has button interface that can change its emotional states between neutral mode and angry mode. Also by pressing the record button, the user can record his or her own voice that will played with a voice effect when Googly is touched.

 

 

Interaction Phases

Applications

Promotional Purpose

Googly can grab the user's attention by its eyes following the user. Also, it can give affordance to the user because he or she naturally tends to follow where Googly is glancing at. Therefore, it can be used effectively for promotions. For instance, people took more of my business cards when Googly was set to glance at them than when it was looking straight ahead at the exhibition. 

Giving Effective Warning

Angry eyes can be effective to keep the user away at distance. For example, if angry Googly is placed on dangerous objects such as a stove, it can give children a warning not to touch and stay away from it.

 

Inducing Honesty

Googly can be effective in places where honesty is needed, for instance, places where fee paying is unmonitored. As people tend to be more honest when they feel that they are being watched, Googly can encourage them to pay even when unattended by a person. 


Poster Design

Googly at Tokyo Design Week 2015


Reference

[1]     Till Ballendat, Nicolai Marquardt, and Saul Greenberg. 2010. Proxemic interaction: designing for a proximity and orientation-aware environment.  In ACM International Conference on Interactive Tabletops and Surfaces (ITS '10). ACM, New York, NY, USA,  121-130. DOI=http://dx.doi.org/10.1145/1936652.1936676

[2]    Jonas Togler, Fabian Hemmert, and Reto Wettach. 2009. Living interfaces: the thrifty faucet.  In Proceedings of the 3rd International Conference on Tangible and Embedded Interaction (TEI '09). ACM, New York, NY, USA,  43-44. DOI=10.1145/1517664.1517680 http://doi.acm.org/10.1145/1517664.1517680

[3]    Eva Burneleit, Fabian Hemmert, and Reto Wettach. 2009. Living interfaces: the impatient toaster.  In Proceedings of the 3rd International Conference on Tangible and Embedded Interaction (TEI '09). ACM, New York, NY, USA,  21-22. DOI=http://dx.doi.org/10.1145/1517664.1517673

[4]    Majken Kirkegaard Rasmussen, Erik Grönvall, Sofie Kinch, and Marianne Graves Petersen. 2013. "It's alive, it's magic, it's in love with you": opportunities, challenges and open questions for actuated interfaces.  In Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration (OzCHI '13), Haifeng Shen, Ross Smith, Jeni Paay, Paul Calder, and Theodor Wyeld (Eds.). ACM, New York, NY, USA,  63-72. DOI=http://dx.doi.org/10.1145/2541016.2541033

[5]    T. C. Burnham and B. Hare, "Engineering Human Cooperation," Human Nature, vol. 18, no. 2, pp.88-108, 4 July 2007.  

[6]    Guy Hoffman, Jodi Forlizzi, Shahar Ayal, Aaron Steinfeld, John Antanitis, Guy Hochman, Eric Hochendoner, and Justin Finkenaur. 2015. Robot Presence and Human Honesty: Experimental Evidence.  In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI '15). ACM, New York, NY, USA,  181-188. DOI=http://dx.doi.org/10.1145/2696454.2696487

[7]    Yuta Sugiura, Calista Lee, Masayasu Ogata, Anusha Withana, Yasutoshi Makino, Daisuke Sakamoto, Masahiko Inami, and Takeo Igarashi. 2012. PINOKY: a ring-like device that gives movement to any plush toy.  In CHI '12 Extended Abstracts on Human Factors in Computing Systems (CHI EA '12). ACM, New York, NY, USA,  1443-1444. DOI=http://dx.doi.org/10.1145/2212776.2212477

[8]    R. Kuwakubo, "Nikodama," New Media Scotland, 2010.[Online]. Available: http://www.mediascot.org/lefttomyowndevices/ryotakuwakubo/.[Accessed 22 December 2015].