Monday, September 27, 2010

Analysis on Topobo

As defined by Van Dam, post-WIMP interfaces are interfaces "containing at least one interaction technique not dependent on classical 2D widgets such as menus and icons" [1]. I guess that makes Topobo a post-WIMP user interface. Topobo is a 3D constructive assembly system with kinetic memory developed by Tangible Media Group at the MIT Media Lab. As a modeling system, Topobo contains both physical inputs and outputs. It allows users to "directly manipulate objects rather than instructing the computer to do so by typing commands" [2], thus incorporating one of the post-WIMP interaction styles - Direct Manipulation. A user's ability to interact with Topobo is not hindered by any technical thresholds associated with programming physical models to move. Instead of writing up codes to make the Topobo blocks move, users can employ the pre-existing naive physics concepts to it. By drawing upon "some of humans' most basic knowledge about the behavior of the physical world" [2], users are able to manipulate the Topobo structures into recreating its previous movements. Instead of being limited by their abilities to express desired movements in programming languages, users can directly interact with the pieces themselves to generate the desired movements. 
However, there're certain tradeoffs in that comes along with implementing an interface that relies upon direct manipulation. The active blocks of Topobo embedded with kinetic energy only allows the duplication of movements physically created by the user. The designers at the Tangible Media Group made a tradeoff between reality and versatility. The range of motion of the Topobo piece and the speed at which it operates are solely dependent upon the physical input done by the user. If the user wishes to have one section of the model spin in a continuous fashion for a set amount of time, he or she would have to physically rotate that section of the model for the said time in order to reach the desired movements for the model. However, if the user was using a traditional GUI where all the movements were programmed into the Topobo blocks, then physically difficult to achieve movements such as continuous spinning or extremely fast rotation are easily accomplished with a simple line of code. The designers have favored a reliance on direct manipulation of the Topobo pieces applying naive physics. 

According to the definitions of TUIs stated by Kenneth P. Fishkin in his A Taxonomy for and Analysis of Tangible Interfaces, a "2D taxonomy is fruitful, one that uses as its dimensions embodiment and metaphor" [3]. So for Topbo, it can be categorized as a full embodiment user interface, where "the output device is the input device; the state of the device is fully embodied in the device" [3]. As an assembly system with kinetic memory, any changes/interactions done on the Topobo blocks are reenacted on the blocks themselves. If you rotate one section of the block by 30 degrees, the blocks will rotate the same section by the same degrees. Both the input and the output in this care are physical. 
In the Topobo assembly system, users manipulate and change the blocks on the structure, and their motions are recreated right back at them with the same blocks. This is a type of "Really direct Manipulation". There's no metaphor needed to describe the input and output process of using this TUI, and thus it can be categorized as a full metaphor. 

References:
1. Van Dam, A. Post-WIMP user interfaces. Commun. AC., 40 (2). 63-67.
2. Jacob, R., Girouard, A., Hirshfield, L., Horn, M., Shaer, O., Solovey, E., and Zigelbaum, J., Reality-Based Interaction: A Framework for Post-WIMP Interfaces. Proc. CHI '08, ACM     Press 2008. 
3. Fishkin, Kenneth. A Taxonomy for and Analysis of Tangible Interfaces. Pers Ubiquit Comput 2004. 8: 347-358. 

Superhero Proposal

Chameleon Suit


So for the TUI project, our group (Team BAMFness) decided to do a Chameleon Suit. 
We live in such a fast-paced world right now, and every second is extremely valuable. There's also an infinite amount of social rules and expectations that we have to follow/meet in our everyday lives, especially in the area of attires and appearances. So to waste even a couple of minutes changing outfits can no longer be seen as acceptable. 



The Chameleon Suit can address this problem. 
By using either a LED display or a projection system, we the team decided to have the suit change its appearances upon command/environment. 
~While on the way to an important business meeting, the unitard suit receives notification from the subject's mobile phone's calendar/synched schedule program about the upcoming meeting.It will project/display a corresponding pattern unto the suit to allow the subject to look presentable for the meeting on time. 
~At a party, sensors within the unitard suit detects high noise level (i.e. music), increased temperature due to people's body close together while dancing, and all other types of environmental cues. Upon receiving information from the sensor, the suit will alter itself to a fashionable party outfit. 
~By using a specific hand gesture/body posture, the subject can activate a color/pattern picking function on the suit. Placing the palm of his/her hand upon a specific color/pattern will allow the camera in the suit to pick up that design and transfer it onto the suit. BAM! Instant fashion statement!
~There can also be a specific number of pre-loaded outfits in the suit itself. By doing a swiping motion across the body while the sensor is activated, the subject can quickly change his/her outfit upon desire. 


This Bubelle Emotion Sensing dress developed by Philips Design gave us the idea that our Chameleon Suit can spontaneously alter its appearance based on different types of inputs. 

The Beat Dress is also very cool. A project created by the students of Malmo University in Sweden, the LED lights within the dress responds to the beats in music. 
There're just so many different possibilities that can be employed for the Chameleon Suit. We are also exploring a method to reproduce invisibility. So far we've came across similar effects done by people at the University of Tokyo, where they used ultra-reflective clothing coupled with a camera-projection system to trick the eyes into seeing "invisibility". 



Team BAMFness believes that the existing technology will enable us to successfully duplicate the proposed "superpower" of the Chameleon Suit. 
Tiny pico projectors such as the 7mm Explay Projetor Engine can be both discrete and powerful at the same time, and products similar to it can definitely be embedded into the suit. 


There're still a lot of technicality issues to be worked out, but hopefully Team BAMFness will be able to come up with a comprehensive draft plan for tackling this project soon! XD