Monday, April 16, 2012

Duality by ART+COM


“Duality” is a reactive environmental installation in the city center of Tokyo, created by the Berlin-based media designers at ART+COM. The artwork was realised in January 2007 and is a permanent installation. The boundary between a walkway and an adjacent artificial pond was chosen as the location for the work. This interface between “liquid” (water) and “solid” (land) was thematically used and augmented by the question of “real” (water ripples) and “virtual” (artificial light waves). Passersby trigger the installation that interplays between solid and liquid, virtual and real, light and water:



Their footsteps generate virtual waves that transform to real water waves in the pond. Intended as a playful moment to enrich the commute, or to surprise the unexpecting, the installation proposes a different way of integrating media in public space. The installation is located outside an office building complex in central Tokyo, which is linked to a highly frequented subway station. The objective of the artwork was to evoke stronger identification of commuters and accidental visitors with the place. Using translucent glass floor to diffuse monochromatic LED matrix the ART+COM designers defined a unique aesthetic, different from standard displays. Making the installation interactive, reacting to the passersby's footsteps, they challenged the expectations of the behavior of public displays. They took a step further by extending the waves as physical motion in the adjacent pond. The original concept was inspired by the dual nature of light, the so called "Wave-Particle Duality," but through the development process, the immediate playfulness and challenging the expectations became at least equally important to the final realization.


The installation aims at creating an identity of the space. Pedestrians become aware of the space that they would usually cross without paying much attention to. It’s a beautiful example of how spaces can adapt depending on the people within them. Most of those concepts so far had an impact on the visceral and behavioral level of processing (Emotional Design: Why we love (or hate) everday things – Donald Norman). Think about changes in temperature or lighting to make someone feel more comfortable. Duality has an impact on the reflective level. It makes people have a moment of contemplation when they don’t expect it.

New research on emotion and cognition has shown that attractive things really do work better.In recent years, the design community has focused on making products easier to use. Design experts have vastly underestimated the role of emotion on our experience of everyday objects.Emotional Design analyzes the profound influence of this deceptively simple idea. 

Donald Norman lists the following 3 levels of design based on emotion:
  • Visceral Design (evolutionary responses)
  • Behavioral Design (bodily activity)
  • Reflective Design (mental activity)
Duality probably utilizes the Behavioral Design level and detects physiological changes based on body motion that are Non-Stylized.

In the future, will inanimate objects respond to human emotions? Is it possible to create emotional robots?

References:

Sunday, April 15, 2012

Gaming User Interface
In the past decade or so, most gamers would agree that gaming experience had improved vastly. Not only in terms of graphics or game play, the gaming interface has seen big changes as well. I'm sure many had seen or at least tried playing one of these before:
Nintendo Game Boy
Nintendo Game Boy Color
Nintendo Game Boy Advance
Notice how much the game boy series changes over time. From a non-colour device to a coloured device, to a differently shaped device. The change in shape is to cater for a better grip for users, the addition of colours is straightforward.
Sony Playstation
Sony Playstation 2
Sony Playstation 3
As you notice, the controller design remains the same, indicating a sign of consistency. However, the difference is in that the latest instalment of the PlayStation series has a wireless controller for the convenience of users. Speaking of convenience, the Xbox kinect doesn't even require a controller.



It works by hand gestures and voice recognition. This makes iterating on the interface much easier for users. Slick? Absolutely. Judging from the changes seen, one would not be surprised if the next generation of gaming interface would involve some brain computer interface.

Sunday, April 8, 2012

Yet Another Bad Interface

On one of my recent visits to Tan Tock Seng Hospital, I noticed that there were electronic gates like the ones we see at the train stations to restrict the number of visitors during visiting hours. Each patient is restricted to at most 4 visitors. Hence, in order to ensure that only a maximum of 4 visitors are at the patient's ward, the visitors have to first register manually using a touch screen interface computer.

Problems
The user will first have to scan their NRIC using the bar code scanner. Being a user myself, I noticed many people were having trouble scanning their NRIC. Firstly, it was not clear how and where we should position our identity cards under the scanner. Although there was a picture indicating how it should be done, the majority of first timers still failed the NRIC scanning process.

Next, the user has to input the ward number, bed number and patient's name. This was done using a touch screen keyboard on the screen as seen below:


The 3 input fields were clear and concise, it even has examples by the side of the text field. However, the main problem here is the keyboard. Notice that it is not the normal keyboard that we usually use at home. This touch screen keyboard is arranged in alphabetical order. My family and I had trouble typing the patient's name into the text field. Coupled with the scanning problems, the time taken to register was simply not ideal. Imagine if one user stays at the machine for so long, there would be unwanted long queues just to register.

Thoughts & Reflections
I was amazed that even hospitals are starting to use technology to tackle their problems. However, more can be done to improve the current human computer interface. The scanning problem can be improved by playing a short video clip of the process of scanning the NRIC. The touch screen keyboard just has to be replaced by the normal keyboard most people are using.

When I was in the ward, I noticed there is another touch screen interface right next to the patient's bed. As it is for the nurse or doctor to use, I had no idea what it was for. However, something caught my eye. It was the keyboard on the screen. It was completely different from the one I saw earlier on, as seen below:


The first thing that came to my mind was, where is the consistency? It makes no sense to have a certain keyboard for one machine, and to have a different keyboard on another machine, in the same hospital. Perhaps they were different machines made by different companies? Perhaps the hospital feels that visitors who does not use computers will find it easier to key in the names with a keyboard made in alphabetical order? The list of questions goes on and on..

Monday, April 2, 2012

CLI versus GUI

What is CLI?
CLI is short for Command Line Interface. It is an interface or dialogue between the user and a program, or between two programs, where a line of text (a command line) is passed between the two. The commands are stored in the graphical shell or in files like the registry or the OS/2 os2user.ini file. A CLI is used whenever a large vocabulary of commands or queries, coupled with a wide (or arbitrary) range of options, can be entered more rapidly as text than with a pure GUI. This is typically the case with operating system command shells. CLIs are also used by systems with insufficient resources to support a graphical user interface.

Screenshot of a sample Bash session


What is GUI?
GUI is short for graphical user interface. It is a type of user interface that allows the users to interact with the electronic devices with images rather than text commands. A GUI represents the information and actions available to a user through graphical icons and visual indicators such as secondary notation (visual cues which are not part of formal notation - properties like position, indentation, color, symmetry) as opposed to text-based interfaces, typed command labels or text navigation. GUI has greatly benefited from the concept of Direct Manipulation.

Screenshot of a sample GUI system


CLI versus GUI
There are many experts who claim that CLI is much faster and easier as compared to GUI and there are an equal number of experts who claim otherwise. Given below is a comprehensive comparison between the two different types of interfaces:
1. Ease:
  • CLI - New users find it a lot more difficult due to the need for familiarity and memorization of the commands
  • GUI - Although new users may find it difficult to navigate using the mouse in the initial stages, it is found that the users pick this up a lot faster
2. Control:
  • CLI - Users have much more control over their file system and operating system
  • GUI - Often advanced or experienced users who need to perform a specific task may have to resort to the use of CLI for this purpose due to the limited potential to offer control as part of GUI
3. Multitasking:
  • CLI - Capable of multitasking, but do not offer the same ease and ability to view multiple things at once on one screen.
  • GUI - The concept of having windows allows users to easily view, control and manipulate multiple things at once and is usually much faster than CLI
4. Speed:
  • CLI - Due to the input being limited to only a keyboard and a minimal set of commands, an advanced CLI system will essentially get a specific task completed faster than an advanced GUI system
  • GUI - Using a mouse and keyboard to navigate through several steps to control the operating system for many things is going to be much slower
5. Resources:
  • CLI - A computer that uses only CLI takes up much less resources
  • GUI - Requires a lot more system resources because of each of the elements that need to be loaded such as icons, fonts etc. In addition, video drivers, mouse drivers and other drivers that need to be loaded will also take up system resources
6. Scripting:
  • CLI - User can easily script a sequence of commands to perform a task or execute a program
  • GUI - Enables a user to create shortcuts, tasks or other similar actions to complete a task or run a program. But does not come close in comparison to what CLI offers.
7. Remote Access:
  • CLI - Often when accessing another computer or networking device over a network, a user will only be able to manipulate the device or its files using CLI or other text only manipulation
  • GUI - Although remote graphical access is becoming popular and is possible, not all computers and especially network equipment have this ability.

In-spite of its many merits, CLI has to this day come to be side-lined by GUI. However, CLI still has much to offer us, and many of its benefits simply cannot physically be emulated or even replaced by graphical ones. CLI and GUI have come to co-exist and can be used simultaneously to facilitate the completion of tasks that vary in requirements.

References:

Sunday, April 1, 2012

Muscle Computer Interface

MUCI


A combined effort between Microsoft Research, University of Washington, and University of Toronto has made the possibility of interacting with computers with nothing but your muscles a reality. In 2008, the researchers unveiled their muscle computer interface, abbreviated MUCI.  The hardware component of MUCI consists of an armband that the user attaches to their forearm. The armband uses six electromyography sensors (EMG) and two ground electrodes arranged in a ring around a person's upper right forearm for sensing finger movement, and two sensors in the upper left forearm for recognizing hand squeezes.

Example of a gesture recognized by MUCI

MUCI allows users to interact with computers and other devices without requiring the use of their hands. Though there are alternative hands free interaction systems such as voice control and camera based systems, these are vulnerable to inaccuracy and have privacy issues.

There are existing products such as prosthetics that rely on detecting muscle activity, but MUCI is the first commercial application. Unlike the electrodes used with prosthetics, user's do not have to worry about placing MUCI's electrodes on an exact position on their arms. After slipping the armbad on, MUCI's software will undergo a set of calibration exercises to recognize the position of the electrodes and to understand the user's movements. The calibration exercises rely on machine learning algorithms that improve in accuracy with time. The algorithms use three main components of data from the electrodes: the magnitude of muscle activity, the rate of muscle activity, and the wave like patterns that occur across sensors. These three components provide sufficient data to discern the type of muscle movement that the user is exerting. Preliminary testing on 10 subjects revealed that after calibration, the system has accuracy rates of up to 95% in recognizing movement of all 10 fingers.

Potential Applications
There are a number of potential applications for this technology, including the following:
  • Opening car trunk with groceries in hand - When holding grocery bags in both hands, it can be extremely difficult to access the car keys and open the trunk. MUCI can alleviate this problem by allowing the user to open their trunk by completing a simple gesture such as touching two fingers. 
  • Controlling an MP3 player while jogging - It can be awkward and time consuming for a user to take an MP3 player out of their pocket and change the song, increase the volume, etc. These actions can often force the user to stop jogging and stand stationary, something that is undesired. MUCI can allow the user to easily control their MP3 player while remaining in motion. 
  • Accepting/ending phone call when driving - Having to reach for a phone and fumble for the small accept button can be an and inconvenient and dangerous task while driving. MUCI can allow the user to accept and end their calls without lifting their hands off the steering wheel.
  • Playing video games such as guitar hero - As demonstrated in the video below, MUCI also has entertainment applications. Users could use MUCI as a controller in games such as Guitar Area, where their actions with an imaginary air guitar would be interpreted by the system. 

It should be noted that muscle computer interfaces are still very much in the research phase. Researchers are testing how well they work in real world scenarios, such as when people walk and run while wearing it. Future plans include creating arm bands that are easier to wear and that can be camouflaged as jewelry or an article of clothing. 

Sources
http://www.newscientist.com/article/dn13770-hightech-armband-puts-your-fingers-in-control.html
http://www.technologyreview.com/computing/23813/page1/
http://www.popsci.com/technology/article/2009-10/muscle-based-interface-lets-you-literally-point-and-click-no-mouse-required

Thursday, March 29, 2012

Tangible User Interfaces

What are Tangible User Interfaces?
Tangible user interfaces are user interfaces that allows a user to interact with digital information using physical objects. They consists of 4 characteristics such as:

1. Physical representations are computionally coupled to underlying digital information

2. Physical representations embody mechanisms for interactive control

3. Physical representations are perceptually coupled to actively mediated digital representations

4. Physical state of tangibles embodies key aspects of the digital state of a system.

Examples of Tangible User Interfaces

Computer mouse
Although we use a computer mouse everyday, many of us do not realise that this device is actually an example of a tangible user interface. The user drag the mouse on a flat surface to move the pointer on the computer screen. The direct relationship between the movement of the mouse and pointer on the screen has allowed the user to operate the computer easily.



Microsoft Surface
Microsoft Surface is a system that is designed to look like a table and has a multi-touch display which allows many users to use it at the same time. It has the ability to detect objects that are placed on it and provide users with many functions to manipulate these objects such as transferring photos via different devices. The video below shows how Microsoft Surface can be used.


Reactable
Reactable is a musical instrument designed to create and perform music. It is a clear and glowing round table with plucks placing on its surface. The users are able to turn the plucks and connect them to other plucks to create music with different elements such as synthesizers, effects, sample loops and control elements. When the pluck is placed on the surface, the pluck lights up and interact with other plucks. Music becomes tangible with Reactable as the user is able to see these interactions on the surface. The video below shows the usage of Reactable.



Tangible User Interface Alarm Clock (TUI-AC)
TUI-AC is an innovative alarm clock which consists of a ball and a pull-ring. The user set an alarm by pulling the ring out of the ball and throw it like a grenade. The pull-ring contains a sensor which measures the distance between the ball and itself. The alarm is louder when the ball is thrown further away from the pull-ring. When the alarm rings, the user needs to get up from bed to find the ball and insert the ring into the ball in order to switch off the alarm. TUI-AC is very useful for people who have difficulty waking up every morning.


References:

Monday, March 26, 2012

Understanding Brain Computer Interface

Have you ever wanted to communicate with others or move objects using only your mind just like the characters you saw in the movies? Isn't it great to be able to do it? This is now becoming a reality with the development of Brain Computer Interface (BCI). BCI is defined as a system of interaction between the brain and a device.

How BCI works?
A set of electrodes called electroencephalograph (EEG) is attached to the scalp or implanted onto the specific brain surface to receive better and more accurate signal. The EEG measure differences in the voltage between brain cells. This signal is then magnified, filtered and read by a software. BCI works in reverse for its input to the brain. The signal such as video is converted into voltages that are sent to the EEG and activate the brain cells. The person will then receives the signal of the video.

Application of BCI
One of the applications of BCI is for entertainment. The mind of a user can be used as a controller to control a video games or replace a remote control to change the channels of television. An application for entertainment is the release of a toy called "Star Wars Force Trainer". It uses a headset to detect concentration signal from the user's brain. When the user concentrates, the headset receives the signal and transmit it to a microchip that switches on the fan and lift up the ball inside the clear tube.



Another application of BCI is the implementation of devices that can help disabled people to live normally. The disabled person can use his mind to control such device to overcome his physical difficulties. He first visualises an action with a headset attached to allow the software to learn the brain signals. After a few tries, the user thinks about the action to transmit the brain signals to the device which will read the signal and execute the action. Some examples of such devices are robotic arm and mouse cursor.

Here is a video of Tan Le, co-founder and president of Emotiv Systems, showing how a user controls the computer with his mind using BCI.


Limitation of BCI
There are still challenges when implementing BCI.

1) Complexity of the brain
The electric signals from the brain do not totally determine a person's thought and action. There are also chemical process that the EEG cannot read.

2) Weak signals received by the EEG
These brain signals are so weak and small that they are easily interfered by signals generated by other actions.

3) Inconvenience of BCI equipments
Some BCI need a wired connection to their equipments. Although there are BCI that are wireless, they still require the user to carry a computer around. 

These challenges can be overcome with research and development of BCI. The EEG can be improved to receive better brain signals and the equipment of BCI can become wireless and lighter. With these challenges being overcome, I believe that this technology will be able to benefit us in making our lives more convenient in future.

References: