There are many exciting new research directions in the field of Human Computer Interaction (HCI) that are aiming to actively enhance the user’s experience. This often culminates in nifty interfaces both on screen and as input devices. There are radical proposals rethinking computer concepts as standard as keyboard + mouse or menu bars and windows. Many of these new ideas are about making changes to the experience that are inherently drastic or obvious.
I’ve been thinking lately about more subtle changes to the computer experience. What if your PC could recognize that you are angry, sad or happy? How should it change its behavior or presentation? Could you PC even suggest what you should be doing with your computer?
An example I keep thinking about is the computer in a standard elevator. The user pushes the up or down button and waits for the elevator to arrive.
An easy way for the elevator’s computer to recognize that it has a hurried or stressed-out user is to watch if he hits the button over and over until the elevator arrives.
How could the elevator use this information to better enhance the users’ experiences? Could the overall user experience be improved?
The change in such a case should probably be subtle. People entering a new building probably do not want to learn how to use an entirely different elevator system. Instead the elevator could change its behavior slightly depending on the current users and input.