Next-Generation Human-Computer Interfaces (HCIs)
Keyboard and mouse have served as the computer interface of choice for several decades. Although in some applications they may be difficult to replace, for example in document typesetting or graphic design, in other situations their usefulness is limited. Consider a wall-size information display in an emergency command center, urban surveillance headquarters, or scientific visualization facility. An effective interaction with such a display, requires proximity which, to a large degree, precludes the use of a keyboard and mouse. Imagine, however, the convenience of interaction with the displayed content by merely waving a hand or leaning the whole body. Can it be accurate and robust for any user?
In addition to convenience, next-generation HCIs are expected to play a role in cybersecurity. Today, most users elect a short, and therefore convenient, 4-digit passcode on their smartphones and tablets or simple password on their laptops. This, however, can easily lead to security breaches as short passcodes are relatively easy to decipher, especially given smudges left on smartphone/tablet screen. If human body and its movements are to be used for interaction with computers, can they be also used for secure user authentication?
The current research in the VIP group aims at answer these questions. Please see the list of projects and a sample video below for details.
- GestureMouse: A gesture-based human-computer interface with Kinect
- Virtual Keyboard: Tactile alphanumeric data entry using Kinect
- BodyLogin: Hybrid user authentication using full-body gestures
- HandLogin: Hybrid user authentication using hand gestures
- DeepLogin: Deep neural networks based user authentication using full-body or hand gestures