Next-Generation Human-Computer Interfaces (HCIs)

Virtual Keyboard developed by MS student Feitong Yi using Kinect camera (see here for details).

Keyboard and mouse have served as the computer interface of choice for decades. While in some applications they may be difficult to replace, for example in document typesetting or graphic design, in others their usefulness is limited. Consider a wall-size information display in an emergency command center or scientific visualization facility. An effective interaction with such a large display, requires proximity which, to a large degree, precludes the use of a keyboard and mouse. Imagine, however, the convenience of interaction with the displayed content by merely waving a hand.

In addition to convenience, next-generation HCIs are expected to play a role in cybersecurity. Today, most users elect a short, 4-digit passcode on their smartphones or simple password on their laptops. This, however, can easily lead to a security breach as short passcodes are relatively easy to decipher, especially given smudges left on smartphone screen. If human body and its movements are to be used for interaction with computers, can they be also used for secure user authentication?

The current research in VIP lab aims to answer such questions through various projects:

  • GestureMouse: A gesture-based human-computer interface with Kinect
  • Virtual Keyboard: Tactile alphanumeric data entry using Kinect
  • BodyLogin: Hybrid user authentication using full-body gestures
  • HandLogin: Hybrid user authentication using hand gestures
  • DeepLogin: Deep neural networks based user authentication using full-body or hand gestures