SIGN UP for NEWS
Get our latest press releases by email.

The EyeTracking Blog

Oct 15

Written by: EyeTracking Inc.
10/15/2012 2:05 PM  RssIcon

Gaze-controlled systems have been in the news quite a bit lately. Fujitsu made headlines at CEATAC (the Consumer Electronics Show of Japan) last week when they demoed a prototype tablet that uses eye movements for navigation. In September CNN Tech ran a story about a $30 pair of eye tracking glasses that "opens the door to a new era of hands-free computers, allowing us to use them without a mouse, keyboard or touch screen." Such innovations are certainly impressive, but before we all throw away our antiquated hand-controlled devices and start practicing eye-clicks, let's get some perspective on this application.

History: The technology to control systems with our eyes has been around for about three decades. Since the early 1980s disabled users have benefited greatly from the use of gaze-controlled systems as a means of clicking and typing. As eye tracking has improved, these systems have grown more accurate, less invasive, easier to calibrate and more broadly applied. It's been a life-changing advancement for users with cerebral palsy, spinal injury, Parkinson's, muscular dystrophy and a variety of other disabilities.

So why hasn't gaze-control been implemented in all computing platforms? Well, one reason is that the technology is not small enough, fast enough or cheap enough for a standard computer. Evidently, that barrier is on the verge of being eliminated. There is, however, another reason that you navigated to this blog using your fingers instead of your eyeballs - because it's easier that way. In a digital environment, clicking, swiping and typing are currently the best ways to get from pixel A to pixel B. Why complicate things? Our hands are well-suited to fine motor tasks. Although the idea of controlling the world using only your eyes may appeal to your inner-Jedi, it really isn't the most practical option for able-bodied users.

Obstacles: King Midas thought it would be great if everything that he touched turned to gold, but that didn't work out so well for him. This legend has been adopted by eye tracking researchers to describe a fundamental obstacle of gaze-controlled systems. They call it the Midas Touch Problem. Here it is in a nutshell: the eye has evolved over millions of years to view the environment, not to manipulate it. In a gaze-controlled interface the eye needs to do both of those things. Thus, the system is required to distinguish between (1) gaze intended to gather visual information and (2) gaze intended to activate a specific command. Otherwise, the user finds that everywhere he or she looks - voluntarily or involuntarily - a new function is activated (just like King Midas - Get it?). To combat the Midas Touch Problem, dwell time and blinks are used as clicking modalities in many gaze-controlled systems, but that doesn't really solve the issue. How many times did you blink while reading this paragraph? How many times did you stare at a part of the screen for more than 500 milliseconds? You probably don't know because these actions often occur unconsciously. So now King Midas has some gloves, but they have a few pretty big holes in them. And Midas Touch isn't the only problem. You also have to worry about head box constraints, calibration drift and mechanical issues inherent in all practical applications of eye tracking. Plus, in the computer age our visual system is already over-strained. How will the eye respond to repetitive selection tasks? How long until we have a disorder called Pupil Tunnel Syndrome? All of these factors must be considered when evaluating this technology.

Conclusions: Gaze-controlled systems provide a wonderful benefit to the disabled. They offer the opportunity to read, write, communicate and use the internet to people who would otherwise be excluded from these activities. The smaller/faster/cheaper gaze-controlled systems in today's news definitely represent important breakthroughs for eye tracking as an assistive technology. That said, it's hard to imagine that gaze-controlled systems will replace hand-controlled systems for the population at large. Maybe there is a hybrid arrangement (hands + eye) that could improve digital interactions, but the eye alone does not seem to be the best option. There are just too many complications (accidental clicks, slower dwell-based navigation, accuracy issues, camera problems, eye stress, etc.). If we are indeed on the precipice of "a new era of hands-free computing," we might end up learning the same lesson that King Midas did - Be Careful What You Wish For.