A system of electronic hardware and software that noninvasively tracks the direction of a person's gaze in real time has been developed by researchers at NASA's Jet Propulsion Laboratory in Pasadena, California.
The system illuminates the eye with a low-power infrared LED, acquires video images of the pupil, iris, and cornea in the reflected infrared light, digitises the images and processes the digital image data to determine the direction of gaze from the centroids of the pupil and cornea in the images.
In comparison with other commercial systems, the NASA system is claimed to operate at much higher speed, making it suited for applications that involve human-computer interactions, including typing and computer command and control by handicapped individuals, and eye-based diagnosis of physiological disorders that affect gaze responses.
Systems that rely on standard video cameras are limited to slow, full-frame operation in which the burden of processing the full-frame image data is placed on the host computer. In the JPL system, most control functions and processing of image data are performed by firmware on an onboard field-programmable gate array (FPGA).
The architecture of the system could be used in an affordable, portable, standalone computer-peripheral unit similar to an optical mouse.
UK productivity hindered by digital skills deficit – report
This is a bit of a nebulous subject. There are several sub-disciplines of 'digital skills' which all need different approaches. ...