A German research project has developed a system that could allow hands-free devices to be controlled more accurately by users' eye movements.
The Eye-Controlled Interaction (EYCIN) system, developed at the Fraunhofer Institute for Industrial Engineering in Stuttgart, could lead to software applications for disabled people, or professionals, such as maintenance technicians or chefs, who need both hands free to carry out their work while accessing information.
Eye tracking systems for operating hands-free devices have been available for some time, but integrating them with a computer graphical user interface (GUI) has proved problematic. Involuntary eye movements can be translated into mouse pointer movements, while a fraction of a second too long spent reading the label of an on-screen button can be interpreted as a mouse click.
But by combining engineering and psychological knowledge, the Fraunhofer Institute researchers have produced software that accurately interprets pupil movements through an easy-to-use, Windows-style interface.
EYCIN consists of a monitor linked to a camera that tracks the eye, using optical recognition software to detect the pupil. An on-screen pointer follows eye movements and when the user glances at a designated area for half a second, EYCIN interprets this as a mouse click.
Dr Fabian Hermann, a human-computer interface researcher with a background in psychology, is a usability engineer on the project. 'The project originated from a German industrial company that funded research into a visor and controls to operate a GUI with the eyes,' he said. 'It has since been developed into other areas.'
'Although the first papers we produced concerned hands-free applications for maintenance engineers, EYCIN could also be used for interaction in the home,' Hermann added.
'One future application would be controlling multimedia home entertainment, or a room's lights. You could even control a kitchen display showing recipes or other information when your hands are busy or covered in ingredients.'
EYCIN computes where the eye is by triangulation and uses the co-ordinates like a mouse input to a screen. The user selects a control by looking at a functionally-enabled, on-screen button area for half a second. The button then changes colour twice before going "click", showing the user that a selection has been made.
One problem the EYCIN team encountered was that standard Windows interfaces tend to have the text for a hotlink or button on the selection area, so if the user spends a little too long reading that text, it is automatically selected.
'We call it the Midas problem,' said Hermann. 'With the mythical Greek king, everything he touched turned to gold. With early versions of our system, everything you looked at was selected.
'We needed to develop guidelines for GUI design to avoid such errors: for example, do not produce click areas with information on them. By separating information, such as labels, from selection areas, you avoid this problem.'
The entire screen layout had to be considered. GUI systems are more accurate in the centre than towards the edges, so 'we have a guideline to design screens with the smaller buttons in the centre,' said Hermann.
Another interface problem the researchers had to address was microsaccades - miniature jerk-like movements made by the eye. They used a statistical average filter, which buffers co-ordinates, giving smoother movement of the eye. 'We tried several thresholds and latencies and settled on the correct one by statistical aggregation,' said Hermann.
A lack of available hardware has limited EYCIN's commercial applications. 'We use a standard commercial eye-tracking system with an infrared camera limited to a range of 1-1.5 metres,' said Hermann. 'But we hope to get the camera tracking a wider field for domestic use.'
The team investigated the possibility of combining the eye-operated control with speech recognition. 'You could look at an object and say "Open",' said Hermann. 'It's interesting, but there are problems with latency [the delay between computing visual and audio information], which introduces too many errors.'
Hermann said a marketable version of EYCIN is one or two years away. Applications for the final version will depend on how much funding the team receives, but he believes EYCIN could have important uses for people with paraplegia and other conditions that limit computer use.
'One major European project is focussing on technology interfaces for disabled people. We hope in the course of this project to bring in eye-based interaction,' he said.
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...