John Elvesjö, founder and chief technology officer, Tobii Technology
Education
1997-1999 Royal Institute of Technology, Stockholm
Career
- 1998-1999 Researcher/Project Manager Institute for Surface Chemistry, Stockholm
- 2001 Founds Tobi Technology with Mårten Skokö and Henrik Eskilsson
- 1999-2003 Founder and CEO, Jenser Technology, developed instruments and sensors to monitor surfactant levels in processes
- 2011 Board member at consumer electronics form Mutewatch, saw first product through final development
- 2011 Board member at cloud computing firm Xcerion; sold iCloud product to Apple
- 2010-13 Advisory board member at Stockholm Innovation and Growth
- 2015 Board member at Resolution Games
- 2015 Finalist for SME category at EPO European Inventors Award
Mårten Skogö; cofounder and chief science officer, Tobii Technology
Education
1997-2001 Royal Institute of Technology, Stockholm
Career
- 1999 Cofounder of Jenser Technology with Elvesjö
- 2001 Cofounder of Tobii Technology with Elvesjö and Eskilsson
- 2015 finalist in SME category at EPO Inventors’ Awards
The eyes, as the saying goes, are the windows to the soul. It’s not a statement that would stand up to any scientific scrutiny, but nonetheless studying the eyes can tell you a great deal. Notably, being able to tell exactly what somebody is looking at gives vital information about their thoughts at that particular moment.
This is the part of the thinking behind the technology of eye-tracking, the technique that uses optics and processing to determine how somebody’s gaze is moving over images on a screen or any other parts of their field of view. Its Swedish inventors, John Elvesjö and Mårten Skogö, believe it could be only a few years from replacing touchscreens, trackpads or mouse-control as the main way that users navigate their computer displays; it could also help with control of heavy machinery, and greatly assist people with disabilities which affect their hands and dexterity.
”I realised that if we could get a computer to record how my eyes were moving, it could be hugeJohn Elvesjö
Like many technologies, luck played a role in the development of eye-tracking. Elvesjö, a physicist, was working on a project involving following the movements of fruit pulp particles in water as part of an engineering physics programme at the Royal Institute of Technology in Stockholm. He noticed that a sensor designed to follow the particles was, in fact, capable of doing other things: “I accidentally turned the camera around and it picked up the orientation of my eyes. I realised that if we could get a computer to record how my eyes were moving, it could be huge,” he said.
This insight led to him quitting his course and eventually setting up a company to develop and commercialise the technology with two friends: Skogö, who is also an engineering physicist, and computer hardware specialist Henrik Eskilsson. Sixteen years after Elvesjö’s first insight, their company, Tobii Technology, holds key patents for eye-tracking and now has 570 employees, with offices in six countries.
The Tobii eye-tracking system works by combining a set of different technologies. It uses a camera to locate the eyes, then projects a pair of infrared beams onto the face which create patterns on and around the eye. Infra-red sensors in the tracking device detect the reflections of the IR projections; and a series of computational algorithms translate the information on the moving reflections into where the users’ eyes are focused. “We look at the face and the eyes, taking account of corneal curvature and things like that which tell us which way the eyes are pointing in relation to the position of the head, then we have a mathematical model that describes how the person’s face and eyes map onto the computer screen. It gives us that information continuously, even when the person is moving around. That gives us information on the user’s attention, and that in turn gives us insight into their intention,” Skogö said.
“It’s actually very difficult for a computer to track eye movements, and part of the reason for that is that the movements are actually very small,” he added. “Bear in mind that if you’re tracking someone looking at objects on a screen, those objects are themselves small and close together. You also need to compensate for lighting conditions, whether the observer is wearing glasses, and even if they have facial piercings that are in the detector’s field of view.”
Even though most laptops and smartphones, and even desktop monitors, are now equipped with built-in cameras, these are not sufficient; dedicated technology is needed. Elevesjö explained. “In theory you can do eye-tracking with the camera on a laptop but it’s not reliable,” he said. “It could only be relevant if a hundred people use the computer, all looking at the same thing in a controlled environment.”
The Tobii equipment is streamlined for simplicity. “We added the optical specs to design a pattern that it could pick up easily, and elaborate about what is unique about the human eye, so we pick up that and nothing else,” Elvesjö said. “The less we pick up, the easier it is to process the images. Ideally we want to do the calculation on one tiny chip.”
Skogö and Elvesjö believe that giving a device information on gaze fundamentally and instantly makes the device more intelligent. “You always look at something before you interact with it,’ Skogö explained. This might allow a computer to tailor how its deisplay is used. “For a start, it can tell when somebody is loking at it at all,’ Elvesjö said. “If nobody’s there, it doesn’t ned to be switched on; it can conserve energy and switch itself off.” It can also detect whether attention is being paid to ant pop-ups on the screen, and automatically close any that are being ignored.
It can also help to prioritise processing power. “You can render graphics with higher resolution where you’re looking, it can refresh information more quickly where you’re paying attention, for example at stock price information; the price you’re looking at would update more often.”
One of the challenges of developing the system was how vision itself works. “There’s an optical eye and a mental eye and they’re not aligned,’ Elvesjö said. “If you look at something your eye actually moves around all the time; if the eye stops you stop seeing. The Eye is more sensitive to changing conditions. We can’t measure the mental eye, but with a bit of understanding of the way vision works, we can derive the intention.” The gaze tracking hardware, with the camera, IR projector and detector, also houses the processor that detects the gaze point, but the gaze engine, which relates that to what’s on the display, runs on the host computer. “That’s the part which knows whether you’re looking at a hyperlink, for example.”
But computer displays aren’t the only target for Tobii technology. The team has mounted the technology into a wearable pair of glasses, which can for example be worn by shoppers or museum visitors, for example, to detect what catches their eye as they look at a display; it can also evaluate the effectiveness of the placement and design of display labels, or of signage in airports.
Another potential location for the system is in a car. Here, the approximate location of the driver’s head is always known, but gaze can be used to detect when the driver’s attention is straying, if they are getting sleepy, and to check when they are looking at dashboard instruments. Defining sensitive gaze areas, for example on the windscreen, can also be used to control in-car features such as communications or entertainment. “That’s an example of applications where the gaze acts like a third hand, when both of your hands are occupied,’ Elvesjö said. “We’ve done things with radiology, where the user is looking at a scan or X-ray; a surgeon might need to do that with both hands occupied, so looking at the scan can give them access o more information or call up a surgical plan. And with professional musicians, for example, it can follow their eyes down a score and turn the page when they get to the end.” Similar systems are already in use to help handicapped people use computer systems.
Elvesjö thinks the system is already at a point where it could replace a touch-screen or mouse as a way of interacting with a computer. “We could steal a key on the keyboard to use as an eye confirmation button. You could look and click, look and scroll; and if you wanted to use a trackpad for gestures, for example, we can provide that gesture with precision. We can anchor a gesture by pointing with the eyes; you gesture where your hands are and that affects the display. So look at a picture and perform the pinch or expand gesture and it’ll zoom in or out.”
Another possible use is in image or video compression. “If I show you a video or a photo, it can help you compress a movie; you can compress the parts where the viewer is less likely to be looking. If I show a movie to 20 people, that gives a good heat map to where attention is being placed. You compress the other bits. You can do even do it in real-time with gaming, with 4K screens, and wirelessly. That tells you where on the screen you need high resolution and high refresh rate, and frees up processing power for the game itself.”
Tobii holds 60-70 patent families describing the technology. “It’s been a huge task over the past 3-5 years to figure out what eye-tracking would best be used for; Now sensors are small and cheap enough so that they can be used on a standard laptop or a smartphone; but there are all sorts of things you could do; we need to figure out what is actually best to do,’ Elvesjö said.
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...