Toyota, manufacturers of the popular Yaris, Prius and Verso, plans to install mind-reading technology in its vehicles by 2010, so that the key mental states of the driver such as drowsiness, anger or distraction, can be detected and acted upon.
Drawing on work at Cambridge University's Computer Laboratory, the technology identifies 24 feature points on the face and tracks them in real-time. Movement, shape and skin colour are then analysed to identify gestures like a smile, frown or a raised eyebrow. Taken cumulatively, the mental condition of the driver will then be able to be predicted.
Dr Rana el Kaliouby of project partner MIT's Media Laboratory, said: 'My specific work with Cambridge was to develop face spatial analysis that analyses in real-time to associate different facial combinations and configurations to establish relevant underlying states.'
The key mental states the manufacturer is concerned with are drowsiness, as well as cognitive distraction, which can come in four forms - through being lost, if a road is monotonous, auditory distractions such as mobile phone use, and visually when the driver turns to talk to a passenger.
Critically, the vehicle's response to the various mental states is still being considered, but the theory remains that the car will interpret these non-verbal clues and tailor its response accordingly. It is important for the driver, said Kaliouby, that the response is suggestive rather than instructive.
Realistic traffic behaviour
According to Kaliouby, Cambridge is due to start the second phase of testing at the Transport Research Laboratory (TRL) in Berkshire, which will last six months and be sponsored by Toyota, which she said had been impresssed by initial tests. This study will use a simulator to analyse more realistic traffic behaviour rather than using image libraries to train the computer. Following this, a long-term research project with the car maker is planned, where the team hopes to explore a combination of facial recognition technology with physiology sensors, such as skin sensors and heart rate monitors to gain a greater understanding of driver emotion.
Previous programs have detected six basic emotional states - happiness, sadness, anger, fear, surprise and disgust. But the one developed at Cambridge recognises complex states that, although appearing more frequently, are harder to detect as they are conveyed in a sequence of movements rather than a single expression. Most other systems assume a direct mapping between facial expressions and emotion, whereas Cambridge's interprets the facial and head gestures in the context of the person's most recent mental state. This means that the same facial expression may imply different mental states according to the context.
The technology has also interested global oil and gas company Schlumberger, which has approached MIT with an interest in incorporating it into their vehicle fleet. And it isn't just in transport where the technology is relevant. MIT has been working closely with an autism day centre in
Rhode Island to see if the technique can improve human-to-human interaction.
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...