A team at Bristol University is working to develop technology that would enable disabled people to control a wheelchair using tongue movements.
A US company called Think-A-Move (TAM) has made some headway in this area, but its former director of engineering, Dr Ravi Vaidyanathan, now a biodynamics lecturer at Bristol, hopes to advance the technology by improving signal-processing algorithms, making it easier to use.
Tongue motion is often one of the only movements available to seriously disabled people, such as those who have suffered spinal injuries or severe strokes.
In existing TAM technology, a small microphone is inserted into the ear, which picks up vibrations created by air pressure changes caused by the tongue. these signals are processed and converted into computer commands.
'The technology is not quite ready for clinical use without a lot of work with the patient,' said Vaidyanathan. 'The major issues are that, first, it has to be more universally accessible. ideally, it should be as easy to calibrate as speech recognition, for example, so that you can use it without having a person sitting there programming the signals.
'Second, there has got to be something more interactive in terms of teaching you how to use it. third, we will probably develop a lot of new algorithms to understand the signals.'
To customise the existing technology, Vaidyanathan said a scientist would have to sit down with a person for a few hours while they practised making signals, before going away to program the software. He hopes to develop self-tuning calibration algorithms that will be able to distinguish tongue movement signals from other sounds heard in the ear.
'A huge part of the work is going to focus on algorithms for disturbance rejection,' he said. 'Your ear canal is a very noise-ridden environment. We need to separate what a signal from your tongue looks like as opposed to other signals in the body. Ideally, you could just put the earpiece in one ear, maybe go on to the web, download the program, hit a few buttons and it will figure out the difference between your signals.'
Tongue signals have been found to last 0.2 seconds and, working with Prof Lalit Gupta of Southern Illinois University Carbondale, Vaidyanathan has developed algorithms that can identify the control tongue movements by observing 0.2-second time windows and looking at specific instances when they might be different from other tongue movements.
Four distinctive movements have been found to work for most people: touching the tongue to the lower left side of the mouth and gently flicking it up; touching the lower right side of the mouth and flicking it up; touching the centre bottom of the mouth and flicking it up; and then flicking the tongue gently across the top palate.
'Rather than come up with four more moves, which might make it confusing to learn, what we can do is repeat any one of the four in quick succession, giving you eight moves,' said Vaidyanathan. 'Any one of those eight together will provide 64 possible combinations. At that point, we have subsumed the alphabet and you could even type.'
As well as developing mathematical algorithms, the Bristol researchers will make hardware improvements, such as a more sophisticated earpiece to get better signals, and a device that will allow for wireless communication.
The team will also explore the relative merits of a generic or customised earpiece.
'Another task is to design some hardware for wireless transmission because right now you have an earpiece with a wire, which you plug into a computer. The chief reason for this is that we might be able to differentiate signals better. If you have a wire coming out of your ear, for example, if that wire gets hit, you get interference, so it might be a cleaner signal if it was wireless.'
Vaidyanathan plans to follow up the project by adapting the system to manipulate a prosthetic arm.
'If someone is trying to control a prosthetic arm using devices that are commercially available, you flex some muscles in your back,' he said. 'This causes some movement, which moves the arm.
'Maybe we will have a system to form a hybrid with an existing system — for example, a prosthetic arm that takes some electrical impulses from your shoulder to direct the arm, but lets your hand actually pick up a glass of water. To do that, you just move your tongue a little bit.'
The secret life of a London Music Hall
Does anyone know when electric lighting was first used in Wiltons. I presume it was installed on the stage first and then backstage later? Or was it...