Scientists at the Fraunhofer Institute for Applied Information Technology have developed a system that detects hand and finger positions in real time and translates them into responding interaction commands.
The prototype system tracks a user’s hand in front of a 3D camera, which uses the time-of-flight principle. In this approach, each pixel is tracked and the length of time it takes light to be filmed travelling to and from the tracked object is determined. This enables the calculation of the distance between the camera and the tracked object.
Georg Hackenberg, who developed the system as part of his Master’s thesis, said: ‘A special image analysis algorithm was developed that filters out the positions of the hands and fingers. This is achieved in real time through the use of the intelligent filtering of the incoming data.’
Alex Deeg, a spokesman for the Fraunhofer Institute for Applied Information Technology, said there are a number of other systems on the market that have some, but not all, of the same properties.
‘The Microsoft Surface table allows people to interact with objects and content; however, this is only two-dimensional,’ he said.
Deeg added that Microsoft’s upcoming controller-less gaming technology code named Project Natal only detects full-bodied movements and not track finger position.
‘Therefore, it does not allow for such fine-grained interactions as the system that has been developed here,’ he said. ‘It should also be noted that while our system only supports hand-gesture-based interaction, it is entirely possible that it could be extended to cover full-body interaction.’
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...