By attaching a microphone to a touchscreen, the CMU scientists are said to have shown that they can tell the difference between the tap of a fingertip, the pad of the finger, a fingernail and a knuckle.
According to a statement, this technology, dubbed TapSense, enables richer touchscreen interactions.
While typing on a virtual keyboard, users might capitalise letters by tapping with a fingernail instead of a finger tip or switch to numerals by using the pad of a finger, rather toggling to a different set of keys.
Another possible use would be a painting app that uses a variety of tapping modes and finger motions to control a pallet of colours or switching between drawing and erasing without having to press buttons.
‘TapSense basically doubles the input bandwidth for a touchscreen,’ said Chris Harrison, a PhD student in CMU’s Human-Computer Interaction Institute (HCII). ‘This is particularly important for smaller touchscreens, where screen real estate is limited. If we can remove mode buttons from the screen, we can make room for more content or can make the remaining buttons larger.’
TapSense was developed by Harrison, fellow PhD student Julia Schwarz and Scott Hudson, a professor in the HCII.
‘TapSense can tell the difference between different parts of the finger by classifying the sounds they make when they strike the touchscreen,’ Schwarz said. An inexpensive microphone could be readily attached to a touchscreen for this purpose.
The researchers found that their proof-of-concept system was able to distinguish between the four types of finger inputs with 95 per cent accuracy and could distinguish between a pen and a finger with 99 per cent accuracy.
UK productivity hindered by digital skills deficit – report
This is a bit of a nebulous subject. There are several sub-disciplines of 'digital skills' which all need different approaches. ...