Haptic technologies are advancing rapidly, in tandem with virtual reality and as standalone devices. Andrew Wade reports.
Our sense of touch is so integral to our existence that it’s difficult to imagine a world without it. Unlike vision and hearing, we can’t easily mask it with blindfolds or earplugs, or dull it as we can our sense of smell by holding our nose. Touch is an omnipresent function that completely envelops us via our skin, providing a layer of protection from our surroundings while at the same time enabling us to interact with them.
For centuries, sensory technology has focused largely on sight and hearing, the twin pillars that form the basis of communication. But the science of haptics, or kinaesthetic communication, is undergoing something of a revolution. The emergence of smartphones and proliferation of touchscreens have brought the technology into the mainstream. Now, the rising popularity of virtual reality (VR) and augmented reality (AR) is fuelling rapid advances, with sectors including healthcare and robotics discovering its potential across a range of applications.
“Our sense of touch is absolutely fundamental,” Dr Alastair Barrow, director of Generic Robotics, told the audience at a recent RAEng haptics seminar. “It’s the first sense to develop in the womb and it’s an ever-present always-on protector.”
Barrow, who has a PhD in cybernetics, co-founded Generic in 2013. He has more than a decade of experience in VR, haptics and robotics, and has collaborated on a number of medical training simulators for different branches of surgery, as well as procedures such as catheterisation and hernia repair.
“It’s no surprise that haptics in healthcare is a huge topic,” he said. “However, the number of applications where haptics is being beneficially used in healthcare right now is very, very small.”
As Barrow’s body of work suggests, surgery simulation is one of those areas. Currently, surgeons learn predominantly from theory, observation, cadaver and close-monitored patient trial, where senior colleagues oversee their work. A refined sense of touch and hand-eye coordination is obviously vital.
“When you’re practising to be a surgeon you need to develop an incredible array of abilities,” said Barrow. “You need to be able to have great academic knowledge, decision making, you need to look good in blue! One really important aspect is this close association of feedback between the sense of touch and dexterous motion…so we can use haptic devices to simulate
doing procedures.”
“We’re starting to look at using actual scans of real patients in simulation, so that a surgeon can practise doing a real procedure in simulation before they do it on a real person.”
There’s a pretty clear incentive for innovation here; the more accurate that simulation can become, the more likely it is we wake up from real-life surgery. And while major strides are being made, haptic surgical simulation is not yet commonplace. However, according to Chris Scattergood, co-founder of Fundamental VR, many surgeons are improving their skills using more orthodox technology.
“Surprisingly, a lot of surgeons will actually refine their skills using YouTube,” he said. “If you go to YouTube there’s about 170,000 [surgery] videos on there, and we’ve met senior consultants who have learned an entire procedure by watching YouTube. Once they are then confident that they can perform it, they say they’re confident, and they go and do it.”
Like Barrow, Scattergood is operating at the crux of VR and haptics, with a particular focus on healthcare. Fundamental VR works with medical device manufacturers, pharmaceutical companies and hospitals in the UK and the US, and is an official development partner for Microsoft’s Hololens AR device.
The company’s FeelReal VR platform uses devices such as the HTC Vive and Oculus Rift to simulate surgical environments. Once immersed, haptic feedback mimics incisions, injections and other procedures, while proprietary software maps and calibrates over 20 different tissue types, such as tight and loose skin, sub-cutaneous fat, cartilage and bone.
"For each one of those there are different values,” Scattergood explained. “Whether that’s initial resistance, the feel across the top of it, the pop – the amount of pressure you need to put through. We’ve mapped all of those into a system.”
Tracking the incisions
A surgeon overseeing multiple juniors practising on cadavers needs to physically monitor the procedures carried out. But software cannot only simulate the feel of various tissues, it can also track exactly what the scalpels and syringes are doing. In combination with VR and haptics, the technology can help sort the Dr Christiaan Barnards from the Dr Nick Rivieras.
“For the first time we’ve got measurable feedback that allows us to see how well somebody’s doing and how fast they’re learning,” said Scattergood.
It’s not just surgical procedures where haptics are impacting healthcare. Tactile experience is closely linked to emotional development and wellbeing, and the prospect of haptic treatments for mental health, elderly and neonatal care is something that’s also being explored.
“There’s a big body of research looking at it generally,” said Dr Barrow. “If a new parent can’t touch their child, obviously that’s really distressing, but it also has potentially hugely detrimental effects on the development of the child.
“So we can imagine, potentially, future cots being lined with non-contact haptic interfaces...whereby a child could be stimulated physically. If we take that a little bit further, you can think about parents at home, potentially physically interacting with their child remotely, and being able to bond with them if they can’t get to hospital, or if they can’t get into the ward for some reason.”
Haptics in healthcare clearly holds much promise, but it’s a technology that’s been around in different guises for a long time in other sectors. Recent advances in VR have clearly helped spur innovation in haptics, with the technologies enjoying a natural synergy. But applications are also in development where haptics substitute for other senses rather than accompany them. At Goldsmiths University, Prof Atau Tanaka and Dr Adam Parkinson have been working on a haptic interface for visually impaired audio producers. According to the researchers, the need for such a device has been driven by the digitisation of audio practices.
“Audio production and editing is something that, back in the analogue days, you could think of as being a haptic activity,” said Tanaka, a professor of media computing at Goldsmiths. “So you rock tape back and forth to scrub the sound, and identify a point in the tape that you would literally splice with a razor blade. It was a very physical, material activity.”
However, the advent of digital technologies means audio is now represented on computer screens via graphic user interfaces, with sound in a waveform. Cutting and editing cues can be executed with extreme accuracy, but the process has moved away from the tactile to become almost entirely visual.
“All this is fantastic, but what if you’re visually impaired? You don’t get to see the visual waveform representation,” said Tanaka.
Tactile translation
The challenge for the Goldsmiths team became finding another medium with which to map the audio content. According to Tanaka, there is a large community of visually impaired people working in radio stations, recording studios and editing suites. In collaboration with this community, the researchers developed the Haptic Wave, a device that provides a tactile translation of visual soundwaves.
Resembling something like an oversized crossfader, the tool allows users to ‘scroll’ through the waveform. Peaks and dips are fed back to the hand via a motorised copper button, which communicates the amplitude of the waveform at a given point, facilitating precise edits.
In total, 11 visually impaired people helped develop the Haptic Wave, including a country-music producer, an e-book editor, and a heavy-metal musician. Various prototypes were tested over three years, with the cohort feeding back regularly. From the outset, the audio professionals indicated a preference for some type of tactile representation of the screen, despite being largely unaware of haptics.
“Somehow – even without knowledge of haptic technologies – our users were starting to get this idea that maybe this kind of technology would be useful for them,” said Tanaka.
That almost primal, instinctive connection to our tactile senses is one of the things that makes haptics such an exciting area of engineering development. It’s a relatively nascent technology, but one that has the potential to resonate with us on a deep level. Whether in combination with VR or in standalone devices, haptics is opening up a world of new sensory possibilities. And we’ve only just begun to scratch the surface.
英國鐵路公司如何推動凈零排放
It would be better if the trains had good coverage of the country. Large areas have no easy connection and so cars (or buses?) and lorries are still...