But thanks to recent technological advances and reductions in cost driven by an emerging consumer market, the capabilities of augmented reality (AR) and virtual reality (VR) technology are increasingly catching up with the Hollywood vision.
And in the world of defence, where an uncertain world is driving a host of complex operational and training requirements, VR and AR tools are becoming increasingly useful and important.
Virtual reality technology has actually been used by the defence industry as a training tool for some time. Indeed, back in 1980 the US Army enlisted experts from gaming firm Atari to develop a training system for its Bradley fighting vehicle based on the firm’s vintage Battlezone shoot-em-up.
But while such early systems were fairly primitive, and not widely used, advances in processing power and display technologies over the course of the last decade or so have ushered the technology into the military mainstream.
Here in the UK, the RAF has been using an advanced VR training system to train helicopter crews for a number of years now.
Developed by UK firm Virtalis, the system consists of a head-mounted display, a wooden surround that mimics the inside of a helicopter, and a camera-based tracking system that monitors the position of the user’s head and enables them to be placed precisely in a virtual world.
The technology, which is currently being used by the RAF at its bases in Valley, Anglesey and Shawbury in Shropshire, enables trainees to practice a range of different scenarios before setting foot in a helicopter for the first time. And according to Virtalis MD David Cockburn-Price it has helped the RAF make far more efficient use of its training resources. ‘They don’t waste time in the air going through the basics with the student because the student’s already hitting the ground running,’ he explained, adding that since deploying the system the RAF has seen a discernible drop in the failure rate of trainees.
The Virtalis system is high end, and although it is said to quickly pay for itself when measured against the flight cost per hour of a real helicopter (in the region of £10–15k/ hour) it’s still expensive: according to Cockburn-Price the displays alone cost between £15k and £30K.
But at the other end of the market, researchers are increasingly looking at how lower-cost, off-the-shelf technologies developed for other sectors could help provide a solution.
One of the main organisations driving this push in the UK is the government’s Defence Science & Technology Laboratory (DSTL), which has been funding a number of projects in the field.
Most recently, UK electronics consultancy Plextec unveiled a VR system for training soldiers in medical emergencies based on the commercially available Oculus Rift VR headset. Developed with funding from the DSTL’s Centre for Defence Enterprise (CDE), the system enables trainees to experience what it’s like to make tough clinical decisions while under fire.
According to Collette Johnson, Plextec’s medical business manager, the technology (expected to cost just a few hundred pounds once mass production begins) is currently being trialled by the MoD and could be in the hands of troops within the next few years.
Exploring the potential use of commercial off-the-shelf components (COTS) is also one of the primary aims of the DSTL’s Synthetic Environments (SE) Towers of Excellence programme which, among other things, is helping to develop a range of VR-based training systems underpinned by existing technologies. One notable example is an experimental driving simulator for the latest variant of the Warrior infantry fighting vehicle (which is expected to enter service by the end of the decade).
Caroline Shawl, a technical partner on the project, explained that with a new version of the Warrior being brought in, DSTL wanted to try to understand whether simulation could be used to train handling and control skills for tracked vehicles. A simulator based on an off-the-shelf motion platform and image-generation system was duly developed and, according to Shawl, successfully demonstrated that troops could learn just as well in a simulator as in a live vehicle.
Not only has the exercise apparently helped shape the MoD’s requirement for future training systems, but Shawl said it has also changed perceptions of what simulators can be used for. Her colleague, group technical scientist Ian Greig, added that there’s a growing awareness now that the falling cost of simulator technology is increasing its utility. ‘In the past,’ he said, ‘we might have spent £50m buying two very high-fidelity fast jet simulators and used them to train individual pilots. We can now buy 25 comparable devices for £1m each and link them together so that when the trainees go and fly in the real aircraft they are much better prepared.’
Of all the sectors that have helped drive down the cost of the technology, the one that’s had the biggest impact is arguably the games industry. Although, Shawl stressed that the spin-offs are not necessarily the ones you might expect.
‘The level of realism is a bit of a red herring,’ she said. ‘What you’re trying to achieve through training is a training effect. It’s quite easy to be seduced by the graphics on the screen, but the bits that are most relevant to us are the development of things such as GPUs, consoles, PCs and the power that these devices have.’
These kinds of advances are also expected to prove useful in the SE Tower’s latest project, which will be looking at ways of improving the performance of the computer-generated ‘entities’ — such as ground troops or civilians — used in a number of virtual training tools.
One of the problems with existing tools, said Greig, is that the entities — whether they represent ground troops or fighter jets — currently possess limited levels of autonomy. ‘We’ve got tools that will generate several thousand entities but we need lots of people to control them,’ he said. ‘You can leave them running on their own for about five minutes but then you’d have to go in and make a correction, and you might need one controller per 50 entities.’
How do we replicate civilians going around doing civilian things and reacting to the military activities that are going on around them? How do we represent explicit enemy forces? And how do we represent the insurgents, those people that are within the civilian population but are acting against our interests?’
Ian Greig, Group Technical Scientist, DSTL
One of the key aims, therefore, of the DSTL’s forthcoming SCORE (Simulation Composition and Representation of Natural and Physical Environments) project is to get the numbers of entities up and the number of human controllers down.
It is, said Greig, a hugely challenging problem. ‘How do we replicate a city’s worth of people going around doing their business? How do we replicate civilians going around doing civilian things and reacting to the military activities that are going on around them? How do we represent explicit enemy forces? And how do we represent the insurgents, those people that are within the civilian population but are acting against our interests?’
With the project yet to formally begin Greig and Shawl can only speculate on how this might be achieved, but Shawl said that it will be looking at a range of areas from improving interfaces so one person can control more of what’s going on in the background, to embedding greater levels of artificial intelligence.
But, while tools such as those developed through SCORE could have a major influence on the way armed forces conduct training, it’s the emerging use of augmented reality (AR) technology in real-life combat situations that many believe could have the biggest impact on the military world.
To date, defence applications of augmented reality have largely been limited to the highly sophisticated head-mounted displays worn by fighter-pilots.
Perhaps the most notable examples of this are the BAE Systems Striker helmets worn by Eurofighter pilots, and the recently launched F-35 helmet, which was developed jointly by Rockwell Collins and Elbis, and is thought to be the most advanced augmented reality helmet ever built.
Although there’s some variation in how these helmets work — they all pull off the same key trick of monitoring the precise position of the pilot’s head so that relevant information can then be displayed directly into his or her line of site.
One particularly neat feature of both the latest Striker helmet and the F-35 system, is the way they can display footage from cameras on the outside of the aircraft, to enable the pilot to see through the aircraft’s structure and have an uninterrupted 360° view.
But while the engineering challenges of developing technologies such as this are almost unimaginably complex, engineers are now attempting to take AR into new, arguably even more technically demanding territory, with the development of head-mounted systems for ground troops.
Dr David Roberts is group leader, military operations & sensing systems with one of the companies at the forefront of this emerging area: US firm Applied Research Associates. ‘Augmented reality has been around in cockpits and aircraft for a long time,’ he said, ‘but the transition of taking it to the dismount soldier is just happening now. On a plane you’ve got a lot of resources, you can plug the display into a large power source, you’ve got computing and you can leverage the sensors. Once you go to the dismounted soldier everything is about reducing size weight and power.’
Working with funding from US defence research agency DARPA Roberts’ team has developed a software and sensing package that it hopes will be used to underpin this new wave of solider technology.
Dubbed ARC4 (augmented reality command, control, communicate, coordinate) the system combines inertial sensors, GPS and camera data to record and track the position and orientation of the user’s head. Armed with this data, the system software is then able to take information from whatever communication networks it’s attached to and render it directly into the soldier’s field of view.
If you present that information in a heads-up way you’re always aware of what’s going on and you’ve got the information in front of you
Dr David Robert, Applied Research Associates
Roberts said that it has primarily been developed for JTACs (joint terminal attack controllers) who could for instance use it to monitor the location of blue forces (friendly troops) who might be out of line of sight.
He added that one of the key advantages over existing tablet-based systems is that troops can keep their eyes on the battlefield. ‘When you look down you’re not paying attention to all the stuff that’s happening in real time around you,’ he said, ‘then your brain’s got to relate the 2D information you see when you look down at a map with the 3D information of the real world when you look back up. If you present that information in a heads-up way you’re always aware of what’s going on and you’ve got the information in front of you.’
Roberts stressed that the technology, which could potentially be integrated with any head-mounted display, is considerably more sophisticated than commercially available systems such as Google Glass. ‘Those sorts of devices are not providing what we would call true augmented reality in the sense that there’s information that’s able to cover a good portion of your front view, a lot of those devices are look-away systems and are not geo-registered to your real world.’
Meanwhile, here in the UK, BAE Systems has used funding from the same DARPA programme to develop the Q-Warrior system, a clip-on full-colour AR display designed for ground troops.
Chris Colston, BAE’s director of business development for advanced displays, said that the lightweight device was made possible by advances in wave-guide displays.
He explained that the system effectively replaces the complex lens configurations found on conventional optics with a wave guide consisting of parallel sheets of glass. An image is inserted into this guide and bounces along until it reaches a diffraction grating between the sheets. This is used to control the way in which the image is emitted onto an eye motion box that the user can see.
Q-Warrior also includes a tracking package which, like ARC4, uses a combination of GPS and inertial technology to calculate the precise position of the user’s head.
Colston said that the system has been designed primarily for command-and-control specialists who could use it to view data, including video, images, navigational cues, and information on the location of friendly troops. He added that a wearer could even use it during targeting to calculate the blast radius of particular weapons, and minimise the chances for collateral damage.
Both Q-Warrior and ARC4 are currently in trials with Darpa, and although the two systems are already close to the finished article (Colston envisages a production version of Q-Warrior within 18 months) BAE and ARA are continuing to develop the capabilities.
The next challenge for Roberts is developing a system that works in areas where there is no GPS and although he wouldn’t be drawn on how he intends to solve this problem, Roberts claimed his team is close. ‘We have a path that will incorporate techniques for having this system carry you forward as you move into a building and are in “GPS-denied” areas — and that’s a really big deal right now.’
Report finds STEM job candidates facing bias after career break
Can an employer´s preference for a prospective candidate WITH recent experience over one who does not - perhaps through taking a career break - when...