Current surveillance systems in the field generate vast amounts of data that can not only overwhelm operators, but also slow down the speed at which the images can be analysed.
Now, the Office of Naval Research (ONR) has developed a multi-sensor motion-tracking system that won’t require constant monitoring from a human operator.
The sensors will automatically find moving objects then send high-resolution images of them to ground or airborne operators.
During field tests, researchers used an airborne network to overlap multiple real-time tracks provided by a wide-area surveillance sensor with high-resolution narrow field-of-view sensors.
Using geo-projection of the wide-range imagery, all moving vehicle-sized objects were tracked in real-time then automatically converted into geodetic co-ordinates.
The co-ordinates were sent to the airborne network, where the zoomed-in sensors could then search images to identify the vehicle.
‘The demonstration was a complete success,’ said Dr Michael Duncan, ONR programme manager. ‘Not only did the network-sensing demonstration achieve simultaneous real-time tracking, sensor cross cueing and inspection of multiple vehicle-sized objects, but we also showed an ability to follow smaller human-sized objects under specialised conditions.’
The demonstration used precision, motion-stabilising sensors developed by other ONR programmes for close-up images. Dubbed ‘EyePod’, this dual-band infared sensor is designed to be attached to smaller, man-portable UAV platforms.
The wide-range images came from a 16-megapixel camera that captures four frames per second and has a one frame per second step-stare capability.
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...