Research aimed at teaching robots to “see” may soon make it possible to bag speeding motorists, track enemy planes, and automatically safeguard the
’s borders and resources without any chance of detection.
A
That’s because, instead of painting a target with radar waves or laser beams, a camera needs to capture an image or series of images from the target.
“If it can view the object moving, that’s all it needs. The computer figures out everything else,” said Warren Dixon, a UF assistant professor of mechanical and aerospace engineering. “We’re trying to use both regular and infrared cameras, so night or adverse weather conditions don’t present a problem.”
Achieving computerised speed and motion detection requires overcoming several challenges. One is figuring out how to get a computer to understand the surrounding environment by interpreting images recorded by a video or still camera.
“The information from a camera is just a flat-screen, two-dimensional image,”
People and animals can perceive depth because their brains combine each eye’s snapshots. Two cameras can also achieve stereo vision, but computers can make sense of it only if they know the exact position of each camera. That allows them to triangulate the target and learn its position relative to the camera. Part of
“With my work, you don’t need to know that specific location information,” he said. “You could have one camera taking an image from an airplane and another mounted on a car taking a picture of the same image -- and not know how the airplane and car are related to each other -- and through this new mathematics you can understand how they’re related to the target.”
The technology has law enforcement and military applications. Police in moving or parked squad cars could use the computer-camera systems much as they do radar and laser guns to track and ticket suspected speeders.
The target would have to be within the line of sight, with the range varying according to the power of the lenses in the camera.
Soldiers, meanwhile, could mount the cameras on airborne drones or truck convoys and set them to look for and automatically report potentially hostile objects moving toward the convoys – again, without any fear of giving away the convoys’ locations.
Robotic drones or remote camera-based monitoring posts outfitted with the technology also could be used for applications ranging from private security in warehouses and shopping centres to continuous remote monitoring of borders to protecting water supply reservoirs.
In addition to the robotic applications, the technique is being refined for a project led by Andy Kurdila, a UF professor of mechanical and aerospace engineering, to provide vision systems for tiny airborne surveillance drones called micro air vehicles.
The goal of that five-year project, which is jointly funded by a $5 million grant from Eglin Air Force Base in
MOF captures hot CO2 from industrial exhaust streams
How much so-called "hot" exhaust could be usefully captured for other heating purposes (domestic/commercial) or for growing crops?