In a demonstration aboard a former US Navy ship, a small quadcopter developed by researchers at Carnegie Mellon University’s Robotics Institute and spin-off company Sensible Machines flew autonomously through dark, smoke-filled compartments to map fires and locate victims.
Last autumn’s demonstration, part of a US Office of Naval Research (ONR) project called Damage Control Technologies for the 21st Century (DC-21), showed that a small drone can operate in the confined spaces inside a ship to gather situational information to guide firefighting and rescue efforts.
‘With the micro-flyer, we wanted to show that it could autonomously navigate through the narrow hallways and doors – even in dense fire smoke – and locate fires,’ said Thomas McKenna, ONR’s DC-21 program manager. ‘It succeeded at all those tasks.’
As part of the DC-21 concept, information gathered by the micro-flyer would be relayed to a large humanoid robot, the Shipboard Autonomous Firefighting Robot (SAFFiR), that would work with human firefighters to suppress fires and evacuate casualties.
‘Flying autonomously through narrow doorways in darkness and smoke poses a number of technical challenges for these small drones,’ said Sebastian Scherer, systems scientist at CMU’s Robotics Institute. ‘But this capability, known as ‘fast lightweight autonomy,’ will have numerous applications beyond shipboard fires, such as investigation of building fires and inspection of hazardous chemical tanks and power plant cooling towers.’
Sensible Machines built a quadrotor 23-inches wide and 12-inches high to fit through the 26-inch-wide hatches of the ex-USS Shadwell, which is being used to test firefighting techniques.
The drone was able to negotiate the tight spaces but its smaller rotors reduced its efficiency, limiting its flight time to approximatley five minutes.
Sensible Machines is now building a drone that is 16 inches wide but replaces the four rotors with a single ducted fan with two larger, counter-rotating propellers. Scherer said in a statement that the larger rotors work more efficiently and are anticipated to increase flight time to 30 minutes.
https://www.youtube.com/watch?v=g3dWQCECwlY
The primary sensor used by the drone to build its map of fire areas is a RGB-D camera - depth camera - similar to that of a Kinect game controller.
Scherer said it works better in the dark because there’s less ambient light to interfere with the infrared light the camera projects. Other researchers have tried using depth cameras to do mapping, he said, but have had limited success because they still rely primarily on visual features, with depth information used supplementally.
‘We flipped it around, using mainly the depth camera to build our maps,’ Scherer said. In addition to the RGB-D camera, the drone uses a forward-looking infrared (FLIR) camera to detect fires and people and a downward-facing optical flow camera to monitor the motion of the drone itself.
The work for ONR was supported by a Small Business Innovation Research grant to Sensible Machines for which the Robotics Institute is a subcontractor.
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...