This is the aim of engineers at Sheffield University’s department of Automatic Control and Systems Engineering (ACSE) who envision robots working together in hostile environments.
To achieve this robotic systems will need in-built architecture that helps them make sense of their surroundings and prevent them from colliding.
To this end, the team has developed software that enables Quadacopters to learn about their surroundings using a forward facing camera, and have also demonstrated mid-air collision avoidance through the use of game theory.
To make sense of its environment, the robot starts with no information about its environment and the objects within it. According to the University, by overlaying different frames from the camera and selecting key reference points within the scene, it builds up a 3D map of its surroundings.
Other sensors pick up barometric and ultrasonic data, which give the robot additional clues about its environment. This information is then fed into autopilot software to allow the robot to navigate safely, but also to learn about the objects nearby and navigate to specific items.
‘The crossing demonstration is something we’ve developed in-house based on game theory,’ said Dr Jonathan Aitken, a research fellow at ACSE. ‘[The Quadcopters are] trying to optimise decisions they make based on the information they’re getting from their partner. In this case, the goal for them is to be at different altitudes so that they’re able to cross successfully without crashing into one another.’
Dr Aitken is attached to a Reconfigurable Autonomy project that aims to develop the underlying architectures for robotics that will enable the use of such machines at long distance.
‘When we can’t get up close and actually control the robotics – whatever the robotics system is – we can actually put an architecture on board that’s going to be able to manage problems and faults intelligently,’ he said.
Potential applications for the Quadcopters and similar robotic systems would involve assisting in nuclear or deep space scenarios.
‘The Quadcopter or robot that goes in is very much going to be working under its own steam, its going to make its own decisions on how its going to operate safely and its going to have to make decisions about if there are problems on board,’ said Dr Aitken. ‘For example, if there are failures on board, how do we adjust and compensate for those failures? It’s going to have to be the vehicle itself that makes that decision because we’re not in a position to control it.’
The next step is to extend the programming capability so that multiple robots can collaborate with each other, enabling fleets of machines to interact and collaborate on more complex tasks.
Oxa launches autonomous Ford E-Transit for van and minibus modes
I'd like to know where these are operating in the UK. The report is notably light on this. I wonder why?