Reverse engineering the insect brain

UK technology start-up Opteran is on a mission to transform the world of autonomy with advanced “natural intelligence” technology that mimics the brains of insects. Jon Excell reports.

Insects like honeybees have about a million neurons by comparison to around 86 billion in a human being - stock.adobe.com

Until now, most efforts to create machines able to operate autonomously in the world have drawn on so-called deep learning techniques: a vastly expensive, computationally intensive suite of approaches that effectively attempt to replicate aspects of the human brain.

But UK technology startup Opteran is taking a different route, tapping into 600 million years of evolution to unravel and mimic the highly efficient navigational and decision-making abilities of insects.

Spun out from the University of Sheffield in 2020, Opteran - named after the Hymenoptera order of insects, which includes wasps, bees and ants - has now developed a commercially available product: the Opteran Mind - which it claims could help usher in a new era of higher performance autonomy at a fraction of the cost of existing approaches.

If you really want to see state of the art autonomy, don’t go to California…..look at a garden

David Rajan - CEO, Opteran

Now employing 45 people, the company is growing rapidly, and last month announced a major partnership that will see its technology embedded in advanced warehouse robots developed by German autonomous picking and transportation robot manufacturer Safelog.

But as Opteran CEO David Rajan recently told The Engineer, this is just the beginning, and the company is already exploring and developing further commercial applications in sectors ranging from logistics and automotive, to mining, security and beyond.

The firm has its origins in research carried out at the University of Sheffield’s Department of Computer Science by co-founders Prof James Marshall and Dr Alex Cope.

Over the course of a decade, their group set about advancing the understanding of the structure and function of insect brains, using a host of techniques to figure out exactly how insects see the world, localise themselves in space, navigate over huge distances, and respond to all the uncertainty and chaos of the world around them.

By distilling these capabilities into a series of algorithms, the team arrived at a concept it refers to as “natural intelligence”: a potentially game-changing idea that it claims represents a vastly more sensible and efficient way of solving the autonomy challenge.

“Insects like honeybees have about a million neurons by comparison to around 86 billion in a human being, but the central systems are there,” said Rajan.  “They see the world, localise themselves in space and can even navigate up to 10 kilometers consuming just micro watts of power. If you really want to see state of the art autonomy, don’t go to California…..look at a garden.”

All of these capabilities are now in the process of being embedded in an actual product - the Opteran’ Mind- an edge computing solution composed of a series of insect-inspired algorithms. Crucially - and in stark contrast to most other efforts to crack the autonomy nut  - this technology runs with low-cost cameras and chips and doesn’t require training, infrastructure or connectivity to work. 

The technology currently includes algortihms that enable machines to spatially navigate and know where they are in the world. Over the coming months, the team plans to add further capabilities including collision avoidance and - at some point next year -  decision making algorithms that will allow machines to prioritise tasks. 

Explaining how Opteran Mind works in practice, the company’s chief product officer, Charlie Rance, contrasted it with the so-called SLAM (simultaneous localisation and mapping) approach to autonomous navigation, which is the most widely used  solution to the problem of robot navigation. 

Opteran's technology is now being used on autonomous warehouse robots produced by Germany company Safelog

Using SLAM, a machine equipped with advanced cameras and huge amounts of processing power builds a highly accurate map and localises itself on that map at the same time. It’s a sophisticated technique that has been at the heart of autonomous development in recent years, but it’s not without its limitations: It’s expensive, it needs to be trained on huge amounts of data, and the maps it creates can fail or mislocalise if there are sudden, unexpected changes to the environment such as the lights being turned off or something being moved.

Opteran’s approach is fundamentally different. By mimicking the way an insect is able to find its way around using a brain the size of a pinhead and fewer than a million neurons, Opteran’s technology is able to solve this problem with, said Rance “the tiniest amount of compute and the lowest data footprint you can possibly think of.”

Alongside all of this, added Rajan, it’s also inherently better suited to dealing with the unpredictability of the  real word: “Things happen, things change: the weather is horrible, the lighting is bad, in a warehouse everything gets moved around all the time.  We’re operating  as nature can, instead of in this kind of engineered way of using sensors and compute to try and solve 80 per cent of the problem.” 

We want to be the autonomy company...we’re building the brain for everybody else’s brawn

What’s more, unlike existing systems, Opteran’s technology doesn’t need to be trained on vast amounts of data before being deployed. “We’re not gathering data to train a system in a datacenter,” said Rance. “The algorithms are innate, so they know how to move around the world on their own, they can respond to the dynamic variability, they adapt to the world as it’s happening around them.”

The Safelog application is a good illustration of this.  Typically, deploying an autonomous guided vehicle (AGV) into a warehouse takes a fair bit of setting up with operators having to painstakingly scan the entire facility and process huge amounts of data before the robots can be let loose on the warehouse floor. The set up using Opteran’s technology is, said Rance, orders of magnitude less onerous: “All it takes for them to do is to drive one robot at the speed that robot operates and off it goes. We’re essentially creating a solution that allows their AGVs to autonomously map in one shot, so it’s a very quick set up time and you can share that with the rest of your AGVs. We can remove fixed infrastructure, so we’re not using any kind of reflectors or QR codes or anything. And we’re keeping the system at a lower cost.”

Alongside the Safelog deal, the company also has a number of other ongoing commercial applications which, for the time being, are subject to client confidentiality agreements. However, Rajan told The Engineer that these include an application on an indoor security drone for the consumer market, as well as use of the technology in the mining and automotive sectors. Whilst the range of potential application areas is almost unlimited, Rajan stressed that there are some applications that the company is keen to avoid. “We are not in favor of kinetic applications”, he said, “we’ve already turned down projects in the drone industry that would have been kinetic.”

The company is now very consciously harvesting the low-hanging fruit, but its longer-term ambitions are huge: “The real opportunity here is every single machine that moves could have an Opteran mind inside,” said Rajan. “That means all the humanoid robots, all the vacuum cleaners, all the lawn mowers, all the warehouse robots, all the drones. That’s the opportunity in front of us: full general purpose autonomy inside every machine that moves on the planet. We want to be the autonomy company, the company that enables machines to move around the whole world with their own brain. That’s our ambition. We’re building the brain for everybody else’s brawn.”