According to the University, the breakthrough could be employed on remote-sensing satellites to enable real-time monitoring and decision making for several applications.
Data collected by remote-sensing satellites allow for aerial mapping, weather prediction, and monitoring deforestation. Most satellites passively collect data, since they are not equipped to make decisions or detect changes. Instead, data is relayed to Earth to be processed, which is time consuming and limits the ability to identify and respond to rapidly emerging events, such as a natural disaster.
To overcome these restrictions, a group of researchers led by DPhil student Vít Růžička from Oxford’s Department of Computer Science took on the challenge of training the first machine learning program in outer space. During 2022, the team pitched their idea to the Dashing through the Stars mission, which had issued an open call for project proposals to be carried out on board the ION SCV004 satellite, launched in January 2022. During the autumn of 2022, the team uplinked the code for the program to the satellite already in orbit.
The researchers trained a simple model to detect changes in cloud cover from aerial images onboard the satellite, in contrast to training on the ground. The model was based on an approach dubbed few-shot learning, which enables a model to learn the most important features to look for when it has only a few samples to train from. A key advantage is that the data can be compressed into smaller representations, making the model faster and more efficient.
In a statement, Vít Růžička said: "The model we developed, called RaVAEn, first compresses the large image files into vectors of 128 numbers. During the training phase, the model learns to keep only the informative values in this vector; the ones that relate to the change it is trying to detect. This results in extremely fast training due to having only a very small classification model to train."
Whilst the first part of the model, to compress the newly-seen images, was trained on the ground, the second part, which decided whether the image contained clouds or not, was trained on the satellite.
Developing a machine learning model would usually require several rounds of training, using the power of a cluster of linked computers. In contrast, the team’s tiny model completed the training phase (using over 1300 images) in around one and a half seconds.
When the team tested the model’s performance on novel data, it automatically detected whether a cloud was present or not in around a tenth of a second. This involved encoding and analysing a scene equivalent to an area of about 4.8x4.8 km2 area.
According to the researchers, the model could be adapted to carry out different tasks and to use other forms of data.
Vít Růžička said: “Having achieved this demonstration, we now intend to develop more advanced models that can automatically differentiate between changes of interest - for instance flooding, fires, and deforestation - and natural changes. Another aim is to develop models for more complex data, including images from hyperspectral satellites. This could allow, for instance, the detection of methane leaks, and would have key implications for combatting climate change.”
Performing machine learning in outer space could also help overcome the problem of onboard satellite sensors being affected by the harsh environmental conditions, so that they require regular calibration.
This project was conducted in collaboration with the European Space Agency (ESA) Φ-lab via the Cognitive Cloud Computing in Space (3CS) campaign and the Trillium Technologies initiative Networked Intelligence in Space (NIO.space) and partners at D-Orbit and Unibap. The team’s findings are detailed in Scientific Reports.
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...