AI tool will scan wildlife photos for climate change clues

An international team of researchers has developed a tool to help AIs sift through wildlife images and look for clues as to how nature is responding to climate change.

Adobe Stock

Known as INQUIRE, the tool will test how well AI algorithms can mine image banks for details such as what animals are eating, how healthy they are, and with which other species they are interacting. It will measure AI’s ability to draw conclusions from an image bank of five million wildlife photos uploaded to the iNaturalist citizen science website.

MORE FROM AI

INQUIRE was developed by researchers from organisations including the University of Edinburgh, University College London, UMass Amherst, iNaturalist and the Massachusetts Institute of Technology (MIT). It’s hoped the tool will help reveal key insights into the impacts of climate change, pollution, habitat loss and other pressures on tens of thousands of animal and plant species.

“The thousands of wildlife photos uploaded to the internet each day provide scientists with valuable insights into where different species can be found on Earth,” said Dr Oisin Mac Aodha, reader in Machine Learning at Edinburgh University. “However, knowing what species is in a photo is just the tip of the iceberg.

“These images are potentially a hugely rich resource that remains largely untapped. Being able to quickly and accurately comb through the wealth of information they contain could offer vital clues about how species are responding to multi-faceted challenges like climate change.”

According to the researchers, INQUIRE’s findings to date highlight opportunities to develop new AI algorithms that can better help scientists efficiently explore vast image collections.

“This careful curation of data, with a focus on capturing real examples of scientific inquiries across research areas in ecology and environmental science, has proven vital to expanding our understanding of the current capabilities of current AI methods in these potentially impactful scientific settings,” said Dr Sarah Beery, assistant professor at MIT.

“It has also outlined gaps in current research that we can now work to address, particularly for complex compositional queries, technical terminology, and the fine-grained, subtle differences that delineate categories of interest for our collaborators.”

The team’s peer-reviewed findings will be presented at the NeurIPS conference, a leading conference on machine learning.