AI has the power to be a global force for good. It’s is reducing conflict between humans and snow leopards. It’s increasing chili yields in India by 21 per cent. And it’s helping to make mental health conditions easier to diagnose and supporting tough decisions about conservation projects.
However, AI also has a substantial carbon footprint if not handled correctly. A standard internet search uses approximately 0.3 watt-hours of energy; an AI search uses ten times more. Alongside this power usage comes water consumption. An AI-powered search uses approximately a single shot of water (25ml), whereas its non-AI-powered equivalent uses only half a millilitre. In short, AI is substantially hungrier and thirstier than ‘conventional’ computing tasks.
On this basis, many people might opt to reduce their use of AI tools, but what if said tools could prevent human or snow leopard injuries, or help to improve the lives of chili farmers? It’s a valid question, and before we write off new technologies, it’s important to understand the bigger picture. After all, even a single pair of jeans takes just shy of 4,000 litres of water to make – and the lithium needed for 125 EV batteries will consume almost half a million litres.
Understand impact
AI tools need to be developed responsibly from the get-go, and that means that they should be created within ethical parameters. Organisations like the Alan Turing Institute and the World Economic Forum have clear guidelines around factors such as bias and fairness, transparency, accountability, robustness, data protection, privacy and avoidance of harm, which are all aimed at guiding developers in creating AI for good.
And some impacts are not as pronounced as they seem at first pass. After all, high electricity consumption is not necessarily a bad thing if the power comes directly from solar, wind or hydroelectric sources. Clearly, in most industrial and residential settings, power comes from the grid, which is a mix of both renewable and non-renewable sources. This mix varies substantially by country: the UK’s grid is fuelled by least 40 per cent renewable fuel, with low-carbon sources such as nuclear contributing an additional 14 per cent or so in recent years.
This is even more complex when we look at resources like water; although industry might consume significant amounts of water to cool servers, it doesn’t simply vanish. However, it can disrupt local ecosystem patterns, and may not return to the same place, or for a long time.
Heat, water and code
As we alluded to, one of the first questions that organisations should ask when considering an AI strategy is whether this application really needs AI – and if so, does it need high-powered GPUs? In many cases, the answer is yes, but in others, lower-powered GPUs can do the work just as well. For example, a typical Nvidia H100 GPU has a 150 kgCO2e carbon footprint across its lifecycle. An Nvidia L5 has a carbon footprint a third of the size.
As a further point of comparison, the carbon footprint of a traditional CPU for non-AI tasks is usually in the region of 5-25kgCO2e, far less than that of a single pair of jeans (33-80KgCO2e).
One of the major concerns about AI is that its power usage puts more waste heat into the atmosphere. Although waste heat from datacentres can be used elsewhere, it’s not always simple. Many datacentre facilities are focused on removing heat from servers as quickly and efficiently as possible. Often, the warm water leaving a facility can only be forty or fifty degrees centigrade – as hot as the water from a domestic hot tap. However, with some reengineering, this can be changed so that it’s easier to re-use, for example, in heating nearby buildings.
However, it’s important to look even more broadly to make sure that AI’s ‘supply chain’ is as green as possible. Above and beyond its power and water usage, we should also consider the manufacturing footprint of the datacentre and the servers within it, whether the equipment is reused or recycled further down the line, and what happens to equipment that can’t be recycled. Developments are being made in these areas all the time, and we have even seen startups emerge to help separate valuable metals from motherboards using electrolysis, decreasing the amount of mining we need to do to produce new minerals.
Finally, we should also consider AI software itself. Significant amounts of attention have been paid to the carbon impact of hardware and physical buildings, but we mustn’t forget about the code and AI applications. More efficient code and coding languages can dramatically decrease the amount of power needed to run AI, but there are also trade-offs: languages like Rust can be more powerful and energy efficient, but security flaws in more powerful languages are proportionally higher-risk.
Cultivating technological mindfulness
Our global environment director is fond of saying that a good sustainability strategy isn’t necessarily about reducing certain numbers to zero, it’s more about making sure that we can answer today’s needs without compromising the world of tomorrow.
AI is no exception to this, and the good news is that technology is getting more efficient all the time. For example, between 2010 and 2018, the amount of computing done in datacentres across the globe increased by more than 550 per cent, but energy usage increased by just six per cent.
We’re clearly getting better at being efficient, and there is increasing awareness of this across the supply chain, from chip manufacturers to datacentres and users.
Emma Dennard, VP Northern Europe, OVHcloud
Comment: Industry must prioritise environmentally responsible adoption of Gen AI
Industry needs to develop the application of AI by all legal and economic means possible. Big-Brother needs to be kept out or we will have a...