Comment: How photonics could save AI from its own energy appetite

The evolution of AI doesn’t have to come at the planet’s expense, and with the right technology, it won’t, says Lieven Levrau, senior director, Product Strategy, Nokia, and chair of the Technical Steering Committee at The IOWN Global Forum.

The training of GPT-3 was estimated to have used just under 1,300MWh of electricity
The training of GPT-3 was estimated to have used just under 1,300MWh of electricity - AdobeStock

Ever since ChatGPT launched in late 2022, AI has been shaking up industries in ways we never imagined. Unlike traditional AI, which mostly analyses and classifies data, Generative AI can create brand new content – whether it’s text, images, or even synthetic datasets. This revolution is fuelling innovation, but there’s a catch. Training these AI models requires a huge amount of energy. The training of GPT-3 was estimated to have used just under 1,300MWh of electricity. This is roughly equivalent to the annual power consumption of 130 homes. Training the more advanced GPT-4, meanwhile, is estimated to have used 50 times more electricity. As AI adoption grows exponentially, so does its environmental impact. The challenge now is finding a way to sustain AI’s rapid growth without pushing our planet to its limits.

AI’s appetite for power

One of the biggest obstacles to greener AI is the legacy of outdated infrastructure. Many data centres were built long before the AI boom and simply weren’t designed to handle the immense power demands of modern GPUs. We can’t just drop high-performance chips into old, inefficient systems and expect a sudden leap in efficiency. In the past, the approach has often been to apply patchwork fixes and incremental upgrades, but these can only go so far. As AI models grow larger and more complex, their training demands keep increasing, exposing the fundamental inefficiencies of these aging digital networks. Reports suggest that the computational power needed to sustain AI’s growth is doubling roughly every 100 days, putting further pressure on the already strained power grid.

For businesses, this presents a difficult dilemma. AI is becoming essential for staying competitive, but there is a large cost associated with the running of high-performance GPUs – not just in financial terms, but also in energy and space. While some enterprises can afford the infrastructure, many are left struggling to adopt AI without dramatically increasing their carbon footprint. So, how do organisations keep up with AI advancements, and do so efficiently and responsibly?

From electronics to photonics

All-Photonics Networks (APNs) could redefine AI training. Instead of relying on traditional electronic networks, APNs use ultra-fast, energy-efficient photonic data transmission. This shift allows AI workloads to be processed remotely in next-generation green data centres that leverage renewable energy sources and advanced cooling systems to minimise environmental impact. By reducing latency and power consumption, APNs allow AI models to be trained at speeds and scales previously unattainable, paving the way for more advanced and sustainable AI development.

With remote GPU services, companies no longer need to house energy-guzzling GPUs on-site. Instead, they can offload AI training to specialised, eco-friendly, high-performance computing centres – significantly reducing their power consumption, cutting operational costs, and making AI development more sustainable. As an added benefit, these remote centres are designed for efficiency, ensuring that AI models can be trained without the unnecessary energy waste typical of traditional infrastructure.

Discussions about AI's environmental impact often focus on its electricity consumption, but the water footprint is equally concerning. NPR reports that the average data centre uses 300,000 gallons of water a day to keep cool, roughly equivalent to water use in 100,000 homes. This overlooked issue adds another layer to AI's sustainability challenge. APN-connected data centres could help mitigate this by incorporating more efficient cooling.

Speed meets sustainability

Switching to APN-powered Large Language Model (LLM) training isn’t just about going green – it also boosts efficiency. Traditional AI training can take weeks or even months, consuming massive amounts of energy in the process. But with APNs’ ultra-low latency and high-bandwidth connections, AI models can be trained in days instead of months, improving speed while cutting down on waste.

This shift is good for the environment and is a game changer for businesses. Until now, only tech giants could afford top-tier AI computing, but with remote GPU services, smaller enterprises may soon be able to access powerful AI compute resources without investing in energy-hungry, on-prem infrastructure. By democratising access to cutting-edge AI training, APNs could level the playing field while making AI development more sustainable while increasing market competition.

Balancing sustainability and security

Of course, efficiency and sustainability aren’t the only concerns. AI models rely on massive datasets that often include sensitive information, making data security a top priority. Any new infrastructure - APN included - must support end-to-end encryption, remote attestation, and advanced security measures to ensure that training data remains protected.

The future of AI development

The rise of APN-based training proves that technological progress and sustainability don’t have to be at odds. By minimising energy waste, reducing costs, and democratising access to high-performance computing, photonic technology is shaping a future where AI can be both powerful and environmentally responsible.

The challenge ahead isn’t whether we should make AI greener – it’s how quickly we can make it happen.

Lieven Levrau, senior director, Product Strategy, Nokia, and chair of the Technical Steering Committee at The IOWN Global Forum