Is AI Sustainable? Five Ways To Reduce Its Carbon Footprint

Artificial intelligence (AI) can be part of the solution for tackling global warming – but it’s also a significant emitter of carbon itself. Can its net contribution be positive?


Forget adding bunny ears to your selfie; AI has long since grown up and begun tackling tough, environmental problems. Its data-crunching superpowers make it ideal for everything from ocean monitoring to climate change prediction modeling. But training AI models requires vast amounts of energy, so do the benefits outweigh the environmental cost? In short, is AI sustainable?

Sustainable AI: fact or fiction?

It’s no secret that the world needs to take swift and decisive action on greenhouse gas (GHG) emissions if we’re to avoid catastrophic climate change. And it’s easy to find research that extols the virtues of AI in achieving that. Business consultants BCG, for example, estimate that AI could reduce emissions by 5% to 10% by 2030.

But it’s also easy to find plenty of articles that compare the carbon footprint of training AI models to 125 New York-Beijing round-trip flights or, to cite one 2019 research paper, the lifetime carbon footprint of five cars. So, what’s the truth? Is AI a hero or a villain?

While such a polarized narratives make great headlines, as in most things, the reality is more nuanced. AI can have environmental benefits, but it’s a balancing act between the energy used and the energy saved. So, what can be done to maximize the benefits of AI without supersizing the environmental costs?

Choose renewable energy

According to research published by, “Using renewable energy grids for training neural networks is the single biggest change that can be made. It can make emissions vary by a factor of 40 between a fully renewable grid and a fully coal grid.”

Renewable energy is one of the world’s core strategies for decarbonization, but whether or not it’s available to you depends largely on where you live and the suppliers you can choose from. And the fact remains that most sources of low-carbon electricity – such as solar or wind power – are variable. Grid operators can’t turn them on and off as needed.

Digitization of the grid can help with load balancing and demand management, while energy storage can cope with short-term variations in energy availability. And AI itself can help maximize distribution efficiency and drive predictive maintenance to avoid downtime. But ultimately, a significant increase in storage is required if renewable energy is to become available to all.

Place workloads effectively

Is it better to do AI in the cloud or at the endpoint? Surprise, surprise… the situation is nuanced and the only correct answer is: it depends. Shifting workloads from the cloud to the endpoint can reduce the cost of data transmission, but for some workloads cloud is imperative. The good news is, however, that work is being done to reduce the carbon footprint of cloud compute by companies such as Cloudflare.

Cloudflare’s mission is to build an internet that’s safe, performant, reliable, and consumes less energy. Over 25 million internet properties run on its global network, which spans more than 250 cities in over 100 countries. Its 11th generation servers, powered by Arm Neoverse-based CPUs, process an incredible 57% more internet requests per watt than its previous generation servers based on traditional CPU architectures.

Consider embedded emissions

Embedded emissions simply refer to the amount of GHGs generated in production of an asset. The embedded carbon of AI can be tracked right along the line from device to algorithm, but in Arm’s case, it means the engineering workflows required to develop our intellectual property (IP).

These workflows consume billions of compute hours per year and, of course, require a significant amount of energy to power them. The challenge is to increase workflow efficiency while reducing time and energy consumption, to achieve results of the same, or higher, quality.

And here’s an interesting thing: we can use AI to reduce the embedded carbon of AI – streamlining processes and spending compute hours more efficiently. How? Well, engineers may choose to use ‘good enough’ compute. That is, paring down workloads to produce enough cycles to get the job done accurately, without wasting energy and resources. By running full test suites at key milestones, for example, but minimizing the number of tests run in between these points, it’s possible to reduce compute hours and conserve energy without compromising on accuracy and quality.

Maximize performance per watt

As AI becomes ever more ubiquitous, a relentless focus on efficiency will become essential for reducing its environmental impact. Performance per watt will become the new measure of success.

But to stop climate change in its tracks, keeping power and energy numbers stable is not enough. We need to take a carbon-first approach, considering it a vital statistic alongside power, performance and area.

By actively investigating new ways to tighten the power envelope, we can help AI stay on the right side of history as part of the climate solution and a more sustainable future.

Think! Does it need AI?

Perhaps one of the most important questions to consider is does it need AI? Sure, it’s nice that your coffee machine recognizes your face and makes your morning cup of Joe accordingly. But if we truly want to avoid dangerous levels of global warming, we’re going to have to take a long, hard look at what we consider to be essential compute – and work to reduce or remove non-essential workloads. If you can just as easily tap in your coffee order and save some energy, for example, why complicate things?

Of course, there are heftier workloads than AI coffee machines, but the principle applies across the board. We can no longer afford to be profligate with our resources; we need to ensure that the benefit outweighs the cost. And if that means bye-bye AI coffee, then so be it.

Leave a Reply

(Note: This name will be displayed publicly)