If you want proof of Microsoft’s progress toward its environmental goals, look closer to Earth: to a construction site on an industrial estate in west London.
The company’s Park Royal data center is part of its drive to drive the expansion of artificial intelligence (AI), but that ambition runs counter to its goal of being carbon negative by 2030.
Microsoft says the center will run entirely on renewable energy. However, the construction of data centers and the servers that fill them means that the company’s scope 3 emissions – such as CO2 related to the materials in its buildings and the electricity people use when using products like Xbox – are more than 30% above their 2020 levels. As a result, the company is exceeding its overall emissions target by about the same percentage.
This week, Microsoft co-founder Bill Gates claimed that AI would help combat climate change, as big tech companies are “seriously willing” to pay extra to use clean electricity sources, so they can “say they use green energy.”
In the short term, AI has been problematic for Microsoft’s green goals. Brad Smith, the outspoken president of Microsoft, once called his carbon ambitions a “moonshot.” In May, taking that metaphor to the breaking point, he admitted that because of the AI strategy “the moon has moved.” It plans to spend £2.5 billion over the next three years growing its AI data center infrastructure in the UK and has announced new data center projects around the world this year, including in the US, Japan, Spain and Germany.
Training and operating the AI models that underpin products like OpenAI’s ChatGPT and Google’s Gemini consumes a lot of electricity to power and cool the associated hardware, with additional carbon generated by manufacturing and transporting the associated equipment.
“It is a technology that increases energy consumption,” says Alex de Vries, the founder of Digiconomist, a website that monitors the environmental impact of new technologies.
The International Energy Agency estimates that total electricity consumption by data centers could double from 2022 levels to 1,000 TWh (terawatt hours) in 2026, equivalent to the energy demand of Japan. According to calculations by research firm SemiAnalysis, AI will result in data centers using 4.5% of global energy generation by 2030.
It means that amid concerns about AI’s impact on humanity’s employment and longevity, the environment also plays a role. Last week, the International Monetary Fund said governments should consider imposing carbon taxes to absorb the environmental costs of AI, in the form of a blanket carbon tax that captures server emissions as part of its reach, or other methods such as a specific tax on CO2 generated by that equipment.
All the big tech companies involved in AI – Meta, Google, Amazon, Microsoft – are looking to renewable energy sources to meet their climate goals. In January, Amazon, the world’s largest corporate buyer of renewable energy, announced it had bought more than half the output of an offshore wind farm in Scotland, while Microsoft said in May it was backing $10bn (£7.9bn) worth of renewable energy projects. Google aims to run its data centres entirely on carbon-free energy by 2030.
A Microsoft spokesperson said: “We remain steadfast in our commitment to meeting our climate goals.”
Microsoft co-founder Bill Gates, who left in 2020 but retains a stake in the company through the Gates Foundation Trust, has argued that AI can directly help fight climate change. The additional demand for electricity would be accompanied by new investments in green generation, he said Thursday, which would more than offset usage.
A recent UK government report agreed, stating that the “carbon intensity of the energy source is an important variable” when calculating AI-related emissions, though it added that “a significant proportion of AI training globally still relies on high-carbon sources such as coal or natural gas”. The water needed to cool servers is also an issue, with one study estimating that AI could be responsible for up to 6.6 billion cubic metres of water use by 2027 – almost two-thirds of England’s annual consumption.
De Vries argues that the hunt for sustainable computing power is putting pressure on demand for renewable energy, which would lead to fossil fuels taking over the pressure in other parts of the global economy.
“More energy consumption means we don’t have enough renewables to accommodate that increase,” he says.
NexGen Cloud, a UK-based company that provides sustainable cloud computing, a data center-dependent industry that delivers IT services such as data storage and computing power over the internet, says renewable energy sources for AI-related computing are available to data centers if they avoid cities and areas that are located next to sources of hydropower or geothermal energy.
Youlian Tzanev, co-founder of NexGen Cloud, says:
“The norm in the industry is to build around economic hubs rather than around sources of renewable energy.”
This makes it harder for any AI-focused tech company to meet carbon targets. Amazon, the world’s largest cloud computing provider, aims to be net zero by 2040 – removing as much CO2 as it emits – and to match its global electricity use with 100% renewable energy by 2025. Google and Meta are pursuing the same net zero goal by 2030. OpenAI, the developer of ChatGPT, uses Microsoft data centers to train and operate its products.
There are two main ways that large language models – the technology underlying chatbots like ChatGPT or Gemini – consume energy. The first is the training phase, in which a model is fed reams of data pulled from the internet and beyond and builds a statistical understanding of language itself, ultimately allowing it to produce convincing-looking answers to questions.
The initial energy cost of training AI is astronomical. That keeps smaller companies (and even smaller governments) from competing in the sector, if they don’t have $100 million to spend on a training run. But it’s a small difference from the cost of actually running the resulting models, a process known as “inference.” According to analyst Brent Thill, at the investment firm Jefferies, 90% of AI’s energy costs are in that inference phase: the electricity used when people ask an AI system to respond to factual questions, summarize a text fragment or write an academic essay to write.
The electricity used for training and inference is routed through a vast and growing digital infrastructure. Data centers are filled with servers, built from the ground up for the specific slice of the AI workload they’re running. A single training server might have a central processing unit (CPU) barely more powerful than the one in your own computer, combined with dozens of specialized graphics processing units (GPUs) or tensor processing units (TPUs)—microchips designed to quickly crunch through the vast amounts of simple computations that make up AI models.
If you’re using a chatbot and watching it spit out answers word for word, a powerful GPU uses about a quarter of the power it would take to boil a kettle. All of this is hosted in a data center, whether it’s owned by the AI provider itself or by a third party – in which case it might be called “the cloud,” a fancy name for someone else’s computer.
SemiAnalysis estimates that if generative AI were integrated into every Google search, this could translate into annual energy consumption of 29.2 TWh, similar to what Ireland consumes in a year, although the financial cost to the tech company would be prohibitive. That has led to speculation that the search company may start charging for some AI tools.
But some argue that looking at the energy overhead for AI is the wrong lens. Instead, consider the energy the new tools can save. A provocative paper in Nature’s peer-reviewed journal Scientific Reports earlier this year argued that the carbon footprint of writing and illustrating is lower for AI than for humans.
Researchers at the University of California Irvine estimate that AI systems emit “between 130 and 1,500 times” less carbon dioxide per page of text generated than human writers, and up to 2,900 times less per image.
Left unsaid, of course, is what those human writers and illustrators do instead. Refocusing and retraining their labor in another area – such as green jobs – could be another step in the right direction.