AI drive brings Microsoft’s ‘green moonshot’ down to earth in west London

amazon, microsoft, ai drive brings microsoft’s ‘green moonshot’ down to earth in west london

An artist impression of the Microsoft Park Royal datacentre, now under construction in west London. Photograph: Microsoft

If you want evidence of Microsoft’s progress towards its environmental “moonshot” goal, then look closer to earth: at a building site on a west London industrial estate.

The company’s Park Royal datacentre is part of its commitment to drive the expansion of artificial intelligence (AI), but that ambition is jarring with its target of being carbon negative by 2030.

Microsoft says the centre will be run fully on renewable energy. However, the construction of datacentres and the servers they are filled with means that the company’s scope 3 emissions – such as CO2 related to the materials in its buildings and the electricity people consume when using products such as Xbox – are more than 30% above their 2020 level. As a result, the company is exceeding its overall emissions target by roughly the same rate.

This week, Microsoft’s co-founder, Bill Gates, claimed AI would help combat climate change because big tech is “seriously willing” to pay extra to use clean electricity sources in order “to say that they’re using green energy”.

In the short term, AI has been problematic for Microsoft’s green goals. Brad Smith, Microsoft’s outspoken president, once called its carbon ambitions a “moonshot”. In May, stretching that metaphor to breaking point, he admitted that because of its AI strategy, “the moon has moved”. It plans to spend £2.5bn over the next three years on growing its AI datacentre infrastructure in the UK and this year has announced new datacentre projects around the world including in the US, Japan, Spain and Germany.

Training and operating the AI models that underpin products such as OpenAI’s ChatGPT and Google’s Gemini uses a lot of electricity to power and cool the associated hardware, with additional carbon generated by making and transporting the related equipment.

“It is a technology that is driving up energy consumption,” says Alex de Vries, the founder of Digiconomist, a website monitoring the environmental impact of new technologies.

The International Energy Agency estimates that datacentres’ total electricity consumption could double from 2022 levels to 1,000 TWh (terawatt hours) in 2026, equivalent to the energy demand of Japan. AI will result in datacentres using 4.5% of global energy generation by 2030, according to calculations by research firm SemiAnalysis.

It means that amid the concerns about AI’s impact on jobs and humanity’s longevity, the environment is featuring, too. Last week, the International Monetary Fund said governments should consider imposing carbon taxes to capture the environmental cost of AI, in the form of a general carbon levy that captures emissions from servers as part of its reach, or other methods such as a specific tax on CO2 generated by that equipment.

All the big tech firms involved in AI – Meta, Google, Amazon, Microsoft – are seeking renewable energy resources to meet their climate targets. In January, Amazon, the world’s largest corporate purchaser of renewable energy, announced it had bought more than half the output of an offshore windfarm in Scotland, while Microsoft said in May it was backing $10bn (£7.9bn) in renewable energy projects. Google aims to run its datacentres entirely on carbon-free energy by 2030.

A spokesperson for Microsoft said: “We remain resolute in our commitment to meet our climate goals.”

Microsoft co-founder Bill Gates, who left in 2020 but retains a stake in the company via the Gates Foundation Trust, has argued that AI can directly help fight climate change. The extra electricity demand would be matched by new investments in green generation, he said on Thursday, which would more than compensate for the use.

A recent UK government-backed report agreed, stating that the “carbon intensity of the energy source is a key variable” in calculating AI-related emissions, although it adds that “a significant portion of AI training globally still relies on high-carbon sources such as coal or natural gas”. The water needed to cool servers is also an issue, with one study estimating that AI could account for up to 6.6bn cubic meters of water use by 2027 – nearly two-thirds of England’s annual consumption.

De Vries argues that the chase for sustainable computing power puts a strain on demand for renewable energy, which would result in fossil fuels picking up the slack in other sections of the global economy.

“More energy consumption means we don’t have enough renewables to feed that increase,” he says.

NexGen Cloud, a UK firm that provides sustainable cloud computing, a datacentre-reliant industry that delivers IT services such as data storage and computing power over the internet, says renewable energy sources for AI-related computing are available for datacentres if they avoid cities and are sited next to sources of hydro or geothermal power.

Youlian Tzanev, NexGen Cloud’s co-founder, says:

“The industry norm has been to build around economic hubs rather than sources of renewable energy.”

This makes it more difficult for any AI-focused tech company to hit carbon goals. Amazon, the world’s biggest cloud computing provider, aims to be net zero – removing as much carbon as it emits – by 2040 and to match its global electricity use with 100% renewable energy by 2025. Google and Meta are pursuing the same net zero goal by 2030. OpenAI, the developer of ChatGPT, uses Microsoft datacentres to train and operate its products.

There are two key ways in which large language models – the technology that underpins chatbots such as ChatGPT or Gemini – consume energy. The first is the training phase, where a model is fed reams of data culled from the internet and beyond, and builds a statistical understanding of language itself, which ultimately enables it to churn out convincing-looking answers to queries.

The upfront energy cost of training AI is astronomical. That keeps smaller companies (and even smaller governments) from competing in the sector, if they do not have a spare $100m to throw at a training run. But it is dwarfed by the cost of actually running the resulting models, a process known as “inference”. According to analyst Brent Thill, at the investment firm Jefferies, 90% of the energy cost of AI sits in that inference phase: the electricity used when people ask an AI system to respond to factual queries, summarise a chunk of text or write an academic essay.

The electricity used for training and inference is funnelled through an enormous and growing digital infrastructure. The datacentres are filled with servers, which are built from the ground up for the specific part of the AI workload they sit in. A single training server may have a central processing unit (CPU) barely more powerful than the one in your own computer, paired with tens of specialised graphics processing units (GPUs) or tensor processing units (TPUs) – microchips designed to rapidly plough through the vast quantities of simple calculations that AI models are made of.

If you use a chatbot, as you watch it spit out answers word by word, a powerful GPU is using about a quarter of the power required to boil a kettle. All of this is being hosted by a datacentre, whether owned by the AI provider itself or a third party – in which case it might be called “the cloud”, a fancy name for someone else’s computer.

SemiAnalysis estimates that if generative AI was integrated into every Google search this could translate into annual energy consumption of 29.2 TWh, comparable with what Ireland consumes in a year, although the financial cost to the tech company would be prohibitive. That has led to speculation that the search company may start charging for some AI tools.

But some argue that looking at the energy overhead for AI is the wrong lens. Instead, consider the energy the new tools can save. A provocative paper in Nature’s peer-reviewed Scientific Reports journal earlier this year argued that the carbon emissions of writing and illustrating are lower for AI than for humans.

AI systems emit “between 130 and 1,500 times” less carbon dioxide a page of text generated compared with human writers, the researchers from University of California Irvine estimated, and up to 2,900 times less an image.

Left unsaid, of course, is what those human writers and illustrators are doing instead. Redirecting and retraining their labour in another field – such as green jobs – could be another moonshot.

OTHER NEWS

7 minutes ago

ADNOC Yas In Schools Wraps Up A Thrilling National Finals With Record Participation Of 1,265 Students From 89 Schools

7 minutes ago

2 FTSE 100 shares that could rise after the general election

7 minutes ago

California prisoner who spent 13 years on the run after escaping prison camp is recaptured

7 minutes ago

One Line in 'House of the Dragon' Just Redeemed 'Game of Thrones' Worst Mistake

7 minutes ago

Tories set to be ‘all but wiped out’ in London

7 minutes ago

Romania vs Netherlands predicted lineups - Euro 2024

7 minutes ago

‘Disappointed in us as a nation’: Australian war memorials vandalised

7 minutes ago

Poliovirus alarmingly resurges in Hyderabad

7 minutes ago

French govt meeting with Macron to prevent far-right takeover

7 minutes ago

Scientist discovers oldest water on Earth and drinks it

7 minutes ago

Billboards calling on Wimbledon to drop Barclays as sponsor appear near All England Lawn Tennis Club

7 minutes ago

Coronation Street favourite ‘to return’ for Gail Platt’s exit storyline episodes

7 minutes ago

Cash Isas surge in popularity with £73.5bn poured into them in a year and a half

7 minutes ago

‘Does not want to upset Muslims’: Albanese gives clue on not sacking Senator Payman

7 minutes ago

Jake Guentzel signs with the Lightning hours before NHL free agency opens

7 minutes ago

Shape and Escape: Dubai most physically challenging escape rooms

7 minutes ago

Labor 'in their right mind' shouldn’t support proposed Truth and Justice Commission bill

7 minutes ago

Senator Mushahid Hussain praises China for leading 'alternative global order'

7 minutes ago

Blackhawks expected to sign veteran winger to four-year deal

8 minutes ago

If you're wondering what the smell of smoke is around the Houston area...

8 minutes ago

Voters kick all the Republican women out of the South Carolina Senate

8 minutes ago

The Supreme Court keeps hold on efforts in Texas and Florida to regulate social media platforms

8 minutes ago

Supreme Court rejects Trump's absolute immunity claim for unofficial acts

8 minutes ago

Euro 2024: England's Jude Bellingham under UEFA investigation

8 minutes ago

‘Sneaky Sauvignon surcharge’: Next prime minister urged to dump Sunak’s post-Brexit wine tax

8 minutes ago

Sir Ian McKellen withdraws from Player Kings tour on medical advice after fall from stage

8 minutes ago

We went into the trenches and came out the other side - Pickford

8 minutes ago

Waterfront F&B, yachting businesses still reeling in wake of oil spill

8 minutes ago

Fifth seed Medvedev eases into second round

8 minutes ago

Breaking-Shigekix expects to battle familiar foes for gold in Paris

8 minutes ago

At least 9 killed after car drives into pedestrians in Seoul accident: Reports

12 minutes ago

Supreme Court keeps Trump election case alive, but rules he has some immunity for official acts

12 minutes ago

Sam Reinhart agrees to 8-year, $69 million deal to stay Florida Panthers, AP source says

15 minutes ago

Gould speaks out on new signing with telling reaction

15 minutes ago

Wages mapped: UK's best and worst places for pay revealed - is your area on list?

15 minutes ago

Sons of Anarchy Cast Reunite For 10-Year Anniversary

15 minutes ago

STAYC gets 'cheeky and daring' with 1st studio album

15 minutes ago

Flight overbooked? Here's how to avoid being bumped from your plane

15 minutes ago

Embattled Boeing agrees to buy longtime supplier Spirit AeroSystems for $4.7bn

15 minutes ago

Aer Lingus to cancel 122 additional flights this week amidst pilot walk-outs