The carbon quandary: AI, big data, and impending environmental crisis
AI offers significant potential for improving energy efficiency, advancing predictive modelling, and guiding evidence-based policies to address climate change impacts. Image Credit: Wikimedia Commons
Imagine this: skies tinged orange, air thick with a burnt odour, and trees resembling extras in a zombie movie, their skeletal branches stretching out. Cities that once bustled now echo as ghost towns, with only occasional tumbleweeds drifting through. It might seem like a scene from a post-apocalyptic film, but it could become our reality if we fail to address carbon emissions.
Enter Artificial Intelligence (AI)—our potential superhero or unexpected villain in the climate change saga. According to recent research from Nature, AI systems like GPT-3 and BLOOM might be our secret weapon against carbon emissions. But hold your applause. Is it really as simple as letting robots save the day?
AI: The Unlikely Savior
Let's look at the figures. Training GPT-3, one of the most advanced AI systems, generates an astonishing 552 metric tons of CO2e (carbon dioxide equivalent). Alarming, right? However, when spread across the millions of tasks it performs, each query only emits about 2.2 grams of CO2e. BLOOM, another AI system, is even more efficient, with emissions of around 1.6 grams of CO2e per query. Now, compare this to a human writer. A writer in the US produces approximately 1400 grams of CO2e per page, while an Indian writer, working in a less energy-intensive environment, generates about 180 grams of CO2e per page. This comparison is akin to contrasting a Prius with a gas-guzzling SUV—the AI is far more eco-friendly.
This study shows that AI systems emit between 130- and 1500-times fewer CO2 equivalents per page of text than human writers, and AI illustration systems produce between 310- and 2900-times fewer CO2 equivalents per image than human artists. Despite the high initial emissions from training, AI’s efficiency and scalability significantly reduce the overall environmental impact when deployed extensively. The research underscores the potential for substantial reductions—up to a hundred- or thousand-fold—in the environmental footprint of these everyday human activities.
The Devil is in the Details
However, before we celebrate AI as our climate saviour, we need to examine this more closely. What about the other use cases where AI is used? The energy and recycling costs of the hardware used for AI, although lower over time than those for human activities, still present challenges.
Training GPT-3 alone produces carbon emissions equivalent to the lifetime footprint of five cars. The computational and environmental costs of training scaled with the model size and skyrocketed when additional tuning steps were employed to enhance the model's final accuracy.
Specifically, the researchers discovered that a tuning process called neural architecture search, which optimises a model by gradually adjusting a neural network's design through extensive trial and error, had extremely high costs for minimal performance gains. Without this process, the most expensive model, BERT, had a carbon footprint of about 1,400 pounds of CO2 equivalent, comparable to a round-trip flight across the United States for one person.
As AI expands rapidly, so does its carbon footprint. Since the deep learning breakthrough in 2012, the computational demands for training the largest deep neural networks have increased by 300,000 times. Training a massive AI model like ChatGPT alone generates an annual carbon footprint equivalent to that of sixty people in the Western world.
Need for Innovation
An effective strategy for decreasing AI's energy consumption and carbon footprint involves substituting the conventional Von Neumann computing architecture, where memory and processing units are distinct, with an innovative approach known as neuromorphic computing. This type of architecture emulates the structure and operations of the human brain by integrating memory and processing units, thereby significantly enhancing capabilities for parallel processing. Several key advancements in spiking neural networks (SNNs) have rendered neuromorphic computing feasible.
Theoretically, these breakthroughs suggest that numerous AI applications could become up to a hundred to a thousand times more energy-efficient.
A collaboration between Chinese and Swiss researchers has led to the creation of an energy-efficient neuromorphic chip that mimics human neurons and synapses. Known as "Speck," this chip has an impressively low resting power consumption of just 0.42 milliwatts, consuming nearly no power when idle. In contrast, the human brain, famous for processing intricate neural networks, runs on just 20 watts of power—much less than today's AI systems. Neuromorphic computing, therefore, shows great potential for energy-efficient machine intelligence.
This technology is well-suited to handle the dynamic computational demands of various algorithms, with real-time power consumption as low as 0.70 milliwatts. It offers AI applications a brain-inspired solution with excellent energy efficiency, minimal latency, and reduced power consumption.
The Data Centre Dilemma
Having said all this, while AI can theoretically cut global carbon emissions, this relies on responsible usage. These technologies power (and are powered by) massive data centres, which are significant energy consumers. According to a Wall Street Journal report, these data centres hinder the shift to clean energy due to their enormous power demands. The rapid growth of hyperscale data centres, particularly in Northern Virginia, USA, has disrupted efforts by electric utilities to reduce fossil fuel reliance.
At the heart of the data centre boom is Northern Virginia's famed "Data Center Alley," where about 70 per cent of the world's internet traffic flows through. Northern Virginia's rise to global prominence as a data centre hub dates back to the early days of the internet. The infrastructure established back then attracted major players in the dot-com and telecommunications sectors, such as AOL, Yahoo, and WorldCom. The advent of AI models like ChatGPT, which require massive computing power, has further spiked the demand for data centre infrastructure.
To meet this growing demand, utilities often extend the operational lifespan of coal-fired power plants and add natural gas power plants to balance the variability of renewable energy sources. Despite the US energy sector's significant achievement in steadily reducing coal-generated power over the past decade—retiring about 10 gigawatts of coal power capacity annually—this rate is expected to slow. S&P Global Commodity Insights forecasts a reduction to around 6 gigawatts per year until 2030 due to the increased energy demands. Companies like Google and Amazon promote their renewable energy commitments, but their extensive data operations still depend heavily on traditional power sources. We must recognise this double-edged nature of AI.
CPUs to GPUs
We can’t use legacy, CPU-intensive hardware to run AI models. Even simple changes will help. Research indicates that accelerated computing, utilising GPUs instead of CPU-only instances, specialised hardware, software, and parallel computing techniques, has vastly enhanced both the performance and energy efficiency of data centres.
For instance, achieving equivalent performance, a GPU-accelerated cluster consumes 588 fewer megawatt hours per month, marking a fivefold increase in energy efficiency. Deploying GPUs allows climate models such as the IFS model from the European Centre for Medium-Range Weather Forecasts to operate up to 24 times faster, significantly cutting annual energy consumption by as much as 127 gigawatt hours compared to CPU-only systems.
A Cautionary Tale
Until the time we innovate, we would have the irony of using AI to combat deforestation in the Amazon rainforest. This initiative utilises AI to monitor and analyse data to predict and prevent deforestation. This noble cause is paradoxically reliant on the very technology that contributes significantly to carbon emissions.
AI offers significant potential for improving energy efficiency, advancing predictive modelling, and guiding evidence-based policies to address climate change impacts. However, apart from the energy consumption associated with the required computing resources, the manufacture of AI hardware components, such as semiconductors and data servers, involves extracting rare earth metals and other resources. Irresponsible mining practices can result in habitat destruction, biodiversity loss, and water pollution, intensifying pressures on ecosystems already affected by climate change. The rapid turnover of AI hardware and short device lifecycles contribute to the mounting issue of electronic waste.
Moreover, AI systems are susceptible to biases and errors, which can distort climate-related decision-making. Flawed algorithms may generate inaccurate climate projections or ineffective policy recommendations, potentially leading to misguided actions or worsening environmental challenges.
Conclusion
Navigating the complex relationship between AI, energy consumption, and climate change requires a multidisciplinary approach that encourages collaboration among diverse stakeholders. By leveraging technological innovation, policy reforms, and collective action, we can pave the way for a more equitable, resilient, and sustainable future for generations to come.
Given the Indian government's considerable efforts to promote excellence in AI—through initiatives like the National AI Strategy, the IndiaAI mission, the establishment of AI research institutions, collaboration with industry stakeholders, investment in AI startups, and the promotion of AI education and training programs—this is an issue India cannot afford to overlook.
The author is a research scholar at Takshashila Institution, Bangalore. The views expressed in the above piece are personal and solely those of the author. They do not necessarily reflect Firstpost’s views.