Google’s emissions increased by 48% since 2019, thanks to AI purposes
Google’s latest annual environmental report shows that its recent efforts in the field of artificial intelligence have had a real impact on greenhouse gas emissions.
The expansion of its data centers to support AI development will contribute to the company producing 14.3 million tons of carbon dioxide equivalent in 2023. This represents a 48% increase compared to the equivalent figure for 2019 and a 13% increase from 2022.
“This result was primarily due to increases in data center energy consumption and supply chain emissions,” the report’s authors wrote.
“As we integrate AI more into our products, it may be challenging to reduce emissions due to the growing energy demands of more intense AI computing and emissions associated with expected growth in our technology infrastructure investments.”
See: How Microsoft, Google Cloud, IBM and Dell are working to reduce AI’s climate harm
Google claims it cannot identify the component of overall data center emissions that is responsible for AI
Contents
In 2021, Google pledged to reach net-zero emissions across all of its operations and value chain by 2030. The report said this goal is now considered “extremely ambitious” and “requires (Google) to deal with significant uncertainty.”
The report further states that the environmental impact of AI is “complex and difficult to predict,” so the company can only publish aggregated data center-wide metrics, which include cloud storage and other operations. This means that the environmental damage that will result from AI training and use in 2023 is being kept secret for now.
That being said, in 2022, Google engineer David Patterson wrote in a blog, “Our data shows that ML training and inference accounted for only 10%-15% of Google’s total energy use over the past three years.” However, that proportion has likely increased since then.
See: Everything you need to know about Greentech
Why is AI responsible for rising emissions from technology companies?
Like most of its competitors, Google has introduced a number of AI projects and features over the past year, including Gemini, Gemma, image generation in Overview and Search, and AI security tools.
AI systems, especially those that train large language models, demand a lot of computational power. This means higher power consumption and, as a result, more carbon emissions than normal online activity.
SEE ALSO: Artificial Intelligence Cheat Sheet
According to a study by Google and UC Berkeley, training OpenAI’s GPT-3 generated 552 metric tons of carbon – the equivalent of running 112 petrol cars for a year. Furthermore, studies estimate that a generative AI system uses about 33 times more energy than machines running task-specific software.
Last year, Google’s total data center power consumption increased by 17%, and while we don’t know how much of that was due to AI-related activities, the company did admit it “expects this trend to continue in the future.”
Google isn’t the first big tech organization to reveal that AI developments are impacting its emissions and proving difficult to manage. In May, Microsoft announced that its emissions had increased by 29% from 2020, primarily as a result of building new data centers. “Our challenges are unique to our position as a leading cloud supplier that is expanding its data centers,” Microsoft’s environmental sustainability report said.
Leaked documents seen by Business Insider in April reportedly show that Microsoft has acquired more than 500MW of additional data center space starting in July 2023 and that its GPU footprint now supports live “AI clusters” in 98 locations globally.
Four years ago, Microsoft president Brad Smith called the company’s pledge to be carbon-free by 2030 a “moonshot.” However, in May, he acknowledged via Bloomberg’s Zero podcast that “the moon has moved further” since then and is now “more than five times farther away.”
Alex de Vries, founder of digital trend analysis platform Digiconimist, which tracks AI sustainability, believes Google and Microsoft’s environmental reports prove that the tech giants are not taking sustainability as seriously as AI development. “On paper they may say so, but the reality is that they are currently clearly prioritizing growth over meeting climate goals,” he told TechRepublic in an email.
“Google is already struggling to meet its growing energy demand from renewable energy sources. The carbon intensity of each MWh that Google consumes is rising rapidly. Globally we have a limited supply of renewable energy sources available and the current trajectory of AI-related power demand is already too high. Something big has to change to make those climate goals achievable.”
Google’s rising emissions could also impact businesses that use its AI products, which have their own environmental goals and regulations they must comply with. “If Google is part of your value chain, then Google’s emissions going up also means your Scope 3 emissions are going up as well,” de Vries told TechRepublic.
How Google is managing its AI emissions
Google’s environmental report highlights how the company is managing the energy demands of its AI development. Its latest tensor processing unit, Trillium, is 67% more energy efficient than the fifth generation, while its data centers are 1.8 times more energy efficient than the typical enterprise data center.
Google’s data centers now provide roughly four times more computing power with the same amount of electrical power than they did five years ago.
At NVIDIA GTC in March 2024, TechRepublic spoke with Mark Lohmeyer, vice president and general manager of compute and AI/ML infrastructure at Google Cloud, about how its TPUs are getting more efficient.
“If you think about running a highly efficient form of accelerated computing with our own in-house TPUs, we leverage liquid cooling for those TPUs, which allows them to run faster, but it’s also a more energy efficient and as a result a more cost-effective method,” he said.
Google Cloud also uses software to sustainably manage up-time. “What you don’t want is a cluster of GPUs or any type of compute that’s deployed using power but not actively producing, you know, the results that we want,” Lohmeyer told TechRepublic. “And so driving high levels of infrastructure utilization is also important for sustainability and energy efficiency.”
Google’s 2024 Environment Report says the company is managing the environmental impact of AI in three ways:
- Model Customization: For example, it increased the training efficiency of its fifth-generation TPUs by 39% with techniques that speed up training, such as quantization, where the precision of the numbers used to represent the model’s parameters is reduced to reduce computational load.
- Efficient Infrastructure: Its fourth-generation TPU was 2.7 times more energy efficient than its third generation. In 2023, Google’s water management program offset 18% of its water use, much of which goes to cooling data centers.
- Emission Reduction: Last year, 64% of the energy consumed by Google’s data centers came from carbon-free sources, including renewable sources and carbon capture schemes. It also deployed carbon-intelligent computing platforms and demand response capabilities at its data centers.
Additionally, Google’s AI products are being designed to tackle climate change in general, such as fuel-efficient routes in Google Maps, flood prediction models, and the Green Light tool, which helps engineers optimize the timing of traffic lights to reduce stop-and-go traffic and fuel consumption.
Demand for AI could impact emissions targets
Google says the power consumption of its data centers – which power its AI activities, among other things – currently accounts for just 0.1% of global electricity demand. Indeed, according to the International Energy Agency, data centers and data transmission networks are responsible for 1% of energy-related emissions.
However, this is expected to increase significantly over the next few years, with data center power consumption projected to double between 2022 and 2026. According to SemiAnalysis, by 2030, data centers will consume around 4.5% of global energy demand.
Training and running AI models in data centers requires a significant amount of energy, but manufacturing and transporting chips and other hardware also contribute. The IEA estimates that AI in particular will use 10 times more electricity in 2026 than it did in 2023, due to rising demand.
See: AI is causing fundamental data centre power and cooling problems in Australia
Data centers also require a lot of water to cool down, even more so when running energy-intensive AI computations. A UC Riverside study found that the amount of water withdrawn for AI activities could be equal to half of the UK’s annual water consumption by 2027.
Rising electricity demand could push tech companies back to non-renewable energy
Technology companies have long been big investors in renewable energy, with Google’s latest environment report saying it has purchased more than 25 TWh in 2023 alone. However, there are concerns that skyrocketing energy demand as a result of their AI efforts will keep coal- and oil-fired plants in business that would have otherwise been shut down.
For example, in December, county supervisors in Northern Virginia approved building 37 data centers on just 2,000 acres, resulting in proposals to increase the use of coal power.
#Googles #emissions #increased #purposes