Home > News > It

AI is "sucking up" global electricity! The scarier part is yet to come

Ma Ya Lan Sun, Apr 21 2024 06:41 AM EST

In recent years, the rise of artificial intelligence (AI) has sparked widespread discussion and concern. Many worry that AI will lead to skyrocketing unemployment, while some optimistic folks jokingly remark, "As long as electricity costs more than steamed buns, AI can never fully replace humans."

Though said in jest, behind this joke lies a real issue: the energy consumption of AI. More and more people are concerned that high energy consumption will become a bottleneck for AI development. Just recently, Kyle Corbitt, a tech entrepreneur and former Google engineer, revealed on the social media platform X that Microsoft is facing this exact challenge.

How much power does AI really consume?

Corbitt mentioned that engineers at Microsoft, who are training GPT-6, are busy building an InfiniBand network, connecting GPUs located in different regions. This task is extremely challenging, but they have no choice because deploying over 100,000 H100 chips in the same region would overload the power grid. b61b1a08-9d53-4177-bf6f-4a0db288dd85.png Why would the concentration of these chips lead to the consequences of a grid collapse? Let's do some quick math.

Data published on NVIDIA's website indicates that the peak power of each H100 chip is 700W, with 100,000 H100 chips peaking at a maximum power consumption of 70,000,000W. Moreover, a professional in the energy industry mentioned on X platform that the total power consumption of 100,000 chips would be equivalent to the entire output of a small solar or wind power plant. In addition to this, one must consider the energy consumption of the accompanying facilities for such a large number of chips, including servers and cooling equipment. The strain that so many power-hungry facilities concentrated in a small area would put on the grid is considerable.

AI Power Consumption: Just the Tip of the Iceberg

Concerning the issue of AI energy consumption, a report by The New Yorker once garnered widespread attention. The report estimated that ChatGPT alone consumes over 500,000 kilowatt-hours of electricity per day.

In reality, while AI power consumption may seem astronomical, it still pales in comparison to cryptocurrency and traditional data centers. The challenges encountered by Microsoft engineers also indicate that what hampers AI development is not only the energy consumption of the technology itself but also that of the supporting infrastructure and the capacity of the grid.

A report released by the International Energy Agency (IEA) revealed that in 2022, the energy consumption of global data centers, artificial intelligence, and cryptocurrencies reached 460 terawatt-hours (TWh), accounting for nearly 2% of global energy consumption. The IEA predicts that in the worst-case scenario, by 2026, the electricity consumption in these sectors will reach 1,000 TWh, equivalent to the entire electricity consumption of Japan.

However, the report also indicates that the energy consumption directly attributable to AI research and development is much lower than that of data centers and cryptocurrencies.

NVIDIA holds approximately 95% of the market share in the AI server market, supplying around 100,000 chips in 2023, with an annual power consumption of approximately 7.3 TWh. In contrast, in 2022, cryptocurrency energy consumption amounted to 110 TWh, equivalent to the entire electricity consumption of the Netherlands. S47f07b2c-7a90-4bc9-9153-cfa339e2d901.png Caption: Estimates of energy consumption for traditional data centers, cryptocurrency, and AI data centers in 2022 and 2026 (bar chart from bottom to top). It can be seen that the power consumption of AI is currently much lower than that of data centers and cryptocurrency. Image Source: IEA

Cooling Energy Consumption: A Vital Consideration

The energy efficiency of data centers is typically assessed using the Power Usage Effectiveness (PUE) metric, which represents the ratio of all energy consumed to the energy consumed by IT loads. The closer the PUE is to 1, the less energy the data center wastes.

A report by the Uptime Institute, a standard organization for data centers, revealed that the average PUE for large data centers globally was approximately 1.59 in 2020. This means that for every unit of electricity consumed by IT equipment, an additional 0.59 units are consumed by supporting infrastructure.

The majority of the additional energy consumption in data centers is attributed to cooling systems. One study found that cooling systems can account for up to 40% of the total energy consumption in data centers.

In recent years, as chip technology has advanced, the power consumption per device has increased, leading to higher power densities in data centers and greater demands for cooling. However, improvements in data center design can significantly reduce energy waste.

Due to differences in cooling systems, structural design, and other factors, the energy efficiency of different data centers varies widely. According to the Uptime Institute report, European countries have achieved a PUE as low as 1.46, while over one-tenth of data centers in the Asia-Pacific region still have a PUE exceeding 2.19.

Countries worldwide are taking measures to encourage energy efficiency and emissions reduction in data centers. For example, the EU requires large data centers to install waste heat recovery equipment, the US government is investing in the development of more energy-efficient semiconductors, and the Chinese government has mandated that data centers achieve a PUE of no more than 1.3 starting from 2025, with the proportion of renewable energy usage increasing annually to reach 100% by 2032. Sff1f5db4-7545-4a77-a5ca-60317f7a5300.png Figure caption: Energy efficiency ratio of large data centers worldwide in 2020. From left to right: Africa, Asia-Pacific, Europe, Latin America, Middle East, Russia and CIS countries, United States and Canada. Image source: Uptime Institute

Tech Companies Struggle to Conserve Electricity, Let Alone Go Green

As cryptocurrencies and AI flourish, the scale of data centers operated by major tech companies continues to expand. According to the International Energy Agency (IEA), in 2022, the United States had 2,700 data centers, consuming 4% of the nation's electricity, a figure projected to rise to 6% by 2026. With land becoming scarcer on the coasts, data centers are gradually shifting inland to states like Iowa and Ohio. However, these secondary locations lack developed industries, and electricity supply may not meet the demand.

Some tech companies are attempting to break free from the constraints of the grid by purchasing electricity directly from small nuclear power plants. However, both this method of electricity consumption and the construction of new nuclear power plants face complex administrative processes. Microsoft is trying AI to aid in applications, while Google employs AI for computational task scheduling to improve grid efficiency and reduce corporate carbon emissions. As for when controlled nuclear fusion will be put into practice, it remains uncertain.

Climate Change Adds Insult to Injury

The development of AI requires stable and robust support from the power grid, but with frequent extreme weather events, many regions' grids are becoming more fragile. Climate change leads to more frequent extreme weather events, causing not only a surge in electricity demand and increased burden on the grid but also direct impacts on grid facilities. According to an IEA report, due to factors such as drought, insufficient rainfall, and early snowmelt, the global share of hydropower fell to its lowest level in thirty years in 2023, at less than 40%.

Natural gas is often seen as a bridge to renewable energy transition, but it proves unstable during winter extreme weather. In 2021, a cold snap hit Texas, leading to widespread power outages, with some residents experiencing over 70 hours without electricity. A major reason for this disaster was frozen natural gas pipelines, causing natural gas power plants to shut down. The North American Electric Reliability Council (NERC) predicts that from 2024 to 2028, over 3 million people in the United States and Canada will face increasing risks of power outages.

To ensure energy security while achieving energy conservation and emission reduction, many countries also view nuclear power plants as a transitional measure. At the 28th United Nations Climate Change Conference (COP 28) held in December 2023, 22 countries signed a joint statement committing to increase nuclear power generation capacity to three times the 2020 level by 2050. Meanwhile, with countries like China and India vigorously promoting nuclear power construction, the IEA predicts that global nuclear power generation will reach a historic high by 2025.

The IEA report states, "In the face of changing climate patterns, enhancing energy diversification, improving cross-regional grid dispatch capabilities, and adopting more resilient power generation methods will become increasingly important." Securing grid infrastructure not only affects the development of AI technology but also concerns national welfare and livelihoods.

References:

[1] Kyle Corbitt. X. https://twitter.com/corbtt/status/1772392525174620355. <2024-03-26/2024-04-09>.

[2] IEA (2024), Electricity 2024, IEA, Paris https://www.iea.org/reports/electricity-2024, Licence: CC BY 4.0

[3] Andy Lawrence. Which regions have the most energy efficient data centers?. Uptime Institute.

https://www.datacenterdynamics.com/en/opinions/which-regions-have-most-energy-efficient-data-centers/. <2020-08-04/2024-04-10>

[4] Zhang, Xiaojing, Theresa Lindberg, Naixue Xiong, Valeriy Vyatkin, and Arash Mousavi. "Cooling energy consumption investigation of data center it room with vertical placed server." Energy procedia 105 (2017): 2047-2052.

[5] Evan Halper. Amid explosive demand, America is running out of power. Washington Post. https://www.washingtonpost.com/business/2024/03/07/ai-data-centers-power/. <2024-03-07/2024-04-09>.

  • Jeremy Hsu. "US grid vulnerable to power outages due to its reliance on gas." New Scientist. Link. <2024-01-11/2024-04-09>.

  • Jeremy Hsu. "Much of North America may face electricity shortages starting in 2024." New Scientist. Link. <2023-12-23/2024-04-09>.