top of page
Writer's pictureDaniel Rudis

Is Energy Consumption for AI spiraling out of Control?




The energy consumption of AI, particularly in comparison to other areas such as crypto mining, has been a topic of discussion. To put things into perspective, let's consider some key insights.


According to rmi.org, Bitcoin alone is estimated to consume 127 terawatt-hours (TWh) of energy annually. In comparison, Switzerland's annual energy usage in 2018, as reported by Axpo, was 57.6 TWh. It's important to note that electricity and energy consumption are not the same, and energy needs are trending downwards. Nevertheless, this means that Bitcoin mining consumes more than double the energy compared to the entire country of Switzerland.


As for Google, they claimed in 2019 that the average energy consumption for a single search is approximately 0.0003 kilowatt-hours (kWh). With over 1.2 trillion searches conducted globally each year, this amounts to about 7% of Switzerland's energy needs. While this figure is significant, it is considerably less shocking when compared to the energy consumption of Bitcoin.


However, it's crucial to recognize that simple Google searches are just a small part of the energy-intensive aspects of the IT sector. Cloud computing, artificial intelligence (AI), and the rise of 5G mobile networks contribute to substantial electricity bills. In fact, streaming services alone are projected to account for 87% of consumer internet traffic by now.


When it comes to large language models (LLMs), the transparency regarding their energy consumption is somewhat limited. Bloomberg reported allocating close to 1.3 million hours of training time for BloombergGPT using Nvidia's A100 GPUs. To put it into perspective, this amount of training time accounts for about 0.0023% of Switzerland's annual electricity needs.


Moreover, estimates suggest that a single query to GPT-3 (an early version of ChatGPT) consumes approximately 0.004 kWh of electricity. Based on OpenAI's estimation of one billion queries per day, this corresponds to around 1/40 of Switzerland's running electricity needs. While this represents a significant amount, it is important to consider that it is based on a vast number of users.


As LLMs and AI usage continue to surge, the energy requirements for OpenAI and similar organizations could soon reach levels comparable to smaller industrial countries. This raises questions about the incorporation of LLMs into search engines and whether the additional user benefits justify the potential surge in energy demand.


It's undeniable that data centers, cloud platforms, and computing centers consume substantial amounts of energy. According to research by ericsson.com in 2018, the IT sector as a whole accounted for about 800 TWh of electricity consumption, equivalent to 3.6% of global electricity usage. This is 14 times the energy consumption of Switzerland.


On a positive note, despite exponential data growth between 2010 and 2015, the electricity needs of the IT sector remained relatively stable due to improved energy efficiency. There is ongoing pressure to make IT more energy-efficient and reduce its environmental impact.


Looking ahead, quantum computing could be a game-changer in terms of energy consumption. IBM's recent research paper signals the arrival of a new era of quantum computing. Quantum computers have the potential to complete tasks that would take classical computers several years in a matter of hours. This level of efficiency could lead to significant energy savings in the long run.


To summarize, while the energy requirements of large language models are considerable, they may not be as shockingly high as initially expected. The uncertainties lie in the further proliferation of cloud services and AI, as well as the size of AI models. However, there is a good chance that technological advancements will prevent energy needs from spiraling out of control.

507 views0 comments

Kommentare


bottom of page