Artificial intelligence: energy consumption and opportunities in balance

Artificial intelligence is associated with increased energy consumption, but AI systems can also be used to save energy in the company.

listen Print view
Growing tree

(Image: Black Salmon / Shutterstock.com)

16 min. read
By
  • Patrick Schnell
Contents

Artificial intelligence and machine learning are nothing new, but the discussion about the energy consumption of AI is more topical than ever. The rapid development and spread of AI has the potential to fundamentally change many areas of life and the economy. AI systems are increasingly being used in the IT and software industry to perform complex tasks more efficiently and quickly. However, the operation of many modern AI models requires enormous computing capacities, which are not only cost-intensive but also leave a significant ecological footprint.

Patrick Schnell
Patrick Schnell

(Image: 

Patrick Schnell

)

Patrick Schnell is a computer scientist, author and speaker. As founder of schnell.digital GmbH, his mission is to support companies in becoming digital pioneers in their industry. With his team, he helps companies from industry, finance, NGOs and trade to implement individual software and digitization projects. As one of a handful of Central European companies, schnell.digital is an officially certified MongoDB partner, bringing knowledge in the field of NoSQL databases to companies. In the workshop “NoSQL Mastery” at the WeAreDevelopers World Congress 2024, Patrick Schnell used practical examples to demonstrate the advantages and disadvantages of NoSQL for various application areas.

On the other hand, AI also opens up promising opportunities that go far beyond simply increasing efficiency. Companies and research institutions are continuously working on making AI technologies more sustainable and developing new applications that offer both economic and environmental benefits. For example, AI can contribute to the optimization of energy systems, which not only reduces costs but also protects the environment.

This article examines the various facets of sustainability in the context of AI. It looks at the current challenges and negative aspects as well as the potential and positive developments. Finally, it shows how companies can implement sustainable processes through the intelligent use of AI and thus make an important contribution to the energy transition.

Videos by heise

An impressive example of the energy consumption of AI systems is the comparison between the consumption of ChatGPT and traditional search engines such as Google. According to a report by The Brussels Times, ChatGPT requires around 25 times more energy than Google's search engine to perform similar tasks, such as answering simple questions.

This discrepancy can be explained by the fundamental differences in the technologies. While search engines such as Google are based on relatively simple algorithms that rely on indexing and matching search queries with existing data, modern large language models (LLMs) such as ChatGPT work with highly complex neural networks. These networks of billions of parameters require enormous computing capacity during training and execution.

Training a large language model such as OpenAI GPT, on which ChatGPT is based, requires huge amounts of data and months of calculations on specialized high-performance computers, the GPU or TPU clusters, which continuously consume energy during training. This energy requirement can be controlled by the training frequency (model updates, internal tests, etc.).

Another key factor is inference, i.e. the phase in which the trained model generates answers or executes tasks. Here too, considerable computing power is required to perform the complex calculations with the lowest possible latency. The larger the model, the more energy it generally requires. Since ChatGPT is based on advanced language models that are able to understand and reproduce context and nuances in human language, the energy required is correspondingly high. This overhead is always incurred as soon as a query is sent to the systems. Factors such as the input length (context window, the maximum size of the information to be processed in a prompt) are decisive here.

The high energy consumption when operating LLMs not only has ecological implications, but also economic ones. Companies that operate such AI models have to make considerable investments in infrastructure to provide and maintain the required computing power. In addition, the high power consumption leads to ongoing operating costs that can increase significantly over time. This is precisely what makes the new AI models from the cloud so interesting for SMEs and private individuals, as they do not require their own hardware.

The operating costs for a system like ChatGPT are immense. According to Digital Trends, the cost of operating ChatGPT is estimated to be several million dollars per month. These costs are made up of various components, including hardware, power supply and maintenance. As a result, OpenAI is facing a loss of almost 5 billion US dollars this year.

A large part of the operating costs for LLMs is therefore attributable to the operation of the hardware infrastructure in data centers, which not only house the physical servers, but must also have comprehensive cooling and power supply systems to ensure continuous operation. In addition to energy costs, there are other operating costs, including maintenance and management of the network infrastructure.

Energy costs vary depending on the location and energy source, but are always considerable. Particularly in regions with high electricity prices, energy costs can account for a large proportion of total expenditure. This consumption (per server) will certainly decrease over the next few years as new systems become more energy efficient and require less power for the same performance (performance per watt).

However, all of this applies not only to the use of AI, but also to all cloud systems. Critics would say that this is the core of the problem and that the cloud trend is already leading to enormous energy consumption and costs. However, central data centers in particular, which can be built and operated in an optimized manner (in some cases with their own solar parks), are far more energy-efficient per rack unit than, for example, setting up and operating their own physical data center infrastructure in a small or medium-sized company.

AI technologies can make a significant contribution to saving energy in companies by optimizing processes and workflows. One example is the intelligent control of energy consumers in production facilities and offices. By analyzing data on energy consumption and operating processes, AI can make predictions and recommendations to optimize energy use. This can be done, for example, by adjusting operating times, automating energy consumers or optimizing heating, ventilation and air conditioning systems.

One practical example is the use of AI in the manufacturing industry. AI systems can monitor and adjust production processes in real time to minimize energy consumption. The integration of sensors and continuous data analysis makes it possible to identify and optimize inefficient processes, which leads to a significant reduction in energy consumption and operating costs.

Another application example is the use of AI in building management. Intelligent building control systems based on AI can significantly reduce the energy consumption of buildings by optimizing lighting, heating and air conditioning. These systems use sensors to learn from the occupants' usage habits and adapt the control of energy consumers accordingly in order to both maximize comfort and minimize energy consumption.

The systems mentioned here require smaller amounts of data than a general voice model, for example. The models can therefore be operated relatively cheaply and with low energy consumption. Depending on the type and size of the model, they can already be operated well on standard mid-range notebooks or desktop computers.

One particularly promising area in which AI can contribute to sustainability is the optimization of power grids. The integration of renewable energy sources such as wind and solar energy poses new challenges for the electricity grid, as these energy sources are weather-dependent and therefore less predictable. AI can help to improve the stability and efficiency of the power grid.

According to a Handelsblatt report, AI is already being used successfully to support the integration of renewable energies and increase the efficiency of the power grid. AI systems can analyze large amounts of data in real time to make predictions about energy production and consumption. These predictions make it possible to manage the electricity grid more efficiently and plan the distribution of energy better. For example, surplus energy generated during periods of high production from renewable sources can be temporarily stored in battery storage systems and fed back into the grid during periods of high demand.

AI can also help to prevent outages in the power grid by detecting anomalies and potential problems at an early stage. By continuously monitoring the electricity grid, weak points can be identified and preventative measures can be taken to ensure the reliability and stability of the grid.

The use of AI to save energy can offer companies a number of benefits. Firstly, optimizing workflows and processes can achieve significant cost savings. Energy is one of the largest operating expenses in many industries, and even small improvements in energy efficiency can lead to significant financial savings. Some of the figures vary widely and realistically range from 9 to 20 percent in office, hospital or factory operations. Independent, reliable figures are currently being determined by various research projects. Companies that invest in AI-supported energy management systems can reduce their operating costs and increase their competitiveness.

Secondly, reducing energy consumption also helps to improve a company's environmental footprint. The lower electricity demand means that less fossil fuels are burned in the electricity mix, which leads to a reduction in CO2 emissions. However, it is important to weigh up the costs and benefits carefully.

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.