Technology association: Germany should lead the world with energy-saving AI
The VDE believes that progress in energy-efficient hardware is important to make large-scale AI models a lasting success. This is an opportunity for Germany.
(Image: IM Imagery/Shutterstock.com)
"Overall, large language models such as GPT-4 offer a wide range of possibilities that are currently being further developed economically. However, it is crucial to overcome the associated challenges," writes the Informationstechnische Gesellschaft (ITG) in the VDE electrical engineering and IT association in a recently published position paper. This is the only way to maximize the benefits of the current drivers in the field of artificial intelligence in a sustainable manner.
In general, generative AI with assistants such as ChatGPT from OpenAI, Gemini from Google or Claude from Antrophic is not just about the development of algorithms and software for groundbreaking new applications, the ITG emphasizes in its analysis. Rather, further advances in the field of energy-efficient, digital hardware and its interaction with suitable software play a key role.
"There is global talk of building more nuclear power plants to satisfy the hunger for energy," explains Damian Dudek, ITG Managing Director and co-author of the paper. "We are talking about making data processing more efficient and reducing energy consumption so that AI can be used sustainably." If Germany were to lead the world with such approaches, "completely new market potential could be opened up for of society and technological sovereignty could be maintained".
LLMs need an enormous amount of computing power
According to the authors, only those who take a leading technological position through innovation have the opportunity to establish a globally accepted framework regarding ethical requirements and data security. "When it comes to generative AI, we need agile regulation that keeps pace with developments and creates trust among the population," summarizes Dudek. "This will make us competitive and actively shape the future."
Training large language models (LLMs) and transferring them to large-scale applications requires "enormous computing resources, which are nowadays provided by dedicated multicore systems", writes the ITG. These rely on graphics processors with special software environments "to enable highly parallel processing with high throughput and high memory bandwidth".
Focus on neuromorphic computing approaches
In addition to reducing the structure size and increasing the number of transistors per chip area in accordance with Moore's law, the authors say there are several new approaches to overcoming the limitations of the traditional chip design method. Corresponding fields of research include multi-level data storage in a single device such as in phase-change materials, mixed-signal data processing circuits based on the latest technology in CMOS integration or the integration of electrical-optical signal processing devices in system-on-chip (SoC) architectures.
Videos by heise
The experts also consider approaches for neuromorphic computing, which are inspired by signal processing in biological systems, and quantum computing on various hardware platforms to be promising. Germany should "stay on the ball" in these fields. Only recently, the ITG also brought photonics into play in order to reduce the energy requirements of data centers in particular. The German start-up Q.ant presented an energy-efficient photonic AI chip in November. In general, the German research community should, according to the association, tackle the challenges associated with AI "with responsibility for the social benefits and with a view to national and European economic development and our technological sovereignty".
(nen)