Greener code: resources at a glance

Climate change is a reality. To achieve greater sustainability, IT can and must make a contribution to conserving resources – even in the development stage.

listen Print view
Greener code: resources at a glance

(Image: erzeugt mit Midjourney durch iX)

14 min. read
By
  • Richard Attermeyer
Contents

To reduce the effects of climate change and keep the earth livable for future generations, we are all called upon to contribute to reducing CO₂ emissions –, including IT. There are two aspects to consider here: On the one hand, IT has a positive effect on resource consumption, as digitalization helps to conserve raw materials and resources – by saving paper and travel, among other things. On the other hand, IT itself is consuming more and more energy due to the increasing number of data centers. It is currently unclear whether the increased energy consumption does not already exceed the savings achieved. Consequently, the industry should focus on limiting its hunger for energy and reducing resource consumption (see Bitkom study: Climate effects of digitalization).

Richard Attermeyer
Richard Attermeyer

(Image: 

Opitz Consulting

)

Richard Attermeyer works as Chief Solution Architect at Opitz Consulting. His current focus is on the modernization of large business applications.

Often, the focus is only on energy-efficient hardware and the overall energy consumption of data centers. Approaches such as the use of electricity from renewable energy sources or the use of waste heat point the way forward. However, the resource-efficient operation of data centers is only one side of the coin. It would be more important to minimize resource consumption directly in the applications operated in them. The question therefore arises whether it is possible to set the right course as early as the software development stage –, for example by making suitable design decisions. However, the prerequisite would be that the CO₂ emissions of software can be measured in a comprehensible manner. There are no recognized standards for this (yet). And what you can't measure is difficult to control.

Videos by heise

A fundamental decision also needs to be made whether only the emissions at runtime of the software should be considered or also those during its creation. The latter includes compiling and testing the code, but also the costs of training an ML model if AI is used. Despite this ambivalent starting position, a few basic considerations can be made, which are discussed below.

The general principle is to avoid unnecessary consumption of resources. This applies to all areas: the creation of software and AI models, the storage and processing of data and the operation of the software.

Specific design strategies and recommendations for action are derived from this principle. One source is the Handbook of Sustainable Design of Digital Services from the Institute for Sustainable IT. The institute offers awareness training on the topic and is a member of the European Institutes for Sustainable IT.

Figure 1 shows a selection and grouping of the tactics described in the handbook. The following list takes a closer look at some of the recommendations for action – The Green Software Foundation refers to these as Green Software Patterns.

Overview of tactics for greater energy efficiency (Fig. 1).

(Image: Opitz Consulting)

Considering the high operating speed of current computer systems, many users are wasteful with the available performance. When in doubt, they solve problems with poor runtime behavior of inefficient software by using more hardware.

However, the aim should be not to waste CPU cycles unnecessarily, but to solve problems with efficient algorithms. However, it is not only the runtime behavior of the algorithm that is decisive, but also the choice of programming language. This is because they differ significantly in terms of their energy efficiency. According to a comparative study on the energy efficiency of different programming languages, C is the best option in terms of energy and time consumption at runtime. Rust performs only marginally worse and Java is in fifth place behind C++ and ADA. Java is the only language that is not directly compiled to place high in the ranking. TypeScript and Python are in the bottom third.

As with most architectural decisions, the choice of programming language is a question of the individual advantages and disadvantages for the project. Even though C scores highly in terms of energy and speed, Java – is preferable due to its memory security –.

Another approach to saving CPU time is caching the calculated data. If web pages and associated data are retrieved, processed and output from the database each time they are accessed, this requires more CPU time than if they were to flow in directly from the cache. The cache also generally contributes to shorter response times and therefore improves the user experience.

There are no standardized methods for measuring COâ‚‚ emissions from software operation, but there are projects that at least make energy consumption traceable under certain conditions. The Kepler project allows CPU consumption to be measured in Kubernetes clusters. Together with the statistics from cgroups and sysfs, the data from Kepler can be transferred to ML models to determine the energy consumption of Kubernetes pods. The metrics can be made available via a Prometheus exporter to evaluate them with an observability and visualization tool such as Grafana, as shown in Figure 2.

Exemplary representation of the Pod Current Energy Consumption with the Kepler Exporter (Fig. 2).

(Image: Sustainable-computing.io)

The increasing energy consumption of IT has been criticized not only since the AI hype. At present, however, the consideration of AI is particularly relevant from a sustainability perspective, as both the training of the models and the inference (the derivation of statements) must be considered. Although the generation of models already consumes energy, the more of them are actually used, the greater the impact on operation. Applications with generative AI are experiencing a rapid upswing. Developers should therefore increasingly consider whether and in which areas they use AI.

What often goes unnoticed when programming is the fact that network traffic between servers in data centers and user systems contributes significantly to energy consumption. Simple measures to reduce the load include compression (which in turn consumes CPU cycles) and the use of GraphQL instead of REST. By parameterizing the requests, GraphQL avoids transferring unnecessary data, as is often the case with REST interfaces. Only what is necessary to implement the use case under consideration is transferred.

While GraphQL and REST relate more to text data, images and videos in particular contribute to high data volumes. Images should therefore be transferred in a suitable scale wherever possible instead of scaling them on the client. Choosing suitable formats such as webp also helps to save data.

For typical business applications, it is important to implement data lifecycle management and remove data that has become superfluous from the "hot" database, which contains the currently required data that is accessed frequently. Depending on the use case, certain retention periods for data must be adhered to in order to ensure smooth processing. They can then be moved to other storage areas. This form of reduction can have a positive effect on the amount of data to be read for answering a query – but also brings advantages in terms of backup. The topic of data storage also includes optimizing access. Database tuning is therefore an important task, both for performance and energy efficiency reasons.

Developers and software architects often only have the directly processed data in mind. But how should other data, such as log and metric data, be handled? Is the amount of data collected appropriate for the purpose of the software system and the environment? How are retention times defined and what clean-up procedures are in place? Here, too, it is worth taking a closer look, as carelessness in the use of storage resources has become widespread in view of the long-term fall in the cost of storage space.

Careful planning when it comes to hosting and operations helps to avoid unnecessary resource consumption and therefore costs. In larger cloud environments in particular, this is known as FinOps. Important aspects to consider are

  • Is it ensured that unused resources can be discovered and released?
  • Is there continuous rightsizing? Are factors such as memory, CPU requests and limits set correctly in the container environment or are resources being wasted?
  • Are unused resources (such as development and test environments) released again and shut down?

This brief overview of the recommendations for action makes it clear that many design decisions have an impact on resource consumption. In cloud environments in particular, it is clear that the financial aspect can be used as a proxy to reduce the environmental impact. In the cloud, FinOps not only has the effect of keeping an eye on the costs of cloud usage, but also of making a difference to the environment.

A number of trends have emerged in the design of IT systems –, including microservices and real-time processing. The following considerations show how the individual measures for greater sustainability and resource conservation presented above fare regarding these trends.

Microservices are an architectural style that implements an IT system as a set of independent services. Independence refers to both the runtime environment and the data. One consequence is that more CPU power and memory are required and both direct calls and data replication put a strain on the network.

Many projects switch to microservices too early. The distribution should always be justified by corresponding business-driven quality aspects. It is therefore often advisable to start with a technically well-structured monolith (Modulith) and only switch to distributed services at a later stage where the benefits exceed the additional costs and resource requirements.

iX/Developer special issue on modern software architecture
Software architecture anime

(Image: iX)

The content of this article ties in with the iX/Developer special issue"Praxis Softwarearchitektur", which is aimed at software architects. In addition to the classic architecture content on methods and patterns, there are articles on socio-technical systems, quality assurance and architecture and society. Domain Driven Design is also a topic, as are Team Topologies and Green Scrum.

We have been able to attract well-known experts as authors, who pass on their knowledge in many exciting articles for both architecture beginners and specialists.

On the other hand, distribution can also help to reduce resource consumption if there are different deployment scenarios for the individual services, for example regarding availability or different scaling requirements. Monolithic applications are often provided with the maximum amount of resources required. Many resources then remain unused for a long time and cause high idle consumption. It's like buying an SUV and driving it 30,000 kilometers a year, even though you only need it for two weeks a year to pull your caravan. A small car and a rented motorhome would be cheaper and more environmentally friendly.

Real-time processing is increasingly replacing batch-oriented processes, a prominent example being instant payment in the financial sector. Banks whose payment infrastructure relies entirely on batch-oriented processes have difficulties meeting the time requirements for instant transfers.

Batch processing has the advantage of being able to shift the workload to times when more renewable energy is available. In Germany, this is often between 10 a.m. and 5 p.m., as the figures from Agora Energiewende show.

Alternatively, processing can also be shifted to times when resources are unused (such as at night) so that no additional resources are required for real-time processing. In addition, AI models can be trained in regions where less electricity is required to cool data centers and where this electricity is generated in a COâ‚‚-neutral way. One example would be Scandinavia.

Considering energy efficiency as a key driver for the architecture of IT systems means that a number of additional trade-offs need to be considered. It is therefore important to continuously address this throughout the entire DevOps lifecycle.

Energy-optimizing activities in the DevOps lifecycle (Fig. 3).

(Image: Opitz Consulting)

Many of the relevant aspects have already been covered above in the context of solution strategies, but Figure 3 also includes the interesting point of purposeful testing, which relates to optimizing the CI/CD process in terms of energy efficiency. Ideally, every single commit should be tested to see whether it leaves the software in a release-capable state. However, it makes a difference whether developers work on a feature branch, create a pull request for it or merge their development branch into the master branch.

Particularly considering the large number of different tests in the CI/CD pipeline, the question arises as to when and how often they need to be run. How often should security scans or a static code analysis be carried out? Can the tests be limited to those parts of the code that have just been reworked? Should build caches be used? Timely feedback correlates strongly with energy savings and also increases developer satisfaction.

One advantage of many decisions for the architecture of energy-efficient systems is that they often deliver an ecological as well as an economic benefit. Many of the tactics discussed in the article, as well as a few others, are summarized in Figure 4.

Flow chart on the tactical approach for greater ecological and economic benefits (Fig. 4).

(Image: Opitz Consulting)

A specific selection of tactics and their trade-off analysis must be made for each use case. Software that is only used by a few users must be evaluated differently over its life cycle than frequently used applications. Content-centric websites with a lot of static content require different tactics than classic business applications whose content is focused on specific users. Thinking about the energy efficiency of your own software should become a matter of course for every developer in the interests of greater sustainability.

(map)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.