Day 2 at 38C3: misappropriated research data and a dark review of the year

Recognizing depression by voice – how AI could be misused in recruiting. 38C3 is all about surveillance and the misuse of data.

listen Print view
CCC letters, 38C3 in Hamburg

(Image: heise online / mack)

6 min. read
Contents

Day two at 38C3: In addition to classic security topics such as iOS malware and remote control of public institutions via radio, the focus was once again on repressive government measures and the weakening of civil rights. In some cases, visitors had to wait in long queues outside the lecture halls for admission – those who couldn't find a seat were at least able to watch almost all the presentations on stream. A small consolation, although people certainly didn't come to Hamburg to sit in front of a screen. On the occasion of the Young Hackers' Day, there were also many additional activities for 10 to 17-year-olds.

In future, research data will not only be available at national level, but also at EU level to train AI models. This is because neither the AI Regulation, which came into force this year, nor the GDPR adequately protect data. The first step will be health data, with eight data rooms planned, for example in the area of mobility. This is attracting the attention of civil rights activists, data protectionists and security researchers for various reasons and is therefore also a topic at 38C3.

The problem here is that AI models or data from them, which were originally collected for public welfare projects, could subsequently be used for commercial purposes. As far as the application of the data or AI models is concerned, there are no limits to the imagination. This is shown by recent research by Dr. Rainer MĂĽhlhoff et al. Together with Hannah Ruschemeier, he presented in the lecture "Public welfare-oriented research with AI: curbing misuse by earmarking AI models" which companies have already participated in freely available AI models or have used their participation in research projects for dubious and sometimes questionable software.

As one example, MĂĽhlhoff cited the company "VoiceSense", which conducted a study together with the Neuropsychiatric Center Hamburg (NPZ Hamburg) to identify depressed people using voice biomarkers, among other things. The company analyzed the data itself and then sent it to the NPZ. In addition to the reliability of the software (186 test subjects took part in the study), it is unclear whether the data was deleted after the end of the study. At the same time, the company offers software that is also used in the human resources sector and is designed to identify depressed applicants or the mood in companies based on their voice. The researchers are therefore calling for data use to be restricted to the common good, meaning that data may only be used if it is suitable for the original purpose.

Videos by heise

While many children and young people were already bustling around the CCH the day before, the Saturday of the congress was dedicated to young hackers. The organizers had an extensive programme for interested ten- to seventeen-year-olds, with lots on offer, from reading sessions and soldering courses to workshops on mathematics. A separate area for the youngest, the Kidspace, offered space for learning and playing, but also for rest breaks.

With a limited contingent of free day tickets, the 38C3 organizers supported families and teachers who wanted to introduce their children to hacking. However, the 300 tickets were quickly snapped up, as were the 800 or so workshop tickets that were only made available on Saturday morning. Apparently there is no shortage of children and young people interested in technology.

The CCC's review of the year was depressing, as Constanze Kurz, one of the CCC spokespersons, found. The projects mentioned at European level include the European Health Data Space, which was adopted in March and collects all kinds of patient data. The year was also characterized by advances in facial recognition. Vera is now being used in three federal states - in North Rhine-Westphalia and Hesse, and on a trial basis in Bavaria. Vera stands for "cross-procedural research and analysis platform".

But that's not all: this year, the traffic light coalition has repeatedly called for a security package with comprehensive surveillance powers. This includes automated facial recognition with AI and data retention. Nancy Faeser also wanted the BKA law to allow police officers to secretly enter homes, search them and install Trojans on computers. Another step towards the surveillance state is chat control, on which no agreement has yet been reached.

The CCC is also particularly critical of developments at international level. The UN Cybercrime Convention threatens to introduce massive surveillance obligations and undermine IT security. Despite widespread criticism from civil rights organizations and IT experts, the EU appears to want to approve the treaty.

The analysis of the election software used in the state elections in Saxony and Thuringia was also depressing. Errors were discovered in the calculation of the distribution of seats. The CCC has been criticizing the lack of transparency of the software used for years and is calling for open source development. Although the BSI has produced a 90-page paper, the fundamental problems remain.

The CCC ended the year with security gaps in the electronic patient file and at the VW Group. At the latter, terabytes of location data of e-vehicle owners were visible.

The outlook for 2025 was no less gloomy: There will be an eye on the geopolitical changes brought about by the new US presidency and, of course, the early federal elections – and the voting software – used there in our own country.

(mack)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.