"They didn't give a damn about the security of the players' data"

Lilith Wittmann has found serious security loopholes in online casinos. The interview with her is not just about technology, but also about a government agency.

listen Print view
Lilith Wittmann

The "riot influencer" (self-description) Lilith Wittmann has once again uncovered serious security vulnerabilities.

(Image: privat)

11 min. read
Contents

Last week, several Merkur Group online casinos warned their users about unauthorized access to players' data. Shortly afterwards, software developer and security researcher Lilith Wittmann went public with an extensive blog post: she discovered the gaps and thus gained access to the complete data records of almost one million players.

In an interview with heise online, she explains not only how easy it was to find the loopholes, but also how the legally required player protection should actually work. It is about the joint gambling authority of the federal states (GGL) and the cross-state gambling supervision system (LUGAS). The so-called KYC (know your customer) procedures of private companies also play a role – All of these parties are directly or indirectly affected by the security loopholes that have since been rectified.

How did you originally discover the vulnerability?

I first registered with online casinos that are legal in Germany two months ago because I wanted to take a look at the state infrastructure behind them. For example, there is the central LUGAS file in which player data is stored. I also wanted to try out the KYC processes. But I didn't get that far because I saw the first security gap on the homepage in the browser console.

Videos by heise

What was the vulnerability and how could it be used to access player data?

It didn't take long before I had the providers' API in my hands. It was also nicely self-documenting because it was based on GraphQL. Then I tried out a few potential queries and always had more data in my hand than I should have had. Then I had the IDs of users and tried them out with interfaces to third-party providers. After several days, I had all the data. In the end, neither Merkur nor its partner and software provider had thought about what a secure architecture could look like. It seems that other teams have always built the integration with third-party providers or for payment providers. Otherwise they would have realized: "Oh, we're using the user ID for authentication, but we're also issuing it elsewhere."

Were the right tools used to protect personal data?

GraphQL is just a different API technology to a Restful API, for example. You can build something secure and useful with any API technology. In this case, the decision was made not to do that, but simply to hack something together. I don't see the fact that GraphQL is self-documenting as a problem. You just have to think about how to verify that a user has access to a particular data object. Since GraphQL queries can be nested in virtually unlimited depth, you have to check this at every level. In addition, the provider has used "security by ID" and "security by session" in some places. This may be okay in individual cases, but not if the session ID is issued elsewhere via the API.

What is the problem here?

Some say that if you have a sufficiently long, unique and random ID, then you can't guess it and you don't need to additionally secure such a UUID. I generally take a very critical view of this. At the same time, the provider also merged databases, and there were people in them who had a number between 1 and 1 million as an ID. They were brought into the same system, and that is particularly absurd: on the one hand, the IDs are disclosed in many places, and on the other hand, some of them can simply be counted on. So it wasn't the tools used that were the problem, but the fact that you couldn't handle them.

What role do the third-party providers play here, from whom they were also able to copy data?

They didn't have the best security concept either, because they authenticated users based on the mill's ID. "Security by UUID" can be okay if you do it in a self-contained system where you control everything. But if you are building software that is intended to be integrated into other systems, you shouldn't rely on it. The third-party providers should also have paid more attention to this so that they too could not be verified and data copied.

How did the notification to the GGL work and why was there no responsible disclosure to the operator directly?

In this case, I specifically decided to work with the supervisory authority. I also had to because I had access to the LUGAS references. These make it possible to retrieve a lot of data on the players from the GGL. The GGL operates a safe server to which every provider must report every interaction of a player in pseudonymized form. Every deposit and withdrawal, every game, every win or loss is sent to GGL in real time. GGL then provides me with a data record via a GDPR request with my LUGAS or player ID. If these IDs are public, I can get the data from any other player. This means that the GGL, a public authority, is involved in this data protection incident.

What happened afterwards?

The GGL issued a public warning to Merkur. According to the warning, Merkur did not carry out the legally required annual pen test properly. Even I am not formally allowed to do this, as there are precise requirements. I think that if you want to allow gambling in Germany at all, you should at least adhere to the very low legal requirements. We're not talking about a few security loopholes that were inadvertently left open: People didn't give a damn about the security of players' data. There was also a risk that the GGL might not have been able to secure any evidence if I had approached the provider with a responsible disclosure first. Legally speaking, an authority is also better at preserving evidence than I am.

And when did the provider, who informed the players first, come into play?

The GGL took my report, reviewed it, wrote a new report and sent it to the provider. The provider then decided to make it public, close the gaps and inform the players. However, this also means that the provider did not engage in a responsible disclosure process. For example, they did not confirm to me in writing that they had received the report, there was only one telephone contact and that was it.

Why have some players uploaded private documents, including letters from the employment agency?

KYC procedures are used to determine who someone is and where they live. If someone has a German identity card, this is easy to answer. However, many people who want to play at online casinos may not have this, but instead have another European ID or other documents, for example. However, the provider must verify the address. If I want to play legally on a platform regulated in Germany, I have to prove that I currently live in Germany. I can do this by uploading documents. If I have a letter from my bank, for example, then there is a high probability that I live at this address –, which is probably high enough for the casinos at least. However, players upload all sorts of things, from unredacted bank statements to letters from the employment agency and medical diagnoses. Just to legitimize themselves and be able to play. The provider obviously didn't check, but saved everything. And in my opinion, this also includes particularly sensitive data within the meaning of paragraph [9] GDPR.

Who could be responsible for this?

This is of course a problem for Merkur on the one hand, but also for its KYC provider. In my view, they would also have to intervene if such sensitive data is uploaded. It must not be accessible online. It is simply not acceptable to be able to log in with a name and a player ID and then have access to a KYC process that is several years old. At best, you could keep something like this in cold storage in case of an audit, but not in the way that has been done here.

What was the idea behind KYC for legal gambling?

In principle, it is good that KYC procedures are used to curb money laundering, ensure player protection and make sure that not very young people play. The limits should also be enforced in this way – As a rule, you should not be allowed to gamble more than 1,000 euros a month online. We know that gambling addiction is a major problem.

Why do you generally refer to gambling as a black box in your blog?

Since the legalization of online gambling, a lot of data has been collected by the state via the GGL-Safe servers, but not used for research. I find that highly critical. We legalize something that is highly addictive and then collect the data but don't evaluate it. It is well known in research and in the gambling industry that providers earn 70 to 90 percent of their revenue from a small proportion of players. This can be gambling addiction or at least problematic gambling behavior. However, the providers have a vested interest in these people continuing to gamble. It would actually be the state's job to mitigate problematic gambling behavior.

Is nothing really happening?

There are state-sponsored studies such as the gambling atlas. Researchers sit down and conduct surveys. If it turns out, as in the Gambling Atlas 2023, that many people have a gambling problem, the industry comes along and says: The data isn't correct! But the state has all the data and could verify it. But now we have data from many casinos. My hope is that I can give the data to scientists or large media companies so that we can have a data-based debate on the whole issue of gambling.

(nie)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.