Millionfold data leaks at AI apps: User data publicly accessible

Security researchers uncover serious data protection gaps: Some AI apps in the App Store expose millions of user data.

listen Print view
Screenshot of the Firehound website

Security researchers from CovertLabs show in their directory Firehound that they were able to freely download user data from the internet in the case of hundreds of iOS apps.

(Image: CovertLabs)

3 min. read

Several AI apps in Apple's App Store are already facing criticism because they charge a lot of money for services with questionable, expensive subscription models that users can get much cheaper directly from ChatGPT & Co. But security researchers are now pointing out with a directory why users should also be cautious with regard to data protection: They have revealed that a whole series of these apps apparently have not adequately secured user data.

On the website of the security laboratory CovertLabs, 198 iOS apps are currently listed in a public database called Firehound. In 196 cases, data was accessible. The criteria for checking apps for the directory are unclear. However, a main focus is apparently on AI apps, although other categories such as health and fitness, graphics, education, and entertainment are also listed. CovertLabs specializes in Open Source Intelligence and provides tools for security researchers, investigators, and companies.

The current negative frontrunner is an AI chatbot app called "Chat & Ask AI by Codeway," which, according to Firehound, allowed unauthorized access to 406 million data records from over 18 million users. According to the report, access to names, email addresses, and complete chat histories was possible. According to the researchers, other listed apps allow millions of accesses to user data. CovertLabs offers developers the opportunity to have their app removed from the directory once vulnerabilities have been fixed. The security researchers want to show developers how to plug the gaps.

The cause of the vulnerabilities is usually misconfigured databases and cloud storage. The data is thus publicly accessible to anyone who knows what to look for. Firehound also reveals the database schemas for the findings. Detailed scan results can only be viewed on the website after registration and activation. This is intended to ensure responsible disclosure.

Videos by heise

The findings also raise questions about Apple's app review process. Every app that wants to enter the App Store must first go through a review process at App Review, which usually takes between 24 and 48 hours. In addition to automated checks, humans also inspect apps. However, Apple apparently primarily checks compliance with app development regulations and for signs of malware, but not the developers' backend infrastructure.

Given the increasing trend towards vibe coding, it is reasonable to assume that many AI apps that serve quick money-making have themselves been created with the help of AI. The resulting lower barriers to entry into development are likely to lead to newcomers increasingly lacking experience in security and app development.

Users should generally be particularly cautious with apps from unknown developers. They should ideally not disclose sensitive data at all or only to a very limited extent in such apps. It also helps to grant apps only the absolutely necessary permissions and to critically question permission requests.

(mki)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.