AI as a black swan: experts describe dangers for 2025
Gary Marcus is an AI expert. He warns against AI for cyber criminals. Amy Webb, known as a futurist, is worried about the stock market.
Sirens
(Image: Anastasiia Skorobogatova/Shutterstock.com)
Black swans are very rare. But they do exist. A “black swan” event is an event that is very unexpected but has a significant impact. For example, the CrowdStrike disaster last summer can be described as such an event. A faulty update paralyzed airports, businesses, and hospitals. Politico magazine asked some experts what unpredictable moments there could be in 2025. Artificial intelligence is a major risk for two authors.
Gary Marcus is worried about generative AI
Gary Marcus is an AI expert and Professor Emeritus of Neuroscience at New York University. He is a major critic of the current hype surrounding generative AI. Although he considers the possibilities of AI to be fundamentally limited, Marcus writes in his article that cyber criminals in particular could use the technology to cause great harm. It is the “perfect tool for cyberattacks”. This refers to AI-generated texts that are used for phishing attacks, but also deepfakes that can be used to mislead people. An employee of a Hong Kong bank is said to have already fallen for such a video and sent fraudsters 25 million US dollars.
Videos by heise
However, large language models are also susceptible to attacks such as jailbreaking and prompt injections. These involve the AI models being tricked into performing unwanted behavior by the provider. In a harmless case, chatbots could answer questions that they are not supposed to answer; worse would be the theft of account data. However, worse scenarios are also conceivable.
Marcus also warns that generative AI tools are used by developers when coding. Sometimes they do not understand what the AI has done, but sometimes they simply do not check all the code. This could lead to security vulnerabilities. Marcus is also very concerned that the US authorities tend to act in a deregulatory manner.
Amy Webb fears a stock market crash
As CEO of the Future Today Institute and professor at the New York University Stern School of Business, Amy Webb is often referred to as a futurist. She also sees a problem in the planned deregulation and dismantling of authority structures by the newly elected President Donald Trump. Botnets have already proven how easily and effectively AI can be used to spread misinformation. “Post-election, malicious actors and nation-state malcontents will focus on a new target in 2025: the financial markets,” she writes.
AI provides masses of real-time data, financial reports and economic indicators and can also summarize the voice of the public from social networks, for example. Deliberately spread misinformation and rumors could cause the markets to falter. AI can also be used for dissemination to sound credible and find the best times for publication, for example.
(emw)