Googlers resist secret military AI

Over 600 Google employees are calling for abandonment of secret military contracts. Because there, even the remnants of Google's AI ethics cannot be verified.

listen Print view
The word "Google" on the glass facade of an office building

How will Google's management decide?

(Image: Daniel AJ Sokolov)

4 min. read

“We are Google employees who are deeply concerned about ongoing negotiations between Google and the US Department of Defense,” write AI experts employed at Google to CEO Sundar Pichai. According to US media reports, the over 600 signatories are demanding that Google reject the use of its artificial intelligence (AI) for secret military tasks. “As people working on AI, we know that these systems can centralize power and that they do make mistakes.”

Because the large AI projects are heavily in deficit, operators are eager to do business with the military and spies. Such contracts tend to be lucrative. For this, original intentions are discarded. This unsettles the Googlers; among the over 600 signatories are more than 20 managers, up to vice presidents of the data group.

Competitor Anthropic wanted to reject the use of its technology for mass surveillance domestically and for fully autonomous weapons this year. This angered the US Secretary of Defense so much that he put Anthropic on a blacklist. Since then, other suppliers to the military are no longer allowed to buy from Anthropic. A lawsuit against this ban is pending.

Google's AI experts, including many from the DeepMind AI research lab, now want to prevent any secret use. Because with secret use, even Google cannot prevent its technology from being used in violation of contracts. “We want AI to benefit humanity, not to be used in inhuman or extremely harmful ways,” write the concerned Googlers. “This includes lethal autonomous weapons and mass surveillance, but goes beyond that.”

As early as 2018, there were rumblings within Google's workforce. Over 4,000 employees signed a petition against a contract accepted by Google to create artificial intelligence for selecting targets for military strikes. In fact, Google let the contract expire; it was taken over by Palantir.

In addition, Google introduced its ethics rules for Artificial Intelligence at the time, consisting of seven general principles and four specific exclusion criteria: No technology that is likely to cause harm, no weapons or other technology that causes injury, no surveillance and espionage beyond internationally recognized norms, no violation of international law or human rights.

Videos by heise

Alone, Google abandoned this self-limitation last year. Instead, three vaguely formulated AI principles apply: first, bold innovation; second, responsible development and deployment; and third, collaboration. There is no longer any mention of the exclusion of lethal weapons.

“The only way to guarantee that Google does not become associated with such harms is to reject any classified workloads,” the current letter from the workers states. “Otherwise, such uses may occur without our knowledge or the power to stop them.”

The US Department of Defense, on the other hand, wants to be allowed to use purchased AI for “all legal purposes.” What the ministry classifies as legal is changing. In the recent past, the US has repeatedly bombed civilian boats off the coast of Venezuela, with the unverified accusation that drugs were on board. Survivors were not rescued and brought to justice but killed.

Even the illegitimate president of Venezuela, Nicolás Maduro Moros, was forcibly kidnapped; dozens of his bodyguards shot dead. It is said that AI supported this attack.

In the Iran war, Palantir's AI is said to have been used for target selection for military strikes. On February 28, the first day of the attack on Iran, a US missile hit a school in Minab. Over 100 schoolchildren and many other civilians died. Although the building was once used by the Islamic Revolutionary Guard, it had been a civilian school for at least ten years. Palantir's Maven likely did not notice this repurposing.

(ds)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.