Sentient AI: disagreement on this could divide humanity
Philosophy professor fears division between people who perceive future AI as sentient or unfeeling.
The debate about the potential sentience of future artificial intelligence (AI) could divide people into two camps, fears philosophy professor Jonathan Birch from the London School of Economics. The background to this is the assumption by a group of scientists that AI consciousness could be possible as early as 2035. This was reported by various US media outlets.
Just in time for the AI Action Summit on November 21 and 22 in Los Angeles, which will focus on the safety framework for AI, philosopher Birch expressed the fear that believing or not believing in the feelings of AI could drive people apart. The debate was triggered by the statement of a group of scientists who assume that AI could show forms of emotion within the next decade.
Social consequences of AI
Defining these remains difficult and controversial even among experts. How should possible emotions such as joy or pain be measured in an AI? And even if they are, what rights should an AI be granted? A similar debate revolves around the treatment of animals and their welfare. Here too, different cultural, religious or social interests clash.
According to Birch, companies also lack interest in dealing with the secondary and social consequences of AI. Author Patrick Butlin, research fellow at Oxford University, says there is a risk that AI systems could resist in a dangerous way. This justifies a slowdown in development. However, such an assessment of awareness is not currently taking place and is not being commented on by technology companies.
Even if the experts themselves disagree about the future state of consciousness, the philosophers' concern about the division of society remains. A first step would be to acknowledge this problem and define parameters against which the perception of AI can be measured. In the AI Deepdive, Wolfgang Stieler from the MIT Technology Review talks about AI and consciousness.
(hoh)