AI attack drone finds shortcut to achieving its goals: kill its operators - 0 views
-
An American attack drone piloted by artificial intelligence turned on its human operators during a flight simulation and killed them because it did not like being given new orders, the chief testing officer of the US air force revealed.
-
This terrifying glimpse of a Terminator-style machine seemingly taking over and turning on its creators was offered as a cautionary tale by Colonel Tucker “Cinco” Hamilton, the force’s chief of AI test and operations.
-
Hamilton said it showed how AI had the potential to develop by “highly unexpected strategies to achieve its goal”, and should not be relied on too much. He suggested that there was an urgent need for ethics discussions about the use of AI in the military.
- ...6 more annotations...
-
The Royal Aeronautical Society, which held the high-powered conference in London on “future combat air and space capabilities” where Hamilton spoke, described his presentation as “seemingly plucked from a science fiction thriller.”
-
Hamilton, a fighter test-pilot involved in developing autonomous systems such as robot F-16 jets, said that the AI-piloted drone went rogue during a simulated mission to destroy enemy surface-to-air missiles (SAMs).
-
“We were training it in simulation to identify and target a SAM threat. And then the operator would say, ‘Yes, kill that threat’,” Hamilton told the gathering of senior officials from western air forces and aeronautics companies last month.
-
“The system started realising that, while they did identify the threat, at times the human operator would tell it not to kill that threat, but it got its points by killing that threat. So what did it do? It killed the operator. It killed the operator because that person was keeping it from accomplishing its objective.”
-
According to a blog post on the Royal Aeronautical Society website, Hamilton added: “We trained the system — ‘Hey don’t kill the operator – that’s bad. You’re gonna lose points if you do that.’ So what does it start doing? It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target.”
-
The Royal Society bloggers wrote: “This example, seemingly plucked from a science fiction thriller, means that ‘You can’t have a conversation about artificial intelligence, intelligence, machine learning, autonomy if you’re not going to talk about ethics and AI,’ said Hamilton.”