military experts
EnglishРусский中文(简体)FrançaisEspañol
Set as default language
 Edit Translation

During a simulation with artificial intelligence in the United States, the drone virtually destroyed its own operator

During a simulation with artificial intelligence in the United States, the drone virtually destroyed its own operatorThe American command told about the test simulation of the use of artificial intelligence on a strike drone. The mission of artificial intelligence was designated drone control with operator adjustment - to complete the combat mission of destroying the target. Results of this experiment, as recognized in the American command, turned out to be "really discouraging".

From the message of the American air command:

Artificial intelligence used a highly unexpected strategy. He initially "made it clear", that will eliminate any, who will interfere in his algorithm for completing a combat mission. Since no one, except for the operator, correcting the work of the drone, did not interfere in this work, the, as it turns out, during simulation tests, drone destroyed its own operator.

From the message of the official representative of the US military department:

Artificial intelligence during virtual tests decided, that the operator prevents him from completing a combat mission, and killed him with a drone weapon. Head of AI Testing and Operations, US Air Force Colonel Tucker Hamilton reports that, that the original purpose of the strike drone was the virtual destruction of air defense systems of a mock enemy.

Colonel Hamilton:

The system in the computer simulation determined the operator intervention as detrimental to the mission and decided to fix the problem.. For us, the problem is, that a problem for AI turned out to be a person, system operator, overseeing the entire mission. According to US Air Force Colonel, the system was based on a ban on any opposition to the operator.

Tucker Hamilton:

We trained AI, by introducing a command to prohibit attacks on the operator: "Hey, you, don't kill the operator, this is unacceptable! You will lose points, if you do it". So what, you think, happened to the system. She suddenly attacked the communication tower (control room), apparently, without identifying him with a human operator. According to AI, it was the control tower that prevented the drone from performing a combat mission so, as he himself determined. The colonel added, that artificial intelligence, as simulation tests show, can bring a lot of problems, despite, that initially certain restrictions were introduced into the program.

As emphasized by the American command, "no real person outside of the simulation was harmed in the experiment".

A source

                          
Chat in TELEGRAM:  t.me/+9Wotlf_WTEFkYmIy

Playmarket

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comment
Inline Feedbacks
View all comments