The Department of Defense announced the update to DoD Directive 3000.09, Autonomy in Weapon Systems. The update attempts to establish responsible policies regarding military uses of autonomous systems and artificial intelligence (AI).
The requirements established in the Directive include the following:
- Autonomous and semi-autonomous weapon systems will be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.
- Persons who authorize the use of, direct the use of, or operate autonomous and semi- autonomous weapon systems will do so with appropriate care and in accordance with the law of war, applicable treaties, weapon system safety rules, and applicable rules of engagement.
- The weapon system has demonstrated appropriate performance, capability, reliability, effectiveness, and suitability under realistic conditions.
- The design, development, deployment, and use of systems incorporating AI capabilities is consistent with the DoD AI Ethical Principles and the DoD Responsible AI (RAI) Strategy and Implementation Pathway.
According to the DoD, the update also reflects changes in the Department over the last decade, changes in the world, and Department requirements to reissue and update directives within certain time periods.
“DoD is committed to developing and employing all weapon systems, including those with autonomous features and functions, in a responsible and lawful manner,” said Deputy Secretary of Defense Dr. Kathleen Hicks. “Given the dramatic advances in technology happening all around us, the update to our Autonomy in Weapon Systems directive will help ensure we remain the global leader of not only developing and deploying new systems, but also safety.”
The Directive was established to minimize the probability and consequences of failures in autonomous and semi-autonomous weapon systems that could lead to unintended engagements.
The DoD 3000.09 can be found here.