The Department of Defense has made changes to the Autonomy in Weapon Systems directive to reflect DOD’s continuous efforts to develop policies on military uses of artificial intelligence and autonomous systems as well as changes across the department over the last 10 years.
The DoD Directive 3000.09 was developed to prevent unintended engagements by mitigating potential failures in semi-autonomous and autonomous weapon systems, the Pentagon said Wednesday.
“Given the dramatic advances in technology happening all around us, the update to our Autonomy in Weapon Systems directive will help ensure we remain the global leader of not only developing and deploying new systems, but also safety,” said Deputy Defense Secretary Kathleen Hicks, an inductee into Executive Mosaic’s 2023 Wash100 list.
One of the requirements in the directive is the need to design autonomous weapons platforms that will enable operators and commanders to exercise human judgment over the use of force.
Systems that integrate AI should be designed, developed, deployed and used in a way that is aligned with DOD’s Responsible AI Strategy and Implementation Pathway and AI Ethical Principles.
Register now for the Potomac Officers Club’s 4th Annual Artificial Intelligence Summit, where DOD and GovCon leaders will share insight into recent AI advancements as well as outlook on cutting-edge development approaches for 2023 and beyond.