As the use of artificial intelligence and machine learning continues to grow exponentially in the public sector, the intelligence community is on the cusp of big changes in the way it operates. Because AI/ML is increasingly critical to U.S. intelligence capabilities, the IC is now updating and adapting its tradecraft to include these advanced technologies.
However, the process of integrating AI into an intelligence agency is not a simple, straightforward process, especially due to the classification and sensitivity of IC data and the regulations that govern it.
Ramesh Menon, chief technology officer of the Defense Intelligence Agency, said the first step of adopting AI is to understand where it will fit into intelligence cycles or mission processes.
“Are we introducing an AI-based system to speed up the collection time, or are we using the system to make additional recommendations to the analyst?” Menon asked virtual attendees during the panel conversation of GovCon Wire Events’ Second Annual Data Innovation Forum. “And from that perspective, there are some regulations, it’s not like we could do what we want.”
Menon said DIA employs a specifically-focused team to ensure that anything the agency does regarding AI is compliant with ICD 209 guidelines. AI is certainly relevant to intelligence missions, Menon implied, but from a technology perspective, there are many more layers of complexity.
“There is the accuracy of the model, the validity of the inference. Is it a closed loop control system? Are we introducing bias and drift in the outcomes and results? How is that being used?” Menon posed, underscoring the importance of the responsible and ethical use of AI, as outlined in a DOD memo.
AI is also transforming IC workflows. Before AI/ML technology was introduced to the National Geospatial-Intelligence Agency, workflows were “linear and somewhat transactional,” according to Jim McCool, director of NGA’s data and digital innovation directorate.
Now, McCool said NGA is envisioning what the workflow will look like in an environment in which “the data is streaming.”
“We just started at NGA talking about how that tradecraft is applied by being above that stream of data, and not necessarily transactionally halting the data at any particular moment,” McCool revealed.
AI/ML is presenting opportunities for new data tradecraft, but McCool noted that “the existing analytic tradecraft needs to find its role without slowing down the machine learning” as the integration of these technologies accelerates.
McCool’s concept of not hindering the flow of data is one that industry embraces. Dan Carroll, field CTO for cybersecurity at Dell Technologies, said he’s seeing an increased capability for compute and processing at the tactical edge. To Carroll, this means “really acting on data where it is generated,” which can benefit processes both at the edge and within the larger enterprise.
Carroll said you can act on data “where it exists, you get an outcome, you support that mission locally, and then you can send the results and smaller sense of that data back to a core processing center to help you understand and expand the wider mission.”