Author: Jane Edwards|| Date Published: November 28, 2016
Quentin Gallivan
Quentin Gallivan, CEO of Hitachi Data Systems Pentaho subsidiary, has said government agencies should develop a centralized plan that seeks to leverage the use of business analytics tools and an open-source framework like Hadoop in order to facilitate data integration and access.
Gallivan wrote that agencies should adopt an open-source framework that includes governance practices on the use of data and works to support big data processing operations.
He also called on agencies to encourage personnel to integrate and process Hadoop data, have an onboarding process that can support many different data sources, and be able to turn data into analytic data sets for users on demand.
Agencies should develop a common data refinery designed to facilitate information delivery to numerous users, said Gallivan, who is also a senior vice president at HDS.
To maximize the ROI on their investments, data professionals need to consider how each phase of the analytics pipeline adds value and supports the overall business goals, from raw data to end user analytics, he added.
BigBear.ai has completed its acquisition of Ask Sage, a generative artificial intelligence platform designed for government and regulated industries, in…
The Defense Microelectronics Activity has awarded 10 companies positions on the potential 10-year, $25.36 billion Advanced Technology Support Program V, or…
The General Services Administration continues to advance the One Acquisition Solution for Integrated Services Plus, or OASIS+, contract vehicle through…
The Department of War administers the Cybersecurity Maturity Model Certification, or CMMC, 2.0 program to strengthen cybersecurity across the defense industrial…