Author: Jane Edwards|| Date Published: November 28, 2016
Quentin Gallivan
Quentin Gallivan, CEO of Hitachi Data Systems Pentaho subsidiary, has said government agencies should develop a centralized plan that seeks to leverage the use of business analytics tools and an open-source framework like Hadoop in order to facilitate data integration and access.
Gallivan wrote that agencies should adopt an open-source framework that includes governance practices on the use of data and works to support big data processing operations.
He also called on agencies to encourage personnel to integrate and process Hadoop data, have an onboarding process that can support many different data sources, and be able to turn data into analytic data sets for users on demand.
Agencies should develop a common data refinery designed to facilitate information delivery to numerous users, said Gallivan, who is also a senior vice president at HDS.
To maximize the ROI on their investments, data professionals need to consider how each phase of the analytics pipeline adds value and supports the overall business goals, from raw data to end user analytics, he added.
Sally Wallace has been promoted to executive vice president and chief operating officer at Leonardo DRS. The Arlington, Virginia-based company said Tuesday…
IT systems integrator 22nd Century Technologies, Inc. has completed its acquisition of BT Federal, the U.S. government contracting arm of BT Group. Government…
The National Geospatial-Intelligence Agency has awarded Raytheon a five-year, $110.4 million contract to support the Geospatial-Intelligence Data Transformation Service IV…
The U.S. Air Force has awarded InDyne a potential $1.1 billion indefinite-delivery/indefinite-quantity contract to support the service’s missile warning and…