Hello, Guest!

5G, AI & More: Hitachi Vantara Federal’s Phil Fuster Talks GovCon Trends & Emerging Tech

In today’s ever-changing federal landscape, technologies like artificial intelligence and 5G are the future, and they’re already unlocking new opportunities and capabilities across a wide range of federal missions. In this Executive Spotlight interview, Hitachi Vantara Federal Chief Growth Officer Phil Fuster explained which technologies, trends and shifts are ruling the public sector today and why.

Read below for Phil Fuster’s Executive Spotlight interview.

Which emerging technologies do you anticipate will have the greatest impact on the federal landscape in the next few years?

The areas that I see having the most impact on the federal landscape are AI, DataOps and ubiquitous connectivity from technologies like 5G and StarLink. Artificial intelligence can process how your best operator/analyst interacts with data and process it at line speed in a scalable way. This is especially beneficial when agencies experience turnover, retirements and talent pipeline challenges, to name a few. 

Additionally, DataOps serves as the scientific framework for connecting vast datasets from diverse sources, ensuring they are seamlessly meshed, unbiased and readily accessible for decision-makers. In the current landscape, where data is scattered across geographically dispersed and often incompatible systems in both IT and OT/IoT, a challenge arises in achieving the seamless integration and utilization of information. Adding to this challenge is the existence of data in different formats, presenting hurdles from both file structures and data types. The pivotal solution lies in the ability to normalize data, a process that transforms it into a standardized format and common language, facilitating easy utilization. This normalization process is integral to overcoming the complexities associated with disparate data, ensuring a more cohesive and accessible dataset for informed decision-making.

Lastly, the ability to access these transformative tools anywhere and everywhere is paramount for government agencies. Whether responding to disaster relief or engaged in active military operations, the advent of 5G and StarLink enables organizations to deploy traditional data center capabilities at the edge. For instance, your most knowledgeable analyst can be on-site — be it the U.S. border or on an airborne platform — processing raw data in real-time, thereby expediting decision-making processes. The convergence of these technologies promises to significantly impact decision-making, ensuring the right information is available in the right place and at the right time.

Let’s talk more about AI. In which applications are you seeing the highest demand for AI/ML from your federal customers, and can you explain what’s driving that demand?

Two key areas where the demand for artificial intelligence is notably high among federal customers are intelligence gathering and sensor data analysis. In the realm of law enforcement, there’s a critical need to collect and process disparate data in various formats from multiple devices quickly and efficiently. The integration of AI with DataOps technologies proves instrumental in expediting data ingestion, reducing the timeline from months to days. This combination allows for the assembly of raw data into actionable intelligence, akin to the expertise of seasoned agents, making the entire process scalable.

Another significant demand area is sensor data analysis, particularly in the U.S. military. With an extensive portfolio of sensor-gathering equipment, including airborne sensors, ground stations, HUMINT (human intelligence) and cyber assets, there’s a constant influx of data impacting our warfighters. The normalization of this data, coupled with AI processing capabilities, becomes pivotal for providing instant intelligence to field commanders. Expert AI platforms can handle data ingestion rates as high as 100TB per second, enabling real-time classification and processing — a game-changer for field commanders and decision-makers at the edge.

We’re already seeing news about the next iteration of 5G. As technology develops, what do you think is on the horizon for 5G?

Looking ahead, the next iteration of 5G holds transformative potential, particularly in optimizing operations on the flight line and at ports. Consider the volume of data generated by aircraft during combat or training flights — a multitude of terabytes. The advent of advanced 5G technology facilitates real-time data transfers for sensor information, encompassing details on weapons, pilots and aircraft maintenance. This real-time capability equips ground crews with essential information, enabling them to efficiently resupply aircraft, assess a pilot’s readiness for relaunch, and make informed decisions about potential emergent repairs or maintenance needs. A similar paradigm can be applied to ships entering ports, ushering in a new era of data-driven efficiency and decision-making in both aerial and maritime contexts.

One of the most pressing concerns about data in today’s digital landscape is the sheer amount of it there is to be processed. What do you see as some of the more viable solutions to tackling this challenge?

Data mesh and smart technologies are available in the market today to tackle data challenges. The bigger issue is that there are many solutions and most only handle a piece of the problem. Hitachi has a broad framework called Pentaho+ (Pentaho Plus) which is a comprehensive platform that can ingest over 300 types of data natively (utilizing an Open API it can map many others), having ETL and cataloging capabilities, as well as AI capabilities for learning how an individual interacts with data. It also has a tiering system called Data Optimizer which can explain how to tier data and where to put it based on performance, cost and other factors. This allows organizations to put more used data on NVMe (nonvolatile memory express) drives for instant access, spinning disks, in the cloud or archive it, allowing agencies to keep metadata available so that they can re-hydrate it at any time. Being able to create this global mesh of data with metadata pointing to where it resides is critical. This helps with accessibility of data, deduplication of data and replication to the right place as well. This type of comprehensive framework is what every agency and department needs to optimize their data operations. 

Lastly Phil, how do you think the proliferation of commercial entrants in space is changing the domain? What trends or shifts can we expect to see from that going forward?

Commercial entrants into space are game changers. Being able to create and deploy technologies without the constraints of contracting, fair competition, budget and bureaucracy are transformative. Commercial organizations like SpaceX, Virgin and Blue Origin are developing technologies faster than traditional methods and can change direction on a dime if needed by the mission or agency. They are launching technologies like StarLink which are changing the ability to communicate in contested environments. Field commanders and remote users can access data and have comms in the most challenging environments. OTAs are helping move traditional development in a timely fashion but not at the scale that commercial organizations can take on at their own cost and risk internally.

Video of the Day

Related Articles