We’ve come a long way since 1778 when George Washington’s spies gathered and shared military intelligence on the British Army’s tactical operations in occupied New York. But information broadly, and the management of data specifically, is still “the” critical factor for situational awareness, streamlined operations, and a host of other use cases across today’s tech-driven battlefields.
War is now fought in physical and cyber-space, and just like in the physical domain, battle staffs require a comprehensive cyber situational understanding of the cyber-domain to mitigate risk – a common operating picture fed by streaming data from all applicable sensors across multiple enclaves. This ability to see the cyber-battlespace will enable the hardening of defenses, inform how adversary actions will impact friendly networks, and enable the identification of vulnerabilities and the development of contingencies. All vital to counter and target adversarial capabilities and stop disruptive effects.
Beyond Cyber Use Cases
And while operations in the cyber-domain are more likely to make the evening news, there are a vast array of critical use cases that support the military’s need for a data architecture that collects, processes, and delivers any type of data, anywhere.
Consider how data is vital to solving the massive logistical challenges of procuring and supplying subsistence items to the hundreds of U.S. military installations spread across the globe. Or how a data- and AI-powered predictive maintenance program like Condition Based Maintenance Plus (CBM+) extends the life cycle of military assets, reduces costs, and more efficiently keeps U.S. hardware in the skies, sea, and land.
To drive these data use cases, the Department of Defense (DoD) communities and branches require a reliable, scalable data transport mechanism to deliver data (from any source) from origination through all points of consumption; at the edge, on-premise, and in the cloud in a simple, secure, universal, and scalable way. The data transport architecture should provide guaranteed delivery, even in contested or denied, disrupted, intermittent or limited bandwidth (DDIL) environments.
Universal Data Distribution Solves DoD Data Transport Challenges
These requirements could be addressed by a Universal Data Distribution (UDD) architecture. UDD provides a wide range of extensible capabilities ranging from real-time data ingestion, edge processing, transformation, and routing through to descriptive, prescriptive, and predictive analytics. UDD provides the capability to connect to any data source anywhere, with any structure, process it, and reliably deliver prioritized sensor data to any destination. Data can be securely shared across on-premises, public cloud, hybrid cloud environments, and multiple enclaves, with the movement of data across multiple enclaves facilitated by the utilization of any DoD-approved Cross Domain Solution.
Once the architecture is set up to ingest sensor data from thousands of endpoints into the UDD, the system could scale up or out based on mission requirements. In fact, one of our customers is ingesting data from over 100,000 endpoints. The UDD can act as a neutral, bidirectional, data movement engine between sensors and any of the required data platforms. Sensor data from thousands of endpoints can provide valuable operational insights about threats and vulnerabilities resulting in improved battle space awareness.
Cloudera Data Flow (CDF) is the enabling technology for the UDD architecture, providing secure, universal, hybrid, and streaming data distribution. CDF could help DoD agencies effectively manage data flows and sensor feeds with ease of use in mind.
- Data transfer requirements are instantiated using a low code environment to specify and automate reliable data movement.
- UDD offers a flow-based, low-code development paradigm that provides the best impedance match with how developers design, develop, and test data distribution pipelines.
- With over 400 connectors and processors, UDD enables a broad range of data distribution capabilities.
- Mission-specific, custom processors can also be developed as needed. These data distribution flows can be version controlled into a catalog where operators can self-serve deployments to different runtimes.
Our technology would enable the DoD to capture, store, analyze and act on any data at massive speed and scale with CDF, as well as Cloudera Data Platform (CDP), and Cloudera Machine Learning (CML) solutions.
Built upon proven open source components and open standards with wide industry adoption, Cloudera solutions are readily implemented and straightforward to integrate with other platforms. These capabilities and solutions ultimately enable US Military and Civilian leaders and analysts with the real-time and predictive insights they need to meet their objectives and support the mission. CDF, CDP, and CML provide a comprehensive data ecosystem for the security and governance of mission data, as well as a rich toolset for its operational use. Cloudera technologies are accredited in the IC and DoD communities.
Join me at AFCEA TechNet Augusta
Join me at the upcoming AFCEA Technet Augusta on August 15 – 18, where I’ll present how a Universal Data Distribution (UDD) architecture powered by Cloudera solves the data transport challenge by providing a wide range of extensible capabilities including:
- real-time data ingestion of sensor data from thousands of endpoints
- edge processing
- transformation
- and routing through to descriptive, prescriptive, and predictive analytics
I hope to see you there.