Army aims to speed up data fusion on the battlefield
The service wants to combine multiple modes of data on the sensor platforms.
With the military adding more sensors in battlefield scenarios, one of its next goals is better data fusion—the ability to combine data from multiple, sometime disparate, sensor systems and create a near-real time operational picture that can be viewed from the field.
To date, most sensors have operated in a somewhat stove-piped manner, with information going to a single operator or otherwise being kept separate from other sensors. Any mixing of that data into an operational picture takes place at a fixed site such as a command post. The Army is now looking for ways to combine data from multiple sources—satellites, unmanned aerial vehicles, ground sensors, radar—closer to the action and regardless of the operational environment.
The service’s Communications-Electronic Research, Development and Engineering Center (CERDEC) has issued a Request for Information to industry asking for ideas on building a common architecture to handle multi-modal data.
“The ultimate goal,” the RFI states, “is to process all collected information, make it accessible and discoverable through a standards-based data management framework, and reduce the processing and exploitation time between sensor and analyst.”
The Army’s multi-sensor platforms collect data in a variety of modes, including images, video, audio and radar information, CERDEC said. And although it can be combined at a command post, the process of transmitting all that data back can put a strain on the network and result in latency. Current systems “represent a critical flaw in producing actionable intelligence in a timely manner,” the solicitation says. Sharing and fusing the data on the sensor platforms themselves would not only take a load off of the network, but it would quickly give those in the field better situational awareness.
CERDEC said it is looking for platform-independent Processing Exploitation & Dissemination, or PED, applications that speed up the processing and sharing of information, and can be applied to real-time or forensic analysis, or both. Real-time algorithms should reside on the sensor platform; forensic algorithms could take advantage of a wider range of information available via the cloud.
The deadline for responding to the RFI is April 6.
This isn’t the Army’s only foray into improving data fusion. Last fall, the service l announced a launched a geospatial intelligence research effort with GIS Federal, an enterprise cloud computing and big data startup. That project will explore whether graphics chips, such as those made in Nvidia, could speed up the processing of geospatial intelligence applications.
NEXT STORY: DISA CIO targets cloud automation