What's the answer to better geospatial intelligence?
The military has grown so good at collecting data that it risks suffocating in its own success.
It’s like visiting a high-end all-you-can-eat buffet. Where do you begin and when do you stop?
Growing intelligence demands are requiring military leaders to turn to highly advanced technologies such as Web-based geospatial information, high-resolution imagery, social media and secure mobile apps to create a seamless knowledge environment for identifying enemy intentions and movements.
Yet even as new intelligence sources are discovered and used, it is no surprise that the military continues to drown in too much information. An astounding amount of raw data is now pouring into fixed and mobile systems, challenging users to squeeze sense out of all the information at their fingertips.
Information Overload
“One of the biggest problems [military planners and troops] face is information overload,” said Richard Cooke, vice president of geospatial intelligence solutions at ITT Exelis in McLean, Va. “There's more data than they have people to possibly consume it.”
A superabundance of data, as well as information trapped inside incompatible formats, can prevent crucial intelligence from reaching the people who most need to use it.
“The soldier is faced with myriad challenges when working to achieve situational awareness of the battlespace through a common operating picture (COP),” said Joshua Delmonico, chief of the systems, engineering and architecture branch of the Army Geospatial Center (AGC) in Alexandria, Va. “Even when soldiers share a COP it is often incomplete, which can lead to uncoordinated and unsynchronized operations.” He added that “these challenges prevent the soldier from leveraging tactically collected intelligence within mission command networks and systems.”
Like many other parts of the Defense Department, the Army is working hard to ensure that troops not only receive the intelligence they need, but have the tools and skills necessary to accurately interpret the information.
“We deliver geospatial training, technical support and reach-back capabilities to the soldier in the field, host an on-demand environment for the force and are an authoritative provider for geospatial data, information and analytics,” Delmonico said. “The AGC also develops common data, analytical and decision applications across the force.”
Addressing intelligence access and manageability challenges, the AGC is developing the Army Geospatial Enterprise (AGE), an integrated system of technologies, standards, data, people and processes that’s designed to provide a Standard and Sharable Geospatial Foundation, which is a common COP, at all echelons. Delmonico observed that standardized, synchronized and shareable geospatial information is crucial to the Army’s modernization initiatives.
“It will provide the decisive advantage to the total force now and in the future through greatly increased mission command capabilities that ensure all our forces and mission partners have the right information, at the right time, in the right place,” Delmonico said. “The AGE provides the framework for the Army to achieve this advantage.”
Data Analysis/Going Commercial
A growing number of intelligence experts, inside and outside the military, are reaching the conclusion that high-speed analytical software is essential for giving users the ability to plow through all of the “big data” collected by satellites, aerial and terrestrial reconnaissance vehicles, networked sensors, Web data and other information-collection sources.
The ability to automate the analysis of that data is going to become a key focus area,” said Jordan Becker, vice president and general manager of BAE Systems’ geospatial intelligence and intelligence, surveillance and reconnaissance (ISR) business.
To better cope with rapidly rising intelligence data volumes, for example, the National Geospatial-Intelligence Agency (NGA) is looking to shift a growing number of analysis tasks from humans to machines.
“Data management is what makes sense of all these data streams that come in from the outside world,” said Mark Reardon, chief of staff at the NGA’s InnoVision directorate, based in Springfield, Va. “I think we're going to need to look at technologies that help us harvest, and, even more so, help us manage these massive amounts of data that come in.”
As intelligence experts search for automated ways of efficiently identifying and disseminating critical intelligence data, more attention is being paid to commercial software and hardware tools, which often cost less and/or outperform military-developed or proprietary technologies. “Especially through the lens of geospatial technologies, we've seen a tremendous acceleration in commercial technologies, particularly in mobile computing and location-enabled devices and apps,” said Keith Masback, president of the United States Geospatial Intelligence Foundation, a Herndon, Va., organization dedicated to promoting geospatial intelligence tradecraft.
“Web apps such as Google Earth are commonly used as an effective tool to communicate GEOINT information,” said Curtis Rowland, technical director of the National Air and Space Intelligence Center’s data analysis group, headquartered at Wright Patterson, Air Force Base, Ohio.
Rowland noted that Google Earth is popular and widely used in the GEOINT field because it allows analysts to connect context and image visualization information to geographic data. “If the [user] can't understand the intelligence, or easily ingest it into their mission processes, then the information has little value,” Rowland said.
“That NGA is providing the foundation data with tools like Google Earth is really remarkable,” said Donald Schiber, deputy chief of the Air Force ISR Agency’s requirements division at Lackland AFB, Texas. “We know we have a lot of data coming on line, and we're just now grasping on how we can properly leverage that data—in the future that's really what we're looking toward.”
Commercial systems are also making headway in data acquisition, most notably in earth-orbiting satellites. When launched in 2013, GeoEye‘s GeoEye-2 satellite, built by Lockheed Martin, will have the highest resolution and be the most accurate commercial satellite, according to the company. GeoEye-2 will be followed by DigitalGlobe’s Worldview-3, which is scheduled to launch in mid-2014 and will offer 0.31-meter resolution panchromatic and eight-band multi-spectral imagery.
“I think it's important that you talk about both GeoEye 2 and WorldView-3,” Masback said. “These two new satellites are a tremendous improvement over the last generation, and will keep these two companies as having the two most capable commercial satellites on orbit in the world.”
Workflow Improvements
Even as technology plays a greater role in analyzing, prioritizing and disseminating intelligence, it’s widely understood that some tasks simply can’t be automated. Machine vision, for instance, has its limits. A discerning human eye is often needed to detect important objects embedded deeply inside a photographic image. While a computer can provide an image’s precise date, time and geographic coordinates, a living expert is often required to supply situational or historical context.
But this doesn’t mean that there isn’t room for improvement. NGA’s Reardon, among others, believes that analytical work practices are going to have to change with the times.
“The idea is that we're not just looking at the software side, we're also looking at how we do business internally,” he said. “Is there something better we can do to improve our ability to collect data? Is there some kind of business process improvement we can make?”
Becker observed that the restructuring of traditional analytics promises to create major efficiency and productivity benefits. "The workflow that analysts practice today dates back to World War II era, where aircraft would collect ground imagery," he said. The photographs would then be studied for evidence of convoys, bomb-making facilities and other activities of interest, and disseminated to authorized users. The work was effective, but time consuming. It also created silos of data that provided only a partial view of the activity. "That's pretty much the same workflow we still use today," Becker said.
But that process may not exist for much longer. “What you'll see five years from now is a breaking down from a linear, serialized process to a process where a lot of different activities go on in parallel,” he said. “It will turn things from looking at what happened in the past to more of the real-time analysis so you can tell what's going on today.”
Big Data Poses Major Challenge
Big data pouring into systems from multiple intelligence sources in different formats slows down analysis on even the most powerful servers and makes it difficult to spot nuggets of actionable intelligence nestled within streams of useless static. “We're drowning in petabytes of data, which is distributed and will remain that way because it's difficult to consolidate across different government agencies, and geographies,” Becker said.
As social media, the king of big-data platforms, enters the intelligence mainstream, it’s becoming increasingly critical to have analytical systems that can automatically identify and manage different types of information.
“All this stuff can be feeding into the system, but if you aren't able to manage it, if you aren't able to attach the metadata to it and understand what it means and very rapidly assimilate it” then it’s not of much use,” Masback said.
Crowd-sourcing applications, which use throngs of people on the ground in real-life situations to provide continuous intelligence feedback, creates a particularly knotty big-data challenge. “The crowd can generate lots of different data, but if I can't put it together to create knowledge and make sense of it in time to make a decision. [So] it's not terribly useful except as a forensic tool,” Masback said.
Assuming it can ever be done, gaining control over large amounts of unruly data promises to take data intelligence to an entirely new level of value. “With the ability to handle really big data—to process it, store it, move it around, have algorithms that can quickly troll through data and look for connections and non-obvious relationships—that's really a hugely important capability moving forward,” Masback said.
A Fused Future
With intelligence streams and data types rapidly multiplying, it is inevitable that disparate intelligence sources eventually will be fused into linked data structures for easier access and exploitation by users involved in forensic and anticipatory analysis. “Today you have independent collection and analysis, whether it's imagery, signals, hyper-spectral data or human intelligence reports,” Becker said. “I think you're going to see fusion of these different sources together where they're linked in new ways that allow analysts to make sense out of them.”
Rowland agrees. “The most challenging piece of the puzzle will continue to be the prediction of the adversary's potential courses of action,” he said. “Support for that requirement will necessitate the ability to fuse GEOINT data, spatially and temporally, for a large number of events or activities.”
NEXT STORY: Attack on control systems may be on the horizon