Getting a handle on big data

DOD seeks cost savings through big data apps while service providers look to demystify the concept.

The looming age of budget austerity is ramping up pressure on U.S. defense planners to leverage big data capabilities, both on and off the battlefield. For the military, the amorphous term “big data” refers to everything from signals intelligence, mobile phone and electronic warfare interceptions to satellite images to video imagery. That means the military is drowning in data, and the tide is  unlikely to turn any time soon. The number of drones in the military arsenal, for example, has exploded in the past decade, from just 50 in September 2001 to 7,500 in April 2012, according to a recent Council on Foreign Relations report.

“The sheer volume of information creates a background clutter,” said the Defense Advanced Research Projects Agency acting director, Kaigham Gabriel, in announcing its XDATA program. Launched as part of the Obama administration’s Big Data Research and Development initiative, the XDATA program will spend $25 million annually over four years to develop computational techniques and software for defense data. A central challenge, according to the White House, will be developing scalable algorithms for imperfect data processing and creating effective human-computer interaction tools.

“Let me put this in some context,” Gabriel said. “The Atlantic Ocean is roughly 350 million cubic kilometers in volume, or nearly 100 billion, billon gallons of water. If each gallon of water represented a byte or character, the Atlantic Ocean would be able to store, just barely, all the data generated by the world in 2010.” Added Gabriel: “Looking for a specific message or page in a document would be the equivalent of searching the Atlantic Ocean for a single 55-gallon drum barrel.” 

Getting a handle on big data, however, increasingly requires striking a balance between growing military requirements and shrinking budgets.

The Defense Department faces deep cuts as it enters what Defense Secretary Chuck Hagel has described as a period of “unprecedented budget uncertainty.” The across-the-board cuts will be felt in personnel and program accounts, including reductions in intelligence analysis and production at combatant command intelligence and operation centers. DOD’s Strategic Choices Management Review aims to “foster closer integration and reduce duplication across defense enterprises,” Hagel said in late July. 

Looking for the ‘why’

The proliferation of sensors has helped fuel the big data glut. “We’ve contributed significantly to some of the information overload that a lot of our customers are dealing with,” acknowledged Richard Cooke, vice president of Geospatial Intelligence Solutions at Exelis.

“It’s all relevant and important data. The issue we’ve come up against is that there’s just so much of it,” Cookesaid.

The problem, he added, is the variety and volume of data. “They [DOD] simply don’t have enough people to take advantage of all the information that’s there.” 

In response, attention is shifting from data management to efficiently extracting as much useful information as possible.

“We’ve got systems that can handle and manage this data,” Cooke said. The next challenge is the creation of software tools that analyze and develop data that military users can exploit to extract relevant information for mission planning and intelligence gathering. “We’ve gotten really good at building tools that can analyze data and tell you what it was, where it was, when it was. Now the industry is focusing on the ‘why’ and the forward looking aspect,” he explained. 

The NSA effect

That forward lean has some military officials looking to take a page from the  U.S.intelligence community’s playbook, including the National Security Agency’s recently revealed electronic surveillance data mining program, PRISM. Central to the agency’s program is the collection of massive volumes of seemingly unrelated and often unstructured data. 

“If you take the privacy concerns off the table, and you just look at least what people are saying the NSA is actually doing, it is absolutely the model that the intelligence community at large and the military users at large really need to move to. It’s being able to come up with techniques able to look at a wide variety of data sources and look for the hidden correlations that are just not obvious,” Cooke said.

The challenge for intelligence analysts, however, is that unlike drone video streams or mobile phone signal patterns, open source data doesn’t follow a predictable pattern. “There are patterns there, but they are layered much deeper than our traditional sources of data. It’s having techniques that can dig those extra layers down and find those patterns of communication,” Cooke said.

Exelis’ approach focuses on meshing traditional intelligence gathering techniques like signals intelligence with emerging big data sources to get a clearer picture of what’s happening on the battlefield and beyond. 

NSA techniques, at least those disclosed in media reports , also could be applied to help automate the intelligence gathering process while reducing manpower to  obtain the most relevant information, Cooke argued.

“If we can help a commander weed through hours and hours of video to get down to the most critical 30 seconds of video, that’s a huge win,” Cooke said.

Observers stress that the services are anxious to apply these new big data techniques but lack the resources to do so.  “They recognize that they’re missing subtle correlations in the data that could tell them things that they need to know and don’t know today,” Cooke said. 

Low-hanging fruit

That deeper dig into the mountain of data isn’t just useful in combat. More mundane  military tasks such as logistics  can also benefit from applying big data techniques, said Rich Campbell, chief federal technologist at the EMC Corp. The battlefield is the sexy part of the picture, but “then there’s the business part, which drives the majority of change and the majority of savings,” Campbell stressed. The logistics piece is “low hanging fruit.”

Take, for example, RFID-tagged gear in the field, which has to be scanned and verified during inventory. During that process, there’s a small (10 to 12percent) chance that an  RFID tag will need to be rechecked. Predictive capabilities could help leverage big data analytics when a few digits are missing in a serial number, Campbell  explained. That  new capability could help avoid  the need to rescan an entire equipment shipment to find and correct the error.

“Things like there are where analytics can help,” Campbell said.

Commercial ventures are also providing inspiration, with companies like FedEx and Wal-Mart sharing big data strategies aimed at boosting efficiency.  “The military is really leveraging the commercial enterprise space to really understand how they’re doing it and why they’re doing it,” Campbell said. 

“A lot of our DOD customers today look at the big data and analytics challenges as being this massively complex problem. Honestly, a lot of times they already have the data. It’s just how to really refine it and get something out of it,” he added.

The message from Campbell and other service providers is that applying big data techniques need not be complicated. “ “There’s a lot of ways to leverage it in a more simplistic fashion,” Campbell argued. “It’s not something to be afraid of.”