What Your Facebook Posts Mean to US Special Operations Forces
Social media as an intelligence asset is of growing value to special operations forces, but there are legal issues and controversy surrounding its use.
It was in the 1873 book “On War,” that Prussian military scholar Carl von Clausewitz give birth to the term “fog of war,” writing that “war is the realm of uncertainty; three quarters of the factors on which action in war is based are wrapped in a fog of greater or lesser uncertainty. A sensitive and discriminating judgment is called for; a skilled intelligence to scent out the truth.”
United States Special Operations Command, SOCOM, is trying to dispel some of that fog, moving forward with the development of advanced data mining tools that, if revealed, could make some of the capabilities outlined in documents disclosed by NSA leaker Edward Snowden look quaint. The utility of social media data is moving quickly beyond simple investigations directly on the battlefield, to that critical moment when a soldier decides whether or not to pull the trigger. According to some military thought leaders, it’s law and policy that isn’t keeping up.
Representatives from elite fighting squads, sometimes broadly referred to as special operations units, tasked with fighting America’s most dangerous—and often most secret battles—say that they need better information, including from social networks, to execute missions that take place all over half the globe. That idea may be controversial, and, in fact, many of the tools being developed may never be legal to use. Regardless, according to one of the Defense Department’s top lawyers, “Legal uncertainty should not be a barrier to us developing a tool” for use by special operations fighters.
Todd Huntley, the head of the National Security Law Department of the Office of the Judge Advocate General, speaking at a special operations event in Washington, D.C., this week, said that the U.S. should continue to build possibly illegal data mining tools rather than relinquish capabilities.
“We should be very cautious in setting precedent that could limit the development of this technology,” he said, adding that if the military waits for the courts before building next generation intelligence capabilities “it will take too long.” (He did not say we should actually use them outside of law.)
The Defense Department policy that governs the way it collects information on foreign persons, whether for use in combat or just as part of investigations, is called Department of Defense (Instruction) No. 5240 IR. It was originally drafted in 1982. Huntley says that’s one reason policy can’t keep up with technology or with the battlefield challenges. “If we can’t even determine who is and who is not a U.S. person, how do we determine how to use existing policies?”
In a wide-ranging discussion, various special operations thought leaders and key figures spoke to the need for much better situational awareness. That term used to mean understanding the location of enemies, what arms they might be carrying, etc. Increasingly it means instantaneous data from social networks like Twitter and Facebook to identify of the target in the sniper scope, and who might be connected to him or her.
Stuart Bradin, a retired Army colonel who worked for SOCOM, put it this way: “It would great if we could use social media to Positively ID (PID) someone. Accuracy matters. So social media tools that can help would be a great capability.”
In highlighting the most pressing problems that the special operations faces, Anthony Davis, the director of science and technology for SOCOM, highlighted the following: enabling small teams through new cutting-edge gear like the TALOS (also known as the Iron Man suit), developing capabilities to conduct special operations in places like Africa where communication infrastructure is absent; and better support and tools for non-kinetic operators, which primarily means assisting with humanitarian missions but can include gathering intelligence for operational use.
In a previous presentation identifying future needs, Davis highlights data mining and behavior modeling as key to special operation’s future.
From that need, new tools are rising. Companies like Snaptrends can immediately connect every Tweet or Facebook post to a specific location. One satellite image analysis company can, reportedly, link any social media post to point on an incredibly high-resolution map.
But those data mining capabilities are still limited and special operations tools and SOCOM has been looking to build beyond them. In May, the command announced its intent to build a new data-mining tool capable of crawling data from “pre-determined web sites” and to “support geospatial, temporal, relationship, textual, and multi-media visualization and visual analytics” to support “situational awareness in a constrained environment,” the a program called Automated Visual Application For Tailored Analytical Reporting, or AVATAR.
As Paul McLeary writes for Defense News, the program would “perform link analysis and correlate that information with intelligence that has already been provided by the big U.S. intelligence agencies.” That means FBI, NSA and virtually any agency that has useful data. That interoperability in the form a single platform sounds a lot like many of the products developed by Palantir to present and display data across law enforcement agencies to a variety of users. But the AVATAR program shows several critical differences. Most importantly, it would query across government databases and the open web to deliver info to a very specific end user, a special operations fighter who may be using that information in battle.
A Short History of Special Operations Forces Social Network Mining
It isn’t the first time that SOCOM has looked into mining social media data for use in operations. A 2012 project called Quantum Leap sought to show that open source data, and particularly social media data, could be made useful to active military operations.
The biggest technological outcome of the program was a plug-in piece of software called “Social Bubble,” designed by a Santa Rosa company called Creative Radicals. The Quantum Leap report authors describe Social Bubble as “a tool which summons data via the Twitter API to display Twitter users, their geographic location, posted Tweets and related metadata.”
According to the authors of the document, the experiment was a success not just in identifying individuals who were actively tweeting and posting but also--and far more importantly for the military--individuals who happened to be connected to them but who didn’t have a social media profile.
“Overall the experiment was successful in identifying strategies and techniques for exploiting open sources of information, particularly social media. Major lessons learned were the pronounced utility of social media in exploiting human networks, including networks in which individual members actively seek to limit their exposure to the Internet and social media.” [Emphasis added.] That’s key to developing an ability to deal with an enemy like the Islamic State, where every tweeting sympathizer could be connected to a target who would prefer to stay off the radar.
The end goal of much of this activity is something referred to as “human entity resolution.” In the most simple terms, that means figuring out not just the identity of the person visible in the sniper scope but the identities of the people connected to him or her.
Special operations fighters say that information could be critical during an operation. But how much of it can now be obtained quickly and legally? That’s become something of a murky issue. The 1982 document aside, not long ago, it was thought to be well settled that law enforcement and the military could use technology to collect information that would otherwise be public (such as your location in a car) and could use data that you gave to third parties like telephone companies. Huntley called those assumptions the basis for a lot of intelligence operations.
“Both of those assumptions have been called into doubt with recent Supreme Court revelations,” he said.
The Enemy Is Data Mining, Too
The ability to use social network data operationally is no longer unique to the U.S military. It also represents a growing vulnerability for people in uniform. Mathew Freedman, CEO of the firm Global Impact and a longtime Defense Department advisor, noted, “The digital exhaust issue becomes much more critical…when an airline knows everything you look at on Amazon...through data mining blogs and tweets that you are going to attend future NDIA events.” The bottom line for Freedman: “It will be harder for anyone to be clandestine.”
The military is currently testing a new encrypted communications devices that function like smart phone in Honduras (see also how the special forces pioneers of the so-called Blackphone). But encryption alone can’t solve every potential digital exhaust problem.
Consider the recent hack targeting the Central Command’s Twitter and YouTube accounts, which occurred because a Defense Department official did not enable two-factor authentication. The department on Wednesday put out a special instruction document urging employees to take common-sense security precautions. The sheer volume of data we create suggests the invisibility is impossible, both for our enemies and for us. The human race is expected to reach 40 zettabytes of a data a year by 2020, up from 4 zettabytes in 2013. “This is the technological context for every future special operations action,” said moderator Klone Kitchen, a special advisor for cyberterrorism and social media at the National Counterterrorism Center.
Because the work of special operations units is so valuable and so very dangerous, special ops fighters occupy a position of some privilege in the military. Republicans, Democrats and politicians every stripe love the idea of small teams of highly talented super warriors doing what it used to require a—very literal—army. And the American people love stories of extreme heroism hence a seemingly unquenchable appetite for Seal Team Six type media
But there’s a danger in relying on small teams to do too much, an intellectual trap to which two of the nation’s most controversial defense secretaries, Robert McNamara and Donald Rumsfeld fell victim. It may be a behavior that we are repeating.
As McLeary notes, the 2012 White House National Security Strategy “places a premium on the use of special operations forces to operate — quietly — with allies on train and assist missions while continuing their counterterror mission wherever Washington deems fit.”
Washington will continue to see fit to send special operations fighters to do a lot more in the coming years. That could include training, equipping, or helping fighters in places like Iraq, Pakistan or Syria. At some point, those fighters may ask, more publicly, for the ability to use controversial intelligence tools to accomplish those missions.
We may not have an answer for them.