Winning “Like War”: A Conversation about Social Media and Conflict with Peter Singer
A new book looks at how “likes” and lies are reshaping the nature of war and peace around the globe. We sat down with the author.
Having explored themes of future war in his books Wired for War and Ghost Fleet, Peter W. Singer turns to the online dimension in Like War: The Weaponization of Social Media (HMH, co-authored with Emerson Brooking). This conversation has been edited for length and clarity.
Defense One: You note that the world first learned of the raid on Osama bin Laden’s hideout when a Pakistani man tweeted about a helicopter that zoomed overhead. How does that fact of modern life change warfare from an operations perspective? How does it change it from a strategic perspective? And when you think of big peer state-on-state competition — the U.S., Russia, and China — who has the upper hand in that new reality?
Peter Singer: You have a Pakistani café owner who is up late at night and hears the helicopters coming in. And so he does the new natural thing: he goes onto social media and complains, which is something that we all do. But his complaints double as this live battlefield report that can then also be mined subsequently about what was supposed to be a secret mission. It’s good illustration of the fact that you will no longer be able to operate without someone out there watching you and telling the story.
This has not just changed the tactics of what you might get away with and where you can move. We'll just say there have been operations in the Middle East more recently, in places like Syria and Iraq and Libya, that have been outed before they even begun by things that were being posted on Facebook. But it's also altered the overall strategy. And that's where you look at something like what ISIS pulled off in terms of the first invasion of Mosul.
Unlike the D-Day invasion or unlike the famous Left Hook in the Persian Gulf War, there was no way for ISIS to keep it secret. And so instead of trying to keep it secret they lean into it. They embraced the fact that everyone is watching. They actually wanted everyone to watch. They literally create a hashtag for it, #AllEyesonISIS. Then they try to drive that hashtag viral through the use of both fanboy accounts and an army of online bots. And the reason they want everyone watching is they believe that they can then weaponize it and weave it into their physical battlefield operations. It becomes part of the story of how a smaller ISIS force is able to rout a larger defending force, one that —oh, by the way— is backed the most powerful military in the world, the U.S. military.Now you get to your question, which was ‘who is advantaged or not in this space?’ Whether they are individuals, organizations, nations—actors that understand the new rules of the game and are organized to reflect these new rules are going do better than those who are not.
For instance, the lesson in the book of inundation and experimentation: It's not just pushing out one message. It's pushing out literally tens of thousands of them. And then getting feedback from them and cycling through them — versus, you know, the PowerPoint battles that happen back and forth with the State Department or U.S. military looking for that one perfect, single message. The failure to get this is part of why we're getting our butts kicked online.
But you also have a structural issue that’s particularly true when you think about the United States versus, say, a China or a Russia. We do not have the means to control what is said online in a way that Russia or China does. And I think that's okay. I actually don't want a Russian or a Chinese system.
But there have also been efforts by the U.S. military to learn from that and use it in its own operations. So for example, there was a contract announced that basically said, “rovide us with the means for one person to control multiple social media accounts simultaneously.” That was from a U.S. military contract announcement. So we can say, “Oh we wouldn't do that.’ But we're also doing this as well.
Defense One: What are the rules of LikeWar and can we adopt them in a way that comports with democratic ideals and standards of operation, or are we always going to be disadvantaged?
Peter Singer: One, as we were just talking about, is that it's transformed the speed, spread, and accessibility of information. And that's changed the very nature of secrecy and how you have to think about it. That's reshaped everything from military operational plans to the nature of the news business to how you run a political campaign.
However, while the truth is more widely available than ever before, Rule No. 2 is it can be buried underneath a sea of “likes” and lies. And this is one of the hard truths of this space. Any war is a back and forth between thinking adversaries, and so more and more are learning the value and the lessons of LikeWar. So there's more and more contestation back and forth.
As part of this change, you get the final rule, which is: the rulemakers are a new set of actors. In this realm, which matters so much to war and politics, the laws of war are decided by a handful of tech geeks that never set out to have this role and really aren't all that interested in war and politics. Mark Zuckerberg creates, originally, FaceMash to help fellow Harvard students rate who is hot or not. And now he's making highly political decisions that shape outcomes of actual battles, terrorist recruiting campaigns, and even elections that literally reshape geopolitics.
Like any battlefield, terrain matters. The twist on this is that this is a battle space that's human-made and run by for-profit entities. The result is that these [social media] platforms, they're not designed to reward morality or veracity. They're designed to reward virality. So the online battles, which in turn influence everything from outcomes of elections to outcomes of physical battles, they're propelled not just by the actions of the fighting online tribes but also the attention economy and the underlying algorithms of the system.
Defense One: You bring up the Active Measures Working Group, a U.S. government-led effort in the 1980s to counter Soviet disinformation. What might a modern iteration look like? Who would lead it — the government, Silicon Valley, consumers?
Peter Singer: I think a good model is a hybridization between that old Cold War model with some of the best practices from other nations that have been at the forefront of LikeWar because they were the first to get targeted by it, the nations along Russia's border. We need to look at what the Estonias of the world have implemented, the Swedens of the world, and the like. It involves everything from identification of incoming LikeWar attacks to digital literacy campaigns to enable their citizens to better navigate a world likes and lies.
This also changes the discussion of election security. We have put much of our energy—whether in the media or in Congress—toward the fear of someone hacking the voting booth, which is a very real danger to be sure, but it's something that has never been done on scale successfully before. By comparison, we've done very little about the kind of attack that has happened, which is not hacking the voting booth, but hacking the voter and the environment around them.
This also moves the response over to the private sector. A large part of the change is for these companies to realize that their creations have become the nervous system of the modern world. They've become for profit engines, but they've also become battle spaces. So they've got to understand and treat them that way.
As an illustration, there is the common practice of beta testing, as in: push the product out in the world, see how it goes, learn from the customers, and then rework it again. It makes a lot of sense in consumer tech. Don't worry that the product's not been fully tested. The whole point is to push it out there and see how people react to it and then refine it. That was fine when it was a food-rating app. It's not fine when it's, again, the nervous system of the modern world, politics, war, etc.
The result is that these companies are continually surprised by things that they shouldn't be surprised by. For example, they pushed out live video feed and never thought that terrorists might broadcast their own attacks, never thought that teenagers might live broadcast their own suicides. Well, actually, you should've. And that's where they can pull lessons from, actually, the military world. Wargame it first. Figure out how bad guys might use the technology, how customers and good guys might inadvertently misuse it.
Instead, there's this kind of repeated pattern: the companies create a technology that then causes all of these new ripple effects onto war and politics, which creates new problems for them, which then they think they can solve with a whole new technology. Rinse, repeat, start all over again. That's what's happening right now with AI. They believe AI can be the solution set to all of these problems, but of course, while AI can screen for certain language, extremists have discovered that if you put, for example, intentional typos in it, then the AI might not find it. That turns into, eventually, the threat actors are using AI themselves.
Then we get to you and I. There's all sorts of things that we need to do in this space as individuals. I think there's an appropriate parallel to cybersecurity, where cybersecurity involves everything from governmental reorganization, private sector best practices, to your responsibilities on your own accounts. It's the same thing on the LikeWar side of things. It's like from the movie Rounders. If you sit down at the poker table and you don't know who the mark is, you're the mark. That's the case, I think, in a lot of this right now.
Defense One: You spend a good deal of time talking about the Arab Spring, which saw a beautiful debut in Tunisia; a stunning, very surprising success in Egypt; then a slog in Libya grinding towards civil war, and an absolute disaster in Syria. What was our sin in the Arab Spring in the West, in the U.S. government? Do you think that online popular democratic movements still have a future?
Peter Singer: The sin was one of arrogance and ignorance, especially on the very nature of politics and war, that there's always a back-and-forth between two thinking adversaries. This is the Clausewitzian side of it. Yes, you saw groups use the power of social media to bring attention to a cause, mobilize larger numbers than ever possible before, which led to mass movements. However, in turn, other governments look at that and say, "Huh, that's not gonna happen to me. I'm gonna learn how to control it, manipulate it, steer it to my own ends.”
You're now seeing the people try and move against that. Even within China. Of course, you've got this series of control mechanisms, but you also have these attempts to work around them, right? What words are banned? I'll use coy langue to steer around it. I'll use imagery to steer around the filters.
What's fascinating is you can see that very same back-and-forth is actually part of the conversation, say in the United States, over the companies themselves trying to police their networks for extremism. Think about Neo-Nazis, the alt-right, white supremacists, and the like; they are using the same strategy of terms and imagery with double meanings that the Chinese activists are using it to try and get around certain controls.
Defense One: You followMichael Flynn, Donald Trump’s former National Security Advisor, whose story in many ways characterizes the personal dangers of life in next era. An innovator in the military’s use of open source intelligence, Flynn eventually became a proponent of conspiracy and a person who has been convicted of mistruth. What’s the final takeaway of Flynn as cautionary tale?
Peter Singer: I think the way we end the book is the answer. It is up to each of us as individuals, organizations, and nations to decide either to make this worse or make this better. What's so special, so different about social media is that we each now have a choice. We can decide what we share and through what we share, we reveal who we truly are.