This photo illustration shows the profile page of Elon Musk, whose purchase of Twitter has been followed by a rise in misinformation spreading on the social-media platform now called X.

This photo illustration shows the profile page of Elon Musk, whose purchase of Twitter has been followed by a rise in misinformation spreading on the social-media platform now called X. Matt Cardy/Getty Images

How the internet is making America more susceptible to rumors and lies

Online trends, combined with basic psychology, are increasingly helping to spread disinformation and fake news.

New trends in online interaction are making Americans more susceptible to the rumors and lies that undermine democracies and their militaries.

That’s the common thread through three recent studies of online behavior. One found that people’s views of reality are increasingly shaped by the online groups they identify with. A second found that views of society are shaped by the extreme voices that dominate social media. And a third found that groups that use encrypted messaging apps are becoming more insular. Their findings hold clues to the future of authoritarian currents and extremist political violence in the United States.

The first study, by Matthew Facciani of the University of Notre Dame and Cecilie Steenbuch-Traberg of the University of Cambridge, found that people with politically homogeneous personal networks—that is, they largely interact with people they believe to be like them—are more likely to believe and share politically fake news and rumors that reflect well on their group or poorly on the opposite group. 

The study analyzed data from 214 participants, evenly split between Democrats and Republicans. It looked at two factors: network homogeneity and what’s sometimes called “cognitive reflection”: basically, how quickly and emotionally people form perceptions from stimuli. It measured those factors against the likelihood of the participants spreading two types of misinformation: political rumors and fake news headlines. Results showed that network homogeneity significantly increased belief in and sharing of fake news that matched the participants’ group identity. 

“Both Democrats and Republicans with a politically homogeneous network will be more likely to rate politically congruent rumors as true,” the authors wrote, though Republicans were more likely to believe politically congruent fake headlines. 

Another study, appearing in the December issue of Current Opinion in Psychology, indicates that the polarization of the American public is only going to deepen. The authors found that social media distorts perceptions of social norms in what they dub “a funhouse mirror effect.” Political discourse online is largely dominated by extreme ideological voices, creating a false perception, even among normal, level-headed people, that these views are more widely held than is the case.

Thanks to social-media platforms and their engagement-seeking algorithms, a small portion of users are able to have outsized effects. For example, only 3% of active accounts on Twitter are responsible for producing one-third of the content, and 0.1% of users share 80% of fake news.

“Not only does this extreme minority stir discontent, spread misinformation, and spark outrage online, they also bias the meta-perceptions of most users who passively ‘lurk’ online,” write authors Claire Robertson, Kareena del Rosario, and Jay Van Bavel. 

Why does that happen? The problem is that people naturally take mental shortcuts to make judgements, a phenomenon called ensemble coding. It’s sort of like prejudice but a bit more nuanced. When you are thinking about what a particular group believes, you are going to rely on the voices of the most prominent members of that group and extrapolate that those voices are a good representation of the whole crowd.

“Ensemble coding is cognitively efficient, allowing people to encode a single representation of a set of stimuli, rather than encoding and memorizing every item,” they write. “Socially, ensemble coding allows people to form a single estimation of group emotion or opinion, rather than individually encoding each person’s reaction.”

That natural neurological tendency to assume that the loudest members speak for a group can work fine when those voices are actually representative. It’s why advocacy groups and political parties have leaders to push their causes to the broader population. But online, the study shows, the most prominent voices representing various groups are not representative. They’re just the loudest and most angry. 

“The most widely shared content on Twitter/X and Facebook is moralized content, such as expressions of outrage and hostility towards political outgroups,” the authors observe. 

That means that the more time people spend online, the more they’re performing the normal mental task of ensemble coding based on coked-up data. “This may be especially problematic for topics like politics, where opinions are invisible and people are generally hesitant to share their opinions with others in everyday life,” the authors write. 

That’s a big issue for the future of democracy in general. Other research finds that as more and more people feel worse about other groups in their country, and become convinced that those groups are very extreme in their views, it can lead to “support for authoritarian regimes.” 

Why does that matter to the military and to national security? Consider those findings in the context of the revelations of a third study that shows small, insular groups increasingly congregate on encrypted messaging apps such as Telegram, Whatsapp, and Viber. Published on Thursday, the study shows that such group chats are exposing more people to propaganda and even recruitment into violent organizations, and doing so in ways that are increasingly invisible to the outside world.

Scholars at the NYU Stern Center for Business and Human Rights and the Center for Media Engagement at the University of Texas in Austin looked at 4,500 encrypted messaging app users in nine countries. Those users included “propagandists,” which the researchers defined as “individuals or groups working to leverage media and communication in purposeful efforts to manipulate public opinion, particularly during elections and other events of civic significance.”

Apps like Telegram are increasingly playing a role in conflict. Violent extremists have used it to plot attacks against Western targets. Members of the Russian military use it to communicate in the absence of more secure tools. 

Such apps have important value for privacy, freedom of expression and communication with the free press. And the scholars are adamant that policy restrictions on such apps should not aim to break encryption. But the way some individuals can use the “broadcast” feature to convene and then directly message self-selected groups is ripe for abuse. 

“Our survey of messaging app users confirms that propagandists are able to reach non-consenting recipients despite existing safeguards. Among users who reported receiving political content via messaging apps in the last year, 55% said some of that content came from people or accounts they did not recognize,” they write. 

Content moderation on encrypted platforms is almost non-existent, even compared to the decreasing content moderation on other social media platforms. 

“The average removal rate of terrorist content is lower on messaging apps than on almost all other platform types, including social media, video-sharing sites, and file-sharing platforms,” they find, citing other published research on the topic.

These trends still may not seem directly relevant to military leaders—especially since, by law and job function, there is little officials can do to alter them. But when Col. Michael Kelley of the U.S. Army Reserve looked at Russian disinformation around the war in Ukraine, he concluded that U.S. defense leaders have “failed to assess [the] impact or sufficiently negate” foreign disinformation efforts.

In a May article published by the Army War College, Kelley suggests that greater media literacy and much more timely and accurate releases of information could be one important tool for countering those trends, at least for the military. 

“This proposal may raise the ire of those who believe the US military’s only purpose is to fight and win kinetic wars,” he writes. “The Joint Force is involved in many other activities to advance America’s strategic interests. An information literacy campaign can be seen as battlefield preparation. Correctly done, this campaign will better prepare allies, partners, and others to accept truthful Western narratives and reject disinformation disseminated by adversaries.”