Weapons of mass deception
Trolls, conspiracy theorists, hoaxers and Trump in the disinformation era
It may be getting harder and harder to figure out the truth, but at least this much is clear: It’s a good time to be a liar.
We’ve spent three years arguing if fake news swung the 2016 election—debating whether the hordes of Russian bots, hoax Facebook pages and inflammatory, dishonest tweets tipped the democratic balance to elect Donald Trump as president.
Yet in those same years, we’ve learned that the stakes in the fight against truth, in a muddy world of social media platforms, go beyond politics.
In Brazil, public health workers were attacked after far-right activists lied on YouTube that they were spreading the Zika virus. Gunmen radicalized by false white-supremacist conspiracies on internet forums like 4chan and 8chan shot up a synagogue in California, a Walmart in Texas and mosques in New Zealand.
Elections have consequences. So do algorithms.
So now, heading into the 2020 election, experts are warning that trolls, hoaxers and dishonest politicians are arming themselves with a whole new arsenal of weapons of mass deception. New technology is making it easier to hoax audio and video, while advances in artificial intelligence are making it all the more difficult to weed out computer-automated “bot” accounts.
And there’s a deeper risk, beyond figuring out the inaccuracy of any one article.
The deluge of misinformation—full of Trump tweets, deepfakes, InfoWars videos, Russian bots, 4chan trolls, that Washington Post correction, those out-of-context memes and your great aunt’s latest questionable Facebook post—has become so overwhelming that some of us may simply give up trying to make sense of it all.
A lie doesn’t need to be believed. It just needs to create enough doubt that the truth becomes polluted. With enough pollution, it’s impossible to see what’s right in front of you.
“When you’re flooded with so much bullshit,” New York Times media columnist Charlie Warzel says, separating fact from fiction becomes so difficult that “the task of trying to do it becomes, you know, tiresome, so you just stop.”
It’s the sort of thing your college philosophy professor might call an “epistemic crisis.” We don’t know what to believe. Truth is hazy. Reality itself becomes irrelevant. It’s a phenomenon that has already happened in places like Russia and the Philippines—and experts say that in the past few years, the United States has suddenly found itself on the same path.
“And that, to me, is one of the scariest things to think about,” Warzel says.
The web of conspiracy
History has a pattern.
An advancement in communications technology hands liars the means to lie louder and spread those lies further. Look at the 1830s, when the invention of the steam printing press and other paper-making technologies produced the rise of the “penny press.” Newspapers became cheaper, more independent, more widespread, more competitive, and eager publishers found the power of the 19th-century version of clickbait. The New York-based Sun put out a series of entirely fictional stories that purported that “man-bats” and other exotic creatures were scurrying around on the moon.
Soviet-born British TV producer Peter Pomerantsev, author of This Is Not Propaganda: Adventures in the War Against Reality, argues that when tech rips open the floodgates of communication, the bad guys always find a way to exploit it. Dictators quickly harnessed the power of radio. Joseph McCarthy, as a U.S. senator in the 1950s, used television to spread his anti-Communist conspiracy theories.
Yet for decades, the internet was heralded as a new frontier that allowed “citizen journalists” to take on the stodgy media elite. In 1998, the Drudge Report, a right-wing news-aggregating website, broke the Monica Lewinsky scandal when Newsweek got cold feet. In 2004, when Dan Rather and 60 Minutes put out a 1973 memo purporting to show that President George W. Bush had received special treatment while in the Texas Air National Guard, Drudge elevated the conservative bloggers who persuasively argued the memo was a fake written in Microsoft Word.
An “Army of Davids”—as some bloggers dubbed themselves—swarmed to debunk flawed media accounts, trying to counter bias wherever they saw it. The gatekeepers were being overthrown, the drawbridge had been flung open.
But the villagers had their own standards for newsworthiness. Drudge also sent his readers to darker corners, where sketchy websites claimed Barack Obama wasn’t an American citizen and Bill Clinton had a secret love child.
Conspiracy theorists used to spread their gospel through books, newsletters, public access television shows, and by standing on street corners and handing out fliers. But the web gave every community a niche—no matter how fringe—and allowed them to spread their message in only a few keystrokes.
The internet, Warzel says, handed fringe figures like Alex Jones of InfoWars a powerful new megaphone.
“He was one of the early pioneers of internet radio and video,” Warzel says. “It was a way to get around the notion that it was hard to sell advertising around some of his kooky ideas.”
An audience of millions repeatedly tuned into Jones’ red-faced rants about 9/11 being an inside job, Obama chemtrails turning frogs gay, and the Sandy Hook shootings being faked. Drudge repeatedly linked to him.
Social media sites only accelerated the spread of misinformation. It’s easier than ever for a single comment, particularly an untrue one, to go viral. In ancient times, the opinions of quacks were largely quarantined to a newspaper’s page of letters to the editor.
New York Times reporter Maggie Haberman and racist Twitter randos with names like “@WhiteGenocideTM” are all simmering in the same stew together. Both, after all, get retweeted by the president.
A Massachusetts Institute of Technology study published last year took a look at over a decade of Twitter posts and found that tweets about false news went viral six times faster than tweets about true news. After all, lies are often more sensational, tapping into human emotions of shock, fear and disgust.
It wasn’t just that humans were more likely to share these kinds of stories. It was that Facebook, Twitter and YouTube developed algorithms to elevate certain types of content into your social media feed. It usually didn’t matter if they were true—social media sites didn’t want to become the truth police. It mattered that the stories drew people in.
“The way they keep people clicking and sharing and commenting is prioritizing things that get your heart pumping,” says Andrew Marantz, author of Antisocial: Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation. “It’s like stocking a huge grocery store, but all of the visible aisles are Oreos and rat poison.”
YouTube actually started rewarding conspiracy theories above popular content. YouTube used to have what the company internally called the “Gangnam Style” problem, where its autoplaying recommendation engine would eventually send every viewer to the 2012 South Korean pop hit. In response, YouTube changed its algorithm in 2015, turning down the recommendation dial for merely popular videos and cranking up the preference for videos that led people down rabbit holes. Conspiracy-theory videos flourished.
Simultaneously, the internet had handed brand-new weapons to pranksters, vandals and assholes—“trolls” who could use misinformation and harassment to make life hellish for chosen targets. Image boards like 4chan combined anonymity and a near-total absence of moderation to become a frothing hive of racists, trolls and trolls pretending to be racists.
The boards delighted in pulling hoaxes—creating fake Jewish Twitter accounts to sow discord in the Jewish community, publishing coupons claiming black people were getting free coffee at Starbucks, and attempting to trick journalists into identifying mass shooters as the wrong person.
Sometimes the hoaxes became reality. A 4chan scheme to trick mainstream media outlets into reporting that the “OK” hand gesture was a white-supremacist sign resulted in white supremacists actually adopting the signal.
Marantz says he spent three years embedded in this world.
“There are people who just want to watch the world burn,” Marantz says. “And that’s a phrase I returned to again and again.”
The motivations vary. In Macedonia, Warzel says, there are clickfarms filled with teenagers pumping out hoax news stories for fake publications, buying Facebook likes, all as a way to make money.
“It’s essentially just like a lemonade stand for them,” he says. But there are also foreign governments trying to influence global trends, politicians trying to game power, and true believers who spread falsehoods because they think it’s the truth.
“To some degree, it doesn’t matter as long as there’s power to be gained and money to be made,” Warzel says.
Art of the lie
Politicians are known to lie. It’s what they do. Presidents lie, whether about WMD or keeping your health care or not having sexual relations with this or that woman. But there used to be limits.
“There were unwritten rules or norms about spin,” said Tim Miller, Jeb Bush’s former campaign spokesman. “You exaggerated for your candidate. You used hyperbole. You tried to muddy the waters.”
But there were unspoken, unwritten lines, implicit walls that mainstream candidates didn’t try to breach. Then came Trump.
Misinformation comes in a hundred ways from a hundred different sources. And yet Trump is somehow all of them.
Trump is America’s top troll. Other primary contenders nitpicked Sen. Ted Cruz’s policy record—Trump called him “Lyin’ Ted,” insulted his wife’s appearance and suggested Cruz’s dad helped assassinate JFK.
The president is America’s chief conspiracy theorist, building his political brand on the lie that Barack Obama wasn’t born in the United States. Everyone expected Trump to claim the election was rigged if he lost—but Trump one-upped the cynics. He claimed the election was rigged when he won, falsely charging that thousands of illegal votes had been cast in the election.
Trump is America’s preeminent liar. At the Toronto Star, fact-checker Daniel Dale tallied over 5,200 false statements from the president since his inauguration, dealing with everything from tariff policy to payoffs to a porn star.
“I was flabbergasted by the frequency and the triviality of many of them,” Dale told the Los Angeles Times. “Trump was simply making things up about everything, for no apparent reason, about the smallest things.”
And now, as House Democrats pursue an impeachment inquiry into whether Trump inappropriately pressured Ukraine to investigate political rival Joe Biden, America gets to watch, once again, how powerful Trump’s misinformation machine is.
A September Monmouth University poll shows that only 40 percent of Republicans believe that Trump mentioned an investigation into Joe Biden during his call with Ukrainian president, ignoring both the rough transcript of the call and Trump’s own words. Indeed, in March, a Quinnipiac University poll found that two-thirds of Republicans believe Trump is honest.
Some of that’s simple partisan psychology. Whether you voted for Trump because of immigration, judges, abortion or tax rates, your mind needs to continually justify your vote. You wouldn’t vote for a liar. You voted for Trump, so Trump must not be a liar. Besides, have you seen those wild claims the Democrats are making?
“At this stage it’s less about defending Trump,” writes columnist Peter Wehner in The New York Times, quoting a conservative psychologist friend. “They are defending their own defense of Trump.”
It’s the same principle that drove feminists to attempt to justify President Bill Clinton’s sexual escapades.
Trump has a legion of staffers and supporters willing to lie for him. He ordered his press secretary to lie about his inauguration crowd size. He pressed the National Oceanic and Atmospheric Administration to defend his inaccurate statements about Hurricane Dorian.
“I have no obligation to be honest to the media,” former Trump aide Corey Lewandowski told Congress in September, speaking under oath. “Because they’re just as dishonest as anybody else.”
Today, Trump has a loyal media apparatus willing to run interference for his falsehoods. The moment a negative story about him goes up, Fox News, the Federalist and a horde of Trump Twitter acolytes fire back with a mix of spin, falsehoods and irrelevancies.
The speed of that response, Miller says, makes it impossible for the truth to get a foothold.
“People are hearing the alternate story at the same time they are hearing the story,” Miller says.
So on the right, Trump’s Ukrainian scandal quickly became a story about the motivations of the anonymous whistleblower, the dishonesty of House Intelligence Committee Chair Adam Schiff, and the purported corruption of Biden and his son.
The Robert Mueller-led special counsel’s investigation into whether Trump’s team colluded with Russia to interfere in the 2016 election turned into a story about the malfeasance of the “deep state,” about Trump-hating FBI agents concocting a scheme to undo the American’s actual election.
For those seeking clarity, the firehose of factually shaky information on Twitter from all sides didn’t help matters. It wasn’t just from Trump fans.
Self-proclaimed members of the anti-Trump #Resistance rack up hundreds of thousands of followers hawking anti-Trump conspiracies and assurances that Trump’s downfall was always imminent.
It’s made social media stars of guys like Ed and Brian Krassenstein—who put out a children’s book featuring a muscled, shirtless Mueller.
In the meantime, the media outlets charged with sorting out the messy truth were being hammered from all sides. Republicans charged that journalists were clearly biased against Trump—just look at all the negative stories they wrote about him!—while Democrats slammed journalists for “false balance” for failing to call Trump’s falsehoods outright lies.
Millions of eyes watched every Trump story, ready to send a barrage of tweets attacking every misstep.
“You could get away with a lot more in the old days,” Miller says. “If you had an error on A17 on the LA Times, people weren’t going to see it.”
The vast majority of media reporting on the Russia scandal was proved to be accurate by the Mueller report. But some bombshells—like reports about a Trump computer server communicating with a Russian bank or WikiLeaks’ Julian Assange meeting repeatedly with former Trump campaign manager Paul Manafort—turned out to be irrelevant or entirely bogus.
Each big mistake plays into Trump’s hands. And since most national media outlets rely on anonymous sources, it’s relatively easy for Trump aides to intentionally trick reporters into making a mistake.
“In some cases, they’re actually trying to put out disinformation this way,” Miller says. “The media reports it, the White House dunks on them for being incorrect.”
In this environment, where liars are everywhere and the truth is almost too strange to be believed, journalists are constantly second-guessing themselves.
“It leads to exhaustion. It leads to burning out,” Warzel says. “And then those mistakes are a breeding ground for more potential misinformation.”
In 2016, BuzzFeed reported that in the last three months before the election, 20 fake news stories from hoax sites—Clinton sold weapons to ISIS! The Pope endorsed Trump!—received more engagement on Facebook than the top 20 stories on actual news sites.
But Trump hijacked the phrase “fake news!” and twisted it into his own catchphrase, a way to disparage any story he didn’t like. It was a joke, but the sort of joke that everyone repeats until it burrows into the national psyche.
If all news is fake news, anything can be true. Everything is a lie and nothing is. In the fight over truth, the fog of war is thick. It’s the perfect environment for an enemy to attack.
Deepfake impact
In the video footage back in May, Speaker of the House Nancy Pelosi sounded drunk. Her words seemed artificially slow, like a drawling slur.
“What is wrong with Nancy Pelosi?” Trump lawyer Rudy Giuliani tweeted. “Her speech pattern is bizarre.”
In reality, the video itself had been doctored, slowed down to make Pelosi sound like she was slurring. Giuliani later deleted the tweet but refused to apologize.
“How could I have figured out that it was inaccurate?” he told The New York Times.
If an altered video that simple could get shared millions of times, experts worried, what could a more sophisticated hoax look like?
In 2017, a Reddit user named “deepfakes,” using Google’s open-source artificial intelligence software, developed a technique to take footage of one face and overlay it onto video footage of someone else. As is typical with new technologies, the internet immediately harnessed it for both pornography and Nicolas Cage memes. The faces of pop stars were imposed on the bodies of porn stars, while the face of the Face/Off actor was swapped onto footage of Gollum and Yoda.
At the same time, another piece of a deceptive puzzle is clicking into place. Audio-editing products like Adobe Max give editors the option to go beyond cutting and splicing sentences, to editing individual sound fragments to make it appear like a speaker said things they never said. Mix and match the sounds, and with a large enough audio library of a politician, you could make them say anything.
Just imagine that a week before the 2020 election, a video is leaked. It appears to be Trump, engaged in criminal and/or sexually explicit acts. Trump denies it. Maybe the video is a deepfake hoax. Or maybe it’s real, and Trump is just using the existence of deepfakes to deny it. Now imagine experts are divided on which is which. When you can’t believe your lying eyes or your lying ears, you’re left to trust your lying gut.
Now, at least, government officials and technology companies are aware of the chaos that these hoaxes could cause. The House Intelligence Committee has already held hearings on the issue.
“The tech companies aren’t ready,” Schiff, the House Intelligence Committee chair, said on a Vox podcast in June. “The government isn’t ready. We don’t have the technologies yet to be able to detect more sophisticated fakes. And the public is not ready.”
In August, the Pentagon started talking to partners for their new Semantic Forensics program, intending to develop technologies “to help to identify, understand and deter adversary disinformation campaigns.”
The private sector’s pushing for similar measures.
Last month, Facebook, Microsoft and a slew of research institutions announced they were joining forces for the “Deepfake Detection Challenge,” a contest to better understand the little clues that give even sophisticated deepfakes away. Deepfakes rarely blink in the right way. The heads might have a strange tic. The eye color might be off.
Facebook chipped in $10 million to the effort. But those trying to create hoaxes are innovating, too, trying to think of ways to out-think the detection system.
The fight isn’t just about technology. It’s about corporate policies. In the last two years, tech companies have tried to change their policies, ditching their laissez-faire libertarian approach to try their hand at benevolent censorship.
White supremacists and conspiracy theorists like Alex Jones got banned from Facebook, Twitter and YouTube. YouTube shifted viewers away from straight-up conspiracy theory videos in its recommendation stream—although liberals may be unhappy to learn they often landed at Fox News instead. Twitter banned the #Resistance-tweeting Krassenstein brothers in June, citing rules that prohibit “operating multiple fake accounts and purchasing account interactions.”
Right now, both major political parties are calling for regulation, including raising the prospect of forcing Facebook to shrink in size. But Republicans and Democrats want different things. While liberals complain about lax regulation allowing “Nazis” to run wild on the site, conservatives fret about overregulation, worried that conservatives could be censored for their political opinions.
But the lack of censorship is dangerous, too, argue some experts. Whitney Phillips, author of the forthcoming book You Are Here: A Field Guide for Navigating Network Pollution, points to YouTube and Facebook’s recent announcement that since political statements were newsworthy, the sites would rarely take down posts from politicians, even if the posts broke the rules.
“At every turn, at every conceivable opportunity, despite how loud the chorus might get, these technology companies made a choice to protect their bottom line over protecting the democratic process,” Phillips says.
Facebook’s motto for its developers was “move fast and break things.” Phillips thinks they were successful.
“Yeah, they’ve broken democracy,” Phillips says.
Too much information
Phillips wants to make it clear that it’s not just Facebook or Twitter’s fault. It’s not just the fault of Alex Jones or Donald Trump or 4chan. It’s your fault, too.
“A lot of the misinformation being spread is not the result of bad actors,” Phillips says. “It’s everyday people doing everyday things.”
She thinks of it in terms of an ecological metaphor, where pollution is the accumulation of a billion little actions from individuals. All of the tweeting, retweeting and Facebook posting adds up.
“We’re sort of at the whims of everyday folks, disinformation agents, algorithms, white supremacists, all jockeying to win the attention economy,” Phillips says. “The result is an air that is so clogged that we can barely breathe.”
In that environment, with so many different competing and contradictory claims, people “don’t even necessarily trust there is such a thing as truth.”
But Phillips doesn’t necessarily agree that more information is the answer. Journalists like to say that sunlight is the best disinfectant. But Phillips argues that sometimes the sunlight simply heats up the petri dish and spreads the disease—especially when people are liable to believe a hoax is true because a journalist says it isn’t.
“The truth can contribute to pollution as much as falsehood can,” Phillips says. “It is easy to feel like you are pushing back against a story when you are saying, ‘This story’s terrible.’ But the algorithm doesn’t care about your righteous indignation. The algorithm cares that you’re engaging with content.”
She urges journalists and everyday people to shift the lens, focusing less on the liars and more on how lies and ideologies have impacted communities.
Warzel, meanwhile, also urges social media users to slow down. Be wary about clicking that retweet button. If a story seems too perfect, doubt it. If a crazy news story doesn’t come from an established media outlet, wait until at least one outlet covers it—ideally two.
Marantz, the expert on online trolls, says the long-term solution to the disinformation crisis is a deep and philosophical one that he’d explain at length with phrases like “reaffirming our commitment to epistemic depth.” But for now, the simpler way to react to disinformation is to rely a little bit more on the old gatekeepers.
“If you read The New York Times or the BBC or the alt-weekly in your town or the New Yorker, you’re going to be better informed than if you read Facebook,” Marantz says.
Not because they’re perfect—there’s a billion reasons to complain about mainstream journalists, he says—but because, for all their flaws, right now they’re the best we’ve got.
“It’s the best short-term solution,” he says, “as opposed to just living a world where no one knows anything.”
A version of this article first appeared in the Inlander, a weekly based in Spokane, Wash.