Major digital companies have reacted differently to slow the viral spread of a subtly manipulated video of House Speaker Nancy Pelosi – a video that could be a harbinger of the political digital fakery to come in 2020 and beyond.
A video that appeared to have been digitally altered to slow the speech of the House Speaker at a recent public appearance, making her appear impaired, made the rounds on social media and major digital platforms in recent days.
YouTube said it had removed versions of the slowed-down video, saying it violated company policies.
A representative for Facebook said their third-party fact-checking partners had deemed the video “false,” and the social media giant was “heavily reducing its distribution” in Facebook’s newsfeed. Versions of the video still exist on Facebook, however, one of which has been viewed more than 2.5 million times as of this report.
(MORE: Seeing but not believing: Inside the business of 'deep fakes')A spokesperson for Twitter declined to comment, but at least one version is still live on that platform as of this report. Twitter’s media policy bans videos that show things like “gratuitous gore” or “adult content,” but doesn’t mention deceptively altered videos.
The slowed-down video was one of two controversial Pelosi videos in the news. The other, heavily edited footage of Pelosi appearing to stammer during another speech, was retweeted by President Trump Thursday, a move that came as the two political leaders traded increasingly personal public jabs.
Ben Nimmo, an information defense fellow at the Atlantic Council’s Digital Forensic Lab, said that the slowed-down video is the latest example of the growing ease with which online content can be altered to sow chaos in the political process – a threat that he expects will only grow as the 2020 presidential race approaches.
“It doesn’t have to be massively complex… but it hits the crucial nerve,” he said. “There are people out there who will want to believe it and people out there who will want to share it because it suits their political beliefs.
“A fake doesn’t have to be 100 percent convincing; it just has to be convincing enough,” he said.
(MORE: House Judiciary Chairman Jerry Nadler appears to nearly faint at New York event)Nimmo said the Pelosi video is also representative of the “slippery slope” from relatively simple manipulation to much-feared “deep fake” videos.
Deep fakes are a more sophisticated brand of digital manipulation in which a programmer can convincingly make it appear as if an individual said or did something they did not, as highlighted in an ABC News “Nightline” investigation.
Experts have been warning that convincing deep fakes – like the one produced by Buzzfeed that put words in the mouth of former President Barack Obama – could make an appearance ahead of the 2020 race as the technology becomes more and more accessible.
“I think the challenge is that it is easier to create manipulated images and video and that can be done by an individual now,” Matt Turek, head of the media forensics program at the Defense Advanced Research Projects Agency, run by the U.S. Department of Defense, told ABC News of deep fakes in December. “Manipulations that may have required state-level resources or several people and a significant financial effort can now potentially be done at home.”
Deep fakes got a shout-out from U.S. intelligence leaders at this year’s Senate Intelligence Committee Worldwide Threat hearing in January. When discussing online influence and election interference threats, Director of National Intelligence Dan Coats testified U.S. “adversaries and strategic competitors probably will attempt to use deep fakes or similar machine-learning technologies to create convincing – but false – image, audio, and video files to augment influence campaigns directed against the United States and our allies and partners.”
Nimmo said it’s important the public is aware of a threat that’s not going away anytime soon.
“I think this is something we’re going to have to prepare for for every election in the coming years,” he said.