You’re looking at the future of fake news and propaganda.
Don’t be embarrassed if you were fooled, even if only briefly. Technology to trick our eyes and ears is advancing rapidly. Teams in Germany are working on Face2Face, the type of face- and voice-swapping technology used to create the video above. Software giant Adobe is creating a “Photoshop for audio” that makes it easy to edit and manipulate what someone has said, as is a Montreal startup called Lyrebird. After you’ve selectively edited someone’s words, you could take that audio and use tech developed at the University of Washington to generate a video of the very same person speaking those words, just to make it fully convincing.
The ObamaPeele video was created using another emerging technology: FakeApp. It’s a free tool that’s recently been used to insert the faces of celebrities into porn videos. Other people have put Nic Cage into Indiana Jones, face-swapped Nic Cage and Lois Lane, and created other cinematic crimes against humanity.
Part of the process for creating this video involved taking an original video of Barack Obama and pasting Jordan Peele’s mouth into it. It looked really bad and clumsy at first. But things got remarkably better the longer FakeApp was left to process a more believable amalgamation of Obama’s head and Peele’s mouth. That took roughly 56 hours and was overseen by a video effects professional.
So the good news is it still requires a decent amount of skill, processing power, and time to create a really good “deepfake.”
The bad news is that the lesson of computers and technology is this stuff will get easier, cheaper, and more ubiquitous faster than you would expect — or be ready for. Even at this early stage it’s proving difficult for humans to consistently separate deepfakes from real videos, according to Matthias Niessner, who runs the Visual Computing Lab at Technical University Munich and leads the research on Face2Face.
“Essentially, we have a user study where we asked people try to spot the difference — turns out we humans are not so great at it,” he told BuzzFeed News.
This is why experts in computer science have been warning that an age of ubiquitous deepfakes could help usher in an “infocalypse.”
Niessner, however, is hopeful that groups like his are making progress training artificial intelligence to detect a fake.
“On the positive side, we managed to train several neural networks that are indeed pretty good at figuring out forged images/videos. … Ideally, we're imagining automated methods in a browser or social media platform to tell what's fake and what's real,” he said.
But right now the technology to create effective fakes is widely available, thanks to FakeApp. The tech to spot them is not. This is where you come in.
There are basic tips you can follow to ensure you don’t get fooled easily. If more people are equipped to spot a fake, then the world has a better shot at avoiding a dystopian hellscape where every movie stars Nic Cage, and governments, trolls, and companies manufacture perfect fakes that collectively make us lose touch with reality.
1. Don’t jump to conclusions
One general best practice is to “refrain from jumping to strong conclusions in either direction without additional information and confirmation,” said Hany Farid, a professor of computer science at Dartmouth College and one of the world’s leading image forensics experts. Wait and see what other information comes to light about the video or piece of content in question, and hold back on sharing it with others. This is especially true in breaking news situations, where information moves quickly and is often wrong or misinterpreted in the early hours.
2. Consider the source
Pay attention to where this piece of content originated. Was it uploaded to a random account on a social network? Who is claiming ownership of it, and where do they say it came from?
3. Check where else it is (and isn’t) online
Do an online search with the video’s title or other relevant keywords to see if multiple credible news outlets and the person or entity in question have weighed in on it. Again, no need to rush to a conclusion until you can consult multiple courses.
4. Inspect the mouth
Farid said that today’s deepfake videos often have telltale signs. One thing to look closely at is the mouth of the person speaking, because these tools often struggle “to accurately render the teeth, tongue, and mouth interior. So, keep a watchful eye on the mouth for [visual anomalies].”
5. Slow it down
Farid also encourages people to slow down and freeze parts of the video to watch it more closely. “Slowing a video down so that you can clearly see the frame-to-frame transition can sometimes reveal temporal glitches that are introduced in manipulated video,” he said.
You don’t need to be an expert in artificial intelligence or audio engineering to protect yourself and help prevent fake videos and other misleading content from spreading. All it takes is a bit of patience and skepticism. To follow ObamaPeele’s timeless advice: Stay woke, bitches.
Here's the same video without any swearing, in case you want to point people to a clean version: