By Will Wills
Imagine seeing a video of someone notable doing something amazing or disturbing, and finding out that it was completely artificial? Not like that “oh, I can see it looks digital” artificial, but completely photorealistic with an artificial computer-generated voice that cannot be detected as patently phony. That is where Deep Fake is taking us.
The words “Deep Fake” are a combination of “Deep Learning” and “Fake” and there are videos flying around now that are extraordinary. One of the recent ones shows actor Bill Hader in an old interview on David Letterman, and Hader’s face slowly and almost undetectably morphs into Tom Cruise’s face (and later Seth Rogen’s), as Hader relays a story about an encounter with Cruise and Rogan. It’s a weird, uncanny-valley effect that, if you didn’t know what Hader and Cruise looked like, you’d swear the combination of the two is a separate, real person.
“Uncanny Valley” refers to a phenomenon where a computer-generated video or image is so close to seeming real, but still not exactly natural, that it causes an odd uneasy feeling. In the first Shrek movie, the non-ogre version of the princess had to be modified to be less realistic in order to avoid making people suffer through that feeling for the entire movie.
Creating accurate Deep Fakes started back in late 2017 by a Reddit user named “deepfakes” (which is where the name originated) and this eventually raised such a concern that Deep Fakes were banned from Reddit in February 2018. Recent advances in a technology called “General Adversarial Network” (GAN) which was invented in 2014 to scan and classify (and create) images, combined with advances in computer technology has made it far easier to generate very accurate fakes.
The most popular use of Deep Fakes is in placing famous actors’ faces onto actors in porn movies, most notably with Scarlett Johansson. Lately, as with everything it seems, as the technology has become more effective it’s being used for political purposes – often to make subtle changes to videos in order to alter the events that occurred within.
Certainly, in the movie business, the ability to create photorealistic images of actors is useful and cost-effective. Did you note the reanimation of Grand Moff Tarkin in Rogue One: A Star Wars Story? Knowing that the actor had already passed, we were already studying the image closely for realism. But what if we didn’t know he had died, would we have simply guessed he was a real actor? Maybe. It was very realistic and modeled after the movements of the original actor.
Now that you’re hopefully disturbed by this information, how is one to detect whether they’re viewing a real video or a DeepFake? The software to detect artificial modifications continues to stay just ahead of the techniques to falsify videos – but that won’t be for long. Google just released 3,000 Deep Fake videos in order to highlight the problem and familiarize the public with just how realistic these videos can be.
A human rights group out of New York City called “Witness” has been training media companies on how to detect Deep Fakes, specifically the transfer of facial expressions from one person to another. Adobe, a specialist in creating photorealistic photos and images, has been dabbling with creating completely fake, nearly undetectable audio that can be placed into the “mouths” of Deep Fake actors. At the same time, Adobe also released technology that can detect the same – essentially participating on both sides of the equation.
Politically there’s been movement to address these concerns, namely to create the “Deep Fakes Accountability Act” that would criminalize fraudulent behavior using artificial video and/or audio. And while this is slowly making its way through the Federal government, individual states are making movements on the same front.
In the meantime, imagine being sent a video that shows your boss, a co-worker or a family member doing or saying something uncomfortable or even illegal. How will you know whether it’s real or a Deep Fake? First, consider the source of the video. That often is telling on the validity of the video. Second, seek corroborating separate evidence of the same. Finally, going to the subject in the video is a good idea. In the meantime, it looks like we’re heading into a world where it will be increasingly hard to tell what’s real and what’s fake.
This post was previously published on CultureSonar and is republished here with permission from the author.
If you believe in the work we are doing here at The Good Men Project and want to join our calls on a regular basis, please join us as a Premium Member, today.
All Premium Members get to view The Good Men Project with NO ADS.
Need more info? A complete list of benefits is here.
Photo credit: CultureSonar