IEyeNews

iLocal News Archives

A guide to ‘deepfakes,’ the internet’s latest moral crisis

BY DAMON BERES AND MARCUS GILMER From Mashable

Where there’s innovation, there’s masturbation — at least in one dark corner of the internet, where nearly 80,000 people have gathered to share fabricated videos of celebrity women having sex and Nicolas Cage uncovering the Ark of the Covenant.

These are “deepfakes,” a new kind of video featuring realistic face-swaps. In short, a computer program finds common ground between two faces and stitches one over the other. If the source footage is good enough, the transformation is nearly seamless.

The technology is relatively easy to use, which has created an enthusiast community on Reddit, where users compare notes and swap their latest work: “Emma Watson sex tape demo ;-),” “Lela Star x Kim Kardashian,” and “Giving Putin the Trump face” among them.

Motherboard did foundational reporting on deepfakes in December and continues to cover the trend, with despairingly predictable news last week that people are using the technology to create porn starring friends and classmates. But legal and computer science experts told Mashable that the technology’s grimier applications shouldn’t overshadow its potential for good, even if it’s difficult to see the upside when non-consenting stars are being jammed into hardcore sex scenes with hundreds of thousands of views on Pornhub and Reddit.

The latter company didn’t respond to requests for comment over the course of a week, but Pornhub said it will remove deepfakes from its platform.

“Users have started to flag content like this, and we are taking it down as soon as we encounter the flags,” Corey Price, PornHub’s vice president, said. “We encourage anyone who encounters this issue to visit our content removal page so they can officially make a request.”

Still, to be very clear: All of this should freak you out.

Above, we see Gal Gadot’s face superimposed onto a porn actress, moments before she pulls her shirt off and gets felt up. Consent didn’t factor into the equation for the Redditor who made this clip, and a casual observer wouldn’t know the video is fake if they received the file from a friend via text message or email, because the transformation is so well done.

The issue is pretty simple: A person who has not consented to a sexual situation should not be put into that situation, whether in physical or virtual life. But the genie is out of the bottle, and it’s staying there. “Gal Gadot” remains one of the top terms associated with deepfake searches on Google, as the company’s own Trends data shows:

This underscores the urgency of the problem, even if it’s an emerging one. Content published to the internet can be hard to erase, particularly when there’s a group of people invested in duplicating and spreading it. People could stop creating new deepfakes tomorrow, but Gal Gadot’s clips could live on indefinitely.

Want help? It’s murky

There’s not much legal recourse for those who fall victim to this new technology, according to Jonathan Masur, a professor who specializes in patent and technology law at the University of Chicago Law School. That’s true even for private citizens.

“There’s the copyright claim, if you took the [footage] yourself. There’s the defamation claim if someone tries to say that it’s actually you. And if you’re a celebrity, there’s a right to publicity claim if someone is trying to make money off of it,” Masur explained. “But each of those is just a narrow slice of what’s going on here that won’t cover the vast majority of situations.”

Many of the of these videos acknowledge they’re fake, which undermines a defamation argument.

“[You] could try to make a case it represents a form of defamation if you’re attacking the reputation of someone, but that’s also pretty hard to do because, by definition, you’re not alleging you’re posting a pornographic picture of that individual,” he said.

And, no, recent efforts to ban revenge pornography, led by Mary Ann Franks and Danielle Citron, wouldn’t be applied in these cases, because those laws pertain to the release of private images or video of an individual.

“There’s no pornographic picture of the actual individual being released,” Masur said. “It’s just the individual’s face on someone else’s body.”

There aren’t any laws against this practice yet, nor have they been introduced. Tackling deepfakes via new legislation would be tricky, as doing so would bump against the First Amendment.

“From a civil liberties perspective, I am… concerned that the response to this innovation will be censorial and end up punishing and discouraging protected speech,” David Greene, the civil liberties director at the Electronic Frontier Foundation, a nonprofit focused on digital free speech, said.

“It would be a bad idea, and likely unconstitutional, for example, to criminalize the technology,” he added.

The unexpected upside

Greene’s concerns may not be unfounded. Though deepfakes are now synonymous with porn, the basic concept behind the technology is facial recognition, which theoretically has a lot of upside to be explored.

You may already be familiar with basic, live facial recognition from apps like Snapchat. The technology is programmed to map faces according to “landmark” points. These are features like the corners of your eyes and mouth, your nostrils, and the contour of your jawline.

Snapchat is pretty good at understanding your face and applying transformative effects, which augment your features.

But its face-swapping feature leaves something to be desired.

Part of that has to do with Snapchat working in real-time — it’s trading speed for accuracy.

Deepfakes work differently. The “FakeApp” program uses artificial intelligence to complete three major steps: alignment, training, and merging. Instead of placing one face over another in real-time, the FakeApp uses hundreds of still-frame images pulled from video footage. It digs through all of those images, identifies faces, and analyzes how they’re lit, what expressions they’re making, and so on. Once the program understands the faces it’s working with, it can use all of its “knowledge” to stitch one over the other.

Though it’s been put to a gross purpose, deepfakes’ seamlessness could be an encouraging sign, depending on your perspective. With enough development, real-time face swaps could achieve similar quality to deepfakes, which may have therapeutic uses, according to Dr. Louis-Philippe Morency, director of the MultiComp Lab at Carnegie Mellon University.

“This technology has very important applications apart from entertainment,” he said.

One moonshot example: Dr. Morency said soldiers suffering from post-traumatic stress disorder could eventually video-conference with doctors using similar technology. An individual could face-swap with a generic model without sacrificing the ability to convey his or her emotions. In theory, this would encourage people to get treatment who might otherwise be deterred by a perceived stigma, and the quality of their treatment wouldn’t suffer due to a doctor being unable to read their facial cues.

Another one of Dr. Morency’s possibilities — and its own can of worms — would be to use models in video interviews to remove gender or racial bias when hiring. But for any of this to happen, researchers need more data, and open-source, accessible programs like FakeApp can help create that data.

“The way to move forward with AI research is to share the code, and share the data. This is an enabler for AI research,” Dr. Morency said.

It somehow gets worse

As with many emerging technologies, the scariest part may be unseen. When Facebook first rolled out on college campuses, few would have anticipated its transformation into a multimedia Goliath that potentially destabilized American democracy as we knew it — but here we are.

Like the “fake news” that has exhausted so many of us on Facebook, deepfakes represent yet another capacity for the internet to breach our shared reality. If every video clip could potentially be fake, why believe anything is real?

And so, expect the response from your unborn grandchild: “Raiders of the Lost Ark? You mean the one with Nicolas Cage?”

IMAGES:

Gal Gadot’s face, digitally stitched into a porn clip IMAGE: REDDIT

This scene from “Raiders of the Lost Ark” originally starred Harrison Ford, not Nicolas Cage, though an untrained eye may not know the difference. IMAGE: NICK CAGE DEEPFAKES/YOIUTUBE

 

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *