Connect with us

Tech

Can Deepfake Porn Impact Your Life?

Published

on

Shutterstock, metamorworks

Jack sits down at his laptop and opens the artificial imaging program known as Dall-E. He inputs a picture of himself and a few random words – blue flowers and rooftop. From this, Dall-E creates a realistic image of Jack standing on a rooftop dressed entirely in blue flowers. It’s funny. It makes him laugh. He posts it on Instagram if he wants, or he just deletes it and creates something else.

In addition to being loads of fun to play with, Dall-E and similar programs are useful for graphic artists, advertisers, and even fine artists working in digital media. But there is a downside to digital imaging. That downside is deepfake pornography.

A study in the journal Cyberpsychology, Behavior, and Social Networking defines deepfake imagery as “synthesized material wherein the face of a person is superimposed onto another body.” The article further states, “Most deepfakes found online are pornographic, with the people depicted in them rarely consenting to their creation and publicization.”

Deepfake technology is not a new phenomenon. People have been cutting and pasting one person’s head on another person’s body and posting the results online for several decades. But until recently, the technology for doing so wasn’t exactly fooling people. The fakes were usually obvious. Not so much.

As an article in the International Journal of Evidence and Proof states, “Ultimately, as the technology improves, parallel technologies will need to be developed and utilized to identify and expose fake videos.” Meaning that, as of now, well-produced deepfakes can easily pass as real.

In the past, high-end artificial imaging was the sole province of Hollywood special effects masters. But these days, with technologies like Dall-E, Lensa, and even the relatively unsophisticated face-swapping apps on Google and Apple, it’s become both free and easy to use. Pretty much anyone can now produce sexually explicit content without the consent or knowledge of those involved.

If this doesn’t scare you, it should. Because it’s entirely possible you’ll open your email while enjoying your latte some fine morning and find a pornographic picture of yourself that you never took or consented to. Or maybe that picture will be sent to your family or your employer or just plain posted on porn sites worldwide.

Deepfake pornography can be a huge problem for victims. Consider the following statement from an article in the Vanderbilt Law Review: “Not only does deepfake pornography shatter victims’ sexual privacy, its online permanency also inhibits their ability to [feel safe using] the internet and find [or keep] a job.” Kristen Zaleski, Director of Forensic Mental Health at USC’s Keck Human Rights Clinic, recently told the Washington Post that she’s been working with a schoolteacher who lost her job after parents and administrators learned about deepfake porn using her likeness.

Whether the images were created and posted by an angry ex-boyfriend, a disgruntled student, or even a total stranger who saw the schoolteacher’s face on social media and wanted to see it attached to a naked body is not known and likely never will be. What we do know is that an innocent schoolteacher was sexualized in a very public way, without her consent, with serious consequences to her and no consequences to the perpetrator.

And guess what? She and other victims of deepfake porn have little to no redress because deepfake porn isn’t illegal. Yes, the United States enacted the National Artificial Intelligence Initiative in 2021, but that’s simply a strategy rather than part of our criminal code. Other governments have established similar frameworks, and the European Union is working to enact some actual deepfake laws, in particular, legislation banning artificial imaging programs that pose “unacceptable risk.” But what does “unacceptable risk” actually mean?

Not to mention the fact that it’s well-nigh impossible to police the internet because technology continually changes. Such is especially the case with sexualized technology, as developers create new ways to be sexual on an almost daily basis. Perpetrators of deepfakes (and whatever comes next) will always find ways to act with anonymity while also finding workarounds to whatever rules exist at any given time. Legislation and law enforcement simply cannot keep pace.

This is unhappy news for individuals who are victimized by the technology, most of whom are women. One recent study reports that 96 percent of deepfake victims are sexualized, and nearly all them are women, though males (especially politicians) and even children have also been abused in this way. The same research finds that many victims are harassed or extorted based on this artificially generated imagery.

Whether we like it or not, artificially generated pornography is now a thing, and because it’s almost entirely unregulated, absolutely anyone can be victimized. Moreover, the amount of deepfake porn that’s available has increased almost exponentially in the last year or two, thanks to the recent proliferation of high-end imaging tools.

The online forum Reddit, long known as a haven for all things licit and illicit, has numerous threads dedicated to artificial imaging, with creators (and wannabe creators) sharing information about which programs work best and how to tweak them to work in ways that satisfy one’s personal arousal template. Plus, of course, there are countless deepfake posts to be admired, commented upon, and freely shared.

Worse still, once deepfake imagery hits the internet, it’s nearly always there to stay. Even if the creator feels remorseful and removes the initial posts, others will have downloaded and reposted it. And if these images do somehow get fully removed from the internet, more can be generated at the touch of a button.

In today’s world, artificial imaging tools have moved the world of sexual fantasy into the world of sexual reality. That’s a fact. The transformation is playing with human lives and emotions in painful ways. Technology that allows us to fantasize can also be used to hurt: A bad breakup can take on a whole new dimension; a random stranger can perpetrate sexual violation from a thousand miles away. Sadly, anyone and everyone is at risk. Even you.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *