🔒Hey there, cyber adventurers!

Imagine waking up one day to find sexually explicit images of yourself online - images you never consented to, and that could ruin your life. This is the nightmare that thousands of women face every day, thanks to the insidious rise of deepfake pornography.

Noelle Martin's life was turned upside down at the tender age of 18 in ways she could never have anticipated. One day, she came across a stunning discovery: graphic photographs of herself were being posted online without her permission. It was a nightmare scenario that no one should have to go through, made worse by the fact that she had no recollection of ever taking such pictures.

To make matters even more distressing, her face was displayed in these sexual photographs, but her body was not. She had become a victim of the insidious practice of deepfakes, as she soon discovered - a term that would eventually become all too common in our digital age. The photos she had carelessly uploaded on her personal social media sites had been modified, leaving her feeling violated, helpless, and completely exposed.

"This is a lifelong sentence," Martin said. "It can destroy people's lives, livelihoods, employability, interpersonal relationships, romantic relationships. And there is very, very little that can be done once someone is targeted".

Deepfakes are digitally made or altered films and images using artificial intelligence or machine learning. Originally, Deepfake pornography became popular online several years ago when a Reddit user published films of female celebrities' faces on porn actors' shoulders. Deepfake producers have since distributed similar movies and images aimed at online influencers, journalists, and anyone with a public presence.

Thousands of Deepfake videos can be found on many websites. Some have allowed users to create their own images, essentially allowing anyone to turn anyone into sexual fantasies without their consent, or to use the technology to harm former partners.

Some AI models claim to restrict access to explicit images. OpenAI eliminated explicit content from data to train DALL-E, which limits users' ability to make certain types of images.

Midjourney prohibits specific keywords and encourages users to report bad photographs to administrators. Stability AI also removed the ability to create explicit images using its image generator Stable Diffusion. TikTok and Twitch have also tightened their rules to better protect their platforms from harmful materials.

However, keeping Deepfakes off the internet requires diligence. Apple and Google recently removed an app from their app stores that displayed sexually provocative Deepfake videos of actresses to advertise the product. Deepfake porn research is scarce, but one report issued by the AI business DeepTrace Labs discovered that it was almost solely weaponized against women, with western actresses being the most targeted, followed by South Korean K-pop singers.

Sexual deepfakes primarily target female celebrities in the United States and South Korea. These movies have become so common in South Korea that individuals petition the government to address the issue. The ladies in these flicks, on the other hand, have few options. Many major social media platforms have prohibited non-consensual sexual deepfakes, but few countries have enacted laws to help these women have the content removed from all websites.

Laws addressing sexual deepfakes have been enacted in states such as Virginia, California, and some jurisdictions in Australia. However, regulation of sexual deepfakes is still uncommon globally. Scarlett Johansson, an American actress who has had numerous sexual deepfakes made of her, has voiced anger that, even with enormous resources to fight back, it can be impossible to remove these movies from the internet.

While celebrities are the primary target of deepfakes, everyday women and female public figures are increasingly targeted. In other situations, these movies were specifically manufactured to be used as harassment.

Rana Ayyub, an Indian journalist who spoke out against the government's response to an eight-year-old girl's rape, was the target of a deepfake film created as part of a concerted internet hate campaign. Noelle Martin, an Australian young woman who has been lobbying against image-based sexual assault, has also been the subject of created sexual photos and deepfaked video. Helen Mort, a UK poet and broadcaster, recently discovered deepfakes of herself online. These movies are intended to scare and deter women shown in them from adopting public positions, in addition to injuring women by co-opting their personal identities.

The rise of non-consensual Deepfake pornography is of great concern for women's safety and privacy, and it's essential to take necessary measures to prevent it from spreading. Social media companies must establish and enforce strict policies to prevent Deepfakes spread on their platforms. It is also crucial to implement laws and regulations globally to protect individuals from this form of online sexual violence.

Table of Contents

WEEKLY NEWSLETTER

Keep up with important cybersecurity & data privacy changes, trends, and technologies that can make your life more secure and convenient every day with our high-impact weekly privacy email!

*By clicking on subscribe, I agree to receive communications from Abhedit

Leave a Reply

Your email address will not be published. Required fields are marked *

SHARE THIS POST

Scroll to Top