In The Know by Yahoo
Why you can trust us

We may earn commission from links on this page, but we only recommend products we believe in. Pricing and availability are subject to change.

Woman reveals that she’s being harassed by a man who’s created fake nude photos of her using AI: ‘It’s so scary being a women on the internet’

A woman on TikTok is urging followers to heed her warning about the dangerous, objectifying uses of artificial intelligence — after she finds out a man has altered her own images, in which she’s completely clothed — to create fake nude photos of her.

Rachel (@rache.lzh5) posted an emotional video to her account in which she explains the chilling way she was made aware of these “deepfakes.”

“I don’t even know how to describe what’s been happening to me for the past 48 hours,” Rachel begins. “Two days ago, someone sent me a message request on Instagram from a faceless account. No followers, no posts, nothing… and it was pictures of me that I had posted. Fully clothed. Completely clothed, um, and they had put them through some editing AI program to edit me naked. They basically photoshopped me naked.”

“And it’s already weird to make that on your own time, but it’s even weirder to send it to me,” she adds.

The situation gets even more violating the following morning, when Rachel wakes up to even more direct messages containing these images — without the watermark.

“So this person paid to have the watermark removed and started distributing it like it was real. And they’re really obviously fake too, like, if anyone has ever seen an actual picture or video of me, they’ll know I’m not built like that,” she explains. “I’m just letting you know that anything you see of me is edited or fake. I don’t have any content. I don’t sell content. None of that is real. And it’s so gross.”

Rachel provides additional ways to prove these images are fabricated — namely, that the editing is faulty.

“underneath the left hand there is a chunk of black where they were unable to completely erase my top,” she writes. “there are lines where my tattoos dont line up and folds on my body that arent there. my chest is much smaller than in the edited photos. i dont have any tattoos on my lower uterus area.”

Creating these deepfake nude photos requires little to no effort by the AI user. All it takes is the click of a button to set the service in motion.

Nonconsensual deepfake pornography allegedly first began circulating years ago after a Reddit user shared images that put the faces of celebrities on the bodies of adult actors. Deepfake creators have grown in popularity, and many of them are targeting people like Rachel, who are online influencers with a public profile and noteworthy online presence.

In October 2020, Drew Harwell of the Washington Post reported that an artificial intelligence service, which is freely available on the internet, has been able to “transform more than 100,000 women’s images into nude photos without the women’s knowledge or consent,” in turn, “triggering fears of a new wave of damaging ‘deepfakes’ that could be used for harassment or blackmail.” Harassment from a nameless, anonymous user is exactly what Rachel has been dealing with since receiving that fateful direct message.

“Users of the automated service can anonymously submit a photo of a clothed woman and receive an altered version with the clothing removed,” Harrell wrote. “The AI technology, trained on large databases of actual nude photographs, can generate fakes with seemingly lifelike accuracy, matching skin tone and swapping in breasts and genitalia where clothes once were. The women’s faces remain clearly visible, and no labels are appended to the images to mark them as fake.”

This is the second time Rachel posted this video on TikTok. The first video she created about the fake nude photos was negatively received. Commenters were quick to unfairly, and disgustingly, accuse her of “wanting” these photos out there.

“And all the comments were so disgusting, like, actually vile. They made me want to throw up. Like, multiple times,” she reveals. “The only reason why you would want these pictures of me is because I don’t want them out there. The only reason why you would want these pictures of me is because you like that.”

“The reality is that the technology will continue to proliferate, will continue to develop and will continue to become sort of as easy as pushing the button,” said Adam Dodge, the founder of EndTAB, a group dedicated to providing technology-enabled abuse training via the Associated Press. “And as long as that happens, people will undoubtedly…continue to misuse that technology to harm others, primarily through online sexual violence, deepfake pornography and fake nude images.”

“That’s 100% illegal, call the police and find out who’s behind it than sue them and put them in jail”

Commenters on Rachel’s video are completely in awe. Many of them are suggesting that she take legal action and do everything in her power to catch whoever is responsible.

“AI is getting dangerous. I know deepfakes has always been a thing but it’s becoming worse and worse. im so sorry,” @xjules222 wrote.

“It’s so scary being a women on the internet,” @reihaspiss replied.

“I wish AI could just disappear forever, it’s so harmful and scary at this point, I’m so sorry, you’re not alone,” said.

“That’s 100% illegal, call the police and find out who’s behind it than sue them and put them in jail,” @erina.yamamoto suggested.

The unfortunate truth is that Rachel is one of many women online who’ve been the victim of sexual harassment and violence by way of artificial intelligence. Twitch streamer QTCinderella said someone created computer-generated nude photos of her and, similar to Rachel’s perpetrator, distributed them online.

“For every person saying it’s not a big deal, you don’t know how it feels to see a picture of yourself doing things you’ve never done being sent to your family,” QTCinderella said in a live-streamed video via the Washington Post.

Rachel’s heartbreaking, vulnerable response to this situation makes one thing clear: As AI takes on a larger role in our society, so does its use as a means of targeting and harming women.

In The Know by Yahoo is now available on Apple News — follow us here!

The post Woman reveals that she’s being harassed by a man who’s created fake nude photos of her using AI: ‘It’s so scary being a women on the internet’ appeared first on In The Know.

More from In The Know:

A woman’s thrifted leather jacket from 1971 tells a heartbreaking love story: ‘I wish to hear Dianne’s side of the story so bad’

Woman’s parents try to buy her support amidst their divorce: 'This is what you call blessings in disguise’

This is the exact $17 lip combo Kendall Jenner wore on the Met Gala red carpet

Mother's Day gifts for every type of mom, starting at $15