February 19, 2023
The recent Deep Fake controversy on Twitch has triggered a questionable discussion, a discussion that no one knew we’d have to have in 2023. Believe it or not, whether non-consensual Deepfake pornography is dangerous is up for debate on social media platforms.
A Deepfake is an image, video, or audio of a person whose face or body is digitally altered to look or sound like someone else. It sounds like an incredible technology to produce art, creative wireframes, and more. But, unsurprisingly, 96% of Deepfake is pornographyographic material, and close to 100% of them are of women. Initially, malicious actors created non-consensual content for celebrity women who’d probably never find out about it. But, the virus has now reached regular women in small communities.
The recent Deepfake controversy lifted the curtain on the ugly side of Twitch. Twitch streamer Brandon “Atrioc” Ewing had accidentally revealed that he had Deepfake pornographyography of popular female streamers open on his computer. One of the streamers was QTCinderella, who then took it to stream to call out the perpetrators.
“F-ck Atrioc for showing it to thousands of people. F-ck the people DM’ing me pictures of myself from my website. F-ck you all. This is what it looks like, this is what the pain looks like,” QTCinderella said.
While it should be a no-brainer that non-consensual pornography is just unethical and brutal, it’s not the case. Instead, QTCinderella’s outrage and fair anger were met with some calling her a “snowflake” and saying she did not have real problems in her life. Unfortunately, this surprisingly gross reaction from the male Twitch community overshadowed a few fans and personalities who came out supporting QTCinderella. So the question remains: Who could think Deepfake pornography is not messed up?
Why is Deepfake dangerous?

It’s a simple question with an even simpler answer; Deepfakes go against the basic definition of consent, which is why they are just plain wrong. But that is not all. They are much more dangerous for regular women, small creators, than popular film actors. Women who are not typically putting out risqué content or live in a traditional setting are more at risk of being a Deepfake’s victim.
Violation of consent is a standard issue for all women. Even a movie star who shoots adult scenes won’t like her pornography on the internet shot without her permission. But streamers like QTCinderella, who’re churning out pure gaming content in a small community, are at a greater risk. The non-consensual fake pornography spreads faster in a small community that can use it for many purposes. Unlike movie stars who are barely on social media, streamers will constantly have to deal with direct bullying.
Deepfake isn’t primarily used for fun, light-hearted content. Instead, it’s eventually turning into a dangerous tool for revenge pornography targeted at women. The results are so accurate that anyone would need to do a double-take. Meaning, Deepfake pornography is an act more severe than leaking private content and videos. Here’s why.
The victim never partakes in the act, but the content utilizes their faces without permission, portraying it as real. Malicious actors can use this type of content to bring shame, threaten, blackmail, or just for revenge purposes. Simply put, it’s capable of ruining lives. In QTCinderella’s case, the non-consensual fake pornography not only violated her fundamental rights but it gave her severe body dysmorphia, she said.
“The amount of body dysmorphia I’ve experienced since seeing those photos has ruined me. It’s not as simple as “just” being violated. It’s so much more than that,” She said.
In cases of smaller streamers, they may experience severe repercussions. The stuff on the internet doesn’t go away forever, so your naked images exist without your consent. It has become a deadly tool to ruin reputation online and hold power over the victim by threatening to share real-like fake pornography.
Unfortunately, Deepfake isn’t a universal crime as of now. In some states, curating fake content is considered a punishable crime, including California, where QTCinderella is based. But, again, having people share and watch your phony pornography even once can scar you in unspeakable ways. QTCinderella has decided to sue the perpetrators, but not everyone has that reach and capacity. Many victims would give in, remove their internet presence, and succumb to the bullying, which is precisely why Deepfakes are dangerous. And treating it as just everyday content is encouraging a heinous act.
If anything, the Twitch community’s response has clarified that victim blaming is a sham. Tone-deaf internet users would always find a way to blame the victim, even when the victim is not involved in the deed. It’s a misogynist, dangerous world, and Deepfake is a brand new tool already working against women worldwide.