[icon ] blenderdumbass . org [icon scene] Articles

My take on the 17 year old girl suing AI companies for Fake Nudes

October 22, 2025

πŸ‘ 40

https://mastodon.social/ : πŸ‘ 2
https://blenderdumbass.org/articles/my_take_on_the_17_year_old_girl_suing_ai_companies_for_fake_nudes : πŸ‘ 2
https://blenderdumbass.org/do_login : πŸ‘ 1
https://blenderdumbass.org/ : πŸ‘ 1
https://blenderdumbass.org/articles : πŸ‘ 9
https://blenderdumbass.org/articles?page=1 : πŸ‘ 1

#AI #defamation #law #privacy #deepfake

License:
Creative Commons Attribution Share-Alike
...in reply to:

Teen sues to destroy the nudify app that left her in constant fear - Ars Technica

arstechnica.com



Lawsuit accuses nudify apps of training on teen victims’ images.


[icon internet] View Referenced Publication


[avatar]by Blender Dumbass

Aka: J.Y. Amihud. A Jewish by blood, multifaceted artist with experience in film-making, visual effects, programming, game development, music and more. A philosopher at heart. An activist for freedom and privacy. Anti-Paternalist. A user of Libre Software. Speaking at least 3 human languages. The writer and director of the 2023 film "Moria's Race" and the lead developer of it's game sequel "Dani's Race".


5 Minute Read



Just for the context, in my article "The Copyright Mentality" I gave explicit consent to anybody that want to make fake nudes of myself. I wrote the following:

I grant you a right, to use any published photograph ( or video ) of myself, to do fake porn of myself, or any other fake video of myself. As long as it is disclosed that it is fake ( or it is absolutely obvious that it is fake ).


So a 17 girl is suing an AI tech company for "allowing" people to make nude pictures of her. Do I have a problem with that? Do I think it is copyright mentality? No! Based on the story, she suffers humiliation, because people she knows cannot tell the difference between if the images are fake or not. And the boys that generated those images are trying to pose those images as real. As the girl herself states:

ClothOff adds no stamp to images to indicate that they are not real, and viewers unaware of the images’ provenance have no way of knowing whether the images are authentic or fabricated,


So what kind of issue is this? A copyright issue? No. It is not her saying that she needs to be paid for her "performance". Or for use of her likeness in production of images. Is that a privacy issue? That's closer, but not quite. Those AI images do not actually reveal anything about her. They generate an imaginary version of her being naked, that looks plausible. But it is not her actual body. Looking at those images you cannot see anything that she might want to hide. Like a tattoo, or a birth mark, or anything that the AI algorithm cannot see on the source image. Of course, we have the problem of the original image being uploaded to a tech company. But that is a separate issue.

So what is the issue then? It is a defamation issue. Is an issue of reputation. She wants to build a certain image to herself and those apps are used to create counter-arguments to her image. She wants to present herself as pure and innocent. And then those boys make porn of her. And show it to people, claiming that she is a "whore". And because people cannot tell AI images from non-AI ones ( mostly / already ) we get a real problem, for this girl.

The issue at hand is not just with fake porn AI apps. But every AI app that can generate images. If not porn, it could be images generated with ChatGPT of her, say being a drug addict, or something. All of those can be used for defamation.

What I don't like about the article is how fucking ageist it is. It is talking about the same one issue, but it specifically uses two terms CSAM and NCII, to describe that one issue. The difference is 1 year of age. For 17 year olds and younger they will use CSAM and for 18 year olds and older it is already NCII. Come on. It is like using two different terms, but one for the white folk and one for the black folk. The defamation problem is exactly the same, whether the victim is minor or whether the victim is an adult. If anything it underplays the importance of the problem, when it comes to adults.

The term NCII or "nonconsensual intimate images" is better applicable here anyway. Even for those that believe that what the law says automatically makes it reality ( which is called schizophrenia ) and they believe minors physically cannot consent. In both cases NCII is the terms to use.

Why not CSAM? Well because it means "child sexual abuse material". It is about abusing a child, and recording it. And then showing it to people. Fake stuff is automatically not a part of it, however much the law wants to pretend it is. Because those boys who sent the pictures of this girl to AI, didn't undress her in the real world. They didn't actually rape her and took pictures of it. They generated fake images. Which means it is only a problem of defamation.

The girl wants the company that produces the app to disappear from the face of the earth. Okay... But it is not a good strategy. First of all there are always going to be other AI companies. And another one will spawn as soon as this one goes away. At the very least they can move to the dark web and operate entirely in crypto if that would be required. Perverts are some of the most resilient people out there.

I think what needs to happen is two things. One, slightly better defamation law. What exactly should be changed about it, I don't know. I'm afraid I will introduce bugs into it, if I will suggest anything. But it should be easier for victims of fake images to do something about those who use fake images against their reputation.

And secondly there should be public awareness of that images aren't real a lot of the times. A photo is not a concrete proof. And if something is being proved by a photo. We should dismiss it. Until we know for sure, that the photo is genuine. c:0

Happy Hacking!!!

[icon unlike] 0
[icon left]
[icon right]
[icon terminal]
[icon markdown]

Find this post on Mastodon

[avatar]  Troler c:0


A photo is not a concrete proof. And if something is being proved by a photo. We should dismiss it. Until we know for sure, that the photo is genuine.


What do you mean, all photos are true. Nikolai Yezhov was never near Stalin, never.

[icon reply]
[icon question]











[icon reviews]Ready Player One is about Privacy, Digital Rights and Ageism

[thumbnail]

[avatar]  Blender Dumbass

πŸ‘ 23



A lot of people have a mixed bag of feelings when it comes to Steven Spielberg's 2018 masterpiece Ready Player One. They dislike the nostalgia bait, and the countless references. They poke fun at logical inconsistencies. Yet nobody can deny the fact that Spielberg apparently is incapable of making a terrible movie. Still, how many of you actually looked at Ready Player One seriously? How many of you thought about it's messaging? How many noticed the politics that Spielberg is hiding in plain sight?


#readyplayerone #stevenspielberg #film #review #movies #cinemastodon #privacy #digitalrights #eff #fsf #richardstallman #ageism


[icon articles]HALO 3C/3C-PS a Spy Device in School's bathrooms

[thumbnail]

[avatar]  Blender Dumbass

πŸ‘ 30 πŸ’¬ 2



The following text is a story I wrote, after listening to a DEFCON Conference talk about HALO 3C devices that are designed to catch smoking and vaping children in schools. The hackers Reynaldo and nyx got access to one such device, took it apart, looked into its hardware and software. And found out that at best it is poorly designed. And at worst it is a literal Mossad bug to listen to kids talk.


#privacy #childsafety #surveillance #politics #hacking #defcon


[icon codeberg] Powered with BDServer [icon python] Plugins [icon theme] Themes [icon analytics] Analytics [icon email] Contact [icon mastodon] Mastodon
[icon unlock]