Just for the context, in my article
"The Copyright Mentality" I gave explicit consent to anybody that want to make fake nudes of myself. I wrote the following:
I grant you a right, to use any published photograph ( or video ) of myself, to do fake porn of myself, or any other fake video of myself. As long as it is disclosed that it is fake ( or it is absolutely obvious that it is fake ).
So a 17 girl is suing an AI tech company for "allowing" people to make nude pictures of her. Do I have a problem with that? Do I think it is copyright mentality? No! Based on the story, she suffers humiliation, because people she knows cannot tell the difference between if the images are fake or not. And the boys that generated those images are trying to pose those images as real. As the girl herself states:
ClothOff adds no stamp to images to indicate that they are not real, and viewers unaware of the imagesβ provenance have no way of knowing whether the images are authentic or fabricated,
So what kind of issue is this? A copyright issue? No. It is not her saying that she needs to be paid for her "performance". Or for use of her likeness in production of images. Is that a privacy issue? That's closer, but not quite. Those AI images do not actually reveal anything about her. They generate an imaginary version of her being naked, that looks plausible. But it is not her actual body. Looking at those images you cannot see anything that she might want to hide. Like a tattoo, or a birth mark, or anything that the AI algorithm cannot see on the source image. Of course, we have the problem of the original image being uploaded to a tech company. But that is a separate issue.
So what is the issue then? It is a defamation issue. Is an issue of reputation. She wants to build a certain image to herself and those apps are used to create counter-arguments to her image. She wants to present herself as pure and innocent. And then those boys make porn of her. And show it to people, claiming that she is a "whore". And because people cannot tell AI images from non-AI ones ( mostly / already ) we get a real problem, for this girl.
The issue at hand is not just with fake porn AI apps. But every AI app that can generate images. If not porn, it could be images generated with ChatGPT of her, say being a drug addict, or something. All of those can be used for defamation.
What I don't like about the article is how fucking ageist it is. It is talking about the same one issue, but it specifically uses two terms CSAM and NCII, to describe that one issue. The difference is 1 year of age. For 17 year olds and younger they will use CSAM and for 18 year olds and older it is already NCII. Come on. It is like using two different terms, but one for the white folk and one for the black folk. The defamation problem is exactly the same, whether the victim is minor or whether the victim is an adult. If anything it underplays the importance of the problem, when it comes to adults.
The term NCII or "nonconsensual intimate images" is better applicable here anyway. Even for those that believe that what the law says automatically makes it reality ( which is called schizophrenia ) and they believe minors physically cannot consent. In both cases NCII is the terms to use.
Why not CSAM? Well because it means "child sexual abuse material". It is about abusing a child, and recording it. And then showing it to people. Fake stuff is automatically not a part of it, however much the law wants to pretend it is. Because those boys who sent the pictures of this girl to AI, didn't undress her in the real world. They didn't actually rape her and took pictures of it. They generated fake images. Which means it is only a problem of defamation.
The girl wants the company that produces the app to disappear from the face of the earth. Okay... But it is not a good strategy. First of all there are always going to be other AI companies. And another one will spawn as soon as this one goes away. At the very least they can move to the dark web and operate entirely in crypto if that would be required. Perverts are some of the most resilient people out there.
I think what needs to happen is two things. One, slightly better defamation law. What exactly should be changed about it, I don't know. I'm afraid I will introduce bugs into it, if I will suggest anything. But it should be easier for victims of fake images to do something about those who use fake images against their reputation.
And secondly there should be public awareness of that images aren't real a lot of the times. A photo is not a concrete proof. And if something is being proved by a photo. We should dismiss it. Until we know for sure, that the photo is genuine.
c:0
Happy Hacking!!!
0