When pointing out my frustrations with AI Generated Art flooding every nook and cranny of the internet these days, I often stumble upon AI Art apologists who say
things like this:
the publisher of arcane tried to use ai generated advertisement, they received a lot of backlash because of the poor quality and obviousness that it was ai generated
The person argues that the whole backlash against AI is based on apparent lack of quality. Or some sort of "obviousness" when it comes to AI. And not on the actual problem: The feeling of lack of respect from the publisher, to the audience. The feeling that the publisher didn't even take time to put some effort into the thing we, as the audience, supposed to engage with. This cheap feeling of being used is what makes people angry. Not the lack of quality.
Other people, who sort of agree with me on this AI problem, also say things that I feel are not sustainable. Such
as this:
I can immediately spot proprietarily copied art.
This is not an answer. AI is getting better. Last year it had problems rendering hands. Today the hands are almost working properly. Last year it had a problem generating video. Today it can generate very convincing video. We are headed into a future where nothing is clear. Where it is impossible to just look at a thing, and know for sure that it wasn't generated.
And therefor we have to push, now, for some ways, for people to make sure they are engaging with something done with effort. And we have to regard anything else, anything without a proof, of sorts, as something generated by AI.
I'm currently recording the screen of my computer to make a proof that this article is written by me and not AI generated. Maybe this proof will be enough. But I am afraid. I'm scared that in not so distant of a future, even a video of typing an article will not be a sufficient proof. So lets talk about it... Shall we?
Ways to prove your art isn't AI
I think for a while recording a video of you making the damn thing, and publishing it alongside the work, will make for a decent proof. Until maybe in future AI will get to a point where it can generate both the art and the proof. Which is scary to think about.
There is AI that can generate video and it only gets better. So while we can use video for now, we need to think about the future.
For certain projects, like an animated film, we can
provide source files. For software it's not as easy. There is plenty of AI that can generate source code. And that is pretty much what the AI generated software is. So having source code on a piece of software does not prove it isn't generated.
There are talks about laws to mark anything AI generated as AI generated. And I'm not against that. Maybe there could be laws also about the proofs of something being not AI. And that those proofs cannot be AI generated. Or like, there could be a penalty for proving something isn't AI, while it is AI. But then how the law would know?
There are people who are trying to develop AI detectors. Often AI, AI detectors. And I can see how those things could be useful for now. But thinking about the way AI is trained: By comparing its output to the training data and rewarding itself when matching the training data faithfully. Basically, in the future, a successful AI is one that can fool AI detectors very well. So even that is becoming hard to know for sure.
Maybe the only way to prove something is to know the people who are doing the art. But if those interactions are on the internet, this could potentially be all faked with AI as well. And Meta is apparently trying to do it already.
So I really don't know how we will move forward. I know that if I wrote something I wrote it. I know that what I wrote isn't AI, but why anybody else would believe me?
So at the very least we need to talk about it. And research it. And make sure we can prove it somehow, in ways that it is sustainable for the future.
Acceptable AI?
I wrote
an article not so long ago talking about Blender's denoiser ( which is apparently AI ) and how I don't think using it is necessarily a bad thing. Denoiser is just a filter that does a very minor adjustment to an image that the artist needs to fully make first.
But that kind of thing opens up a whole other can of worms: How much something has to be human-made for it to be acceptable?
We had this kind of conversation with CGI before. Which is not done, by the way. There are film studios around the world being
caught lying numerous times that their films aren't CGI.
But with CGI in comparison with AI, I at least can see the effort. I am myself very familiar with how hard CGI can be. The fact that it is made on a computer, doesn't make it easier. I really like films like
Avatar because of the enormous effort put into making the CGI. And even with bad CGI in bad films, the CGI is rarely easy to make.
Moria's Race took me years, because CGI isn't simple.
AI on the other hand is actually lazy. Some people tell me
things like this:
a lot of people that show off their ai generated works, don't show you how many images they had to generate, how much tuning of prompts, how many images were discarded before they found one they like. the job itself is more exploratory in a sense.
This is not the same as actually putting the work to make the damn thing, not even talking about all the training one needs to go through to even be able to do art in the first place. People learn how to draw for years, sometimes decades, to then meticulously work on something for hours and even days, to just make one image that the manager or the director of the project would simply discard. Equating trying to nibble around with nobs and whistles of an AI prompt, to the sweat and pain of the real artist, is frankly insane.
Yet not all algorithms are made equal. What about the
Blender's Geometry Nodes? Those are made to quickly generate complex things. Is this the same kind of cheating? Or if the artist had to make the geometry nodes setup perself, it is qualified for being made by a human?
Then if that is qualified, maybe if the artist is training an AI model specifically for a project, and then presents an output of that AI model as the Art, then this is also qualified? Or just because this is AI, and that is technically not AI, then this is bad and that other thing is not?
Conclusion
I don't want to spend time engaging with something that was generated by an algorithm and thrown out there as some sort of trash. I want actual people actually doing something, for that to even register as something worth paying attention to.
But with this explosion of AI crap, it becomes harder and harder to trust that something is not AI. There needs to be something. Some way... Some way to make sure. And I'm losing hope that there will ever be one good enough.
Happy Hacking!!!