[icon ] blenderdumbass . org [icon scene] Articles

AI The Intellectual Laziness Of Humans

March 11, 2023

👁 43

https://blenderdumbass.org/search?text=intelligence : 👁 1
https://blenderdumbass.org/articles/The_Inherent_Instability_Of_Euphemisms : 👁 1
https://yandex.ru/ : 👁 2
https://blenderdumbass.org/articles/Why_Morias_Race_Flopped_So_Fucking_Hard : 👁 1

[avatar]by Blender Dumbass

Aka: J.Y. Amihud. A Jewish by blood, multifaceted artist with experience in film-making, visual effects, programming, game development, music and more. A philosopher at heart. An activist for freedom and privacy. Anti-Paternalist. A user of Libre Software. Speaking at least 3 human languages. The writer and director of the 2023 film "Moria's Race" and the lead developer of it's game sequel "Dani's Race".


From 3 years ago.
Information or opinions might not be up to date.



Artificial Intelligence - the last frontier of the electronics. An invention that will alter the course of evolution. For the last few billion years humans slowly evolved an organ that gave us superiority among the animal kingdom. The brain. A machine of logic, reason, curiosity and knowledge. It brought with it civilization. Before there was any civilization people would mindlessly do nothing unless afraid, hungry or horny. Every other animal today just chills most of the time. They don't have jobs. They don't have art. They don't have laws or social responsibilities. Animals are lazy. And humans are not particularly that different from other animals. Since the dawn of civilization we fought against social responsibilities. We fought against hard labor. We fought for our right to do nothing and chill all day long. We invented machine after machine. We replaced hard labor of almost every kind and all due to our superior brain. Until we reached a point where we took it upon ourselves to replace the brain too.

If you ask a researcher in Artificial Intelligence how, for example, it recognizes images, his answer will be something about the neural network, or some similar algorithm. A neural network is often visualized as a net of sorts and because of it's complexity it's extraordinary hard to understand what it's doing. It's rather much simpler to explain and Evolutionary Algorithm to you. But even that will fail at telling the exact way a computer tackles any particular problem. Let's say that we can set up a simulation of a virtual environment. For example a little game. Where there are clear rules of how to loose and how to win. And clear controls, or settings to tweak, to win the level. For example: we can construct a randomly generated, road. And place upon it a car which will be designed by the computer. There will be a regular physics engine attached. And the car will break down over time from various bumps in the road. The rule is very simple: to get as far as possible down the road without the car breaking.

So you activate the algorithm and it generates a completely random car. Perhaps it has only one wheel and it's on the roof or something. It sends it down the road and it doesn't even go anywhere so it looses immediately. Doesn't matter. It can try another randomly generated car. This time it drives and after some time it breaks. It does it again and again, say a dozen times or so. And then the computer can compare the results of all the designs and slightly alternate the winning design randomly also about a dozen times. Each generation of such randomly generated designs will give more and more refined car that will go farther and farther and be stronger and stronger.

As a tool to generate cars it could be wonderful. And similar technologies are already in use in many places. But how much knowledge does this new car algorithm give the programmer? Well... in the case of the car it's probably not that substantial of a job to reverse engineer the design and understand how it works. But what if it's something more complex?

With the sizes of today's software it is nearly an impossible job to reverse engineer a normal compiled program. Somebody clearly understands this program, since the developer had thought about it and written it in a language that another program can read and understand. But then it was translated into machine code. Ones and zeros. Which is almost unintelligible even for the person that had written the compiler to do that job.

This same truth applies heavily with Artificial Intelligence. But this time, even the programmer doesn't understand how the damn thing works. Which poses two questions: Can we rely on something we can't understand? And does the fact that we remove the requirement to understand technology creates a precondition for devolution of the human brain that got us to this stage in the first place?



Sheiny was drawing on a graphical tablet with a stylus when Ivan ( Chloe's boyfriend ) entered and looked at her weirdly.

Ivan: What are you doing?

Sheiny: I'm trying to draw a cover for my book.

Ivan: By hand?

Sheiny: Not by hand. Aided with a computer.

Ivan: What are you? A dinosaur? We live in the age of AI.

Sheiny put her stylus down and looked at Ivan with a serious look. This is exactly when Mr. Hambleton entered the room. He was holding in his hands four cups of coffee.

Mr. Hambleton: These two are yours, Sheiny.

He placed two cups on her desk.

Mr. Hambleton: Do you want a coffee?

Ivan: No thank you. Ham, Is she a dinosaur?

Sheiny: I don't want to use no AI, thank you.

Ivan: Yeah, but you prefer hard labor.

Sheiny: Art requires sacrifices.

Ivan: It required sacrifices 10 years ago. Now the computer can do it for you.

Sheiny: Then the computer did it, and not me.

Ivan: So what? Also... You can now generate a whole book without doing anything at all, really.

Sheiny: Is there already AI that reads books for you as well?

Ivan: There might be.

Sheiny: Then humans are truly degrading.

While Ivan was processing what she said, Sheiny took a sip of coffee and frowned at it.

Sheiny: You forgot to put sugar.

Mr. Hambleton sipped his too.

Mr. Hambleton: Oh...

And he ran out of the door.

Ivan: What do you mean by degrading?

Sheiny: You have an organ... maybe... inside of your skull that thinks. This organ... by the way it's called "The Brain" ... it enjoys things like art and music and books. You know that books are almost exclusively intellectual? Those are strings of words that must be thought over and decoded to extract a meaning, that must thought over and decoded to extract a deeper meaning. As so on and so forth. Imagine tomorrow buying books so a computer will enjoy it and you.

Ivan: I don't like books.

Sheiny: Yeah?... Well, I do!

Ivan: Well, that's because you are a nerd.

Sheiny: And you are not a nerd?

Ivan: I am kind of a little nerdy. But not mega-mind, Jimmy Neutron kind... Like you are.

Sheiny: Well I train my brain.

Ivan: Wait the second...

Sheiny: What?

Ivan: So... you are a nerdy nerd and you don't like AI? What?

Mr. Hambleton returned with sugar. He put a few spoons into both of his coffees and gave the sugar to Sheiny, who stirred a few spoons into hers.

Sheiny: Your brain is valuable. The fact that humans can build AI is only due to our brains being so advanced at the moment. But what you are suggesting is to abandon the brain and degrade to animal-like lunacy, by relying on AI for everything that the brain does. If we will rely on computers too much, we will loose an ability to think.

Ivan: I used AI and I still can think.

Sheiny: I'm talking about evolution. We evolved the brain. And now we are actively trying to devolve it.

Ivan: So it's going to take millions of years? I mean... What's the problem to have AI now?

Mr. Hambleton: How about safety?

Ivan: What?

Mr. Hambleton: You've been in a very dangerous situation once, right? When you failed to be careful with technology.

Ivan: Ham, the thing is Free Software.

Mr. Hambleton: Hmm... it depends.

Ivan: I'm not talking about proprietary AI.

Mr. Hambleton: Proprietary or Free, we have no idea how it works. See, with proprietary software the user has no idea how it works. But at least the developer does. With AI nobody knows how it works. Including the developer. From the security stand point is an utter nightmare.

Sheiny: Well I don't think an image generator can cause the World War 3 to happen.

Mr. Hambleton: Are you sure about it?

Sheiny: Well perhaps somebody can deep-fake some president saying some atrocity. And it can start a chain of events that will cause a large war. But it's not AI that was the guilty one.

Mr. Hambleton: Yeah, but I'm not talking about deep-fakes. I'm talking about a minute but realistic probability that it will train for efficiency and find a loophole somewhere that will increase it's efficiency at a cost of somebodies freedom or even live. You know... AI is technically a stubborn psychopathic cheater with OCD. And those don't hesitate murder.

Ivan: What are you talking about?

Mr. Hambleton: You know that neural network don't really care about rules. And if we could set those rules, they can and will squeeze through every crack in the rule-package. And will end up killing millions and billions of people.

Ivan: What are you talking about?

Mr. Hambleton: You know how AI works?

Ivan sat in complete and utter confusion. He was thinking that he was talking to a lunatic.

Sheiny: We can't know how the AI works.

Mr. Hambleton: Exactly. And that's why we can't ever trust it!

Sheiny: Well, yes... this is a valid concern. But what I'm telling is that our brain is... well... With regular, human readable software, people advance forward with understanding of things like math and new concepts of computer technology. Even with proprietary software, somebody knows how the damn thing works. And this knowledge, maybe, will be spread with the people. At least it can be spread in theory. With AI, nobody knows how it works. It's literally a black box. So training AI doesn't really advance humans forward. What it does is merely substitutes thinking.

Mr. Hambleton: So we should not use AI because it makes us dumber?

Sheiny: I don't think that we should ban AI. But it should be regarded as a nasty thing to use too often. Like drugs. Because if we are going to rely on it, we are going to get dumber with each generation. Since people will not even want to know how technology works anymore.

Ivan: So then why are you drawing on the computer?

Sheiny: I'm not using AI.

Ivan: Yes, but you are using technology not to think too much.

Sheiny: That is not necessarily true. This program simulates real life materials. Pencils and brushes. I have to use the stylus the same exact way as in the real world.

Mr. Hambleton: Yes. But I think I understand what Ivan is saying. Sheiny, you have control Z. You can make an outrageous mistake and outright fix it. So you are not as concerned within the mind. Therefor you are using less brain to do the same work. And therefor you are degrading. Slightly... Not as with AI. But still degrading.

Sheiny: So what are you trying to say?

Ivan: You have to draw on paper.

Sheiny: Well, then I suppose I have to use quill and ink. No. I have to carve the drawing in stone.

Ivan: Yeah. Something like that.

There was an awkward silence for some moments.

Mr. Hambleton: That's why I'm talking about security. It's at least not as fallacious.

Sheiny: Tools are not means of avoiding thinking.

Mr. Hambleton: You just yourself proved that they are.

Sheiny: How much thinking avoidance there is between a person lifting a heavy box into a truck and a person using a forklift to lift a heavy box into a truck? I'm more willing to believe that forklift requires more thinking, while it's physically easier. This is a very good use of the brain.

Ivan: Yeah, but drawing with a computer... well you have control Z.

Sheiny: Yes, it is convenient enough. But also I have layers, file formats, alpha channel, dynamic filters and many other things, each of which increases the complexity of the job relatively to paper. Yes, ultimately I'm doing the job faster and the result is cleaner. But it's not simpler. I have to think rather hard. I can draw exactly as on paper. And it will look as terribly as on paper. But I want to do a good job, so I separate things into layers. I have groups of layer. Each layer has it's own blending mode. I have to deal with brush settings and filters. It's not easy!

Mr. Hambleton: Well how about the alien in Sinking In The Fire. We used 3D graphics to render it from different angles.

Ivan: Yeah, it's not like you've drawn every frame from scratch.

Sheiny: It's only theoretically simpler to use CGI than to use Krita. You know how much work goes into setting a proper camera track? How much time there goes into relighting... We had to design, build, rig, animate and compose the damn alien into every shot. It was not a trivial task. It's not like I've typed "Add and alien to this shot" and it added an alien to the shot. There was an enormous thinking process involved.

Ivan: Well, there is an art in typing AI prompts too.

Sheiny: It's as much art as producing. No disrespect to Mr. Humbert. But producing is basically just managing creative talent so they would produce something of value to you. It could be an art. An art of business I suppose. But not an art in a sense where you have to think through and decide everything. You know, like drawing. Like posing key-frames. Like working for two hours on roto-scoping a bloody hand for three hundred and seventy five frames.

Ivan: But why should you do something as repetitive as roto-scoping a hand so many times? You can code something to do it for you.

Sheiny: Exactly. Repetitive, uninteresting tasks should be removed as much as possible. And software can provide that without loosing the ability to understand how it's done. But it should not substitute the creative process. It's like - let's watch films using AI. Let's read books using AI. Why?... since, why would you watch a film? It's too long. It requires you sitting in a chair all that time and looking into one direction. It requires thinking about the plot and motivations of the characters. It's mental work. Why can't we unload that to AI? Well... I'll tell you why. I bloody enjoy doing that kind of mental work! As much as I like watching films, I like reading books. And as much as I like those I like writing or painting or filming, or sometimes even roto-scoping brings me joy. I don't want to give away my joy to a computer. Making it less painful? Yes. That's good. But taking it entirely away. What the hell is that even? Again... I'm not telling you that we should ban AI. But we should not shame people for trying to do things the old way. And we should embrace people that do not look for easy solutions and instead strive for self-growth, for understanding. Not for mere function and leisure.

Happy Hacking!


[icon left] Next [icon right] Previous [icon terminal] JSON [icon markdown] Markdown
[icon question] Help

You can comment from Mastodon.







[icon reviews]The Fifth Element 1997 is the most tragic film in Luc Besson's career

[thumbnail]

[avatar]  Blender Dumbass

👁 51 💬 1



It is not a spoiler in 2025 that the message of Luc Besson's 1997 film The Fifth Element is "Love". The fifth element itself ( a revelation in the end of the film ) appears to be Love. And the thesis is that Love is the thing that can defeat the evil in the world. But looking at the film and the behind the scenes drama around it, you can say that Besson didn't really mean love in its purest sense. But he was instead preaching a Bonobo Philosophy. Where "love" or in modern language sex, is used to deescalate conflict. Bonobos are known to fuck each other instead of fighting with each other, making themselves more peaceful. Looking at how horny The Fifth Element ( and Luc Besson ) is, the Bonobo philosophy theory sounds to be a much more plausible reading of the film. Making it very tragic indeed.


#LucBesson #Maiwenn #MillaJovovich #Love #film #review #movies #cinemastodon


[icon petitions]Release: Dani's Race v2025-03-17

[thumbnail]


31 / 50 Signatures

[avatar]  Blender Dumbass

👁 368 💬 2



Dani's Race version 2025-03-17


#DanisRace #MoriasRace #Game #UPBGE #blender3d #project #petition #release


[icon reviews]Bones And All

[thumbnail]

[avatar]  Blender Dumbass

👁 49 💬 2



It is very hard to describe the style of Luca Guadagnino, the director of Bones And All. His films are very good. But it seems like he is not interested in plot, which is weird, considering that the movies are good. He is famous for his erotic dramas, films centered around a sexual tension between people, like his perhaps most acclaimed movie Call Me By Your Name, in which all the substance comes from very subtle things. A character looks a certain way on another character. Or perhaps holds onto another character's hand for way too long. And you have to piece together all these little clues to even start feeling some kind of presence of a plot. Because if you don't pay attention it all looks like people just casually hanging out. And then suddenly a payoff happens, which would make sense only if you paid attention to the little clues.


[icon articles]Dani's Race v2024-09-25 is finally released!

[thumbnail]

[avatar]  Blender Dumbass

👁 74 💬 2



Thank you to all those people who signed the petition. You made it so now Dani's Race v2024-09-25 is finally available to the public to download, play and explore.


#DanisRace #MoriasRace #Game #UPBGE #blender3d #project #gnu #linux #FreeSoftware #OpenSource


[icon articles]Australian Ban on Social Media Will Make Everything Worse!!!

[thumbnail]

[avatar]  Blender Dumbass

👁 74 💬 0



News came to me via Mastodon ( a Free / Libre Social Media platform ) that apparently Australian government just banned Social Media to a certain demographic of people and the consensus is that it is apparently a good thing. Social Media tends to be very bad these days. Due to enshitification and mass surveillance of platforms such as Facebook and Ex-Twitter you can give a pretty solid argument why banning social media all together will make things better. But I fail to see nothing else but an expression of age-discrimination in this ban, when not all people are banned, but only children.


#Australia #SocialMedia #Kids #Paternalism #Law #Philosophy #Freedom #LetGrow #FreeRangeKids


[icon reviews]Man on Fire 2004 is Tony Scott's Leon: The Professional

[thumbnail]

[avatar]  Blender Dumbass

👁 15 💬 1



Critics gave negative reviews to 2004 Tony Scott's film Man on Fire because of "grim story that gets harder to take the longer it goes on". Are you fucking serious? How then Lars Von Trier movies get good reviews? Something isn't quite right here. To be frank, the film is very ultra-cinematic. Which could rub some critics the wrong way. Scott doesn't just direct the shit out of it. He also edits the shit out of it. Making one of the coolest directed films in existence. Which if you think about it, isn't particularly what critics find as a serious picture. And yes, the film is grim. At times it feel like a horror film. Not just a thriller. But the film is a rather satisfactory experience.


#manonfire #tonyscott #dakotafanning #DenzelWashington #film #review #movies #cinemastodon


[icon reviews]Speed Racer

[thumbnail]

[avatar]  Blender Dumbass

👁 39 💬 0



This had to be done finally! One of the biggest inspirations for Moria's Race, one of my favorite childhood movies, one of the best action films ever and perhaps one of the most colorful films: Speed Racer by the Wachowskis.


[icon codeberg] Powered with BDServer [icon python] Plugins [icon theme] Themes [icon analytics] Analytics [icon email] Contact [icon mastodon] Mastodon
[icon user] Login