pouët.net

Go to bottom

AI tooling in the Demoscene

category: residue [glöplog]
Another example that I don't think is very clear given the above discussion.

Recently I have been looking at area lights. That's a pretty complicated thing, from a mathematical point of view. Would take me a lot of effort to reimplement from scratch, starting with the papers.

But this OpenGL reference exists, and it looks pretty easy to integrate: https://learnopengl.com/Guest-Articles/2022/Area-Lights

Is there a difference between integrating the reference code vs prompting AI to implement it?

What about IQs noise code? Mercurys SDF library? Crinkler?
added on the 2026-04-08 11:41:31 by revival revival
Quote:

I shudder at the idea that the code isn't considered a direct art artifact to some.


Just shows that some dont bother to read to the end, or just want to troll.

Code that ends up in the intro/demo of course is a direct artifact.

Code that allows you to create spline controlpoints or other data that will ship with your intro/demo is not.
added on the 2026-04-08 11:43:41 by gopher gopher
(Just be to clear, my question was to the direct artifacts, which I guess was not really what BoyC wanted to discuss. I just have way more angst about that line than tooling.)
added on the 2026-04-08 12:12:02 by revival revival
If you use IQ's code as opposed to AI hallucinated code you can give proper credits and in the case of IQ's stuff licensing is clear too (MIT, generally, I think)
added on the 2026-04-08 12:15:50 by Moerder Moerder
Quote:
Is there a difference between integrating the reference code vs prompting AI to implement it?


Say, a beginner coder wants to write something simple, like a Bresenham line algorithm. He would google it, look at someone's sample code. Then he writes it.

In the 1990s, he would go to a party, talk to other coders, ask them, ask for their sources and learn it from that. Same process really, just the access to information is slower.

With AI, he types not into Google but Claude: "Explain the Bresenham algorithm." Same information arrives.

Now, how often have you written a Bresenham? It happened to me exactly two times in my life. Others likely did it more often, maybe learnt it perfectly at a young age, but how common is that? Next time I would have to look up how it was done like I'm a complete beginner.

So where is the harm when I swap out Google, Github or asking another coder? The latter isn't even a very good option any more. Imagine newcomers calling you all the time like a helpdesk.
added on the 2026-04-08 12:16:52 by tomcatmwi tomcatmwi
@revival:
Just to be clear. Perhaps I misunderstood. You have angst because lotsa people in the scene detest AI?
added on the 2026-04-08 12:19:36 by 4gentE 4gentE
As usual, groepaz nailed it in just a few words.
The outcome will be (and already is) that anybody is using tools they like in the way they like, and disclose their tool usage to their liking. Everything else would be insane. Demoscene is the natural outcome from having fun playing with technology, old or new, turning the DIY/punk tinkering into small art pieces, go to parties, show them to others, and talk about it. Not even the distinction between runtime and tooling holds any water for a second. (I love to do my splines and movements realtime in a small scripting language inside demos. Nobody will ever notice)
added on the 2026-04-08 12:28:28 by bifat bifat
4gentE: it's quite clear what's going on. "They" want to continue winning the parties at all cost, but they know you have to do it politically safe way to avoid PR issues. Hence the "change of winds". All this talk about the love for craft and demoscene being the last bastion against Evilbot was also just a PR stunt.
added on the 2026-04-08 12:29:02 by tomkh tomkh
(@4gentE: A bit besides the point, but a good question. No, more because I don't have a good model of how to draw the line myself. I believe some level of detesting is appropriate, but on the other hand AI also speeds me up a lot, so - conflicting emotions.)
added on the 2026-04-08 12:29:36 by revival revival
Quote:
As usual, groepaz nailed it in just a few words.
The outcome will be (and already is) that anybody is using tools they like in the way they like, and disclose their tool usage to their liking. Everything else would be insane. Demoscene is the natural outcome from having fun playing with technology, old or new, turning the DIY/punk tinkering into small art pieces, go to parties, show them to others, and talk about it. Not even the distinction between runtime and tooling holds any water for a second. (I love to do my splines and movements realtime in a small scripting language inside demos. Nobody will ever notice)


But you have no problem with someone prompting gfx for demos from an LLM. Or you, hopefuly, changed your mind abou that?
added on the 2026-04-08 12:52:49 by 4gentE 4gentE
@agentE, the only answer appropriate for binary thinkers would be a clear: Hm, what's it to you? :)
added on the 2026-04-08 12:56:33 by bifat bifat
Nothing man, relax. I just thought how you were (again) a very special flake. Most people seem to agree to hate on gfx and music promptmachines with the jury on vibecode machines is still out. What you wrote made me think your take is the exact opposite. So, I guess you did not change your mind after all.
added on the 2026-04-08 13:01:33 by 4gentE 4gentE
I think there's an other approach to actual vibecoding (one massive prompt and gacha your way into it) and assisted programming where you can boilerplate some makefile, user interfaces and start cracking yourself until you reach an impasse just go "implement bezier on these automation lines", in "plan mode", review, refine, keep going at it. You know how to build something? It makes it much faster.

The thing I've come to realise from my own experience is that you can't just direct the machine correctly if you don't have any ideas what you're doing but it's still pretty easy to one-shot "easy experiments" by vibecoding (and that's probably the part where it rips a lot of free stuff in training); but if you already know the problem space and you need a bit of speed, I guess it's very helpful.

And for the type of stuff people do in the demoscene, it's never only "render one effect in real time", especially on oldschool machines, it's much more complicated to go around the limitations – I guess – than vibing "render a realtime cube"; the tooling is what makes these oldschool demos run right? it's what create the illusion of realtime, am I right?
It's very silly and shows the levels of bad-faith trolling we're dealing with in the community, when every discussion on the topic has to go into "it's just a tool, just like photoshop or compilers!" territory. it's very obviously not that.

*Tools* have a defined function, and if their output doesn't follow from the inputs given to them, we call them defective and either fix or discard them. But when an LLM does that, it instead gets subsidized so it can cause as much disruption to creativity as possible.
added on the 2026-04-08 13:20:08 by vurpo vurpo
Quote:

Is there a difference between integrating the reference code vs prompting AI to implement it?
What about IQs noise code? Mercurys SDF library? Crinkler?


there is no real difference.
apart from we usually credit other sceners inputs/artifacts we use.
but even without AI that is not happening all the time, and it is not that we disqualify or have demoscene/factcheck police called when it happens.

this is the topic about honesty, but also about trust.
just because something is not properly attributed/credited does not neccessarily imply it was done with bad intention.

we also all accept all mathematical and technical foundations that are simply there and we build upon it, without crediting from the ancient greeks to todays 1337 gfx researchers.
would be pretty silly anyway.
added on the 2026-04-08 13:26:53 by gopher gopher
Quote:
It's very silly and shows the levels of bad-faith trolling we're dealing with in the community, when every discussion on the topic has to go into "it's just a tool, just like photoshop or compilers!" territory. it's very obviously not that.

*Tools* have a defined function, and if their output doesn't follow from the inputs given to them, we call them defective and either fix or discard them. But when an LLM does that, it instead gets subsidized so it can cause as much disruption to creativity as possible.


"This AI is a tool just like --place some past technology here--" line had been debunked and obliterated for a million times. But I don't think bringing that tired false trope out again is mere trolling. There's something about describing some new situation in "oh it's like that old situation We've seen, don't fret" style that makes the ones using it feel smart. And smug. And cool.
added on the 2026-04-08 13:27:36 by 4gentE 4gentE
As a music maker I feel the need to ask: if I generate a lot of samples with AI, break them down and use them creatively… is this bad? (I'm not saying I've done that but I've been thinking about this process recently)

Because what I and a lot of demoscene musicians have done in the past is certainly sample a loooooooot of breakbeats, a lot of sample CDs (sometimes even very uncreatively) and it was "OK" and that blurs the line a lot more.
Quote:
the tooling is what makes these oldschool demos run right? it's what create the illusion of realtime, am I right?
No, that would be mostly the animation-assisted kind of effect. Actual real-time effects still exist on old platforms, and some offer a user mode for playing around with all degrees of freedom to prove it.
added on the 2026-04-08 13:30:06 by Krill Krill
@4gentE: I haven't made up my mind at all, why should I. It's all in flux. I have worked with people using AI in an usual and artistic process even with gfx prompting involved. Attempts to reduce this into a "prompt or not" distinction is totally off the mark and unhelpful from my insight.

I find playing around with AI much more interesting than demoscene at the moment, but that may pass. The remark about realtime splices was intended to drive home this point: The fun in tool usage can of course outweigh and even be to your disadvantage in strictly compo-winning terms. Why not. I have already suggested to relax this compo-winning folklore instead.

The demoscene is trying to open up art-scene kind of subspaces with gfx and music compos. With this come problems. But these do not really matter much to me. I have used photos and clipart from the internet which I modified a bit for textures and stuff. What's the difference, it's commonplace in my circles and not even an important factor for me.
added on the 2026-04-08 13:30:57 by bifat bifat
Quote:
As a music maker I feel the need to ask: if I generate a lot of samples with AI, break them down and use them creatively… is this bad?


sure ok in my world, even without chopping things apart to compose a song.
(edge case, one big generated sample that is the whole song).

people have been using other peoples samples for trackers for ages, and today you just take them from internet databases, so no real difference to aboves question about "gfx algorithms" in my book
added on the 2026-04-08 13:43:40 by gopher gopher
Quote:
Now, how often have you written a Bresenham? It happened to me exactly two times in my life. Others likely did it more often, maybe learnt it perfectly at a young age, but how common is that? Next time I would have to look up how it was done like I'm a complete beginner.

So where is the harm when I swap out Google, Github or asking another coder? The latter isn't even a very good option any more. Imagine newcomers calling you all the time like a helpdesk.

Bresenham is an interesting example to use. Not from the demoscene, but there's a work-in-progress remake of Zarch/Virus for the BBC Micro that has been an ongoing experiment in bullying Claude Code to write the actual 6502 code. As part of this Claude ground out 1400 lines of (commented) 6502 assembly to implement the fastest known (to our community) line drawing routine specifically for one BBC Micro screen mode (MODE 2, 160x256, 4bpp). To do this it wrote its own (simplified) BBC Micro emulator from scratch in order to verify the correctness and speed of the routines. AI coding agents are quite happy to shave an effectively infinite number of yaks in the process of solving a problem (with human guidance necessary, of course).

If I was making another BBC Micro demo with 3D wireframe graphics, should I use this line drawing code? I'd be mad not to, as it's the fastest one we know of and I'd be unlikely to beat it myself (at least in the time I have available for such a pursuit), but thankfully I don't have to answer this question right now.

As a corollary, we have line drawing code for a different screen mode (MODE 4, 320x256, 1bpp) that was human written back in the 90's and Claude was not able to improve this any further (at least in any meaningful way given the size/speed tradeoffs).
added on the 2026-04-08 13:55:41 by kieran kieran
Quote:
No, that would be mostly the animation-assisted kind of effect. Actual real-time effects still exist on old platforms, and some offer a user mode for playing around with all degrees of freedom to prove it.

And those that DO largely rely on tooling, often come with detailed papers and full disclosure (Mahoney stuff for example)

I'd totally delegate "graphics" to an AI once it becomes good enough for C64 stuff for that matter - because it will enable me to do what i want, without waiting forever, and dealing with self proclaimed artists (there i said it) who only want to do exactly what they want.
added on the 2026-04-08 13:56:19 by groepaz groepaz
Quote:
I'd totally delegate "graphics" to an AI once it becomes good enough for C64 stuff for that matter - because it will enable me to do what i want, without waiting forever, and dealing with self proclaimed artists (there i said it) who only want to do exactly what they want.

I remember being called out and mocked "oooh the evil coders" for making this insinuation. Now I know I was right about some of those.
added on the 2026-04-08 14:00:29 by 4gentE 4gentE
Interestingly, i never had those problem with musicians - YMMV
added on the 2026-04-08 14:03:39 by groepaz groepaz
Quote:
I'd totally delegate "graphics" to an AI once it becomes good enough for C64 stuff for that matter - because it will enable me to do what i want, without waiting forever, and dealing with self proclaimed artists (there i said it) who only want to do exactly what they want.


wow.

login

Go to top