Demoscene ? my first hands-on with realtime 3D engines
category: general [glöplog]
Yaye,
I’d like to share a personal reflection and (maybe) open a discussion around how many of us got into real-time 3D, as an ARTIST or designer, before the rise of major commercial engines like Unity or Unreal.
Back in the late '90s, it seems the only way for an amateur 3D artist like me to access a real-time 3D engine was through the demoscene.
I joined the scene in 1998 and started working with NxNG, a 3D engine developed by Xbarr, which I used to make the demo Couloir 14. Then I made most of the Red Line demo on JFYE by NoRecess, briefly experimented with Hulud's engine, and Stil's tools. Later, I returned to Xbarr's next engine, GameStart, and used it to make Within the Mesh. When we decided to turn it into a real business, it became Harfang, which powered Marine Melodies and Dreversed. Harfang 3D never really became mainstream (though we would have loved that, from a business point of view).
Now, I could have switched to Unity or Unreal at any point ... but I didn’t, for reasons I won’t go into here.
So this long (self-promoting) intro leads me to a few (interrelated) questions I’d love to ask the community:
Curious to hear if anyone else had a similar experience (or completely different).
(and see ya around at EVOKE)
I’d like to share a personal reflection and (maybe) open a discussion around how many of us got into real-time 3D, as an ARTIST or designer, before the rise of major commercial engines like Unity or Unreal.
Back in the late '90s, it seems the only way for an amateur 3D artist like me to access a real-time 3D engine was through the demoscene.
I joined the scene in 1998 and started working with NxNG, a 3D engine developed by Xbarr, which I used to make the demo Couloir 14. Then I made most of the Red Line demo on JFYE by NoRecess, briefly experimented with Hulud's engine, and Stil's tools. Later, I returned to Xbarr's next engine, GameStart, and used it to make Within the Mesh. When we decided to turn it into a real business, it became Harfang, which powered Marine Melodies and Dreversed. Harfang 3D never really became mainstream (though we would have loved that, from a business point of view).
Now, I could have switched to Unity or Unreal at any point ... but I didn’t, for reasons I won’t go into here.
So this long (self-promoting) intro leads me to a few (interrelated) questions I’d love to ask the community:
- Did others here also discover real-time 3D engines through the demoscene, before Unity/Unreal were accessible for free?
- Now that Unity and Unreal are available (with a shitloads of features), how come the scene isn't dead (yet)? Why are we still here? And more provocatively: when today’s kids discover 3D interactive art, don’t they all just go straight to Unity?
- Between those two worlds, there’s a sort of middle ground scene-born engines/tools that are easily accessible online: Tooll3, Notch, Cables.GL, Three.js, etc. -> Are there others I’m forgetting?
Curious to hear if anyone else had a similar experience (or completely different).
(and see ya around at EVOKE)
Quote:
Why are we still here?
Tradition at best, mostly.
Quote:
when today’s kids discover 3D interactive art, don’t they all just go straight to Unity?
They do.
nowadays three.js and shadertoy are probably the most accessible and popular ways into 3d graphics programming..
otherwise, high end 3d graphics engine development in the traditional sense (well, PC, DirectX, C++ etc) is a slowly dying art - thanks to the vast complexity and knowledge requirement, sheer scale / volume of work and scope, super high bar, and the obvious domination of unreal+unity in the games industry leading to fewer commercial opportunities. this leads to the gulf between the top (say UE5) and what most hobbyists or even companies can achieve independently being bigger than ever.
otherwise, high end 3d graphics engine development in the traditional sense (well, PC, DirectX, C++ etc) is a slowly dying art - thanks to the vast complexity and knowledge requirement, sheer scale / volume of work and scope, super high bar, and the obvious domination of unreal+unity in the games industry leading to fewer commercial opportunities. this leads to the gulf between the top (say UE5) and what most hobbyists or even companies can achieve independently being bigger than ever.
I think it's still worth it for small projects with custom-made optimized tools and original ideas.
Many of us actually got into it before 3D engines were a thing. It was more about 2D scrollers, rasterbars, sprites/bobs and whatnot. And there were engines for that. AMOS on Amiga is a good example: it was basically Unity, but for 2D. On Commodore 64, there was Simons Basic and Graphics Basic. However, a demo written in any of these would've been considered extremely lame. 3D only began to emerge around 1998-ish, right when you started, due to 3D games like Quake and 3D accelerator cards which were a novelty at the time.
To answer one of your questions: The demoscene is not merely about putting something on the screen. It matters how you put it there. There's a whole galaxy between Unity - a tool even a beginner can use with great effect - and a self-created engine, like Farbrausch or Conspiracy created. Not to mention the traditional way of democoding, which generally means working from scratch and writing your own code. Even Windows demos were frowned upon until 1997-98 because DirectX or OpenGL was considered "lame" by many, a tool that removes challenges and allows anyone to be successful without effort. While this is far from true, it holds some truth if you look at how demos and intros were coded just a few years prior.
I, for one, still don't consider Unity or Javascript demos true. They belong to their own category. Granted, they can be very artistic and outright cool, but not in the way "real" demos. They are like entering a bicycle race on a motorcycle.
Yes, there are many other tools. They're nice and useful. Just right now I'm coding something using three.js. It's not going to be a demoscene production, but I've shown it to some sceners and they liked it. The bottom line: There's nothing against having fun with any tool, but compo categories meant to have boundaries.
To answer one of your questions: The demoscene is not merely about putting something on the screen. It matters how you put it there. There's a whole galaxy between Unity - a tool even a beginner can use with great effect - and a self-created engine, like Farbrausch or Conspiracy created. Not to mention the traditional way of democoding, which generally means working from scratch and writing your own code. Even Windows demos were frowned upon until 1997-98 because DirectX or OpenGL was considered "lame" by many, a tool that removes challenges and allows anyone to be successful without effort. While this is far from true, it holds some truth if you look at how demos and intros were coded just a few years prior.
I, for one, still don't consider Unity or Javascript demos true. They belong to their own category. Granted, they can be very artistic and outright cool, but not in the way "real" demos. They are like entering a bicycle race on a motorcycle.
Yes, there are many other tools. They're nice and useful. Just right now I'm coding something using three.js. It's not going to be a demoscene production, but I've shown it to some sceners and they liked it. The bottom line: There's nothing against having fun with any tool, but compo categories meant to have boundaries.
Something I hadn’t fully realized (Smash pointed it out) is how huge the code gap has become between "doing nothing" and "trying to do something cool-looking with a mainstream engine" (....and the line count required to setup a 3D pipeline on Vulkan doesn't help).
That makes raymarching in Shadertoy or Bonzomatic offer a surprisingly effective path to put something visually rich on screen with a minimal setup, while offering a deeply expressive and technical playground.
That makes raymarching in Shadertoy or Bonzomatic offer a surprisingly effective path to put something visually rich on screen with a minimal setup, while offering a deeply expressive and technical playground.
The unreal engine productions have that extra layer of shading (shadows, photorealistic materials) and postprocessing (motion blurring, bokeh effect, shading) that you get for free. It works wonders on otherwise mediocre scenes (* with exceptions of course, see latest Orange or CNCD etc. productions).
Of course, nothing can substitute true flair - see
https://www.pouet.net/prod.php?which=104003
which could have been easily done in opengl's immediate mode from the 90s. And, maybe, it is much closer to the original spirit of the demoscene, and probably a more significant reference point for the future.
Of course, nothing can substitute true flair - see
https://www.pouet.net/prod.php?which=104003
which could have been easily done in opengl's immediate mode from the 90s. And, maybe, it is much closer to the original spirit of the demoscene, and probably a more significant reference point for the future.
Quote:
added on the 2025-08-08 11:59:35 by fra fra
The unreal engine productions have that extra layer of shading (shadows, photorealistic materials) and postprocessing (motion blurring, bokeh effect, shading) that you get for free. It works wonders on otherwise mediocre scenes (* with exceptions of course, see latest Orange or CNCD etc. productions).
ha, i would actually say almost the opposite.
what quite a few "demos made with engines/tools" made in the last few years demonstrate to me is that there's no substitute for experience - how many of those demos make even an amazing render engine look bad because they use it so poorly with badly lit, over-processed scenes with poor materials that look full of rendering artefacts that shouldnt be there, and fall right into the uncanny valley. that's definitely not about the engine.
like maybe what some of those who spent 30 years grinding making demos with their own code actually learnt most of all was how to get the most out of their routines via clever scenes and good demo craft - things that definitely don't come with the unreal installer or any other 3rd party tool/engine.
Even though some of Unreal's features (like realtime shadows, and even radiosity to some extent) can help the viewer better read a scene (as Navis sort of pointed out), I think we all agree: a shitty 3D scene will still look shitty, even with tons of post-processing :)
That said, for reasons I can't fully articulate, I wouldn't work on a demoscene production made with Unreal. It doesn’t feel right ... not in this context. I'd have to make a real effort to justify it rationally, but the feeling is strong.
Same goes for Unity, more or less.
And to a slightly lesser degree: same goes for Godot.
Weirdly, I find the idea of making a demo with Godot somehow more acceptable. Why is that?
Maybe because it carries that "open source" badge of respectability that appeals to my anticapitalist eyes?
Or maybe because I feel (perhaps wrongly) that the Godot team is driven by motivations not too far from those of sceners (yeah, now that I wrote it I can sense how wrong it might be) ?
With ThreeJs, it’s even easier to accept: it’s made by one of us.
So to me, making a demo with ThreeJs is closer in spirit to using DemoPaja back in the day, or Tooll3 now (with an extra effort because of coding with Javascript, yikes).
Anyway, I can feel the topic is drifting (in a good way) and reading this thread just makes me want to try 64K.
Maybe that’s part of the rationale behind 64K/8K/4K productions: one simply can’t use a mainstream engine.
That said, for reasons I can't fully articulate, I wouldn't work on a demoscene production made with Unreal. It doesn’t feel right ... not in this context. I'd have to make a real effort to justify it rationally, but the feeling is strong.
Same goes for Unity, more or less.
And to a slightly lesser degree: same goes for Godot.
Weirdly, I find the idea of making a demo with Godot somehow more acceptable. Why is that?
Maybe because it carries that "open source" badge of respectability that appeals to my anticapitalist eyes?
Or maybe because I feel (perhaps wrongly) that the Godot team is driven by motivations not too far from those of sceners (yeah, now that I wrote it I can sense how wrong it might be) ?
With ThreeJs, it’s even easier to accept: it’s made by one of us.
So to me, making a demo with ThreeJs is closer in spirit to using DemoPaja back in the day, or Tooll3 now (with an extra effort because of coding with Javascript, yikes).
Anyway, I can feel the topic is drifting (in a good way) and reading this thread just makes me want to try 64K.
Maybe that’s part of the rationale behind 64K/8K/4K productions: one simply can’t use a mainstream engine.
smash: you are spot on!
I think some times using unreal etc confines you to a space where you need to use the 3d scenes and latest materials,modifiers etc thus making same-ish stuff we ve seen before many times (and yes I know the cardinal rule which is you can do whatever the hell you like and have fun with).
I think some times using unreal etc confines you to a space where you need to use the 3d scenes and latest materials,modifiers etc thus making same-ish stuff we ve seen before many times (and yes I know the cardinal rule which is you can do whatever the hell you like and have fun with).
Quote:
Maybe that’s part of the rationale behind 64K/8K/4K productions: one simply can’t use a mainstream engine.
You can still use one for WYSIWYG/sync and export... already been done (64k with Unity iirc).
Quote:
You can still use one for WYSIWYG/sync and export... already been done (64k with Unity iirc).
I vaguely remember a group at EVOKE (was it EVOKE2013?) that worked on their 64K with a custom plugin, showing a preview of each sequence right the Maya's viewports.
ThoseWereTheDaysEtc.
yeah, probably Inque :)
Quote:
high end 3d graphics engine development in the traditional sense (well, PC, DirectX, C++ etc) is a slowly dying art - thanks to the vast complexity and knowledge requirement, sheer scale / volume of work and scope, super high bar, and the obvious domination of unreal+unity
For games, it was always super high bar. The main reason is not so much rendering code IMHO but rather all the features on top, like animation, LOD, physics, AI, lighting etc..not to mention all the tools you have to provide around it. For demos the bar is *significantly* lower.
Quote:
Did others here also discover real-time 3D engines through the demoscene, before Unity/Unreal were accessible for free?
I would know about 3d engines through gaming magazines in the early 90s, then websites like 3D Engines List. Honestly, not much the demoscene.
As I assume the youngest person in this thread, I'm happy to say I write my own engines from scratch personally, I'm fed up with a lot of the generic options. Only a few of the big 3D toolkits are well developed and they usually incur some licensing challenges and a little bloat at least. It's been fun, but if you're the perfectionist type like me it can also be a ridiculous time sink, so I get why indies more focused on getting commercial work done have moved onto large third-party systems. Especially if you're planning to implement a full PBR pipeline.
I spent some years getting good in Unity and I don't think it's a bad engine when used appropriately, but I really dislike the company + sometimes it just isn't a good fit for your performance target. That was my first journey into 3D, but I learned way more from Allegro + OpenGL and doing draw call dumps of games I liked.
Unreal is basically a glorified gaslighting operation meant to sell graphics cards and is a nightmare of engineering shortcuts. The over reliance on Unreal's heavy lifting for visual composition both makes these problems universally inescapable + hurts the artistic value or potential of the work being done. Some games use it to great effect but most don't. Watch some Threat Interactive videos if you have the time as he does great critical analysis of Unreal in my opinion.
Agreeing with el mal, the tooling and implementation of how your systems interact is the hardest part. Just getting the graphics down isn't really a problem (unless you're reinventing PBR, like i said), APIs for video accelerators already abstract away some of the boilerplate steps of the pipeline i.e. the actual blitting of the pixels to the screen. Pixel shaders and depth testing are a thing which makes z-sorting irrelevant. Etc. etc.
I spent some years getting good in Unity and I don't think it's a bad engine when used appropriately, but I really dislike the company + sometimes it just isn't a good fit for your performance target. That was my first journey into 3D, but I learned way more from Allegro + OpenGL and doing draw call dumps of games I liked.
Unreal is basically a glorified gaslighting operation meant to sell graphics cards and is a nightmare of engineering shortcuts. The over reliance on Unreal's heavy lifting for visual composition both makes these problems universally inescapable + hurts the artistic value or potential of the work being done. Some games use it to great effect but most don't. Watch some Threat Interactive videos if you have the time as he does great critical analysis of Unreal in my opinion.
Agreeing with el mal, the tooling and implementation of how your systems interact is the hardest part. Just getting the graphics down isn't really a problem (unless you're reinventing PBR, like i said), APIs for video accelerators already abstract away some of the boilerplate steps of the pipeline i.e. the actual blitting of the pixels to the screen. Pixel shaders and depth testing are a thing which makes z-sorting irrelevant. Etc. etc.
Quote:
Watch some Threat Interactive videos if you have the time as he does great critical analysis of Unreal in my opinion.
Not to sidetrack but he really does not. He genuinely does not know what he's talking about.
Also, PBR is just another illumination model, not terribly hard to implement. :) Yeah, the integrals look intimidating and all, but your implementation doesn't need them anyways, you can just ignore the brutal-looking math.
Bonus point if you dive into that as well, but not required.
Bonus point if you dive into that as well, but not required.
Quote:
Maybe that’s part of the rationale behind 64K/8K/4K productions: one simply can’t use a mainstream engine.
What are your thoughts on link-time compressors in this regard? Specifically public pre-made ones i.e. Crinkler.
Quote:
Not to sidetrack but he really does not. He genuinely does not know what he's talking about.
Could you give an example?
No, the Internet is full with them and as I said it's a sidetrack. Just felt like pointing out that he's a grifter who's asking people for $900k to "fix" Unreal Engine.
Where I can bring back this to topic is that the irony of it, which is that the render pipeline is actually reasonably well done, it's the rest that's a huge HUGE problem - no memory management, bad base code, terrible containers, etc... These all things that coders who have dug in who had to work on a 64k should know because they're table stakes things, and those are the coders we as a scene should be producing, but we're not even making an attempt to.
Where I can bring back this to topic is that the irony of it, which is that the render pipeline is actually reasonably well done, it's the rest that's a huge HUGE problem - no memory management, bad base code, terrible containers, etc... These all things that coders who have dug in who had to work on a 64k should know because they're table stakes things, and those are the coders we as a scene should be producing, but we're not even making an attempt to.
Don't know much about Unity these days, but fwiw, Unreal isn’t just a lighting toolbox. It’s a massive, open-source platform where you can drop in your own parts without thinking about the boilerplate, especially handy if UE is your day job. Our UE demos were about experimenting with features that didn’t exist yet - voxel GI before it was in the main branch, C4D motion-design pipeline before Datasmith, a GPU-side "particle system" before Niagara. Sure, we caught some flak just for using UE, but we expected sceners to be conservative and didn’t mind, and the folks whose opinions we actually care about welcomed us just fine, so was worth that in the end.
Quote:
It’s a massive, open-source platform
You are granted an exclusive access to the source code of UE only if you sign a contract with Epic (by doing it online on Github). This doesn't allow you to share the source code on your own git in open air, and you aren't allowed to share this source code with anyone else (friends or customers) unless they individually sign the same contract with Epic :)
Just for that (and for other reasons that were in the license when I read it, a couple of years ago), UE cannot be considered as an open source project.
Otherwise I agree is it massive lighting tool and an incredible platform for experimenting.
Idk, 3d engine seems to be such an archaic word at this point. It's usually just a go-to scene representation and custom asset format + custom tooling and rendering pipeline built around it. The problem with it - you are pretty much forced to work within those confinements, which also locks you into a specific provider.
Isn't it actually better to rely on a lightweight GL/Vulkan wrapper + collection of flexible libs with minimal interdependencies? And if you need non-procedural assets you can use any general 3d editor for it with scripting capabilities (Blender etc..).
Isn't it actually better to rely on a lightweight GL/Vulkan wrapper + collection of flexible libs with minimal interdependencies? And if you need non-procedural assets you can use any general 3d editor for it with scripting capabilities (Blender etc..).
Funny thread.
Just to annoy you we released a C++ Vulkan demo on Evoke 2025 ;)
https://www.pouet.net/prod.php?which=104778
Just to annoy you we released a C++ Vulkan demo on Evoke 2025 ;)
https://www.pouet.net/prod.php?which=104778