Maximum file size for demo (esp. @Assembly)
category: general [glöplog]
I'm so going to rule the Assembly 2018 demo compo with my precalculated Mandelbrot zoomer!!! And they'll think it is in real time HA HA HA HA HA
That filesize discussion, again. I'm on the "consumer" side of things, but: I prefer 256MB or more of a good demo than 16MB of a bad one. And it's no more party like 1699...
The nicest physically based renderer of the world sucks, if the textures are 256*256 pixels in size. Throw in some high quality albedo textures, normal maps, light probes... and 256MB are used and you don't even have a soundtrack in.
So you want that shiny new stuff, but don't want to eat the filesize? Meh... time for the 640*480 compo.
The nicest physically based renderer of the world sucks, if the textures are 256*256 pixels in size. Throw in some high quality albedo textures, normal maps, light probes... and 256MB are used and you don't even have a soundtrack in.
So you want that shiny new stuff, but don't want to eat the filesize? Meh... time for the 640*480 compo.
Psonice: Obviously not, but I wouldn't consider machine-learned fluid dynamics very cool if it takes 600 MB and it's possible (but technical challenging) to get the same result in 100 KB. Would a 600 MB machine-learned starfield or scroller be impressive? I would consider it an inefficient novelty, and not a reason to abolish file limits or hide filesizes. Use you Machine Learning for something that couldn't be done previously, and I'll gladly vote for it despite the huge dataset.
Quote:
Preacher: I'm not talking about the last few kilobytes, which are rounding errors in modern demos sizes. In my example, I talked about a factor 2, and Navis asked about 600+MB data.
"Pure" movie players may be forbidden by compo rules, but there's a whole continuum of faking impressive effects with a ton of data. Are you calculating that fractal zoom realtime, or simply zooming into precalculated images ? Are you doing real fluid dynamics on a million particles, or just interpolating between a ton of keyframes? That's what I'm talking about, and knowing the filesize can help me judge what is real vs faked (or faked cleverly vs stupidly)
600MB, that's CD-ROM era... I think the lack of size for most demos hampered by the code vs. content ratio (and the fact that they're not 90 minutes long) more than anything. Take a peek at popular productions from, let's say, the past 12 years, and you'll notice that exceptions not withstanding most don't really seem to make any effort at all to keep file size in check; in fact quite the opposite.
If we're to allow 4GB (or whatever now) it's not like that'll be what we're going to see all of the sudden. I have no inclination to make a graph but I'm willing to suggest that plotting release date or resolution/fidelity against download size will look a tad conservative even.
Are we interpolating datasets? Are we calculating everything from the ground up each frame? This is the kind of shit that we've been trying to either fool or impress one another with and is one of the fundamental things that made the demo scene a group of people who pioneered the audiovisual into practical real-time solutions whereas academia, with all due respect, fell behind for a good while.
If I play a video somewhere in a demo and it makes the experience cooler and it's not blatantly excessive or obvious, why the hell not? Essentially, is that so much different from what we see on old school platforms these days where, let's be fair, a lot of productions (not all, don't shoot me Britelite!) look interesting because smart data authoring and cooking is now possible?
Where to draw the line? I suggest we don't. Executable, set within limits defined by organizers. Done, done and done. Plus be sure to upload a quality Youtube grab.
Quote:
Plus be sure to upload a quality Youtube grab.
Because I'd rather see something cool blasting off my screen than shit holding back so it can run on my unsuitable hardware. I'm an optimization guy at heart so I get it, but still if I ever were to do another demo I'd buy a shiny new graphics card first, like I did a few years back.
I think it's really simple. With more and simpler and more well-known technical restrictions, it's easier for a computer geek to understand why a prod has merit and is a cool hack, even if the geek doesn't understand or like the artsy fartsy side of the prod. It is totally acceptable not to understand artsy fartsy stuff. I don't. With the tech stuff removed, what's left? I like shiny and groovy things that move smoothly. Stuff that explodes.
I'm with you - I'm all about flash and thrills - but to each his/her own right? And about that merit thing, that's just not such a feasible metric as it used to be in let's say the late 90s - it's less of a "can you do that effect" like deal these days. And rightfully so. Still if you know what to look for you can pick a great routine right out of the lineup and appreciate it - and if you're being fooled, all the better :D
Beauty is in the eye of the beholder. Having said that, I conclude with: yes, software rendering 640x480 compos back!!
Beauty is in the eye of the beholder. Having said that, I conclude with: yes, software rendering 640x480 compos back!!
Quote:
I prefer 256MB or more of a good demo than 16MB of a bad one.
Yeah, because people that fail to do decent demos in 16MB will magically manage to do amazing 256MB demos, because that makes everything suddenly so much easier obviously.
Quote:
Throw in some high quality albedo textures, normal maps, light probes... and 256MB are used and you don't even have a soundtrack in.
Because the demoscene is so full of people who can actually produce decent normal maps and highdef models and materials. Because, that's obviously much easier than doing beautiful textures for a 16MB demo. Ah, I forgot, just have to use UE4, problem solved.
So, back in the Amiga days, a single-floppy demo was about the same size as the available memory. A modern mid-range computer has what, 20 GB memory? How do you utilise that with a 256 MB demo? There are enough categories for competing in procedurally generated content already.
Quote:
Yeah, because people that fail to do decent demos in 16MB will magically manage to do amazing 256MB demos, because that makes everything suddenly so much easier obviously.
That's not what he said, and you know it.
And that’s a fucking awful attitude.
(Spike’s that is.)
Absence: There's some truth to that. However, filling up my RAM or HDD with more crap textures (this time in high-def yay) doesn't gain anything.
I'm not per se against lifting the size restrictions - i just don't agree to some of the reasonings for it mentioned in this thread.
I'm not per se against lifting the size restrictions - i just don't agree to some of the reasonings for it mentioned in this thread.
Britelite & psonice: you are right. I'm probably overdoing it a bit here. Sorry EvilOne.
If you kept it to "more megs doesn't mean more quality" you are right though.
i think he was more on a "more megs without having the artists to fill it up"-thing there, but, also true :)
Spike: 👍
It’s not just about artists btw. Some of the stuff I’ve done recently just takes ~10 minutes to precalc. Per object. That’s unoptimised, but it’s also for stuff that’s not that complex yet. Maybe we could have 30 minutes precalc time instead of 30s if big files are a problem? :D
Quote:
it amuses me to see people who more or less solely work on tiny intros or on other platforms, discussing the file size limit in the pc democompo they are highly unlikely themselves to enter. real-time is itself enough of a (very real, challenging) limitation.
It amuses me how the discussion rises to higher and higher meta-levels of discussing how we discuss things. Here's another one. Add more, please! ;)
That said, I'm exactly the kind of person who amuses smash (on this particular occasion), so I will just rephrase what others have said:
For every piece of information you forcefully show on the compo slide (be it size, requirements, compatibility or whatever) you implicitly add an optimization parameter which people will spend time on. They then spend less time on the demo itself, giving us poorer demos. If the creators think there is something important the audience should know about, they can write it in the free-form part of the compo slide.
Quote:
Often after a compo I’ve learned one entry is say 30KB, and in a compo where size is the focus that’s meaningful.
Yes, it means they ran out of time, ideas or motivation halfway through.
Quote:
Yes, it means they ran out of time, ideas or motivation halfway through.
*high five*
That's usually what it is.
There is that ;) Or they optimised too early and the compressor worked better than expected (I've been there, cut content to try and hit 4k, ended up with a 2.3k...) But sometimes it does seem to be finished (as these things go at least).
There's also the case of the failed 4Ks that end up in the 8K compo. Not so sure there, should we reward them for only being 4097B or punish them for being 1B too big? :D
There's also the case of the failed 4Ks that end up in the 8K compo. Not so sure there, should we reward them for only being 4097B or punish them for being 1B too big? :D
Well it's simple. 4096 is the limit, so 4097 is too much. I also don't believe tales of people pre-optimizing and being "surprised" by a somewhat deterministic process by margins large enough to fit meaningful content.
And rules are rules. 4097 bytes is too much. You lose, goodbye :)
And rules are rules. 4097 bytes is too much. You lose, goodbye :)
Plek: This abomination is 2.3kb. I removed th nice lighting hoping to get it under 4kb :D it does happen, probably because of deadline panic and lack of experience.
Quote:
Absence: There's some truth to that. However, filling up my RAM or HDD with more crap textures (this time in high-def yay) doesn't gain anything.
Navis' volumetric 3D texture might not be crap. And how can you claim it doesn't gain anything when you haven't seen what he's going to do with it yet? Why not give him the benefit of doubt?