pouët.net

Go to bottom

Maximum file size for demo (esp. @Assembly)

category: general [glöplog]
As said I was referring to the majority of demoscene productions that are not in the same league as the top 5% groups. Navis knows his stuff and for sure will make something great, I agree.
added on the 2017-11-14 00:30:10 by arm1n arm1n
Feel free to submit demos of any size to the Outline PC demo compo! We have better things to do than dedicating brain power to determine an arbitrary size limit, or figuring out whether you're faking stuff or actually making decent use of large datasets. Promise!

(And it's your reputation on the line once - inevitably - it comes out you snatched that Fanta and six-years-old surplus party t-shirt from a more deserving competitor.)
added on the 2017-11-14 01:13:03 by Alpha C Alpha C
Let´s sum it up... the whole thread is about Navis begging to drop the size limit for his new effect at assembly.

So I´d recommend to follow AlphaCs suggestion and just submit it at Outline - there it ´definitely won´t hit a max size limit, and we could find out whether burning over 200 megs is worth it or not still in time do adjust assembly rules accordingly =)
added on the 2017-11-14 02:09:43 by T$ T$
It's worth to emphasize that it's not just any 'fake effects' that are the problem, but a new generation of fake effects that are only possible when you allow 1gb+ entries.

Although, to be fair, those might be really interesting to see in action (if done right).
added on the 2017-11-14 04:02:33 by tomkh tomkh
I was actually looking for some breathing space. 256 mb is not that much if you add music, dlls, models, textures etc. My big resources are not 3D textures (I mentioned it as an example, for somebody else maybe) but pointclouds that I cannot compress with octrees.

Nevertheless, I managed to reduce sizes for these things by about 300mbytes (!), so even 256mb target is very feasible.
added on the 2017-11-14 10:35:36 by Navis Navis
So how did you compress your point clouds given your context or will divulging any of that information compromise the huge edge you've got over most anyway?
added on the 2017-11-14 13:09:15 by superplek superplek
I compressed by quantization+entropy encoding. Octrees would probably give me an extra 50% but at the expense of complexity/time to unwind the data in memory.
At the end, the two datasets come to under 150mbs, which is ok for what is a combined 60m pointcloud; I can live with that.
As for the rest, I moved away from .objs (no idea why it took me so long) so sizes and loading speed should be a lot better.
added on the 2017-11-14 13:31:18 by Navis Navis
Navis: great that you got your data crunched down. I didn't want to say "did you even try compressing it", because I'm not familiar with point cloud data. Anyway, people took your "the only practical solution is to remove the size limit" claim as the truth, because you're a famous person. And now there'll be a bigger size limit, or no size limit at all in the Assembly compo. I hope it will produce more entries. And at least this thread produced many opinions from different perspectives.
added on the 2017-11-14 16:27:43 by yzi yzi
@yzi, no, people did not take Navis' claim as truth because he is famous. People took it because he consistently and sistematically proved that he knows few things when it comes to democoding.

It is time for people to start differentiating "fame" and "reputation". They are not the same.

Which is why I would not dare suggesting Smash or Navis or, lol, Blueberry to learn few things about modern compression algorithms when/if they will choose to complain about current compo size limitations. The idea that they did not think of compressing data already is pretty much insulting.
added on the 2017-11-14 16:48:40 by introspec introspec
Heh, according to Abyss, they were going to remove the constraint anyway starting from this year. Anyway, as I said, there are some new emerging datatypes such as volumetric content that need a lot of space and are very hard to compress further. Volumetric content is now easier to produce with the advent of 3D cameras and lidars.
added on the 2017-11-14 16:49:59 by Navis Navis
There's people who can chop 8 megs off a memory budget to make a PS2 game run on a GCN, those same people won't give that particular damn when it comes to demomaking.

Quote:
I compressed by quantization+entropy encoding. Octrees would probably give me an extra 50% but at the expense of complexity/time to unwind the data in memory.
At the end, the two datasets come to under 150mbs, which is ok for what is a combined 60m pointcloud; I can live with that.
As for the rest, I moved away from .objs (no idea why it took me so long) so sizes and loading speed should be a lot better.


Quantization however does imply that it's potentially a bit lossy, right? And on the .obj thing, the whole hardcoding thing you had/have going on sets an example and at times reinforces a modus operandi that obviously suited you (and others) but requires a certain mindset to be even remotely fruitful ;)
added on the 2017-11-14 16:56:13 by superplek superplek
introspec: to me it intuitively felt that Navis might not have spent very much effort trying to either crunch the data himself, or find someone who can. Let's assume that I'm random dude who knows how to type words into Google, something like "how to compress point cloud data": I find this paper
http://cg.cs.uni-bonn.de/aigaion2root/attachments/GollaIROS2015_authorsversion.pdf
And this site with ... a library? And what sort of bytes-per-point figures are they talking about?
http://pointclouds.org/documentation/tutorials/compression.php

I might think, maybe it's not entirely impossible. But then again, we might speculate on things like, "time is better spent on such-and-such things instead of data compression, and besides, Amiga disks were so-and-so many kilobytes compared to its RAM, and games are so-and-so big, so therefore ... "
added on the 2017-11-14 17:36:07 by yzi yzi
Regarding .obj, any ASCII format for mesh description is pretty lame (slow/big) but human readable. Best to replace it with a similar binary.
The pointcloud compression in the PCL library is octree based and I have used it in the past (and is very good!), but here the pointcloud has some more information that I need to retain (topological) so it's not a straightforward use. And pointclouds.org libraries, even if I managed to use them with what I have, would add some extra (10s?) of mbytes too.

The compression is lossy, yes, but I make up for it by other means. I'm happy with the balance.
added on the 2017-11-14 18:21:19 by Navis Navis
On this subject, anyone knows why it is .zip and not .rar ? Are there historical reasons for delivering compressed productions as zip ?
added on the 2017-11-17 14:06:55 by Navis Navis
rar is a proprietary format & not very practical with lower end systems (as in, a500/st/c64/etc)
added on the 2017-11-17 14:29:18 by havoc havoc
Why not .7z? Better compression ratio and afaik faster as well.
added on the 2017-11-17 14:46:51 by MuffinHop MuffinHop
Well, .zip is not very practical for pretty much any 8-bit system either, due to its memory reqirements. Both .rar and .7z are even worse, to the extent of being impossible to deal with.

So, lower ends systems are definitely not a likely reason for the choice. As a guess, the wide availability of (and common familiarity with) .zip tools, including open source options, made .zip de facto standard in the 1990s, which is probably the reason it was picked up. .rar did not become similarly widespread until mid-to-late 2000s.
added on the 2017-11-17 15:24:17 by introspec introspec
If I learned anything about the scene it is that they like to stick to ancient standards and traditions. Thats why in warez land you have ridics like one (big) install.exe + one .nfo split up into release.rar - release.r<n> which then are individually zipped again and then are zipped or tared again to a single file and stuff like that (dont do warez kids, its not only illegal but also bad for sanity).
Thats of course less important with demoscene releases as they are usually smaller and not autmatically generated and processed by some release script (at least until now ;) but I know that for example scene.org wants zip files to look into the archive and show a preview on the download page (and afaik theres even some script to repack rar and stuff to zip automagically).
added on the 2017-11-17 16:49:09 by wysiwtf wysiwtf
So we had the Assembly compocrew meeting this weekend and we got through things considering the compo rules. There are still some changes under consideration but I can tell you for sure that the demo compo will not have size limit anymore.

We will also do some updates to max. showing time of 4k, 64k and demo. They will most probably be 5min, 10min, 10min but we will get back to this when we finalize the rules. There will be also some memory and size limitation changes in oldskool compo compared to last year. I think britelite can inform you more about this as I would end up forgetting or typoing those anyway. Our goal is to finalize and release the rules for next year latest in February so we will get back to this then. If you have any questions or suggestions before that please contact us! :)
added on the 2017-11-19 17:06:30 by rimina rimina
Quote:
Feel free to submit demos of any size to the Outline PC demo compo! We have better things to do than dedicating brain power to determine an arbitrary size limit, or figuring out whether you're faking stuff or actually making decent use of large datasets. Promise!

(And it's your reputation on the line once - inevitably - it comes out you snatched that Fanta and six-years-old surplus party t-shirt from a more deserving competitor.)


I'm so happy to hear that :D No seriously.. Not hot but HUGE stuff.
added on the 2017-11-20 16:24:26 by superplek superplek
Quote:
rar is a proprietary format & not very practical with lower end systems (as in, a500/st/c64/etc)


I'm not even sure there are any (un)zip tools for c64, and on Amiga you should use LHA.
added on the 2017-11-20 21:37:55 by ___ ___

login

Go to top