pouët.net

Go to bottom

parties with 4K projection in 2024

category: parties [glöplog]
 
got something brewing up and I wonder if any demoparties have 4K projector for compos during this year..
added on the 2024-01-11 15:04:09 by nosfe nosfe
I'm sure many parties will have 4K projections in 2024, it's a popular choice for coders
added on the 2024-01-14 00:10:12 by v3nom v3nom
There are many parties with 4k intro competitions.
added on the 2024-01-16 08:07:44 by Adok Adok
Ok, before this thread goes into residue (thanks to die drei lustigen zwei above me):

I can't really promise anything, but, in case my equipment wish list for the parties I'm helping at goes through, there's a tiny chance at Revision and a pretty good chance at Evoke that the projector itself and most of the chain before it will, IN THEORY, be 4K capable. That means that no, both parties will very probably (p=.99) stay at good old 1080p officially, but in case I'll be running a compo, I might be down to some experimentation with an entry or two, if everyone involved is ok with it. Just to see how it looks and if it's worth thinking about it in the future.

(everyone else, optimize your shaders pls, you got at least a year :D)
added on the 2024-01-16 11:39:15 by kb_ kb_
iirc that weebl classic went

shaders shaders shaders shaders shaders shaders shaders shaders
shaders shaders shaders shaders BANDWIDTH BANDWIDTH

right?
added on the 2024-01-16 12:04:13 by ferris ferris
kb: I think this is the only reasonable way. :-) Stay with the safe as baseline, offer to do experiments if time and equipment and energy and authors' appetite for risk permits.

(I don't stream anymore, otherwise it'd be a fun experiment to see what codec/encoder/resolution/settings/whatever yields the best result across a range of demos, given a set of hardware and bandwidth constraints)
added on the 2024-01-16 16:45:41 by Sesse Sesse
Quote:

I can't really promise anything, but, in case my equipment wish list for the parties I'm helping at goes through, there's a tiny chance at Revision and a pretty good chance at Evoke that the projector itself and most of the chain before it will, IN THEORY, be 4K capable. That means that no, both parties will very probably (p=.99) stay at good old 1080p officially, but in case I'll be running a compo, I might be down to some experimentation with an entry or two, if everyone involved is ok with it. Just to see how it looks and if it's worth thinking about it in the future.


thanks, this kind of info I am after! will ask you closer to those events, if some things are ready...
added on the 2024-01-16 23:53:09 by nosfe nosfe
Can we have HDR too?
Everytime I enjoy _well made_ HDR content, everything else looks washed out and grey afterwards...
Trickier to get right than more pixels though.

Still, serious question. What would be required to get HDR content on the Revision big screen?
added on the 2024-01-17 01:45:26 by jco jco
Quote:
Can we have HDR too?


Well thanks for the trigger question (in a good way) :D

So the answer is: No. Yes. Perhaps. I don't know. Let's break it down, and take the current Evoke setup as an example, as in a screen that's around 10x6m, and this pretty recent and rad projector. The good news first: It _does_ in principle support Rec. 2100, or what we peasants call "HDR10".

But sadly there are things that might be in the way. First, there's the brightness issue. That projector is already at the top of the brightness game, but is that enough for those sweet HDR highlights? Possibly, BUT: in order to make HDR content literally shine, we'd have to dim down SDR stuff quite a bit, and basically make everything that's not totally exaggerated HDR worse. Or we keep SDR content at max brightness, and suddenly the HDR stuff will be too dim except the highlights. Of course we could also decrease the big screen size in order to squeeze more brightness out of the image but seriously, who would want that :). In summary, the HDR experience would be akin to buying a cheap "HDR 400 certified" PC monitor - as we call it in Germany: "witzlos".

Then, possibly even worse, we got a contrast problem. HDR usually also means deep blacks and lots of shadow detail. There's a teeny tiny problem though. Let me pose a trick question here: Which color is this screen?

BB Image

The trick answer is: This screen is black. This is exactly how that screen looks if we project a 100% black image onto it.

Basically, unless we paint the hall black and cut the power for everyone else during compos, no details in the shadows for us. I mean, that's the main reason we try to get brighter and brighter projectors every now and then (except from making the screen Actual Size(TM)): We want people to actually be able to _see_ something. But given that HDR shadows are quite a bit darker than SDR ones unless we up the lumens, like, another 10 times, I fear the real world watching experience will be even worse than it already is sometimes.

There's one thing we _could_ already do though: HDR also defines a wider color space with more saturated colors than previously possible, and that our projector could do. Full Rec.2100 (or 2020) is still kinda unrealistic and really rare, but what most people master their stuff in, is DCI-P3, and yeah, according to the data sheet that projector can do 91% of that space, so yes, we could do more colors - IF the demos are actually made for that color space, and that's the big caveat here (see below)

So, you're asking what would be required for a High Dynamic Range Demo Party? Three things, two of them which I don't really see in the immediate future:

1. A fucking LED screen instead of projection. :) Instantly solves the brightness and contrast issues. Sadly, that's not the only criteria here: We also need a big one with a small pixel pitch, we need accurate color reproduction, AND it needs to be portable, and either able to be quickly set up / torn down by idiots that don't have any budget, or professionals that suddenly need that budget.

2. A/V chain that can actually handle HDR. This rules out a lot of cheaper video mixers, but you know what? We're used to switching sources directly on a HDMI matrix anyway, so worst case we can just fall back to that. (there's also the question what to do with the video in case we're also streaming, but not here, not now)

3. Now we get to the worst part: Entries that actually use HDR to good effect and are mastered properly. The good news: Turning on HDR (at least for Windows and Mac) is trivial.

The bad news: We're all hobbyists here, and up to now we all (and also most professionals) were like "there's a red, a green, and a blue, and they go from 0 which is black to 255 (or 1.0) which is white", and whatever numbers you put into your colors, they look perhaps ugly but not completely wrong. But for proper HDR content, that's not enough. Suddenly you need to know _which_ red, green, blue exactly we're talking about. Also, which white. Also, what even IS white, and how bright is a "normal" white? And does the display device even support the brightness and colors we send it (See above - Rec2020 vs DCI-P3), and how bad does it look if it doesn't? This is a lot of knowledge, and there's a lot to get wrong here

Also, up until let's say last year, PC monitors generally SUCKED at HDR, and tried to fake it with just turning the backlight to 100% and then competing in the Worst Tone Mapping compo. That's why turning on HDR in Windows makes the normal desktop look like shit. In reality, Windows is sending 100% accurate colors out to the screen, but the screens have to squash thousands of nits into 400 somehow, and take the easy way out: "make everything else worse".

Again the good news is that this is currently changing a lot - if you don't want to put a 55" or bigger TV on your table just to watch HDR the way it is intended, there are a lot of options now for either MiniLED or OLED monitors with proper brightness and colors, and judging from the CES that just ended, the whole topic is finally taking off. Yay. That probably means we should give the scene a bunch of years until HDR screens have become standard AND people have actually bought one, AND there hopefully are good resources and best practices for everyone.

So, er, yeah. That would be required, IMHO :)
added on the 2024-01-17 14:37:25 by kb_ kb_
We clearly need to make ICC profiles as mandatory as file_id.diz is. :-)

I'm not too worried about the white points, though, they're mostly the same in all relevant profiles :-) And I can certainly live with P3 colors just getting clipped into sRGB. The tone mapping part is much gnarlier. I am sure Gargaj will love the implications for Pouët screenshots.
added on the 2024-01-17 20:55:05 by Sesse Sesse
ICC profiles don't really matter IMO; at least under Windows the HDR color space (scRGB, aka signed linear Rec709 with 1.0=80 nits, or 10bit Rec.2100 using PQ) is well-defined. Sending static HDR metadata (avg/max luminance, color extents) might be nice but you only know those when the prod is finished, so good luck with last minute submissions :D

And as far as Pouet screenshots go, Gargaj would only have to add support for AVIF or PNG; both can embed the color space as CICP tuple (PNG since spec 1.2), and both are supported by all modern browsers; you'll have to trust those browsers' built-in tone mapping though :)
added on the 2024-01-17 23:03:04 by kb_ kb_
Having worked with implementing wide-gamut color in browsers, I cannot say I trust them to do this reliably, and would probably recommend doing a static Rec. 709 fallback for non-HDR clients on the server side. :-)

(Also, scRGB is such a sad thing)
added on the 2024-01-17 23:45:38 by Sesse Sesse
I don't mind scRGB too much, I don't think it's worse than the alternative, and it kinda allows you to choose how HDR and WCG you want to get by just not doing stuff. My main gripe is that for both output formats you have to query the output SDR level and scale your brightness if you want to match eg. your tool UI to the rest of your OS, and of course it's not just a field in the DXGI output desc but some completely unrelated and complicated other API. :D
added on the 2024-01-18 00:04:51 by kb_ kb_
One the one hand I love HDR (while being kinda indifferent to 4k resolution) but on the other hand HDR in the real-world kinda sucks right now if you don't have one of maybe 10 available high-end TV models (or maybe a next-gen/high-end HMD), which you not only need to enjoy HDR content but also to master it properly. Even then, due to the market being filled with devices with bad implementations, the user experience will suffer until good devices are the majority. I think getting projectors bright enough to deliver decent HDR in the context of demoparties we'd need liquid nitrogen cooling and/or different screen canvas, that being said maybe we can start slow by having a small HDR compo on a 80" high end TV (#kuschelstufen)?
added on the 2024-01-19 16:35:34 by LJ LJ
hey kb, thanks for the detailed comment about hdr. most of what you say aligns with what I assumed: projector can "kind of" do it, at least spec wise, and the central problems are due to the reality of it being a projection (and not a led wall), and authoring the content is hard.

Despite the shortcomings and limitations you mentioned, I do see a few potential benefits that could work at a demoparty. One is, also like you mentioned, escaping the color gamut limitations of srgb. this makes a difference, who would have believed a bright red can be THAT red :D. Of course that's not directly related to HDR, but HDR to my knowledge comes with a wider color profile out of the box, Rec.2100 I take from your comment, and content made for it can be expected to adhere to it.

Then there is the simple fact that we get more bits per channel, so within the possible contrast range, more detail is possible, and we'd see less of that ugly color banding.

But the central problem is that none of that matters if content isn't properly prepared to make good use of it. Even AAA games seem to get this wrong: I've recently played Spiderman 2 on a fancy contemporary OLED TV with all the bells and whistles, and the game mostly looks like simply being blown up to higher contrast (like setting the TV to bright). As soon as you enter a darker or brighter area, auto-exposure kicks in way too aggressively - the point is to NOT do that... colors were nicer due to the wider gamut. Similar with Cyberpunk.
Same with movies: lots of crap examples and a few gems from what I've consumed so far.

...and then I played Alan Wake 2. That's how to do it :D Auto exposure is barely noticeable, dark stuff is dark, bright stuff is bright, both are detailed. It's consistent, effective, and supports immersion. Tone mapping also works great once the screen is calibrated correctly. Must be Remedy's demoscene background. ;)

I doubt that many sceners have a screen suitable for authoring, and I'm afraid of the development process, expecting weird toolchains, drivers, screens, compatibility issues, the OS and the GPU each messing with the signal, and so on.

Is there a compromise here somwhere? If we got the fancy beam, make it an option for demogroups, a checkbox to tick in Partymeister and a simple spec sheet? Is there an easy way to at least support non-srgb color profiles? Can we get more bits per pixel with a simple approach, while ignoring everything else?
added on the 2024-01-20 17:54:52 by jco jco
@LJ nice idea having a Kuschelstufen-side-compo on a TV. Maybe for GFX and video content, or completely open, just to see if there is interest.
(do not allow music entries claiming to be HDR for being a 32bit float wav file though. scnr.).
added on the 2024-01-20 17:58:42 by jco jco
Quote:
Then there is the simple fact that we get more bits per channel, so within the possible contrast range, more detail is possible, and we'd see less of that ugly color banding.

Perhaps today's SDR demos should simply start dithering the output to fix that banding :-) No 10-bit color output needed, really.
added on the 2024-01-21 12:50:16 by Sesse Sesse
@Sesse good point, especially at high resolution. Now that I think about this, I've actually noticed it once when I've been working on UI things and compared different renderers in a cross platform situation, observing that large, low contrast gradients looked way better on macOS. The reason was simply that rendering on macOS applied dithering, iirc due to Core Graphics doing it out of the box...
Still feels a little fake though, and it would be cooler to actually use the tech that's present in most displays nowadays. *shrug*
added on the 2024-01-29 01:44:16 by jco jco
Dunno, dithering isn't cheating or anything, any more than e.g. antialiasing is. It's just a better way of dealing with quantization than truncation or rounding is (has a more predictable and generally less objectionable effect on the spectrum).

I'm unsure to what degree most displays actually can do real 10-bit output? I don't know what the current status is, but for the longest time, a bunch of LCD panels were 6-bit with internal dithering. I can agree that in a perfect world, demos (at least those that don't go for e.g. lo-fi looks) should adapt to any display characteristics, but my point was a bit that this is the harder task. :-)
added on the 2024-01-29 09:14:47 by Sesse Sesse
We've got a kind of eclectic selection of displays at the office, so I've got a bit of experience there... The better (aka more expensive) ones can all to 10 bits per channel nowadays, and it definitely looks better than 8 without dithering on all of them. Good enough in fact that if you stare long enough at those gradients, you'll find all the other artifacts of that respective display (tiling, line flicker etc) first :D

Properly detecting 10bit and adjusting to it is a bit of a hassle tho (especially with GL/Vulkan) so there's absolutely nothing wrong with slapping on some blue noise texture at 1/256 gain and fudging the UVs a bit as last step of your postproc pipeline, and calling it a day.
added on the 2024-01-29 14:57:36 by kb_ kb_
UVs?
added on the 2024-01-29 16:02:27 by Sesse Sesse
Like, randomly move the noise texture around every frame :)
added on the 2024-01-29 16:09:25 by kb_ kb_
Ah, UV coordinates of the blue noise texture :-)

TBH I prefer just having the noise stable, because it means that still images in a video remain still (and thus easy for any video codec to encode).
added on the 2024-01-29 16:14:30 by Sesse Sesse

login

Go to top