pouët.net

Go to bottom

YCrCb to freaking black

category: general [glöplog]
 

I'm a fading to back wrong or is my math not good? My YCrCb planes are 8bit and I'm fading like this:

newy = ( oldy * brightness) / 0xff;
newCr =0x7f+((oldCr-0x7f)*brightness>>1))/0x7f;
newCb =0x7f+((oldCb-0x7f)*brightness>>1))/0x7f;

When I do this it looks like the whole color spectrum is rotated. E.g blues = pinks and whatnot. But I don't get it. Y=0 Cr=0x7f, Cb=0x7f is black right? So scaling CrCb to 0x7f,0x7f should fade to black right?

I convert everything to floats just to make sure its not an integer problem, looks exactly the same. How do you fade to black?
added on the 2010-11-19 06:25:08 by sigflup sigflup
Nah!!! Forget it I figured it out. I was doing it right, just a typo in my code.
added on the 2010-11-19 06:34:48 by sigflup sigflup
fine
added on the 2010-11-19 08:57:20 by elkmoose elkmoose
Another remark: If you're processing video content, you probably should not fade luma towards 0, but 16. Also, the chroma values for neutral gray are 128; you end up with a very small greenish tint when using 127.
added on the 2010-11-19 09:48:02 by KeyJ KeyJ
huh, thanks for the tip. Why 16? Does it have to do with the lineage of "blacker-then-black" syncs in video signals?
added on the 2010-11-19 16:28:14 by sigflup sigflup
BB Image
(i couldn't resist)
added on the 2010-11-19 16:48:46 by v3nom v3nom
@v3nomsoup Gonna fuck up your syncs!
added on the 2010-11-19 16:59:31 by sigflup sigflup
sigflup: Exactly. In most cases, the limits for 8-bit luma are 16 (black) and 235 (white). Chroma is centered around 128, with extremes of 16 and 239.
Exceptions which use full range 0...255 encoding are JPEG and video with full_range_flag=1 (rare!). And there's always xvYCC, which is a completely different story :)
added on the 2010-11-22 09:42:24 by KeyJ KeyJ
A lot of material on capped uses full range, as well as other mp4 demo captures.

Not quite all that rare.
added on the 2010-11-22 10:32:39 by micksam7 micksam7
Actually, grey is the new black.

fade to grey
added on the 2010-11-22 10:36:37 by cg_ cg_
16-235 is a disease. Use 0-255. Welcome to 1995.
added on the 2010-11-22 11:45:56 by dr_evil dr_evil
The whole reason for 16-235 was that the sync pulses in the multiplexed video signal stand out. So feel free not to use full range if you're still working with analog composite video. For everything else, get over it already. And no, connecting an old computer to a TV doesn't count. :)


added on the 2010-11-22 11:59:37 by kb_ kb_
General rant: What sucks most about video is how much legacy crap certain people were able to carry over even into the digital age. 16-235? Interlacing? 29.97Hz? WHAT THE FUCK? All of this made kind of sense in the past (except the 29.97hz NTSC signal which was a bad cover up for blatant engineering mistakes) but makes absolutely NONE as soon as displays don't shoot electron beams or video signals get discrete, digital, and/or compressed. So WHY do we still have to cope with that crap? And worst of all, why is there still people _defending_ all that obsolete shit? Oh wait... same mechanism as religions. Or conservative people. Ok, I get it. Doesn't make it the slightest bit better tho.
added on the 2010-11-22 12:04:42 by kb_ kb_
@kb_ What you kiddin' me? ntsc/pal iz 4 l1f3!!!!!!!!!!1

I mean it's a little interesting, but I agree with you that it's antiquated. What drives these standards? decoder chips? capture chips? software alone? I think it's all a little silly too.
added on the 2010-11-22 18:05:04 by sigflup sigflup
kb, i always thought that even at high resolutions and rates, interlacing still is believed to give "smoother" motion at half the bandwidth. or why else are there all these HD "i" formats? (not attacking your point here, merely being a noob who'd like to know)
added on the 2010-11-22 18:41:10 by skrebbel skrebbel
IT'S SO TYPICAL YOU TO ATTACK PEOPLE LIKE THAT, SKREBBEL!!!1 LEAVE KB ALONE!!! *CRY*
added on the 2010-11-22 18:43:13 by kusma kusma
micksam7: What's the rationale behind that? My best guess would be "otherwise the colors look washed out", which translates to "the player doesn't get it right, so we work around it". But I'm not complaining -- as long as you properly set full_range_flag=1, it's at least technically correct :)

kb_: Sync pulses are *not* the reason for the level limitation. Even if the sync signals are transmitted in-band, they use special 4-byte codes (SAV and EAV). The limitation to 16-235 was done to allow for a bit of headroom and footroom in the video signal, so that a small amount of filtering can be done without having to clip the offshoots.

By the way, if you really care about embedded sync, you can't use 0-255 either, because SAV/EAV use these codes. You'd have to limit to 1-254 anyway or risk that certain patterns in the signal might fuck up the sync.

skrebbel: You're right, the interlaced HD modes were kept in to allow for smooth (50 Hz / 60 Hz) motion without requiring significantly more bandwidth or processing power. However, this was a very short-sigted decision, because interlaced material is hard to work with, deinterlacing is generally not possible in perfect quality and the bandwidth / hardware argument also gets more and more moot with every year that passes.

But anyway, bitching about how much better the would could be without all this legacy stuff isn't of any use either, so why not simply try to live with it? :)
added on the 2010-11-22 19:31:28 by KeyJ KeyJ
The biggest problem is that technology advances way faster than mass consumers... except for 3D gamers (if you can call that "mass").
added on the 2010-11-22 22:07:54 by Jcl Jcl
The problem about "oh, let's include interlaced modes because they're only half the bandwidth" was twofold:

a) the "half" only applies to uncompressed transmission. The only thing that was ever going to be a problem in that regard was SDI cabling, and that could have been solved differently. And as soon as compression comes into play, the savings melt like let's say the world's economy.

b) The whole freaking point about interlacing was that it was an easy trick to get more out of CRT screens. Due to how they're built they display an interlaced signal very well, and first of all interlacing requires almost no additional electronics on the receiving end. Analog TVs pretty much do everything necessary by themselves when presented with an interlaced signal (to be precise, the non-interlaced signals of eg. a C64 exploit exactly that, albeit in the opposite direction).

But then, on the other hand, all of today's display technology doesn't only display a point running over a screen but actually, they present whole images to the viewer. And that alone means that interlacing can not ever work. At least not without quite an amount of CPU power for a non-sucky deinterlacer. Which makes stuff more expensive. They literally turned one of the main purposes of interlacing (as opposed to other means of saving bandwidth) on it's head, as it makes things more complicated for everyone now.

As a sidenote, there was lots of discussion when the HD standards were devised, and most manufacturers of video equipment were very strongly opposed to the idea of interlacing. But in the end "we always did it this way, we do it this way now" prevailed.
added on the 2010-11-23 00:55:16 by kb_ kb_
Quote:
most manufacturers of video equipment were very strongly opposed to the idea of interlacing

Actually the showdown was between ACATS ("Advisory Committee on Advanced Television Service", mostly comprised of long-time TV manufacturers) and CICATS ("Computer Industry Coalition on Advanced Television Service", comprised of computer industry players - Apple, Compaq, Cray, Dell, HP, Intel, Microsoft, Novell, Oracle, SGI, Tandem).

It originally dates back to 1996 (HDTV standardization efforts started in 1990 - this is all analog HDTV of course; the resolution of choice then was 848x480...). It was also way before LCDs became mainstream.

For a rundown of the whole debate in its full retardedness, check out Alvy Ray Smiths (inventor of the alpha channel among other things, worked for MS at the time) page on the subject. The most awesome part of it is that 1080i as originally implemented was actually 1440x1035i upsampled to 1080 pixels height.

Luckily LCDs did become popular before digital HDTV was finalized, which got rid of some of the more idiotic ideas - and at least we did get 1080p24 and 1080p30 (now if only they'd also added 1080p60...).
added on the 2010-11-23 04:44:14 by ryg ryg

login

Go to top