Emulators will become more important in the future...
category: general [glöplog]
How could the TVs tell what was a "first field", if there is no "start of program" signal they'd have to catch to reset their "frame counter". And how about end of program... what if the whole program is one frame long. ;) Then the old 240p/288p home computers could be seen as transmitting a very long sequence of one-frame-long programs, each starting on the left foot.
By the way, someone or something has written on Wikipedia: "Conversely, the FCC forbade TV stations from broadcasting in this format." http://en.wikipedia.org/wiki/Low-definition_television
By the way, someone or something has written on Wikipedia: "Conversely, the FCC forbade TV stations from broadcasting in this format." http://en.wikipedia.org/wiki/Low-definition_television
Think of interlaced video as transmitting a whole picture in two fields, and each field normally contains exactly half the scanlines.
For example, NTSC has 525 scanlines per frame. Each field transmits 262 and a half scanlines. How do you transmit half a scanline? Well, in the vertical retrace you transmit sync pulses each half a line (so the CRT keeps sync), and then in the middle of a scanline, begin transmitting the next frame. You use the momentum of the vertical retrace to effectively place the top field one scanline apart from the bottom field to produce the total picture. PAL uses a similar scheme with 625 scanlines.
http://web.mit.edu/jhawk/tmp/p/interlacing.pdf
On a "studio grade" monitor that has an "underscan" mode, or on a dying TV set where the top of the raster bends down into the visible area, you can see the half-a-scanline at the top of the picture easily.
Video game consoles omit the half-a-scanline because they don't need it, which means there's no half-a-scanline to make the vertical bob for interlaced video and each field effectively becomes one progressive scan 240p frame. On an actual tube TV set that also means the raster is always filling only half the picture tube, which is why 240p output looks dimmer than normal video. That's the reason I understand why so many NES games have bright colors.
Analog capture cards and flat panel TVs look at the sync pulses to determine even vs odd field and that's why they tend to be picker than an older TV set.
For example, NTSC has 525 scanlines per frame. Each field transmits 262 and a half scanlines. How do you transmit half a scanline? Well, in the vertical retrace you transmit sync pulses each half a line (so the CRT keeps sync), and then in the middle of a scanline, begin transmitting the next frame. You use the momentum of the vertical retrace to effectively place the top field one scanline apart from the bottom field to produce the total picture. PAL uses a similar scheme with 625 scanlines.
http://web.mit.edu/jhawk/tmp/p/interlacing.pdf
On a "studio grade" monitor that has an "underscan" mode, or on a dying TV set where the top of the raster bends down into the visible area, you can see the half-a-scanline at the top of the picture easily.
Video game consoles omit the half-a-scanline because they don't need it, which means there's no half-a-scanline to make the vertical bob for interlaced video and each field effectively becomes one progressive scan 240p frame. On an actual tube TV set that also means the raster is always filling only half the picture tube, which is why 240p output looks dimmer than normal video. That's the reason I understand why so many NES games have bright colors.
Analog capture cards and flat panel TVs look at the sync pulses to determine even vs odd field and that's why they tend to be picker than an older TV set.
Quote:
How could the TVs tell what was a "first field", if there is no "start of program" signal they'd have to catch to reset their "frame counter".
Afaik they don't have to.
'Fields' are a concept that only exists on the sender side, not the receiver side.
That is, the only difference is that the odd fields are displaced by half a scanline compared to the even fields.
This is done by the analog timing signals (slight variation in vsync pulse length if I'm not mistaken).
So the CRT just displays the signal as it comes in.
Which also explains how many home/personal computers just display 50 Hz non-interlaced: the even and odd frames are exactly the same.
The Amiga is one of the few machines I know of, which can actually properly output an interlaced PAL/NTSC signal.
Generally you use two copperlists for that, one for even and one for odd frames.
CGA has an 'interlaced' mode, but it is broken because the timing of the even/odd fields is not adjusted accordingly.
Quote:
what if the whole program is one frame long. ;) Then the old 240p/288p home computers could be seen as transmitting a very long sequence of one-frame-long programs, each starting on the left foot.
Every other field must be upper/lower, so you'd need to add filler fields for such a signal to be valid PAL/NTSC.
So does PAL say that a program must consist of at least two fields? Or an even number of fields? I probably mixed the words field and frame above.
Quote:
On an actual tube TV set that also means the raster is always filling only half the picture tube, which is why 240p output looks dimmer than normal video.
I don't think that's quite right. There should be no difference in preceived brightness because even if the beam goes over half the area, it goes over it twice as often (every field instead of every other). I can't remember any difference in brightness between interlaced and non-interlaced mode on Amiga at least. Bright colours were probably used for a different reason (toys often have bright colours).
Quote:
So does PAL say that a program must consist of at least two fields?
I don't think the signal has any concept of programs or their durations.
Well, how about a signal that ends after the first field. Displayed or not?
While it might be fun to split words and extrapolate edge cases, the signal is continuous and doesn't start or stop as part of the operation, especially not several times per second. :)
Quote:
Well, how about a signal that ends after the first field. Displayed or not?
If we're talking about a completely analog signal, then yes. Every (part of a) field is just passed through the decoding logic and drives the cathode ray directly.
It depends on your definition of 'displayed' though, I suppose.
If you only output a single field, the display is not likely to be in sync, so it may look garbled.
But once you're in sync, everything is just displayed as-is. It's all continuous, there is no concept of fields, frames, scanlines or anything on the CRT side. It just tries to follow the sync pulses in the incoming signal, and that may or may not result in a (stable) image.
There's a reason why the chip on the computer side is called a CRT controller. It is the chip that generates all the pulses at the proper intervals.
Quote:
Pretty sure I've watched "Total Triple Trouble" 100% on WinUAE, have you configured it properly? (don't just use the stock settings for each device)
I wouldn't be sure what WinUAE settings to use for that example.
i am surprised of the level of know how of some :) just one thing:
when it comes to TV, think of the fact hat pal/ntsc (color) are a hack on top of regular b/w broadcast, which has been invented in the dark ages - and it uses the power grid frequency as a reference sync. this frequency can (and back then could even much more than it does today) vary quite a bit, so the whole aparatus _must_ be capable of syncing in a relatively wide range, or it wouldnt ever work at all :)
Quote:
The specs of analog display era were always a bit flexible. Nobody expected exact frequencies but rather a close range.
when it comes to TV, think of the fact hat pal/ntsc (color) are a hack on top of regular b/w broadcast, which has been invented in the dark ages - and it uses the power grid frequency as a reference sync. this frequency can (and back then could even much more than it does today) vary quite a bit, so the whole aparatus _must_ be capable of syncing in a relatively wide range, or it wouldnt ever work at all :)
Yes. Exactly.
I wonder if building an HDMI encoder that uses the console or home computer's pixel clock would solve the problem. Then you wouldn't have the problem of the TV sampling the analogue signal at a different frequency than it is generated. Of course, such a project is far from trivial. :)
afaik the hdmi clock is _very_ strict on timing, ie 50Hz must actually be 50Hz and 50.125Hz (as in C64) will not work :(
Actually, why not drive the pixel clock from external SDI sync? ;o Okay okay, I'll go away with the crazy ideas now!
Absence: Isn't that what the XRGB mini Framemeister does?
http://junkerhq.net/xrgb/index.php/XRGB-mini_FRAMEMEISTER#Sync_Mode
I have not witnessed how it actually works with retro machines like the C64, and modern HDMI-eating displays. Yet.
http://junkerhq.net/xrgb/index.php/XRGB-mini_FRAMEMEISTER#Sync_Mode
Quote:
When set to AUTO the output frame rate more closely matches the input frame rate, even if this deviates from the official HDMI timings.
I have not witnessed how it actually works with retro machines like the C64, and modern HDMI-eating displays. Yet.
As far as I know HDMI can carry a wide range of clock rates from 25MHz up. I've played with the xrandr and gtf commands on Linux installs to make custom X modelines and my TV set will take anything up to 60Hz 1920x1080p or down to really low res modes like 320x200 at 70Hz (that's 200 lines, not 200 doubled to 400 as VGA would do). In fact my TV set is far more accepting of weird HDMI modes than the strict subset of modes it will accept over VGA. It wouldn't surprise me in the least if it could take a straight dot clock conversion from a C64 at 50.125Hz without problems though it might have to transmit each pixel twice to produce a clock rate above 25MHz.
Quote:
Absence: Isn't that what the XRGB mini Framemeister does?
Nice, the result is probably similar to generating HDMI from the pixel clock, and so much less invasive... If TVs accept weird HDMI timing, at least that part of the problem is solved. If they don't, there's still the more insane reclock-the-entire-machine-from-external-sync idea! :D
Could Amiga's genlock support be used for something like that without having to solder the whole thing to pieces?
I was planning to sell my HD Ready TV and get a larger full 1080p HDMI monitor - hope the C64-HDMI interface will be ready by then ;)
Methods exist already to put C64 onto VGA right? There are VGA to HDMI cables available for under £20 (I have one) So is that an easier route?
It doesn't make timing issues magically disappear. Either it maintains the bad timing from the source signal, or it reclocks it and deals with glitches internally. The result could be different from connecting the C64 directly to the TV in either case due to different signal path. It won't be perfect, maybe it's an improvement or maybe not, maybe it's good enough.
A quick web search reveals the result of connecting a C64 to one of the VGA converters currently common on Ebay:
you get what you pay for - dont expect proper picture with that cheap shit :=)