pouët.net

Go to bottom

True rasterdemo ("Raster Interrupts"-like) working on GeForce/Radeon! -- Comments?

category: general [glöplog]
There's also mid-screen input reads for same-frame response.
e.g. mid-screen button read for bottom-of-screen pinball flippers. For fast snappy response & all original input latencies.
I’m probably being ignorant, but are our reflexes faster than 1/50th of a second?
added on the 2018-07-18 18:39:36 by rloaderro rloaderro
Scratch that.. i think there are bobsled people who compete in the +-1/100 league
added on the 2018-07-18 18:43:26 by rloaderro rloaderro
If you can anticipate the motion, then definitely. For a good drummer or comping rhythm player, inadvertently missing a beat by a 50th of a second is a mistake.
added on the 2018-07-18 20:23:38 by yzi yzi
Elite sprinters react in just a little over 100 ms (under 100 ms is considered impossible and counted as a false start). But it's not about reflexes here (which is about reacting to an input), just noticing whether you got a response in 50 ms or 100 ms (which is about measuring reaction to your output), which is entirely possible.
added on the 2018-07-18 20:43:30 by Sesse Sesse
Yeah, you can definitely tell if your drum sounds suddenly come 20 ms later. So yeah... I can kind of understand it. It might help in using ProTracker or Deluxe Paint by making the keyboard and mouse feel better. But for watching demos it shouldn't provide any improvement.
added on the 2018-07-18 20:51:46 by yzi yzi
>"Elite sprinters react in just a little over 100 ms (under 100 ms is considered impossible and counted as a false start). But it's not about reflexes here (which is about reacting to an input), just noticing whether you got a response in 50 ms or 100 ms (which is about measuring reaction to your output), which is entirely possible."

There's the race-to-finish effect. The best eSports players in the tighest leagues are dependant on milliseconds much like 100 meter sprinters may cross finish line only milliseconds apart. Y'know, "see-react-shoot" same time after going around a corner, is kind of like the sprint to the finish. In this situation, milliseconds can matter.

In a different situation, consider lag offsets in reaction expectations. For example, an archery game with a moving target. Timing the vertical arrow-shoot as a target moves horizontally. If the archery target is moving 1000 pixels/second on your display (1ms = 1 pixel), timing your shoot 10ms too slow or 10ms too fast, means the bullseye is 10 pixels to the left or to the right by the time arrow hits the target. So you meet a different system with a different lag (e.g. emulator with 16ms more lag than a different emulator), it interferes with your pre-trained archery aim.

Etc.

Using beamracing preserves 100% original input lag mechanics, which means you get the same lagfeel as the original arcade/console/etc machine. For some that can be important.

Either way, everyone has different reasons for reducing emulator lag.
I have used "the UFO" when comparing my gaming screen to a newer one. I have posted my skepticisms vis-à-vis blur reduction techniques and 2D as measurement on your forum as HenrikErlandsson.

And from this I conclude it's incorrect to use your test for the new techniques and would like you to focus hard (for flat screens, that is) on extreme-brightness, 200/240 Hz screens offering black frame insertion. This is the only thing that will give a chance for flatscreens to outperform CRTs for real-time graphics. (As in, no, not "RTG" although it could be, some day. ;))

~

I have shamelessly taken the opportunity to sort of apply my pet peeve (well, anyone's pet peeve, surely) with non-CRT displays from you just posting a question. Sorry about that.

In answer to your question, ...raster is nothing to chase when you have the luxury of a graphics card taking care of much more performant chores. If you want it oldschool, just pick your favorite oldschool platform, a lovely CRT, and enjoy coding without blur :)
added on the 2018-07-23 01:06:33 by Photon Photon
The correct UFO test needs to be used for the right application -- e.g. Ghosting test versus Photo/Crosstalk test. There are 25 different UFO tests now, including new ones invented this year. The Right Test For The Right Job...

Anyhow, if you've experimented with blur reduction you've probably noticed a phenomenon called strobe crosstalk (trailing after images) that CRTs do not have.

For example, strobe crosstalk (www.testufo.com/crosstalk) with double-images for strobe backlights are a big problem and they vary a lot between 25 displays I've tested -- e.g. www.blurbusters.com/crosstalk .... So there are different UFO tests optimized for very different aspects of displays. And the difference between stationary and pursuit camera.

Also, the higher the Hz, the less time LCD GtG can occur in the dark phase between strobe flashes on refresh cycles, so strobe crosstalk unfortunately tends to amplify at higher Hz. The easiest trick to eliminate strobe crosstalk (double images) is a faster scanout followed by a longer VBI that's much longer than the LCD GtG speed. So that's why a ~144Hz with a 1/240sec scanout, tends to have less strobe crosstalk than 240Hz native. Sometimes this is done via "Large Vertical Total" tweaking.

Moreover, CRTs reign supreme for colors and black levels, but with some of the better flat panels, with the lowest strobe crosstalk, adjusting strobe length (e.g. "ULMB Pulse Width" or BenQ's equivalent) can make blur reduction much sharper than at default settings, at the cost of greatly reduced brightness.

Several 240Hz monitors now can do >300nit in strobed mode, so excess brightness helps a lot due to the large darkening of display in strobe mode. Especially the variable-persistence strobe backlights ("pulse width" adjustment of BenQ/Zowie or NVIDIA ULMB) -- since doubling motion clarity halves brightness. At ULMB Pulse Width under 50%, I now can read TestUFO Panning Map Readability Test at 3000 pixels/sec (www.testufo.com/photo -> Toronto Map -> 3000px/sec) just like on a CRT, which default ULMB cannot achieve. Still, strobe crosstalk is somewhat bad unless you raise the black level upwards by about 5% to allow overdrive to function better (turns blacks into super dark greys -- greatly reduces strobe crosstalk). But that's a contrast ratio tradeoff.

If you wanna stick to CRT, one of the better CRTs to buy is the Sony FW900 series - but I've also liked the Nokia 445PRO. We talk about those tubes a bit in the Forums too. Great tubes!
I read so much and watched several videos, but I'm still unsure what the takeaway should be for me here... I usually keep vsync off since at 144hz and high fps it is rare for me to perceive tearlines. It's not usually noticeable for me these days, definitely not like it was with <60 fps on 60hz.

In the rare cases where I do notice them and it bothers me I just turn on Gsync and be done with it. However, my display has 4ms of lag normally and Gsync brings that up to 8ms so when I saw those claims in the very beginning about being able to cleverly "code" the tearing away I was intrigued. I'm still not sure how this knowledge benefits me and what I should take away from all this... does this mean that manufacturers should have an easier time of combating tearlines in the future and I should just look forward to some kind of firmware that implements your method or is there something I can do right now?

If I could use this to cut my latency in half that would be great, but I'm not sure if that is the case here. Either way I'm glad to see such passion here, not sure I've ever seen someone so proud of a project.
added on the 2020-09-28 07:42:17 by joe7dust joe7dust
The takeaway is simple: Time spend on absolute pointless stuff is pointless.
added on the 2020-09-29 19:24:55 by EvilOne EvilOne
Actually... Some of the best discoveries / inventions come from spending time on random pointless stuff.
There was a dude (can't recall the name) who did some research on what leads to "Nobel prize worth" works, and the reason most Nobel prize winner are young when they get one, and don't get a second one, is that it's the time in their life where they do the most random pointless stuff. And once you do something that gets notoriety, you tend to specialize in that field, and it stops being pointless, and it stops being random.
added on the 2020-10-02 19:10:31 by BarZoule BarZoule
Correct, my Tearline Jedi already has inspired practical applications already:

- Software Emulators with identical lag to an FPGA simulation (emuraster=realraster sync). WinUAE has a beamraced lagless vsync mode inspired by my Tearline Jedi research

- RTSS frame rate capping software now uses Scanline Sync, a low-latency tearingless VSYNC OFF mode by steering a tearline between refresh cycle. Guru3D's Unwinder was collaborating with me on this.

Meanwhile, still looking for co-credits, so I can finally opensource the shebang.
Update, one module released as Apache 2.0 -- not the demo itself, but for use by any people here.

blurbusters/RefreshRateCalculator on github

We have released cross platform mathematically-dejittering VSYNC listener that produces dejittered timestamps accurate enough for estimating a raster scanline number (to within ~1% error margin on AMD/NVIDIA GPUs, versus screen height)

It's used by both VSYNCtester and TestUFO.

It is suitable for creating crossplatform scanline sync applications -- like Windows' RTSS Scanline Sync and SpecialK Latent Sync on a different platform -- and is being considered by emulator authors to clone WinUAE's beamraced "lagless vsync" in a more crossplatform manner.

login

Go to top