pouët.net

Go to bottom

How do i do this ? :)

category: code [glöplog]
first tho*

its still the old concept of a backbuffer, which should still be used nowadays, to avoid tearing (while vsync is history unless you still use some CRT)

so its there anyway, blending the backbuffer by some amount less than 1 shouldnt be a problem, eh?
@ fragment, gargaj, kusma:

You folks are a bit uncreative. "In hardware" does not neccessarily only cover the GPU. The screen itself for example also is hardware. In fact, a while ago TFT screens had an effect similiar to this builtin (called "ghosting"). The problem was that it was not controllable, and thus the effect would happen regardless of if one wants it at a given situation or not. I suspect a screen could rather cheaply emulate space and temporal blur... but it would need to be controllable by the GPU. And that IMO is the problem: What are the odds of screen manufacturers AND gpu manufacturers agreeing on a standard to do such a thing?
added on the 2013-01-20 04:02:53 by myka myka
so in other words, "uncreative" or "reasonable"? :)
added on the 2013-01-20 12:20:51 by Gargaj Gargaj
So, GPU and screen manufacturers should get together and spend a ton of time and money to get this thing working, all in the name of making their hardware act like a really shitty old monitor purely to recreate one basic, trivially implemented demo effect?

I'd like to request a rotating wireframe cube feature in new monitors too, cause rotating cubes are so damn hard.
added on the 2013-01-20 14:21:19 by psonice psonice
@gargaj:
Replace "reasonable" with "realistic" and i'll agree :)

@psonice:
"spend a ton of time and money" <-- Wrong. The mechanics and in fact controllers to do this are already part of any current TFT. "Overdriving" for example uses the same tools - just to achieve the opposite. Digital cables also have the ability to send custom data alongside of video and audio. All that is missing is a protocol and a standard.

"one basic, trivially implemented" <- Nope. Implemented maybe. Processed not. If you have followed the evolution of GPUs a bit, you should know just how many shader cycles get burned by motion blur, FSAA and the like. This stuff is expensive, and the ressources spent on it, could be spent more efficiently for other things.

"Demo effect" <-- Maybe you just forgot the entire game industry.

What you are really argueing is that instead of doing something efficiently, one could as well just throw a lot of cycles and processing power at. That's a popular and - granted - "accepted" attitude. It's not mine. Yes, its an "unrealistic" proposal, not just because two sectors of the industry would have to cooperate, but also because neither of both would benefit from it - the only one benefiting from it would be users and programmers.

Now, since apparently no one can bring a good technical argument against the idea, i guess i have to play both sides: A big downside would be, that if a display is responsible for some effect, the "final" picture cannot easily be recorded anymore, unless you loop the output through a video in and then do video recording. I.e. easily doing a screencapute and posting something on youtube wouldn't work properly.... plus even if one were to record via loopback cable, and then play it back elsewhere, effects may be applied multiple times. Those are big downsides, especially in times like youtube and the like.
added on the 2013-01-20 15:18:02 by myka myka
myka: the biggest problem is still a given - framerate dependency.
added on the 2013-01-20 15:30:09 by Gargaj Gargaj
myka, also: Nobody is using this "effect" anymore. It's ugly as fuck. It has nothing to do with actual motion blur. It screws up your colors and replaces them by a banding artefact show. People only ever used it only because it was cheap to do and nothing better was around.

Oh, and a reasonable game/demo rendering pipeline is comprised of so many postprocessing steps that it's trivial to sneak some feedback into some blit operation that you do anyway and make it free, basically (instead of the tiny fraction of a millisecond it would take standalone). Guess why still nobody is doing it. Oh yeah, because it's ugly as fuck.

Seriously, get over your avatar pic and read up a bit on computer graphics in the last 20 years, will you?
added on the 2013-01-20 17:07:23 by kb_ kb_
Gargaj, I believe it's fairly straightforward to compensate for framerate differences.

Except for what kb_ said, of course :-)
added on the 2013-01-20 17:18:22 by revival revival
revival: the point isnt the blur factor, but the blur continuity.

imagine a square moving from left to right with medium speed: on a fast computer, the feedback motion trail would be fairly continuous because the square would move slower than one pixel per frame, whereas on anything slower than that, there would be a visible "onion skin" of the previous frame fading out instead of actual continuous motion blur.

don't get me wrong, it can be used for great effect (Halla did this I think), but it's not motion blur.
added on the 2013-01-20 17:43:54 by Gargaj Gargaj
Also it gets kinda messy if your frame rate skips around a bit.

And for using it for great effect (I mean - The Hobbit was doing so :D) normally you add some displacement or blur or color shifts to it. But that kinda kills all the abovementioned hardware shortcuts. Those stay with the ugly-as-fuck case. :)
added on the 2013-01-20 18:03:14 by kb_ kb_
Framerate compensation would be straightforwards IF the framerate were mostly constant on a given machine (thanks to the GPU being able to control area, effect and strenght).

Where i imagine it would get messy (and thus, framerate-dependency would become a problem), is when the framerate frequently changes. The problem is that when you send a frame to the screen, along with data about spatial/temporal blur, you need to know -ahead of time- what the framerate will be. But in practice, this is hard to know ahead of time, unless - well - the framerate is mostly constant. I really think this approach isn't feasible for situations where the framerate is variable on the same machine and scene.
added on the 2013-01-20 18:29:36 by myka myka
P.S.: The comments about "it being ugly as fuck" are made from ignorace... or perhaps lack of considering how such an approach would work nowadays. This wouldn't (at least not neccessarily) be your old TV blur, where the amount of blur was based real distance between LEDs. It technically has nothing in common with that (though, could emulate it). What it technically would be, is to manipulate the behaviour of LEDs on the screen beyond of what picture is to be shown... for example, spatial blur would not be created by "light leakage" but instead by making a triplet of LEDs (RGB) give some extra charge to neighbouring LED-triplets. It's about screen LED control, not about "light leakage".
added on the 2013-01-20 19:09:51 by myka myka
Gargaj, yeah I got that after thinking about it for a bit.
added on the 2013-01-20 19:18:25 by revival revival
they might as well include screen STFU control
added on the 2013-01-20 21:30:25 by maali maali
Quote:
The comments about "it being ugly as fuck" are made from ignorace...
Actually, those comments stem from the opposite: years and years of experience, but whatever.
added on the 2013-01-21 11:07:02 by gloom gloom
We all know years and years of experience result in an arrogant sense of superiority combined with a complete unwillingness to accept any change, right? And total lack of any clue results in deep insight. All true.
added on the 2013-01-21 11:35:18 by psonice psonice
you can also just buy a shitty LCD with a shitty refresh rate, no need to redesign GPU architecture as you have free motion blur!
added on the 2013-01-21 11:44:57 by maali maali
I seem to remember really REALLY old CRTs having a motion blur effect too. And CRTs have free, high quality hardware AA too. And you can probably get such an old CRT pretty much free (in fact people might pay you to take them away!)

This is clearly the future of gaming. Free AA and realistic, high quality motion blur. The curved glass and poor beam focusing even give you slight fisheye and depth of field for free!

It's no wonder games look so unrealistic now compared to the 80s.
added on the 2013-01-21 12:27:23 by psonice psonice
THE FUTURE IS AMIGA!
added on the 2013-01-21 12:36:42 by maali maali
psonice:

Quote:
Nobody is using this "effect" anymore. It's ugly as fuck. It has nothing to do with actual motion blur. It screws up your colors and replaces them by a banding artefact show. People only ever used it only because it was cheap to do and nothing better was around.

Oh, and a reasonable game/demo rendering pipeline is comprised of so many postprocessing steps that it's trivial to sneak some feedback into some blit operation that you do anyway and make it free, basically (instead of the tiny fraction of a millisecond it would take standalone). Guess why still nobody is doing it. Oh yeah, because it's ugly as fuck.


:)
added on the 2013-01-21 13:24:18 by gloom gloom
Nah, kb's as clueless as the rest of the gamedevs and graphics people. Old CRTs are definite the future :P
added on the 2013-01-21 13:41:31 by psonice psonice

login

Go to top