pouët.net

Go to bottom

particles & DOF

category: code [glöplog]
Hello, It's been a long time since I posted. So I have this problem which bothers me since a week. I want to write a proper DOFed particles.
Currently what I do is to render particles to offscreen buffer like explained here:
http://http.developer.nvidia.com/GPUGems3/gpugems3_ch23.html
store depth or blurriness of particles and I DOF them separately. After that I combine the offsceen particle buffer with color. And I got halos like in the article, but those halos come from other thing. They came out because of color bleeding when dofing, so It's pretty hard to figure it out with sobel. Do you know how to handle it?

I'm attaching a screenshot for better explanation.
BB Image
added on the 2012-06-12 17:38:04 by bonzaj bonzaj
use the mip level bias of the particle texture for the focal distance and render direct. a dither kernel sampler is standard i guess.
added on the 2012-06-12 17:45:44 by yumeji yumeji
Quote:
Hello, It's been a long time since I posted.

Posters anonymous?

Try http://www.slideshare.net/DICEStudio/five-rendering-ideas-from-battlefield-3-need-for-speed-the-run page 12.

If your particles were simple dots or lines you could simply calculate the particle radius and transparency according to the COC (radius), but it doesn't look like it is in the screenshot.

You can also try having a look at http://developer.amd.com/gpu_assets/ShopfMixedResolutionRendering.pdf
added on the 2012-06-12 18:02:00 by xernobyl xernobyl
bonzaj: this problem has been bugging me too .. :)

added on the 2012-06-12 18:07:28 by smash smash
i've never been in this territory, but it might be some "wrong" maths you are doing? like you have to DOF the alpha channel, but not the color of the particles? or use different equations for DOF-ing color and alpha? again, this is unkown territory to me, so just saying out of intuition.
added on the 2012-06-12 18:39:42 by iq iq
Let's try a shot in the dark: Are you using premultiplied alpha? Because it's the only way to get blending+compositing right, and everything else will look horrible and produce halos not unlike the ones in your screenshot.
added on the 2012-06-12 18:48:07 by kb_ kb_
(by that I mean: clear the buffer with 0,0,0,0, render the particles with premultplied alpha and blend them into the offscreen buffer using factors one/invsrcalpha - then blit the offscreen buffer to the screen ALSO using one/invsrcalpha).
added on the 2012-06-12 18:50:18 by kb_ kb_
Non-premultiplied alpha blending looks like a hot contender from the screenshot, indeed.
added on the 2012-06-12 19:59:12 by kusma kusma
yumeji: tried that - looks really bad
kb: yep I'm doing it this way and it gives me headache in terms of math.
My idea was to store the culling information about particles and render them into offscreen without any culling - this way I would have a complete information about the color behind the object. Then to blur the culling information with DOF and use it to blit on screen. I failed havily with that approach resulting in culling artifacts near intersections (but the halos were partially gone ;) ).

I know the remedy for that - deepth peeling of the whole scene inluding partiles and then raytraing the slices when doing DOF. The problem is that I would have around 2 fps with current hardware (but it scales \o/ ). I just need some kind of trick - I think that unreal 4 guys are just not writing Z of particles and making a wrong DOF on them. A lot of people will not see it with common scenes. On the other hand they might know how to do it.

Another thing that bothers me is how to mix raymarched fluids with particles. I was doing it also on offscreen premultiplied alpha buffer (the fluids work with it) and trying to raymarch fluid until I would find an intersection with particles. Then I would blend the fluid that's behind the particles with particles and with fluid that is in front. Unfortunately with premultiplied alpha magic I failed with maths and it just sucks from visual standpoint.

iq: I also thought about it to use different equations for alpha and color but if you look in the paper I posted, there's this awful sequence of (1-a1)(1-a2)...(1-an) alphas that ruin standard blending thinking. But I encourage you to think about it :).

xernobyl: thanks for the papers I'm looking on them now.

added on the 2012-06-12 20:13:12 by bonzaj bonzaj
I should also get into DOF particles eventually. I'm currently thinking in doing it by storing preblurred sprites on a 3D texture and do the usual particle stuff with the addition of choosing the right blurred slice from the 3D texture, instead of using always the same texture.
added on the 2012-06-12 20:44:28 by xernobyl xernobyl
bonzaj: Why not just pre-filter the billboards into a 3d-texture, and look up with the CoC as the z-component?
added on the 2012-06-12 20:48:32 by kusma kusma
I mean, the problem with doing DoF as a post-process on particles is transparency. You need to store multiple z-values per pixel to avoid nasty artifacts.
added on the 2012-06-12 20:50:43 by kusma kusma
kusma: yep order independent transparency would help here, on the other hand the preblurred particles might by nice to investigate but:

1. you really want to spend that 3d texture on animated alpha fade so you would need 4D texture for blurred ones.
2. blurring blended particles gives nicer results visually.
added on the 2012-06-12 20:58:09 by bonzaj bonzaj
bonzaj:
1. Sure, if you want to animate the bitmap data, then you need another dimension. It sounds a bit wastefull to use a full RGBA-texture if all you're animating is the alpha, though. In that case I'd use an RGB-texture and a separate alpha-texture or something. Perhaps with careful filtering you can have the alpha-texture in lower resolution, even?
2. I don't agree. Properly pre-filtered billboards look damn awesome.

But perhaps you can get a cheap forward-rendering blur by using summed area tables? Then you don't end up wasting the z-dimension of the texture for the DoF, but still get a nice constant-speed kernel.
added on the 2012-06-12 21:28:29 by kusma kusma
I have an unstoppable urge to post this.
added on the 2012-06-12 21:37:03 by hornet hornet
kusma: I may try the summed area table bluring. Can you post me some examples of forward rendering particle blur (except for we cell of course ;) )
added on the 2012-06-12 21:39:50 by bonzaj bonzaj
hornet: Yeah, no. We cell isn't exactly top-of-the-line these days. It's biggest problem compared to today's standards is, well, that the blur itself pretty much sucks :)

bonzaj: I don't have anything at hand that I can give you. But I have done some experiments with it that worked out pretty well.
added on the 2012-06-12 22:10:43 by kusma kusma
kusma: it doesn't need to be done by GNU Rocket Corp. If you know sth that was achieved with that solution it will be fine. I just want to check if it's worth digging into.
added on the 2012-06-12 22:45:39 by bonzaj bonzaj
bonzaj: Looked at the "alpha blending" section of the article and it frankly does not make any sense at all. Pure premultiplied alpha, as in "every freaking color value you have in the whole chain has its alpha premultiplied, NO EXCEPTIONS" (which means if you've got non-premultiplied textures, you do a premul pass on them before uploading them to the GPU, NOT in the shader, the interpolation is very much dfifferent) is absolutely easy. Just use srcblend=1, destblend=invsrcalpha for every blend operation and you're done. No need to handle color and alpha channels seperately, and every filtering/blur/compositing operation comes out mathematically correct when you just handle all 4 channels at once.

There's lots of papers out there explaining it in various levels of detail but it helps to imagine a premultiplied r,g,b,a tuple as a homogenous color value, with the alpha being the "w" :)

I don't see how this could be giving any headaches - the part in the article clearly shows the author doesn't know this. Which is no shame, I also spent years being a game dev professional without realizing that whoever propagated srcalpha/invsrcalpha was an idiot. :)
added on the 2012-06-13 00:03:41 by kb_ kb_
OK, now I get it... you accumulate the particles with alpha and THEN you try to DOF them? How is that practical? Of course pre-blurred particles will come out wrong after culling but other than that this whole effort smells like one of the hundreds of techniques that are all the rage for three papers and then get discarded because they turn out to be just too impractical :)

added on the 2012-06-13 00:21:38 by kb_ kb_
kb: well you just answered your question: as long as it works perfectly fine, the culling brakes everything down. I'll give forward blurring a try, It doesn't break the whole offscreen particle rendering part. I'll check the premultiplied alpha thou, since I never heard about it before.
added on the 2012-06-13 00:35:32 by bonzaj bonzaj
one hint then: when debugging stuff with premultiplied alpha, everything always looks like it's properly blended against black. Every texture, every rendertarget, they all should look this way. :)
added on the 2012-06-13 00:51:23 by kb_ kb_
Ah yes, the old "these trees look like cheap rearview-mirror-christmas-decoration" issue.
added on the 2012-06-13 01:02:09 by superplek superplek
Hornet: thanks for a great read. I always had problems with those. This time however the issue is completely different and it comes from the lack of information on what color is behind a blurred object.
When I don't cull then I got a proper bleeding but of course the particles are everywhere. The solution is to find a way how to properly cull when combining the offscreen particle buffer with main render target. Of course soft particle style culling would be best
added on the 2012-06-13 08:37:17 by bonzaj bonzaj

login

Go to top