Unreal Engine - SDF Ambient Occlusion
category: code [glöplog]
Epic game recently published a paper show you how unreal engine works to compute AO with the signed distance field of the scene.
Even with this explanation, I don't understand how they deals to compute the SDF.
They use a SDF 3D texture atlas (for better memory usage) which have the SDF and the transform of each objects separately.
But how can we traversal the SDF quickly with that ? If we have 1 millions of trees in the scene, how can we know the right offset of the textures without min() each objects ?
In a more general way, what is the best method to generate the SDF of a polygonal scene do you use ?
Even with this explanation, I don't understand how they deals to compute the SDF.
They use a SDF 3D texture atlas (for better memory usage) which have the SDF and the transform of each objects separately.
But how can we traversal the SDF quickly with that ? If we have 1 millions of trees in the scene, how can we know the right offset of the textures without min() each objects ?
In a more general way, what is the best method to generate the SDF of a polygonal scene do you use ?
Last question first, you can do a classic raytracing intersection test against the geometry to generate the SDF.
For the question about how they test against the whole scene fast in real time? Well first they have a tight bounding box on each object as you can see in the paper so they are only marching a few steps. What they also have if you read further is a matrix around the camera that stores only the objects which are visible, this is only updated as the camera moves. It says in the paper the worst case on this is the initial setup or if the camera gets teleported to a completely new location. Other than that they just scroll the clipmap and only need to update the edge cases per frame. You can read this on page 51 in that document. So yes there may be a million trees in the entire scene but of course everything is culled down to only what is viewable before doing anything else.
Other thing to remember is this is only making a basic blurred sky occlusion map. So they run this raymarching at half screen size and with not a very accurate marching and then blur it and upscale.
Apparently the sourcecode is available so you could always look at that. I never seen any of it myself but I heard the UE sourcecode is always quite well commented and documented so maybe it's worth taking a look.
For the question about how they test against the whole scene fast in real time? Well first they have a tight bounding box on each object as you can see in the paper so they are only marching a few steps. What they also have if you read further is a matrix around the camera that stores only the objects which are visible, this is only updated as the camera moves. It says in the paper the worst case on this is the initial setup or if the camera gets teleported to a completely new location. Other than that they just scroll the clipmap and only need to update the edge cases per frame. You can read this on page 51 in that document. So yes there may be a million trees in the entire scene but of course everything is culled down to only what is viewable before doing anything else.
Other thing to remember is this is only making a basic blurred sky occlusion map. So they run this raymarching at half screen size and with not a very accurate marching and then blur it and upscale.
Apparently the sourcecode is available so you could always look at that. I never seen any of it myself but I heard the UE sourcecode is always quite well commented and documented so maybe it's worth taking a look.
Aaahhhha!
Am I missing something? Is there a paper? All I can find are presentation slides.
Refference links on The last 5 pages of The linked pdf.
Yeah. This partially inspired me in The last and got me into signed distance functions.
But they almost only use it for lowres occlusion mapping. Still impressive.
Yeah. This partially inspired me in The last and got me into signed distance functions.
But they almost only use it for lowres occlusion mapping. Still impressive.
They use it for some cheap bounced sunlight etc too