How to filter 64bits cubemap with ati card
category: general [glöplog]
I hear this is not possible with an ati card.
Does anyone know 1. if this IS possible ?
and 2. how to do this ?
thanx in advance for advice.
Does anyone know 1. if this IS possible ?
and 2. how to do this ?
thanx in advance for advice.
By 64 bits - you mean 4xFP16 one? (because there are lots of 4x16 integer formats that are filtered by lots of ATI hardware).
If its about 4xFP16 - then yes, no current ATI hardware does filtering of FP16 textures. So either you do not filter, or roll out filtering yourself in the pixel shader.
If its about 4xFP16 - then yes, no current ATI hardware does filtering of FP16 textures. So either you do not filter, or roll out filtering yourself in the pixel shader.
gf can.
but than you have a gf only demo and that suxx
isnt their a coding trick which can solve this problem
and atleast it looks like a 64bits cubemap is filtered
with ati card or something like this
if you know what i mean?
but than you have a gf only demo and that suxx
isnt their a coding trick which can solve this problem
and atleast it looks like a 64bits cubemap is filtered
with ati card or something like this
if you know what i mean?
Hm... you can always blur your cubemap a bit and then just point sample! :)
Other than that... either do filtering yourself in the pixel shader, or use high-precision integer formats. ATI has nice whitepaper in their latest Radeon SDK that discusses use of integer textures for HDR and stuff - it's quite feasible.
Other than that... either do filtering yourself in the pixel shader, or use high-precision integer formats. ATI has nice whitepaper in their latest Radeon SDK that discusses use of integer textures for HDR and stuff - it's quite feasible.