pouët.net

Go to bottom

determining why glsl shaders fail on ati

category: code [glöplog]
I work on a project which generates shaders at runtime, and we are trying to move from Cg to GLSL. The new GLSL stuff seems to work great...but only on nvidia hardware.
I'm not really experienced enough to be able to look at the shaders and determine what is angering the ati driver/hardware gods. Is there a nice list of such things to look out for or any write-ups/posts someone could direct me to about this compatibility issue?

FWIW, here are some example shaders:
http://min.us/mboAd2fK4t#1o
added on the 2012-01-04 09:16:22 by shuffle2 shuffle2
Being NViDiA guy myself, I've found ATI's ShaderAnalyzer very useful for pinpointing the problems - did you give it a try? Perhaps the most common pitfall I keep running into is skipping channels when storing color output.
added on the 2012-01-04 09:34:50 by kbi kbi
what i learnt first was:
always have a dot in your floats...f.e. "float a=1;" should be "float a=1.;" atleast. (1.0 works ofcourse, but if you are trying to make it tiny you can skip zeros) while "0" itself should be "0." or ".0", doesnt matter.
i hated having to do so back when i had a nvidia-card and all worked nice except on other peoples ATI-Cards, but what has to be done has to be done :/ (in the end its very few bytes you "loose" to that tho.)
you also could sketch your shaders in "ATI Rendermonkey" (its free, google it, download/install it, love it), if they work in there they should compile in your code aswell.
shuffle2: In my experience, outputting the compilation-log even if the compilation succeeds, and fixing the warnings reported by the NVIDIA compiler fixes 99% of ATI-incompatibilities (or rather, NVSL-dependencies).

Another option is to declare the shader version (i.e "#version 100" even if you only use GLSL 1.0 features), this seems to enable a strict-mode in NV's GLSL compiler.
added on the 2012-01-04 09:54:13 by kusma kusma
hArDy: Rendermonkey doesn't prevent all NV-ism in shaders, so that won't help.
added on the 2012-01-04 09:55:08 by kusma kusma
I had a look to your shaders. They are all wrong, in that they declare floating point numbers with an f. For example (Shader003),

Code: vec3 c01 = (c0 + c1) * 0.5f;


is not legal GLSL. It should be:

Code: vec3 c01 = (c0 + c1) * 0.5;


Another mistake you are doing is using saturate(). That doesn't exist in GLSL. For example, in shader Shader119, repplace

Code: prev.a = saturate(cprev.a*crastemp.a);


to

Code: prev.a = clamp(cprev.a*crastemp.a, 0.0, 1.0);



Another error is to use frac(), does doesn't exist in GLSL. Use fract() instead. For example, in Shader119, replace

Code: cprev = frac(prev * (255.0f/256.0f)) * (256.0f/255.0f);


with

Code: cprev = fract(prev * (255.0f/256.0f)) * (256.0f/255.0f);



etc etc etc. Basically, your shaders will work in GLSL when they follow the GLSL standard. More info, in the Wikipedia: http://en.wikipedia.org/wiki/GLSL

i know i can be bitchy, but eh, how many times do we have to answer the same question?



added on the 2012-01-04 10:15:00 by iq iq
shuffle: Read and follow the spec (btw seems the document changed a lot since I last looked) instead of using what happens to work in "nvsl" ;)
IIRC it also covers when it's needed to specify float constants and when it's not (but I guess that's only really relevant for size coding)

hardy: rendermonkey (naturally) uses the opengl driver on the system, so no, that won't help. But as kbi says ShaderAnalyzer is really useful to test if shaders at least compile (which will catch most pitfalls).
And as kusma says, nvidia's compiler is actually quite strict (ie glsl and not nvsl) if you start specifying the version - at least in the higher versions I remember it as more strict than amd's.
added on the 2012-01-04 10:22:53 by Psycho Psycho
OK, thanks everyone!

iq: It didn't come off as overly bitchy. however, we #define frac and saturate, among other things, when needed (ie when generating glsl):
Code: #define frac(x) fract(x) #define saturate(x) clamp(x, 0.0f, 1.0f)


So with any luck this will just leave the problem to the syntax of float constants. Thanks a lot for looking at the files though.

also we are currently using
Code: #version 330 compatibility

so i'll fiddle with this to make it more strict
added on the 2012-01-04 10:55:41 by shuffle2 shuffle2
why not just use the standard naming instead of macros ? :)
What iq and others said basicly. You could have easily detected all these errors using AMD's GPU ShaderAnalyzer , and just stick to the specification of GLSL 330 that you can get here.

It's really annoying to see people just blaming xy hardware vendor or z API instead of actually trying to figure out what is going wrong. :/ OpenGL and ATI work fine, you just have to produce good code and test. It's not so difficult.
added on the 2012-01-04 13:37:03 by nystep nystep
Quote:
The new GLSL stuff seems to work great...but only on nvidia hardware.


so gl continues to be a total jackass festival, regardless of what iq justly pointed out.
added on the 2012-01-04 13:40:38 by superplek superplek
Quote:
so gl continues to be a total jackass festival, regardless of what iq justly pointed out.

No, nvidia just continues to make their drivers more "developer friendly" than "standards compliant".. =)
added on the 2012-01-04 14:43:03 by sol_hsa sol_hsa
sol: No, it's more that NVIDIA fucked up a long time ago (they used the same compiler for Cg and GLSL), and doesn't want to break existing code that depends on this misbehavior.

So they need to default to non-strict compilation, and find a future-proof way to introduce strictness. Enabling it on "#version" makes sense as such.

In other words, they are "user friendly" rather than "developer friendly", as people get to play the games they bought, even if the developer went belly-up after shipping.
added on the 2012-01-04 14:58:21 by kusma kusma
kusma: thanks for that. I've long thought nvidia were just catering to lowest-denominator coders or something with their glsl compiler, now I can move from general disgust to grudging acceptance of the situation :)
added on the 2012-01-04 15:32:18 by psonice psonice
Informative thread, spread the word.
added on the 2012-01-04 17:38:04 by Claw Claw
Don't only blame NV - ATI shader compilers are crap too. _really_
added on the 2012-01-04 17:51:52 by las las
kusma: Yeah I figured it'd be that, the "Cg" conflict. But any more or less standardized API that allows major vendors to pull off jackassery like that for whatever legacy reasons they have.... is questionable :) But I know all this doesn't really help the discussion so I'll leave it at that.
added on the 2012-01-04 17:52:34 by superplek superplek
(then again NV also has told "people" that if "you set obsolete renderstate X to Y" your "game will run better come next driver release", so it's all good business I guess)
added on the 2012-01-04 17:53:46 by superplek superplek
i couldnt even compile my last intro on ATI. (gets stuck without bailing out, no error(-message), taskmanager needed. still no fix for that, thats why theres no FINAL yet :/ )
fuck them shader compilers!
rasmus: because currently we can still produce Cg shaders.
added on the 2012-01-04 18:08:20 by shuffle2 shuffle2
las: +1
added on the 2012-01-05 13:09:29 by xTr1m xTr1m
I have been curious for some time how does Apple deal with this problem. Anyone knows? I find it doubtful that the Jobs would allow such discrepancy in 'his' OS and hardware, but there does indeed exist Nvidia, ATI and Intel gfx Macs.
added on the 2012-01-05 17:08:58 by Yomat Yomat
I doubt he'd give a flying fuck about differences between GLSL compilers tbh, but who knows :)

Never had a mac with nvidia somehow, but I've not seen anything like the nvidia-only mess we've seen over the years on PC (and I'm involved with other scenes where a problem like this would be well known). In general ati/nv incompatibilities are rare. If anything I'd say ATI has a better reputation for drivers than nvidia on mac, but it's nothing people really bitch about. The general state of opengl and various other stuff gets plenty of complaints though ;)
added on the 2012-01-05 17:28:02 by psonice psonice
wow, someone quoting a dead jobs in a glsl context. props for that feat, really. and i thought i had seen it all.
added on the 2012-01-05 18:01:32 by superplek superplek

login

Go to top