pouët.net

Go to bottom

ATI+OGL+Geometry Shaders

category: general [glöplog]
Is it just my new HD4650 which sucks so bad it doesn't support GL_EXT_geometry_shader4, or is it AMD who won't provide that particular extension in their drivers?

glewinfo tells me that, although GL_EXT_geometry_shader4 is not in the extensions string, the entry points to all functions can be found (using Catalyst 9.1); trying to compile a geometry shader (GLSL v1.20) won't work, all gs relevant keywords are not recognized by the compiler.

If I want to use geometry shaders: Do I have to restrict my audience to nvidia only (because nvidia cards *do* support GL_EXT_geometry_shader4) or to vista only (directX 10)? Has anyone had any luck in using geometry shaders with ATI and OpenGL? I'd really like some help, since you all know that vista/nvidia only demos suck.
added on the 2009-02-16 17:58:28 by xTr1m xTr1m
just do it all in 100% ASM and it will work
added on the 2009-02-16 18:03:27 by kusma kusma
ATI + OpenGL = CANCER

(btw, even my crappy GMA950 claims to support GL_EXT_geometry_shader4, hahaha)
added on the 2009-02-16 18:38:48 by blala blala
you're right in that vista only demos suck. nvidia only is completely acceptable though.
added on the 2009-02-16 18:53:08 by pommak pommak
No, it's not.
I agree, nvidia-only is weak, graphics card vendor wars are weak, if you want everyone to see your demo, don't limit it to one vendor.
added on the 2009-02-16 23:49:01 by Claw Claw
Oh and by the way you're right they don't support that extension, and a lot of others. No idea why but ATI's opengl support has always been rubbish.
added on the 2009-02-16 23:55:27 by Claw Claw
EXT_geometry_shader4 is nvidia's very own extension, however the very similar ARB_geometry_shader4 has been ratified (but is not part of 3.0) so use that instead.
The fact that you can see the entrypoints but not in the extension string usually means that they are currently being implemented. Ofcouse you shouldn't expect them to work when the extension is not exposed.
I don't have it in my beta either, so it most probably won't be in 9.2.
So, if you're doing something for BreakPoint, the question is if we can expect it to work (stable) in 9.3, 9.4 or 9.5...
As it's a real/important hardware feature (unlike someones use of ext_texture_rectangle or reliance on nvidia bugs) and it's about to be implemented / maybe not quite stable yet / maybe ready for the release, I think it would be ok to use it in this case.
added on the 2009-02-17 00:35:07 by Psycho Psycho
sadly, ARB_geometry_shader4 is missing at all... all corresponding function pointers are 0. I quickly checked over the 3.0 specs, there's no geometry shaders there. So until ATI makes them available in OpenGL (3.1?), demos that want to use the geometry shader are limited to nvidia or vista... that sucks, considering that geometry shaders are not that "new".

So I guess noone has had any success in this matter.
added on the 2009-02-17 14:53:49 by xTr1m xTr1m
Make two codepaths and put in a text insulting ATI when you can't show what you'd want?
added on the 2009-02-17 16:08:28 by Preacher Preacher
If I would be the only coder, I'd probably do that... but if there are two coders and just one nvidia card, we've got a problem...
added on the 2009-02-17 16:52:49 by xTr1m xTr1m
Get the coder with the degenerate card/driver to write a software fallback.
Gee thanks, that'd be me. And no, no software fallback... a rather inefficient vertex shader fallback is possible... but still... many planned features like gpu shadow volume creation (amongst others) wouldn't be possible.
added on the 2009-02-17 17:18:55 by xTr1m xTr1m
The EXT and ARB version seems identical, so they will probably come at the same time. Have you tried the older drivers to see when the entry points were introduced? It's an important/real feature so it's probably coming before being "forced" by 3.1
And ofcourse your 4650 supports geometry shaders, only thing missing compared to 48x0 is the double precision support (which I don't think you can access from graphics apis anyway)
added on the 2009-02-17 17:20:23 by Psycho Psycho
xTr1m, sorry ;)
Psycho: Do you have any reliable source of information? We were targetting evoke 09 for release, It'd surely be cool if ATI could come out with GS support within a few months. But that's just an utopic and optimistic hope.
added on the 2009-02-17 22:59:51 by xTr1m xTr1m
Not really, I have beta driver access, but am not really in contact. You could try asking on opengl.org, or even write devrel@amd.com (show them some ogl interest;)
If it's on the way I would expect to be in place in good time for evoke, but you never know. That's also why I asked when the entry points came in - 8.12 would be better than 9.1 ofcourse , but if they had been there since 8.6 or something it wouldn't be a good sign.. ;)
added on the 2009-02-18 01:48:53 by Psycho Psycho
Psycho: They were missing in 8.12.
added on the 2009-02-18 10:08:59 by xTr1m xTr1m
you'd better switch to d3d then...
added on the 2009-02-18 11:13:20 by hcdlt hcdlt
hcdlt: Geometry shader support was introduced with DirectX 10, which only runs in vista, not in XP.
added on the 2009-02-18 13:29:58 by xTr1m xTr1m
create dx9 fallback then, in case demo is running under xp. or screw xp and use dx10 only. or wait one year, people will be switching to windows 7, it's much better than vista, and then coding dx11-only stuff will make far more sense than coding dx10-only stuff currently.

no, seriously. use dx10 with dx9 fallback 'just in case', or screw geometry shaders. effects are less than half of success in case of demo, imho. it's not 4k, where effects are almost everything you can do.
added on the 2009-02-18 14:10:52 by unic0rn unic0rn
unic0rn, if he's going to write a fallback he may aswell not switch to D3D.
imho dx10 with dx9 fallback is better than ogl with ogl fallback, because everyone with vista/win7 and decent gpu (ati, nvidia, doesn't matter) will be able to see geometry shaders in motion. which is not the case with ogl, ati owners will be screwed, hell knows for how long. depending on software is better than depending on hardware imho. (yeah, i know - you can get pretty nice gpu for vista's price, but some people just hate nvidia and coding nvidia-only stuff is as bad as coding intel-quadcore-only stuff. well, almost.).
added on the 2009-02-18 14:30:22 by unic0rn unic0rn
Yup, true about ATI owners having to wait to see the geometry shaders in action. From a coding point of view though, writing a dx9 "fallback" involves writing an entire abstract renderer - not the case with GL.
that's why i've suggested 'screw geometry shaders' as an option. it can wait, demo should be worth much more than it's shaders.

and one can always do software raytracing:P
added on the 2009-02-18 14:59:20 by unic0rn unic0rn

login

Go to top