pouët.net

Go to bottom

OpenGL multisampling problems

category: general [glöplog]
 
Hi fellow pouëteers.

I have a problem concerning OpenGL and GL_MULTISAMPLE. I know there are forums for this and everything...
Anyway. I create a render context using a multisample pixel format and checking for the highest possible number of samplebuffers (<= 8). Then I enable/disable multisampling with glEnable/glDisable(GL_MULTISAMPLING). Standard stuff.
The problem is that turning mutlisampling on/off only works on some systems and fails on others, effectively giving me multisampling when I don't need/want it.
I tried getting GL_SAMPLE_BUFFERS and GL_SAMPLES. The first one always returns 1 (multisampling support) or 0 (no multisampling). The later returns 2 on systems where switching works and other values (8) on systems where it doesn't. The thing that puzzles me is that i get a rendering context with 8x-multisampling and GL_SAMPLES returns 2...
The gfx cards are Nvidia FX 5xx (works), Nvidia FX 1100 (works). Doesn't work on the laptop I tried it on, which has some Nvidia Geforce 6xxx chipset, and another PC with some Geforce 4 Quadro something chipset.
The driver settings are adjusted to let the application control antialiasing. Drivers are more or less fresh versions...

Oh, and I'm using Qt4 to get the render context here...

any ideas?
added on the 2007-03-08 02:03:11 by raer raer
Never used GL_MULTISAMPLING but I use GL_MULTISAMPLE_ARB and it seems to work fine on most configs I tried.

Did you check that?
http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=46
My code is based on this example.
added on the 2007-03-08 02:14:13 by keops keops
OpenGL is death!!!

killed by Vista.

even the allmighty god Carmack is coding for DX and the 360 now :(
added on the 2007-03-08 06:54:37 by Zest Zest
Quote:
Standard stuff.


OpenGL makes even standards exciting ;)
added on the 2007-03-08 08:41:05 by Preacher Preacher
for me GL_MULTISAMPLE and GL_MULTISAMPLE_ARB have the same extension number...
The strange thing is I don't get an gl error. I query glGetError() after glDisable(GL_MULTISAMPLE), but it returns GL_NO_ERROR, so it should have disabled multisampling.

[quote]OpenGL is death!!![/qoute]
Yeah, that's 'cause people stop using it, it supports everything DX can do... and Vista kills GL because it uses all of the texture memory for the aero-windows. There are other OS's though and they usually don't support DX...
added on the 2007-03-08 09:14:16 by raer raer
OpenGL state machine = CANCER

Anyways, you should also check the gfx card's driver preferences, since they usually allow to force multisampling to on/off or any sample size...
added on the 2007-03-08 09:21:36 by kurli kurli
i don't force the settings... the are set to "application defined" on all machines...
added on the 2007-03-08 09:26:57 by raer raer
gfx cards = state machines

IMHO DX will do the same as GL behind the scenes, you just don't see it...
added on the 2007-03-08 09:29:21 by raer raer
all sequential processes and events = state machines
hi,

well first of all make sure to check you do like in the tutorial keops pointed out, you have to build up a first normal window to get the extensions and check if you have multisampling, and if so, destroy it and make a new one with another pixel format.

granted this is done fine, other problems might come from the fact that on some configs it's possible to enable multisampling but not disable it... so it's typically the kind of feature you'd enable at startup and never touch.. ;)

hope this helps somehow,
added on the 2007-03-08 09:36:58 by nystep nystep
rarefluid: some nvidia-cards enable high-amounts of "multisampling" by combining multi- and super-sampling. In some systems, supers-sampling can't be turned off. Perhaps this is what's happening? Try selecting 2 or 4 samples...

And no, d3d won't do the same, since multisampling is a render-target-configuration, not a state-setting (which makes soo much more sense).
added on the 2007-03-08 10:44:29 by kusma kusma
Quote:
even the allmighty god Carmack is coding for DX and the 360 now :(


Carmack writes code?
added on the 2007-03-08 10:52:39 by すすれ すすれ
...seems to be working on FX cards, but not on the Go-ones (laptop cards)
added on the 2007-03-08 18:48:56 by raer raer
well, it seems to work like nystep said. You can enable it, but can't disable it...
It also seems that if I glEnable multisampling on a window that already has some sort of antialiasing, the picture gets "even more multisampled"... The only safe way to turn it off by application is to NOT acquire a multisampling render context. I need a NON-multisampling context for 2D stuff and I might try using different windows for 2D and 3D. hope that'll work.
I'll do some more tests next week...

Thanks for your replies so far guys!
added on the 2007-03-09 00:33:16 by raer raer
Well. Guess what. Problems are gone with either a new driver or - I rather suspect this - a new version of the Qt library (4.3.0)... That stuff drove me crazy and now it is fucking gone...

And a note on a similar problem: If you're using "GL_GENERATE_MIPMAP_HINT_SGIS" for textures on cards that tell you they support the extension "GL_SGIS_generate_mipmap". DON'T TRUST THEM! Some older cards only support it in SOFTWARE (to be OpenGL 2.0 compliant)!
It will be fuckin slow, that's how you'll notice...
I suspect the Quadro FX1100 (Geforce 5700), Quadro FX500/600 (Geforce 5x00) and probably some ATI Radeons too here...

jsyk.
added on the 2007-07-10 11:19:31 by raer raer
you could also give a look at opengl.org forum.
added on the 2007-07-10 11:27:43 by SilkCut SilkCut
Quote:
That stuff drove me crazy and now it is fucking gone...
Or maybe gone fucking. In that case: it will be back!
added on the 2007-07-10 20:25:45 by Joghurt Joghurt

login

Go to top