pouët.net

Go to bottom

Weird OpenGL errors?

category: general [glöplog]
 
So I digged up some game I was posting about months ago already (timing issues) and it looks like something must be wrong with my OpenGL code, but I fail to see *what* is going wrong. First of all, I programmed the game on a Radeon HD 2600 XT (desktop pc). Now I'm on a laptop which has a HD 3750 Mobile or something like that and an Intel Chip. On the left side, you can see the Radeon graphics (fucked up) and on the right side the Intel chip (that's what it also looked like on the old Radeon):

BB Image

A less notable thing (which is not too important) is that the textures are a bit too dark on the Radeon card (just compare the tree colours).
The more severe thing is that cloud and the pie chart. It is built from a few display lists which are basically just vertex coordinates:

Code: glNewList(glJetpackPie + i, GL_COMPILE) glBegin GL_TRIANGLE_STRIP glVertex2f(Sin(f - pie_piece_width), Cos(f - pie_piece_width)) glVertex2f(Sin(f - pie_piece_width) / 10, Cos(f - pie_piece_width) / 10) glVertex2f(Sin(f), Cos(f)) glVertex2f(Sin(f + pie_piece_width) / 10, Cos(f + pie_piece_width) / 10) glVertex2f(Sin(f + pie_piece_width), Cos(f + pie_piece_width)) glEnd glEndList


Those are called in the program like this:

Code:glColor4f(jetcolor.r, jetcolor.g, jetcolor.b, jetcolor.a) glCallList(glJetpackPie + i)


I would expect now that the vertexes get the proper RGBA values, but even if I just use one specific colour there, they stay completely white on the Radeon card. What the heck is going wrong there?
Try using gl intercept.
http://glintercept.nutty.org/
added on the 2009-11-18 23:36:45 by xernobyl xernobyl
Very strange. By using the opengl32.dll from glIntercept, the graphics actually look correct on the ATI card. O_o
looks to me like some uninitialized "something", which might be different on both machines... as for OpenGL, with the current drivers, one should never expect any default value, and everything should be set up explicitly for every context
added on the 2009-11-19 18:09:51 by Jcl Jcl
If it suddenly works with glIntercept, it might be a problem related to multi-threading attempts of the graphics driver. I had that a while ago on nVidia, and it's absolutely possible that ATI has similar bugs.
To find out where the critical spots are, put a few glGet() calls in your code -- this forces the graphics driver to flush all pending commands and wait for the graphics chip until they're actually done. (And no, glFlush() is not sufficient.) If this still doesn't help, try glReadPixels() :)
added on the 2009-11-19 18:38:29 by KeyJ KeyJ
if you use more than one context, check that wglSetContext() is set correctly for any GL operation on your thread.
added on the 2009-11-19 18:46:24 by krabob krabob
Quote:
If it suddenly works with glIntercept, it might be a problem related to multi-threading attempts of the graphics driver. I had that a while ago on nVidia, and it's absolutely possible that ATI has similar bugs.

Actually, the driver of this card are optimized for OpenGL (it's from the FireGL series) - do you think that this could be an issue?

login

Go to top