d3d is pink
category: general [glöplog]
something weird i noticed with recent demos: all of the demos coded for opengl work perfectly on first try, but most d3d demos just crash, produce weird flickering, set my screen to insane refreshes, refuse to initalise and do many other weird things instead of producing a decent watchable demo or intro.
now, does that mean that -
a) d3d is much less stabel a 3d api than ogl?
b) it is much harder to write stabel code for d3d?
c) people coding for d3d are worse coders :) ?
d) any combination of the above?
please enlighten me :)
now, does that mean that -
a) d3d is much less stabel a 3d api than ogl?
b) it is much harder to write stabel code for d3d?
c) people coding for d3d are worse coders :) ?
d) any combination of the above?
please enlighten me :)
most probably you are using the debug runtimes. not everybody tests their code using the debug runtimes or something. but could be a driver issue aswell :)
you have several similar problems with opengl demos. many of them:
a) setting the screen update mode in the driver settings because it requires block transfer/page flipping (including YOUR demos :)
b) set <=60hz modes, run with the taskbar on top of them, don't manage to switch screenmodes at all for fullscreen (and display in a window on the upper left of the screen), or any combination of the above
c) crash at the end and/or don't restore the display mode on exit (which is REALLY annoying)
d) don't check whether extensions they require are supported and just silently crash if they're not
e) often don't run on anything but nvidia cards.
or, to put it differently: it's not a question of the api, it just seems that many demo coders don't care a fuck about compatibility, testing, stability, or following the specs.
a) setting the screen update mode in the driver settings because it requires block transfer/page flipping (including YOUR demos :)
b) set <=60hz modes, run with the taskbar on top of them, don't manage to switch screenmodes at all for fullscreen (and display in a window on the upper left of the screen), or any combination of the above
c) crash at the end and/or don't restore the display mode on exit (which is REALLY annoying)
d) don't check whether extensions they require are supported and just silently crash if they're not
e) often don't run on anything but nvidia cards.
or, to put it differently: it's not a question of the api, it just seems that many demo coders don't care a fuck about compatibility, testing, stability, or following the specs.
yeah, it's the DirectX debug runtime doing the following:
- Fill all write-only buffers with total garbage before their use (if you encounter pink/green flickering, it's just that. Same with textures full of noise. Same with a burst of white noise audio at the beginning of a DirectSound app). Good idea to punish coders who don't know what "write only" or "initialisation" means ;)
- You might have set "Break on error" in the Direct3D control panel. This will produce a hard break point whenever the D3D runtime encounters an error. Without a debugger running, the application will simply crash or end (With a debugger running on the other hand, you get nice debug output what exactly went wrong, see my recent comment on "Amour" ;)
So, to answer your question: it's b) ;) D3D coders are not worse or better than OpenGL coders, it's just that OpenGL automatically fixes a few errors (decreasing performance) while D3D just ignores them (Retail runtime) or does strange things by purpose (Debug runtime).
So please all D3D coders, use the debug runtime, turn on "Break on Error" and goddamn RTFM, as all this "unexpected" behaviour is nicely documented (with "what will happen" and "why") in the DXSDK.
Demo watchers on the other hand, use the retail runtime, as the retail runtime is for watching and playing, ok? ;)
- Fill all write-only buffers with total garbage before their use (if you encounter pink/green flickering, it's just that. Same with textures full of noise. Same with a burst of white noise audio at the beginning of a DirectSound app). Good idea to punish coders who don't know what "write only" or "initialisation" means ;)
- You might have set "Break on error" in the Direct3D control panel. This will produce a hard break point whenever the D3D runtime encounters an error. Without a debugger running, the application will simply crash or end (With a debugger running on the other hand, you get nice debug output what exactly went wrong, see my recent comment on "Amour" ;)
So, to answer your question: it's b) ;) D3D coders are not worse or better than OpenGL coders, it's just that OpenGL automatically fixes a few errors (decreasing performance) while D3D just ignores them (Retail runtime) or does strange things by purpose (Debug runtime).
So please all D3D coders, use the debug runtime, turn on "Break on Error" and goddamn RTFM, as all this "unexpected" behaviour is nicely documented (with "what will happen" and "why") in the DXSDK.
Demo watchers on the other hand, use the retail runtime, as the retail runtime is for watching and playing, ok? ;)
Shiva, big up your yeti mutantis chest. Where were you on x-day?
you all forget to meantion:
* nvidia drivers are full of crap
* stuff that runs in d3d debug doesnt nec. work with d3d retail due to:
a) mostly you run with a debugger to check the "outputdebugstrings'" from d3d debug.. wich tend to initliaze pointers to null and stuff like that...
b) some shit is allowed to do error on with d3d debug (like setting invalide textre pointers, tho you wiill get an error on that if you run with *highest* error level.. wich tend to be a bit annoying :)
* nvidia drivers are full of crap
* stuff that runs in d3d debug doesnt nec. work with d3d retail due to:
a) mostly you run with a debugger to check the "outputdebugstrings'" from d3d debug.. wich tend to initliaze pointers to null and stuff like that...
b) some shit is allowed to do error on with d3d debug (like setting invalide textre pointers, tho you wiill get an error on that if you run with *highest* error level.. wich tend to be a bit annoying :)
stefan, A and B are good examples of bad coding practise :)
mnemonix> sure, but its very easy to make those mistakes when whacking up demos/intros
i mean.. demoscene prods has never been coded to be system friendly ;)
i mean.. demoscene prods has never been coded to be system friendly ;)
stefan, true. but that's no reason to continue writing faulty code :)
mnemonix> if you use demosystems/introsystems its quite easy to fix stuff like this once and then recycle, but not everyone wants to recycle their code all the time :)
shiva, you're simply bored, are you?
I know you like that kind of discussions :D
I know you like that kind of discussions :D
stefan solely knows about rgb mixing :)
and about shiva's point, well, i'll second ryg and kb.
and about shiva's point, well, i'll second ryg and kb.
What I sadly see is that there are still problems in running something on PC. I know from before that the possibility to not run a new programm/demo I have just downloaded is not small. It is supposed that we left Dos and got into new PC standards (Win, DirectX, APIs) because we disliked that an old demo wouldn't support our card, or would not find Vesa, etc. But it's all the same now. That's the way it is with PC, it was always like this, because of their complexity, compatibility with older systems, updatable hardware,. they are not microcomputers like Amiga or CPC, I should have understood that a long time ago..
My current PC is a Pentium3 at 600Mhz with Geforce2MX by an unknown company called Manli(?). I blame myself for buying Manli, since there are still some problems. Halflife shouldn't be so slow in Direct3D (OpenGL doesn't work well or good at all) with this machine (probably a hardware/drivers bug, but I updated with new detonators and drivers from the card and new DirectX and still nothing), and a demo shouldn't tell me that my card is really old because it doesn't support multitexturing (Which is wrong I think..) Though it's much better now with my new PC than with the older AMD K6/2 I had, with problematic motherboard probably, where no accelerated demo would run good without the sound beeing slowdown/distorted for some reasons I never found out..
My current PC is a Pentium3 at 600Mhz with Geforce2MX by an unknown company called Manli(?). I blame myself for buying Manli, since there are still some problems. Halflife shouldn't be so slow in Direct3D (OpenGL doesn't work well or good at all) with this machine (probably a hardware/drivers bug, but I updated with new detonators and drivers from the card and new DirectX and still nothing), and a demo shouldn't tell me that my card is really old because it doesn't support multitexturing (Which is wrong I think..) Though it's much better now with my new PC than with the older AMD K6/2 I had, with problematic motherboard probably, where no accelerated demo would run good without the sound beeing slowdown/distorted for some reasons I never found out..
"That's the way it is with PC, it was always like this, because of their complexity, compatibility with older systems, updatable hardware,. they are not microcomputers like Amiga or CPC, I should have understood that a long time ago.."
And if you discovered before, would you have stayed in your cave forever ? :)
And if you discovered before, would you have stayed in your cave forever ? :)
i have heard of people having difficulties running all kinds of demos on all platforms, so long live the morrons.
-r, rasmus
i think that Amiga has something called "custom chips"...
also i think that you were saying that some CPC wasn't running in your CPC because you needed a CRTC0 thing or something like that...
what's the difference between not having a CRTC0 for CPC and not having a good GeForce for PC? Answer: you can find a good GeForce easier... :-)
also i think that you were saying that some CPC wasn't running in your CPC because you needed a CRTC0 thing or something like that...
what's the difference between not having a CRTC0 for CPC and not having a good GeForce for PC? Answer: you can find a good GeForce easier... :-)
for once i agree with mr. badsector here :)
...and as ryg said: it's not an API (or platform if you extend it to your thoughts) problem, but a coder problem, where many (including me =)) coders fall into :-)
Last time I looked on pricewatch, I think the cheapest GF2 was $25, so what's your excuse?
@Scali
i don't think that $25 is a big price, if you think the performance that you'll gain. You can buy a simple SVGA for a lot smaller price, but then you'll not have 3D acceleration :-)
And from the other hand, Optimus will need to spend a lot more money to find and buy a CRTC0 than go to a local store and buy a GF2. The proof is that he already has a GF2 :-).
i don't think that $25 is a big price, if you think the performance that you'll gain. You can buy a simple SVGA for a lot smaller price, but then you'll not have 3D acceleration :-)
And from the other hand, Optimus will need to spend a lot more money to find and buy a CRTC0 than go to a local store and buy a GF2. The proof is that he already has a GF2 :-).
if you don't have access to many systems to test your window init and related code, then you should NOT be coding it yourself IMO. Instead, use a library which has proven to be fairly reliable.
For OpenGL demos you can use something like SDL, it's quite easy to write a well-behaving application with that. There are other libraries which can be used for OpenGL too, and presumably there's something similar for Direct3D.
For OpenGL demos you can use something like SDL, it's quite easy to write a well-behaving application with that. There are other libraries which can be used for OpenGL too, and presumably there's something similar for Direct3D.
I think even with those it is possible to do something completely wrong simply because (most) (demo-)coders don't want to read the sdk, don't check return codes etc. :)
theres only one vendor i boycotte testing my stuff on: nvidia , since their driver simply sucks..
i tend to test all my stuff on matrox,s3,ati :)
when nvidia starts to produce quaility cards+drivers, i could *think* about getting an nvidia card again (looking over to see my gf2 gts lying in a bag full of crap:) ..
i tend to test all my stuff on matrox,s3,ati :)
when nvidia starts to produce quaility cards+drivers, i could *think* about getting an nvidia card again (looking over to see my gf2 gts lying in a bag full of crap:) ..
mnemonix: true, but SDL atleast resets the video mode back to normal - even if your demo segfaults ;)