direct3d for openglers
category: general [glöplog]
anyone know where i can find something like "direct3d for opengl coders" (whether book, website, laserdisc, whatever)? i.e., i know gl *really* well and want to learn d3d. even just a list of corresponding functions would be nice (e.g., glColor4f() <=> d3d???() )
The two API's are pretty distant from each other but in general I can only recommend the Direct3D SDK, the tutorials are pretty straightforward.
As Gargaj said, they are pretty distant from each other. When I learned d3d I realized that when I didn't try too compare the two api's but instead put all my effort on learning all the basics first and then slowly bit by bit gaining more knowledge it became clearer. Don't try too make advanced effects and such from the beginning but learn the api. Perhaps it's not that funny writing spinning cubes but it's essential if you like too learn the craft. Atleast that's my two cents...
dude that is just so untrue. they are very similar, just a little different syntactically. for example
D3DXMATRIX matProj;
D3DXMatrixPerspectiveFovLH( &matProj, D3DX_PI/4, 1.0f, 1.0f, 100.0f );
g_pd3dDevice->SetTransform( D3DTS_PROJECTION, &matProj );
<=>
glMatrixMode( GL_PROJECTION );
gluPerspective( M_PI/4.0f, 1.0f, 1.0f, 100.0f );
and
g_pd3dDevice->Clear( 0, NULL, D3DCLEAR_TARGET, D3DCOLOR_XRGB(0,0,255), 1.0f, 0L );
<=>
glClearColor( 0.0f, 0.0f, 1.0f, 0.0f );
glClear( GL_COLOR_BUFFER_BIT );
there are a few big diffs like no immediate mode in d3d, but in the end it's all for the same hardware.
it would be nice if there were a quick searchable list of common gl/d3d operations (like the ones above). maybe i will make one as i learn...
also, spinning cubes are fucking hilarious.
D3DXMATRIX matProj;
D3DXMatrixPerspectiveFovLH( &matProj, D3DX_PI/4, 1.0f, 1.0f, 100.0f );
g_pd3dDevice->SetTransform( D3DTS_PROJECTION, &matProj );
<=>
glMatrixMode( GL_PROJECTION );
gluPerspective( M_PI/4.0f, 1.0f, 1.0f, 100.0f );
and
g_pd3dDevice->Clear( 0, NULL, D3DCLEAR_TARGET, D3DCOLOR_XRGB(0,0,255), 1.0f, 0L );
<=>
glClearColor( 0.0f, 0.0f, 1.0f, 0.0f );
glClear( GL_COLOR_BUFFER_BIT );
there are a few big diffs like no immediate mode in d3d, but in the end it's all for the same hardware.
it would be nice if there were a quick searchable list of common gl/d3d operations (like the ones above). maybe i will make one as i learn...
also, spinning cubes are fucking hilarious.
Well, perhaps I was a really crack-head when it came too coding back then but when I switched from OpenGL to dx7 I did find them really different. I haven't, like, never used d3d8 or d3d9 so perhaps I should have kept my hole shut and kept staring at my ogl code instead! :)
i quite agree with mistarr plague about d3d and gl being really too similar for it to matter. however i am in the void as to why he needs a tutorial/book/whatever if he himself is best able to underline these similarities :)
those are just the first two i came across by reading the tutorial - it would be nice to have a searchable index. right now i have to parse all the pertinent info out of pages that start w/ "this is what a texture is..."
i'll work on the searchable index instead of bitching.
i think one of d3d 8/9 was a major departure from previous versions, so d3d7 might have been a lot different
i'll work on the searchable index instead of bitching.
i think one of d3d 8/9 was a major departure from previous versions, so d3d7 might have been a lot different
mrtheplague: actually, quite a lot of parts of d3d is WAY different from ogl, so a simple function-by-function reference simply won't work out too well. in d3d, basicly all geometry are contained in vertexbuffers, and the stream-binding in d3d is completely different from gl. i'd suggest simply just reading the docs.
kusma, that very example (vertexbuffer + their way of binding) is by far the biggest difference that i've found so far, and i wouldn't personally call it "WAY" different. Also, I don't really believe a couple of glVertexPointer-ish calls are that distant from an FVF definition.
And, besides the whole vertex streaming and a small couple of other things (and probably lots of things i've never done, considering i haven't much of a clue, really), d3d seems to be about as much of a big fat state machine as opengl is.
And, besides the whole vertex streaming and a small couple of other things (and probably lots of things i've never done, considering i haven't much of a clue, really), d3d seems to be about as much of a big fat state machine as opengl is.
FVF definitions? Stuck in 1999? :)
there's not much of a significant difference as long as you use the plain multitexturing/fixed function pipeline with no rendering to textures.
but both shaders and render to texture have some big differences in architecture (and philosophy) between the two apis.
there's not much of a significant difference as long as you use the plain multitexturing/fixed function pipeline with no rendering to textures.
but both shaders and render to texture have some big differences in architecture (and philosophy) between the two apis.
ryg, oh, am i?
please elaborate :)
(all d3d code i ever saw does some FVF define and SetFVF and all that)
please elaborate :)
(all d3d code i ever saw does some FVF define and SetFVF and all that)
In 2005, the code sets vertex declarations (that's a superset of FVFs). But that's actually not a _big_ difference.
I think I know OpenGL quite well and I started learning D3D some time ago. Actually, I just grabbed DirectX 9 SDK, read tutorial on how to create window and show something in it and started writing my own demotool heavily browsing SDK documentation at the same time. :) I haven't got to rendering to textures and some other complex things yet, but I don't think it was too hard so far. :)
Since then, D3D sample tutorial evolved to something like this: :)
http://sphere.pl/~krzysiek/files/screens/vc_001.gif
http://sphere.pl/~krzysiek/files/screens/vc_002.gif
(yes, I guess demotool design is a bit copied from certain other tool) ;)
Since then, D3D sample tutorial evolved to something like this: :)
http://sphere.pl/~krzysiek/files/screens/vc_001.gif
http://sphere.pl/~krzysiek/files/screens/vc_002.gif
(yes, I guess demotool design is a bit copied from certain other tool) ;)
"just a little bit".
You should check for http://www.codesampler.com/, they have nice samples written in both GL and DX (I mean, the GL code does exactly the same thing than DX one).
This has been a good support while developping DevLib (www.devlib-central.org)
This has been a good support while developping DevLib (www.devlib-central.org)
ryg: I hope you (and other fr members) don't mind reusing good ideas? :)
oh god.. it's been years in coming, but finally - here comes the werkzeug clones!
oh come on ryg, we need a werkkzeug clone-tool :(((
ATTAKK OF TEH KKLONES!
well i dont see why FVF shouldnt be used since it's only one call with one constant while vertexdeclarations need some additional calls for validation etc ( if you want to be a nice guy and write code in the clean way ), still it's only possible to add tangents and binormals in vertexdecl since there are no such bitflags in fvf afaik
vertex declarations get rather essential if you want to do separate streams rather than one buffer containing interleaved data.
KEÄNTÊ: IIRC tangents and binormals are just aliases for some of texcoords. Anyway, fixed function doesn't seem to be using tangents, so it's up to you what means what in your vertex shaders. :)
The D3DDECL_USAGE flags are for fixed function backwards compatibility only anyway. When using shaders, the vertex declaration is matched against the vertex shader code and the vertex values are put into the registers as defined - there's no runtime distinction between the "types" (like position, color, texcoord etc) at all.
kb_: answer your emails already :)
kb : you are so wise :D
Krzysiek-K : as far as i know fvf's texcoords are float[2]'s while we need to pass float[3]'s, so vertexdeclaration is the only way, tho i've never even tried to pass data to vertexshader via FVF. strictly vertexdecl.
yes, when passing data from vertexprogram to fragmentprogram most additional stuff is being passed as texcoords which is the most obvious way of doing that, but that's not the topic :)
Krzysiek-K : as far as i know fvf's texcoords are float[2]'s while we need to pass float[3]'s, so vertexdeclaration is the only way, tho i've never even tried to pass data to vertexshader via FVF. strictly vertexdecl.
yes, when passing data from vertexprogram to fragmentprogram most additional stuff is being passed as texcoords which is the most obvious way of doing that, but that's not the topic :)