VC++ is slow.
category: general [glöplog]
Of course there is a way to manually evaluate that, but what he's trying to say is that the human capability to evaluate such things decreases as the complexity of the hardware you're working on increases. On consoles for example, it's often still very doable to evaluate such things (allthough you'll mostly end up evaluating on a more algorithmic or at least slightly higher level, defeating the need to write assembly code). It also very much depends on the scope you're targetting, hand-optimizing using special instruction sets like MMX and SSE (audio decoding, you name it..) can be sufficiently predictable and useful.
But besides that, it is possible to describe the ruleset that applies to the platform at hand and design certain heuristics, even though one would probably not be able to take all these factors into account when manually performing the same task. We have computers to do that for us you know.. :)
And yeah, this is pretty "duh". I know.
But besides that, it is possible to describe the ruleset that applies to the platform at hand and design certain heuristics, even though one would probably not be able to take all these factors into account when manually performing the same task. We have computers to do that for us you know.. :)
And yeah, this is pretty "duh". I know.
In the case of Intel, we have processors from different vendors with different architectures. Something made for the Pentium 4's 20-stage pipeline probably wouldn't do as well on an Athlon, for example. With enough detailed information, you could certainly count cycles on a specific chip, but then your work only applies to that chip. Plus, on a multitasking OS, the cache gets to be unpredictable depending on the kernel's task switching.
And that's why PCs sucks. I will never be able to put my ass down, learn to optimize from Pentium to Athlon and show you something. And then, people would say that on Athlon64 or Athlon128 or Athlon256 or Pentium V/VI/VII it's useless anymore :P
I guess I will only try to start from my 386, no matter if you think it's stupid, cause it's a personal dream of mine. Perhaps I will move to 486 and Pentium later from that. I'll see..
Something else? You think that optimizing ends up at compiling a programm and trying to optimize that??? The best codes on the 8bits are so unpredictable, with unrolled codes, reusing data on registers, the feeling is like hardwiring your algorithm in assembly. It's preety much diferrent than what a compiler does. Then, you may see diferrences. But then again, I don't know what happens today with the shitty PCs :P
I guess I will only try to start from my 386, no matter if you think it's stupid, cause it's a personal dream of mine. Perhaps I will move to 486 and Pentium later from that. I'll see..
Something else? You think that optimizing ends up at compiling a programm and trying to optimize that??? The best codes on the 8bits are so unpredictable, with unrolled codes, reusing data on registers, the feeling is like hardwiring your algorithm in assembly. It's preety much diferrent than what a compiler does. Then, you may see diferrences. But then again, I don't know what happens today with the shitty PCs :P
Why don't you just learn to write code for PCs in C/C++ and optimize there, and worry about asm later?
You rarely need to optimize in asm anyway, for modern (accelerated) PC demos, since the CPU is rarely the bottleneck anymore.
Instead, you should concentrate on how to send the data to the GPU in the most efficient way, and how to make the GPU process it as efficiently as possible.
You rarely need to optimize in asm anyway, for modern (accelerated) PC demos, since the CPU is rarely the bottleneck anymore.
Instead, you should concentrate on how to send the data to the GPU in the most efficient way, and how to make the GPU process it as efficiently as possible.
why are you so concerned with "optimization"? why not concentrate on making something original and beautiful instead of boring seen a million times "plasmas" and "fires"? with modern pc's we have enough speed without concerning ourselves with tweaking the hell out of every function we write.
if optimization is so important, do it last, _after_ you have written something worth seeing. use your imagination and artistic ability first. after that, optimizing will get you some extra brownie points, but it's not the most important thing imo.
if optimization is so important, do it last, _after_ you have written something worth seeing. use your imagination and artistic ability first. after that, optimizing will get you some extra brownie points, but it's not the most important thing imo.
Scali: I'd guess for _lots_ of modern demos, the CPU is still the primary bottleneck.
Hm, I dare to oppose that. Really.
It's just the Optimus is and probably will never be a good programmer, so he has to focus on bullshit like this :)
It's just the Optimus is and probably will never be a good programmer, so he has to focus on bullshit like this :)
To put it this way.. Impressing does not solely imply hand-optimizing functions.
>>The best codes on the 8bits are so unpredictable,
>>with unrolled codes, reusing data on registers,
>>the feeling is like hardwiring your algorithm in
>>assembly.
Can you imagine how much more complicated modern PC effects are from the fireplasmas that your beloved 8bit machines are running? 1 theoretical clock tick spent in vain here and there because of "bad compiler optimization" has no effect on the FPS in these cases. Even on codes of medium size, hand-optimizing would take incredibly much time and experience.
>>with unrolled codes, reusing data on registers,
>>the feeling is like hardwiring your algorithm in
>>assembly.
Can you imagine how much more complicated modern PC effects are from the fireplasmas that your beloved 8bit machines are running? 1 theoretical clock tick spent in vain here and there because of "bad compiler optimization" has no effect on the FPS in these cases. Even on codes of medium size, hand-optimizing would take incredibly much time and experience.
Though micro-optimizing and smart handling of the architecture/compiler does a lot of good things for you. There still *is* something called 'the efficient way' you know.. It's just taken to a higher level.
And not all real-time CG programming consists of over-complicated stuff. Not at all. Certainly not some important trivial tasks.
And not all real-time CG programming consists of over-complicated stuff. Not at all. Certainly not some important trivial tasks.
learning your way to decent highlevel datastructures can lead to an increase in both productivity and execution time :-)
Bagpuss: This is just what I'd like to code.
Anyway,. this thread had no meaning, I originally opened it under some weird state just for fun, to see your responses..
Anyway,. this thread had no meaning, I originally opened it under some weird state just for fun, to see your responses..
Is it me, or Optimus just did is coming out and said he is a troll ? o__O
I think he tries to create posts like his old "serious" ones intentionally to get attention. It just seems that he is not under control of his "talent".
His post "This town is Z" must be a tremendous failure for him. So many letters typed, so few replies.
Class clown behaviour i'd say..
His post "This town is Z" must be a tremendous failure for him. So many letters typed, so few replies.
Class clown behaviour i'd say..
I love Optimus.