pouët.net

Go to bottom

Who has coded (asm) demos on the biggest nb of machines?

category: general [glöplog]
maybe it was a 68HC11, remember kids, this was probably around 14 years ago I remember "68" and "H1" but I might be mistaken on the latter part.

and I'm definitly not an amiga fanboy, so please slaugher me for confusing the chips
added on the 2008-07-25 02:49:17 by thec thec
still coding real functions.
I'll never reach the codinglevel farbrausch is on :(
added on the 2008-07-25 06:25:02 by the_Ye-Ti the_Ye-Ti
Quote:
if students DO learn the extreme low level stuff as well as Java Algorithms, math and general scientific concepts ... why are only so damn few of them able to do it?

I'm going to blame 3 things, in order: lazy profs, group projects, and entitlement. A hero (read as: competent) coder is 10-100x as productive as a tard. People's smart friends drag them through. Profs don't or can't detect cheating, and lazy entitled fuckers think they can make it big without doing any extra work after graduation. They even make it through the first few layers of interviewers at MS, but our tech interviews catch them. Then they break down and cry.
Seriously. A buddy of mine broke a guy on a simple warm-up question. Generate lotto numbers.
added on the 2008-07-25 07:10:07 by GbND GbND
sounds like good youtube-material!
added on the 2008-07-25 07:14:39 by skrebbel skrebbel
I wish we could find fresh graduates with at least a remote notion of algorithmic complexity. (AND/OR without false preconceptions about optimization)
added on the 2008-07-25 09:43:29 by _-_-__ _-_-__
Optimus: No. The problem is mainly the slow CPU, not the programming language. Using ASM is not able to increase the FPS by an amount that is visible - in fact, recoding it all in ASM is likely to slow everything down, since it's easy not to put full optimizing effort to every piece of the engine. Remember, we're doing this as a hobby.
added on the 2008-07-25 09:51:13 by kusma kusma
Generating lotto numbers is simple now? I can never get mine right. :(
added on the 2008-07-25 10:53:03 by doomdoom doomdoom
As if anyone who could make a winning lotto number generator would bother to work at microsoft :)

Odd.. I'm getting a strange feeling of deja vu...
added on the 2008-07-25 14:37:38 by psonice psonice
Quote:
Remember, we're doing this as a hobby.


Not only that, it's just the most blatant example of what someone not matching the criteria in the post just above yours would do.
added on the 2008-07-25 15:29:28 by superplek superplek
for our atari demos, we use a mix of c and asm
c because of manageability
asm for speed where the c compiler just doesn't cut it

gba isn't that much more powerful than let's say a falcon, so the difference must be in compiler quality, or perhaps coder persistance ;D
added on the 2008-07-25 15:37:00 by havoc havoc
AFAIK, the compilers for ARM is usually quite a lot better than the compilers for 68k.
added on the 2008-07-25 15:55:40 by kusma kusma
niels, then again, as convenient as it is to strike constants out of complexity calculations - in the demo/game world the difference between 60 and 40 FPS DOES matter :)

... ok, forget about demos, slow demo code can easily be compensated by waving polish flags on stage and shouting "AMIIIIGAAAA". I should try that with our clients some time.
added on the 2008-07-25 16:48:17 by kb_ kb_
kb, to my experience researchers stop ditching the constants when they're pretty certain no order of magnitude-style optimisation will be possibly anymore (i.e. after they went from O(n^3) to O(n log n), they continue going from 5n log n/2 to 2.36767676n log n/2, etc; this is where i lose interest but ok).

but yeah, saying that something is optimal because it's in linear time (and obviously can't be done in constant time or sth), no matter the implementation, is rather short sighted. then again, it's obvious that these are in many ways two quite unrelated aspects of optimisation. it's perfectly fine that different people focus on the different aspects. as long as they listen to each other :)
added on the 2008-07-25 17:02:11 by skrebbel skrebbel
Besides, my comment was not about game companies, which I am not in the business of having a say at.

Constant factors do matter when a decent framerate can turn bad any time. The problem is that I see people performing tricks on the singular (following "recipes" which came down to them, untested, from teachers or random internet sources) that would only matter once the whole has been optimized algorithmically. (or at least that a space vs time tradeoff has been decided upon)

Worse even, it's hard to get the means to actually measure the results, which means people often optimize "in the dark" which I find even worse than not optimizing.
added on the 2008-07-25 18:23:03 by _-_-__ _-_-__
Something which I'm not sure was mentionned recording academic profiles.. What's really relevant is whether people with degrees have learned their lessons in a very context-dependent manner or not. Whether they, in the end, are able to apply their training to new situations or if their knowledge is restricted to the areas of studying.

There is some evidence that we have this difficulty of applying our intelligence/skills to other contexts for which we have been trained for and revert to intuition or "recipe based" thinking. That must account at least for a part of what make professionals doubt people with academic degrees.
added on the 2008-07-25 18:40:35 by _-_-__ _-_-__
read "regarding" where I wrote "recording."
added on the 2008-07-25 18:41:05 by _-_-__ _-_-__
Assembler is for pussies. Machine language is where it is.

>C000 78 A9 00 8D 20 D0 8D 21
>C000 D9 8D 14 03 A9 C1 8D 15

and so on
added on the 2008-07-25 20:23:57 by Calexico Calexico
what scares me most is that i read it and thought "hm, he left out a D0 A9 at the end of the first line" ;) Why the hell do brains remember such stuff?
added on the 2008-07-25 22:19:42 by kb_ kb_
SKREBBELS OF POLITICIANS INC.
added on the 2008-07-25 22:34:23 by Hatikvah Hatikvah

Of course there was a mistake. How emberassing.

>C000 78 A9 00 8D 20 D0 8D 21
>C008 D0 8D 14 03 A9 C1 8D 15
added on the 2008-07-25 22:45:20 by Calexico Calexico
no idea about other cs students since i never interviewed them on the subject, but my experience with the academics i've worked with is not at all that they don't care about constant factors. what i did notice is a definitive tendency towards producing ever-better solutions for what is basically toy problems, massive over-engineering and a lack of perspective. some examples:

better solutions for toy problems: nobody really cares about a 1% increase in average vertex cache hit rate over existing mesh optimizers, or about better triangle strip generation. existing solutions are simply good enough in practice. same goes for faster ray-triangle intersection algorithms (with "slight numerical issues" - yep, that sounds enticing).

massive over-engineering: there's a tendency to double the complexity of algorithms to get marginal improvements (...on a few carefully selected test datasets). nobody's ever going to use such algorithms in practice; it's not worth the hassle. similarly, if an algorithm absolute requires a peculiar data structure layout, that may be fine for a paper, but it's a royal pain for any program that's trying to solve more than one problem at a time. and a complex and slightly-too-clever caching scheme that produces a small total speed gain stops looking like a good idea once weird bugs appear and you have to find out when exactly things start going wrong.

lack of perspective: ties in with the above. a personal favorite is "optimizations" that decrease expected running time, but greatly increase variance - e.g. rendering or collision detection algorithms that intensively rely on frame-to-frame coherence for performance. which seems sensible until you have a small hiccup (couple of page faults or sth. like that) and suddenly every frame takes a bit longer than the previous one. woo-hoo. similarly, a lot of papers ignore "degenerate cases" that abound in practice. small exercise: find how many papers describe mesh data structures. fine. now how many of them mention other vertex attributes besides position? and how many of those that do consider the case where such attributes are discontinuous? it's ridiculous, really. finally, most people sidestep tricky issues such as numerical robustness entirely. nice if you want to write a lot of papers, but a serious problem if you want to actually use them on more than a couple of simple test datasets.

final note, getting back to the discussion at hand: one thing that is common behavior (by academics and non-academics alike) is using average- and worst-case predictions instead of looking at actual performance on relevant problems. understandable, since you need to implement an algorithm to actually get such measurements. but a lot of worst-case estimates are simply useless to predict practical performance. there's a nice paper on the subject.
added on the 2008-07-25 22:59:05 by ryg ryg
Welcome to the academic work where even mediocre grad students have to produce a given number of papers per year.

added on the 2008-07-25 23:07:37 by Calexico Calexico
Quote:
niels, then again, as convenient as it is to strike constants out of complexity calculations - in the demo/game world the difference between 60 and 40 FPS DOES matter :)


tell me about it - spend night hours trying to optimize tomb raider xbox 1 shadow code (software clipping beats the fillrate hit) to get it like 5% faster and make it drop less in a otherwise 60fps world.

but really, the more you mature as a programmer (also in the game/demo field) the more you learn how bottlenecks are just not 1-dimensional.
added on the 2008-07-26 00:02:21 by superplek superplek
and big kudos to ryg. you are and have been an example all along.
added on the 2008-07-26 00:03:10 by superplek superplek
oh and all, turn on your strict aliasing compiler flag! and have fun on powerpc.
added on the 2008-07-26 00:04:26 by superplek superplek

login

Go to top