Better to learn C or ASM first?
category: general [glöplog]
Now I know most will probably say to just choose one depending on what kind of demo I want to make and whether or not I want to do size restricted old school demos, etc.
Of course, my reason for wanting to learn these two langauges is for making demos, but to be honest, I'm also interested in learning them because I'm interested in Reverse Engineering and Malware/Virus Analysis/detection as well.
So basically I will have to learn these two languages regardless... and perhaps even c++ down the road. But learning them will be essentially shooting two birds with one stone, as i'll be able to satisfied both of my interests :D
As for my background, I have had some programming experience in the past(though very limited); have studied a little bit of java a long time ago, have gone through half a book on python and learned most of the basics... so on one hand, I am not completely new to programming concepts, yet I have never really used them for anything or stuck with any languages long enough to be really proficient at them.
Now, when it comes to C and ASM, which would you guys recommend learning first? I've heard various opinions. Some said that it's better to learn C first and have a solid foundation of all the major programming concepts like pointers, memory management; that it's extremely hard to just jump into ASM without being very experienced at programming. While others said that it's better to learn ASM first as it makes learning C easier because I will have a better understanding of everything underneath the hood so to speak.
Lastly, I was wondering if anyone remember the names of any of the memorable Demos that was in C, or in ASM? A lot of the Amiga and C64 demos were done in ASM right? but are there any ASM or C demos done on x86? or even C or ASM on linux? :D
Of course, my reason for wanting to learn these two langauges is for making demos, but to be honest, I'm also interested in learning them because I'm interested in Reverse Engineering and Malware/Virus Analysis/detection as well.
So basically I will have to learn these two languages regardless... and perhaps even c++ down the road. But learning them will be essentially shooting two birds with one stone, as i'll be able to satisfied both of my interests :D
As for my background, I have had some programming experience in the past(though very limited); have studied a little bit of java a long time ago, have gone through half a book on python and learned most of the basics... so on one hand, I am not completely new to programming concepts, yet I have never really used them for anything or stuck with any languages long enough to be really proficient at them.
Now, when it comes to C and ASM, which would you guys recommend learning first? I've heard various opinions. Some said that it's better to learn C first and have a solid foundation of all the major programming concepts like pointers, memory management; that it's extremely hard to just jump into ASM without being very experienced at programming. While others said that it's better to learn ASM first as it makes learning C easier because I will have a better understanding of everything underneath the hood so to speak.
Lastly, I was wondering if anyone remember the names of any of the memorable Demos that was in C, or in ASM? A lot of the Amiga and C64 demos were done in ASM right? but are there any ASM or C demos done on x86? or even C or ASM on linux? :D
The only applications for ASM on x86 are typically either 256b or smaller intros on DOS, but pretty much nobody writes Windows demos in x86 assembly. The only parts where it is typically used are a couple of helper functions in intros, for example to exploit the x87 FPU and not having to use the C standard library math routines. This is something that you write (or copy) once and then never touch again.
I would also say that most people rather use C++ than C on Windows, even if it's only "C with classes" (or C+ like some people call it ;), i.e. mostly procedural C-like code with only light usage of C++ features. And of course many shader-heavy demos these days only contain very little CPU code and mostly consist of GLSL / HLSL shader code.
But you can just use any other language you want. People use ObjectPascsal, Rust, JavaScript and many other languages for writing demos.
On oldskool platforms it's a different story of course, but even on Amiga you should be able to write large parts of your demos in plain C.
I would also say that most people rather use C++ than C on Windows, even if it's only "C with classes" (or C+ like some people call it ;), i.e. mostly procedural C-like code with only light usage of C++ features. And of course many shader-heavy demos these days only contain very little CPU code and mostly consist of GLSL / HLSL shader code.
But you can just use any other language you want. People use ObjectPascsal, Rust, JavaScript and many other languages for writing demos.
On oldskool platforms it's a different story of course, but even on Amiga you should be able to write large parts of your demos in plain C.
(Accelerated Amiga, that is.)
My short answer: ASM to learn the basics.
You can learn both at the same time!
If you know some Java, C will be rather easy for you.
Quote:
You can learn both at the same time!
Might be a good idea. Especially if there are any aspirations towards reverse engineering etc. since you could combine exercises when trying to create some simple programs. :)
x86-64 has around 3683 possible opcodes (981 if you ignore types and sizes). C has around 33 reserved words.
While you're at it learn Rust as well.
Quote:
You can learn both at the same time!
While you're at it learn Rust as well.
I honestly kinda learned the two in parallel, started with C and kinda picked up ASM along the way as a necessity - it was great because I was able to understand the C mechanisms better when I knew the rough ASM equivalent.
Computerphile - MegaProcessor
https://www.youtube.com/watch?v=lNa9bQRPMB8
This video gives a good overview of how a CPU works.
I wish I had watched it years ago.
https://www.youtube.com/watch?v=lNa9bQRPMB8
This video gives a good overview of how a CPU works.
I wish I had watched it years ago.
Learn C first.
Then if you are curious to know how it works behind, learn ASM.
Then if you are curious to know how it works behind, learn ASM.
Quote:
just to nitpick a bit here, there are still a fair amount of 1k and 4k intros written in assembler.but pretty much nobody writes Windows demos in x86 assembly.
on pc that is
and Amiga...
only Amiga makes it possible
only Amiga makes it possible
Quote:
x86-64 has around 3683 possible opcodes (981 if you ignore types and sizes)
I might have used 30-50 opcodes in my life. Didn't even know it has so much.
Leaning assembly is being able, when given so minimal stuff, to write something you had in mind from scratch without copying it from a listing. Most people fail in that, it's very overwhelming to have to write everything by yourself, rather than the fact they can't be bothered to learn 3683 opcodes.
ferris: My sample might not be representative but I had a feeling that it's mostly the reusable setup code that's typically optimized to death, but a big part of the actual demo logic remains C(++). Though I guess that especially in 1k the focus on ASM might be bigger than in 4k...
Also, very important, C != C++.
for serious debugging asm is a must
neoneye was closest to my oppinion.
For serious programming and debugging it is not a matter of the programming language, actually. Yes: ASM is close to it but that's because it's the (almost) lowest level you could program the machine in.
For me, the most important and first thing is to learn how the machine you are targeting is designed and working. That doesn't mean that you have to learn everything about the machine but about the components you are using if you have a really complex system
(For example: If you want to write fragment shaders, you first have to get the concept of them or at least while you are learning the language you are doing it with. Knowing the language itself doesn't help much there.)
It is not really about learning a language and then start telling the machine to do awesome stuff. That concept is, sadly i think, what they started doing in lots of universities a few years ago: Teaching a language like java in the first semester, then do a crash course of C and then, after 3 semesters, try to teach them low level and operating system concepts.
If you learn how the machine actually works (at least the parts you are targeting - but to know what you are targeting, you need to have an overview) while or before learning a programming language, you will most probably see that, with that knowledge, you can just use any programming language and then choose the one with the concept that feels most natural for you or your goal.
For example in debugging (in x86 and similar):
You see:
xor eax, eax
When learning all the opcodes and language concept you think:
"It is xoring eax with itself - but why?"
When knowing the platform and logic concept you see:
"Ah. It is xoring the eax register with itself to 0 it. Because that's shorter than [mov eax, 0] but the machine will treat it like that".
It's not really about knowing how to write code but why you write it.
For serious programming and debugging it is not a matter of the programming language, actually. Yes: ASM is close to it but that's because it's the (almost) lowest level you could program the machine in.
For me, the most important and first thing is to learn how the machine you are targeting is designed and working. That doesn't mean that you have to learn everything about the machine but about the components you are using if you have a really complex system
(For example: If you want to write fragment shaders, you first have to get the concept of them or at least while you are learning the language you are doing it with. Knowing the language itself doesn't help much there.)
It is not really about learning a language and then start telling the machine to do awesome stuff. That concept is, sadly i think, what they started doing in lots of universities a few years ago: Teaching a language like java in the first semester, then do a crash course of C and then, after 3 semesters, try to teach them low level and operating system concepts.
If you learn how the machine actually works (at least the parts you are targeting - but to know what you are targeting, you need to have an overview) while or before learning a programming language, you will most probably see that, with that knowledge, you can just use any programming language and then choose the one with the concept that feels most natural for you or your goal.
For example in debugging (in x86 and similar):
You see:
xor eax, eax
When learning all the opcodes and language concept you think:
"It is xoring eax with itself - but why?"
When knowing the platform and logic concept you see:
"Ah. It is xoring the eax register with itself to 0 it. Because that's shorter than [mov eax, 0] but the machine will treat it like that".
It's not really about knowing how to write code but why you write it.
asm is so much more fun to learn :)
Toss a coin and pick one, study it for a month and then ask again. Compared to actually doing something, Pouet is a waste of time if you want to learn basics.
Re: C or asm for 4k intros... For me, the difference between 1k and 4k intros is that in 1k, the music is done with Windows General MIDI synth, and GL uniforms are not used. In 4k, music is Crinkler and uniforms can be used. Apart from the 1k music routine, all of the prod-specific code is in the shader. Using C for the very small and non-prod-specific x86 code in a 4k feels like wasting bytes for no good reason. YMMV
"music is Crinkler"
THAT :P
@RbR: It´s a double-ended sword. If you start low-level you get a good insight of the inner workings, but usually have a hard time to adapt to high-level abstractions. If you start high-level you´re that stained by thinking in patterns that you´ll likely find it difficult to think on their actual cost constantly. Thus, the best way should be to do both simultaneously - preferably on the same machine to keep the perception gap small.
THAT :P
@RbR: It´s a double-ended sword. If you start low-level you get a good insight of the inner workings, but usually have a hard time to adapt to high-level abstractions. If you start high-level you´re that stained by thinking in patterns that you´ll likely find it difficult to think on their actual cost constantly. Thus, the best way should be to do both simultaneously - preferably on the same machine to keep the perception gap small.
I'm not sure but I may have meant 4klang.
Or Clinkster