OOP Criticism
category: general [glöplog]
Defiance: Seems to be one of the subtle language details. So, would you say that using "in" in technical terms is more appropriate than using "of" ?
Like in
foreach x in /*collection*/ y { x.doSth(); }
vs.
foreach x of /*collection*/ y { x.doSth(); }
Like in
foreach x in /*collection*/ y { x.doSth(); }
vs.
foreach x of /*collection*/ y { x.doSth(); }
sorry doom :) haven't even read your post, yet.
Quote:
I think that's a fitting example of a case where you don't want to be tied down by algorithms, even a high-level choice like sort-and-rasterise. E.g. working with modern hardware, sorting and rendering a triangle at a time are two things you mostly don't want to do. In a software engine, per-pixel z-buffering may or may not be faster than sorting if the number of triangles is large enough, and some meshes can't be sorted at all.
And what good is a set of triangles? If you're researching algorithms or experimenting with a demo effect that's the sort of construct you might work with, but I think anywhere high-level design becomes a big issue (like a 3D engine say) the more relevant concept is one of a "3D object". The relevant verbs aren't "sort" and "rasterise" but more like "place yourself into this scene", or "scene, please accept this 3D object".
Sure, somewhere at a fairly low level in the engine you may have those "sort" and a "rasterise" verbs, but that's the "algorithm level" I'm talking about, way beneath where the important decisions are made. If you start at that level it can be a very steep climb upwards, and many of the choices you make may cause frustration later on because they force you into design patterns that aren't suited for the overall program.
Since you ask, though, yes, I think triangle is a good candidate for a class (even if it's still little more than a datatype, consider the varieties of triangles there might be: flat, textured, shaded, etc.), as is the set of triangles, and the image.
For that matter, the set of sorted triangles is a good candidate for a sister class to the unsorted set. After all if what you're looking for is a sorted set, sorting an unsorted set is just one way to get there. Let's say you're doing marching cubes over some scalar field, you could just align the sampling grid with view space. Thinking in terms of a rasteriser class that can do "whatever" is helpful:
Code:
class unsorted_triangle_set ...
class sorted_triangle_set ...
..
Renderer::please_render( unsorted_triangle_set UT ) {
sorted_triangle_set ST = UT.produce_sorted_set();
please_render( ST );
}
Renderer::please_render( sorted_triangle_set ST ) {
for each t of ST, rasterise t ...
}
Further abstraction would get rid of the triangle set, or, make it derive from a "shape" class (and you'd have to turn that around slightly to do it in C++ anyway). But the point is with that higher-level structure there's a much clearer description of what you want to do, as opposed to how you want to do it. The framework should reflect firstly that you're trying to render a shape, and secondly that you have a way of doing it.
A better solution would be a renderer that's aware of whole objects and makes decisions on how to construct the final sorted set of triangles (or unsorted, or DX buffer object, or whichever fits the situation) with a broader awareness of the overall scene. That's one of the places where OOP and "interface first" thinking really shines, I think. I've done enough 3D engines the other way around to never want to do it again. ;) Nowadays I prefer to have a scene class, a hierarchy of classes for stuff that goes into a scene, then I have a renderer that attaches to a scene, and so on. And I'm happy to write class definitions for days before anything even compiles, because it really pays off in the end. I think. :)
If I may be presumptuous, I think the point of what _-_-__ is saying is that if you want to prove that your code actually works, you need to think about what is going to actually be going through the CPU, ie. the actual sequence of operations that are going to take place. Whether those operations are extremely specific like "mov ax, 50" or very general like "render a triangle". It's that the design of a program ought to be fundamentally a sequence of verbs that carry out the intended behaviour. Then you start thinking about what classes might be used to carry out that behaviour.
The alternative is to think of the program as a collection of objects (like bags and sets) that are not thought of in terms of their actual behaviour, but are thought of as analogies for the mathematical or real-world concepts that they are supposed to represent. You go ahead and assume that certain things will work because they would work on their analogical mathematical/real objects. You can't do this (and have a bug-free program) until you carefully prove that you can (like pg. 2 of that bags/sets article). Otherwise you end up with a horrible buggy mess like wxWidgets.
The alternative is to think of the program as a collection of objects (like bags and sets) that are not thought of in terms of their actual behaviour, but are thought of as analogies for the mathematical or real-world concepts that they are supposed to represent. You go ahead and assume that certain things will work because they would work on their analogical mathematical/real objects. You can't do this (and have a bug-free program) until you carefully prove that you can (like pg. 2 of that bags/sets article). Otherwise you end up with a horrible buggy mess like wxWidgets.
@ Hermes
yes, you got my point.
Although, it depends on the situation or how you want to implement the piece of code.
yes, you got my point.
Although, it depends on the situation or how you want to implement the piece of code.
(_-_-__ said..)
I thought he meant that it's hard to judge the quality of any given design that has no defined purpose.
Defiance: So, in any fictional programming language, you would rather prefer to have the "in" keyword optional (or maybe both "for" and "in") ?
Quote:
Basically it's hard for me to ascertain the quality of a design that does not serve any computation or operation.
As for your remark that "algorithms are an implementation concern." Algorithms can be as high level as you make them to be. At the design stage they don't even have to be described in terms of the minute operations that they are performing it.
(Algorithms are more abstract and generic than classes)
I thought he meant that it's hard to judge the quality of any given design that has no defined purpose.
Defiance: So, in any fictional programming language, you would rather prefer to have the "in" keyword optional (or maybe both "for" and "in") ?
@Hermes
Yes, because the effect could apply to every element in that particular array.
Yes, because the effect could apply to every element in that particular array.
Defiance: Do you think you could give a simple pseudo-code example that shows how you think it should look like ?
Yesso and hermes got my point across.
Yesso, I also agree with your "alternative" however I prefer mathematical "objects" to "real" objects. Precisely because I don't trust analogies too much ;)
Doom, it appears to me your standpoint is that of a library / API writer. In which case I see why you use this guiding principle of coding to interfaces first. It's also what I would do in that case. Writing an API is a bit like writing a language, and making sure the language work can be done without the implementation. (To a degree)
Yesso, I also agree with your "alternative" however I prefer mathematical "objects" to "real" objects. Precisely because I don't trust analogies too much ;)
Doom, it appears to me your standpoint is that of a library / API writer. In which case I see why you use this guiding principle of coding to interfaces first. It's also what I would do in that case. Writing an API is a bit like writing a language, and making sure the language work can be done without the implementation. (To a degree)
yesso: Really it's about the context. If what you're doing is essentially a procedure, it's best modelled as a procedure. Like if you look at a sort algorithm, it'd be a horrible mess if you describe it in noun terms. It's inherently a verb thing. Input -> operation -> output.
Any larger system such as a 3D engine, GUI framework, game or OS is not essentially a procedure. At least thinking of it that way isn't useful. If it's a system, it's much better modelled as a system, consisting of a bunch of components that have various (let's hope well-defined) relationships to each other.
Any larger system such as a 3D engine, GUI framework, game or OS is not essentially a procedure. At least thinking of it that way isn't useful. If it's a system, it's much better modelled as a system, consisting of a bunch of components that have various (let's hope well-defined) relationships to each other.
_-_-__: "ack"
maybe I should rephrase my yesterday's post "optimizations tend to be a subset of spaghetti code" to "optimization tends to spoil design".
it's always a very deliciate balance between fast vs maintainable code. at least in my experience.
however, you said "Writing an API is a bit like writing a language" and I can only fully agree. API design is often underrated.
maybe I should rephrase my yesterday's post "optimizations tend to be a subset of spaghetti code" to "optimization tends to spoil design".
it's always a very deliciate balance between fast vs maintainable code. at least in my experience.
however, you said "Writing an API is a bit like writing a language" and I can only fully agree. API design is often underrated.
doom: excellent point, and basically what _-_-__ probably meant: you need to take a look at the "big picture" (the system) (systems analysis is really fun, btw ;))
At this stage of the discussion I realize my initial criticism against picking nouns out of nowhere can better be rephrased as a criticism against applying system-design/analysis at a much lower level than necessary.
In particular, I witness this currently on a part of an application I'm responsible for, which may be understood by newcomers and outsiders as a module or system, whereas it is better described as an algorithm. (It's a resource allocation algorithm involving different resource types)
In particular, I witness this currently on a part of an application I'm responsible for, which may be understood by newcomers and outsiders as a module or system, whereas it is better described as an algorithm. (It's a resource allocation algorithm involving different resource types)
well, useless optimization is useless, as they say.
knos (right?) I'm currently working on a project where proper systems analysis has boosted the performance by at least 5 times. it's really worth it.
in my experience, really large projects often suffer from a lack of proper direction.. most people only look at their part of the "universe" and are not seeing that "big picture". conditions like that can easily end up in an "epic project fail". well, I could go on and on about this but it's definitely not pouet.net material.
in my experience, really large projects often suffer from a lack of proper direction.. most people only look at their part of the "universe" and are not seeing that "big picture". conditions like that can easily end up in an "epic project fail". well, I could go on and on about this but it's definitely not pouet.net material.
_-_-__: I have sort of the opposite problem, exposed to systems suffering from an acute lack of object-orientation. They become hard to maintain because they treat everything as an input-output situation. The codebase would shrink to about 10% of its size if even a weak attempt were made to model systems as systems, and a swarm of bugs that have cost countless hours of work would just never have been an issue to begin with.
I see the same thing in my personal projects. That is, as I get better at coding I notice it's because I care less about procedures and more about modeling.
But yeah, sort of not pouet.net material anymore. :P
I see the same thing in my personal projects. That is, as I get better at coding I notice it's because I care less about procedures and more about modeling.
But yeah, sort of not pouet.net material anymore. :P
actually, as long as CPU's aren't OO, then OOP < procedural programming.
because CPU's are aimed at functions...
because CPU's are aimed at functions...
whynot2000- Is an OOP CPU really something that can be created?
If so, would you be able to write an emulator for a regular procedural CPU to emulate that OOP CPU?
If so, would you be able to write an emulator for a regular procedural CPU to emulate that OOP CPU?
If you have two header files, vector.h and point.h and point.h includes the vector.h but also the vector.h includes the point.h then why doesn't #pragma once or IFNDEF kind of guards doesn't work? I remember I used them for a year to resolve stuff like this but suddenly in a C++ project I get compilers errors because of this. I was looking at it for 2 hours without conclusion of what the hell..
so your vector.h includes point.h
and your point.h includes vector.h?
how could that be usefull?
and your point.h includes vector.h?
how could that be usefull?
Uhm. CPUs are not exactly OO but they tend to not just execute a single code thread, either. So that 1960s theory of algorithms is getting a little dated. OO is a perfect way to model a whole computer, though.
Optimus Knight: If two header files include each other there's something wrong with your program. Either you don't need the two-way inclusion or you have circular definitions.
Quote:
whynot2000- Is an OOP CPU really something that can be created?
If so, would you be able to write an emulator for a regular procedural CPU to emulate that OOP CPU?
no, i don't think so.
that's the problem, you lost contact whit the system when you do OOP.
there is no real optimisation anymore or anything.
Quote:
that's the problem, you lost contact whit the system when you do OOP.
a) Not everything needs to be optimized down to the last clock cycle. In almost all realistic environments, the two biggest problems are the need to be as productive as possible and the need to write bug-free, unhackable code. Most of the work in the field of software engineering these days is devoted to those two problems.
b) OOP can still be pretty optimized, at least with C++. It was designed specifically with performance in mind. (For example, you can make inline methods).
c) Once you actually get to know an OOP language perfectly, you can predict exactly how the code will execute just by reading it. Once you get a bit of practice, it even becomes easy.
Quote:
that's the problem, you lost contact whit the system when you do OOP.
Actually no, OOP gives you a way to model how the system actually looks. It's procedural programming that has the awkward perspective on the system. You know a computer is more than a CPU, right? How do you model the fact that you have more than one CPU core running code at the same time? How does that flow chart look when you can walk in two or four directions at once? And how do you model a GPU? How about a SATA controller connecting to a hard drive - how does this become just a series of procedure calls while still reflecting the actual system architecture?
A computer is inherently object-oriented. You "lose contact" with the system by insisting on procedural programming as if a single CPU is still somehow the only significant component you have to deal with. Even as far as that single CPU goes, modern operating systems are best described in OO terms, too.
Quote:
there is no real optimisation anymore or anything.
There is no overhead in calling member functions on objects as opposed to global functions on structures. A class has no overhead compared to a structure. Calling a virtual method is as fast as using a function pointer. There's even still room for assembler coded innerloops without violating any OOP principles. If that's your thing.
Quote:
there is no real optimisation anymore or anything
This is so far from any kind of truth that I'm not sure if I should laugh or cry when I read this. First of all, what Doom said. Second of all I'm just very curious to hear what *your* arguments are to make this bold and false statement? That way we can really deconstruct the false beliefs behind it instead of stating what's true about modern C++ compilers.