For many years I have used object-oriented programming languages. For only a small fraction of that time have I considered the intersection of the language implementation, the conception of OOP, and the desired program behaviour. For an even smaller fraction have I attempted the implementation of programs according to the proper theory of OOP specifically, rather than producing a jumbled-up mess of bobs and switches that violated every rule in the book at some point or another.
It was these facts which I seriously considered for a long time without any clear answer as to why they obtained. Was it merely due to a lack of experience that I never approached the problems in those ways? Surely that played a part - after all, it took geniuses many years to come up with everything, and surely I would not have realized solutions to questions about architecture, design patterns, what sort of classes are appropriate, and so forth. Although educating myself about these things by informal study - through reading books, listening to lectures, studying others' productions, and so forth - solved that problem, it produced a new side effect which became more problematic.
In particular, the implementation of proper OOP and OOD did not, as I believed, readily improve the readability of my programs, nor did it straightforwardly solve all the questions about the design and implementation of them. Indeed, sometimes those things did occur; however oftentimes they did not. Fulfilling those best practices could result in the creation of needless classes, the elimination of classes and procedures which provide a coherent picture of the resulting behaviour, the destruction of simple data structures and their replacement with an ever-multiplying myriad of more and more abstract entities. Furthermore, the problem of implementing the design patterns (to give one example) pushed the difficulty to a deeper level. Should the architecture of the program be so designed to best implement the design patterns? Maybe so. However, there are certain times at which this seems far more like a wooden, almost zombified way of thinking, or at least, putting the cart before the horse.
Or let us consider another matter--the questions of the design of objects themselves, the hierarchy of classes, the delegation of responsibility, etc. Here everything becomes even more difficult. If we are to hold what I already described in the former paragraph, the problem becomes more difficult as with a greater number of classes, a more complex architecture, and the sequestering of information by the class hierarchy introduces new issues about actually understanding the code that comes out the other end. It becomes difficult to locate a behaviour because although it may seem a coherent effect on the part of the user, what would have been a procedure (or a conjunction of two or three) in a normal program can become something spread across a number of classes which eventually create a sort of emergent behaviour. Although the creation of classes, the minimization of state, etc. were initially created to reduce the difficulty in debugging these kinds of problems, the problems reintroduce themselves, albeit in a different manner.
Both of these questions, I think, serve as an example of what I wish to articulate. To be more explicit, the solutions to the problems did not make the problems go away. Rather, the problems themselves were also forced, in their own ways, upon us in new forms.
There are several things more I wish to say. However, I want to keep the post short in the hopes that people will respond. Frankly, I have only begun trying to "really" use OOP so I am sure there are plenty of problems with what I raised.
It was these facts which I seriously considered for a long time without any clear answer as to why they obtained. Was it merely due to a lack of experience that I never approached the problems in those ways? Surely that played a part - after all, it took geniuses many years to come up with everything, and surely I would not have realized solutions to questions about architecture, design patterns, what sort of classes are appropriate, and so forth. Although educating myself about these things by informal study - through reading books, listening to lectures, studying others' productions, and so forth - solved that problem, it produced a new side effect which became more problematic.
In particular, the implementation of proper OOP and OOD did not, as I believed, readily improve the readability of my programs, nor did it straightforwardly solve all the questions about the design and implementation of them. Indeed, sometimes those things did occur; however oftentimes they did not. Fulfilling those best practices could result in the creation of needless classes, the elimination of classes and procedures which provide a coherent picture of the resulting behaviour, the destruction of simple data structures and their replacement with an ever-multiplying myriad of more and more abstract entities. Furthermore, the problem of implementing the design patterns (to give one example) pushed the difficulty to a deeper level. Should the architecture of the program be so designed to best implement the design patterns? Maybe so. However, there are certain times at which this seems far more like a wooden, almost zombified way of thinking, or at least, putting the cart before the horse.
Or let us consider another matter--the questions of the design of objects themselves, the hierarchy of classes, the delegation of responsibility, etc. Here everything becomes even more difficult. If we are to hold what I already described in the former paragraph, the problem becomes more difficult as with a greater number of classes, a more complex architecture, and the sequestering of information by the class hierarchy introduces new issues about actually understanding the code that comes out the other end. It becomes difficult to locate a behaviour because although it may seem a coherent effect on the part of the user, what would have been a procedure (or a conjunction of two or three) in a normal program can become something spread across a number of classes which eventually create a sort of emergent behaviour. Although the creation of classes, the minimization of state, etc. were initially created to reduce the difficulty in debugging these kinds of problems, the problems reintroduce themselves, albeit in a different manner.
Both of these questions, I think, serve as an example of what I wish to articulate. To be more explicit, the solutions to the problems did not make the problems go away. Rather, the problems themselves were also forced, in their own ways, upon us in new forms.
There are several things more I wish to say. However, I want to keep the post short in the hopes that people will respond. Frankly, I have only begun trying to "really" use OOP so I am sure there are plenty of problems with what I raised.
After falling in and out of love several times with the object-oriented paradigm over years now I've come to a very similar conclusion as you have. The very core of my studies in programming were strictly procedural, so at first I found it a wonderful method to deeply consider the implementation of any given program before setting fingers to keyboard; however, this was because prior to this point I had not seriously considered any methods of planning programs prior to their writing and so the substance of OOD quickly occupied the vacuum of knowledge I had in this regard.
At the various peaks of my adherence to this paradigm even after I became more experienced I found myself often paralyzed by questions of ownership and scope at the broader perspective of planning a program, and often could never come up with a single good implementation that served every use case. Oftentimes when this issue would arise it made me wish I could just write the entire thing procedurally, at which point I would often ebb away back into the comforts of what I know as fundamentals. Some of the best procedural code I've written was immediately after becoming disillusioned with OOD but still using some of those same ideas albeit modified while putting hand to plough in the procedural realm.
I would like to add this consideration to your observations as well: in my professional and personal experience, I have not seen any serious improvement in shared code bases that use OOP over a procedural approach or a mixture of both. It is often said of those that write procedurally that each man has his own concept of how best to approach scope and instantiation of components in that paradigm, but I would be quick to respond that the same issue occurs within the object-oriented realm as often if not more than it does in the procedural realm. Oftentimes I have seen just as much confusion and incongruence arise in even the most well-sculpted object-oriented codebases over a long enough time, as the original intention of the first author(s) inevitably dissolves with time and others pick up the slack with their own conceptions of how things ought to be. However, because of how rigid the OO structures can be you often tend to see incredibly ugly hacks in such codebases in order to get some sort of behavior out of a mechanism that was never intended to be used in such a way, or if not that then needless repetitions of things that should not need to be repeated in order to achieve the same result. I've dissected and mapped out tens or perhaps hundreds of thousands of lines of the worst spaghetti BASIC code anyone has ever had nightmares about, and I would sooner take that over an object-oriented codebase that has sprawled into an impossible, unruly mess over years of time (though that may, of course, just be my prejudice :3).
There are others here that have seen similar things but have spent longer resolving the matter in their own practice that I would love to hear from on this matter, namely flash who I know still loyally subscribes to the paradigm and uses it very effectively in his projects.
At the various peaks of my adherence to this paradigm even after I became more experienced I found myself often paralyzed by questions of ownership and scope at the broader perspective of planning a program, and often could never come up with a single good implementation that served every use case. Oftentimes when this issue would arise it made me wish I could just write the entire thing procedurally, at which point I would often ebb away back into the comforts of what I know as fundamentals. Some of the best procedural code I've written was immediately after becoming disillusioned with OOD but still using some of those same ideas albeit modified while putting hand to plough in the procedural realm.
I would like to add this consideration to your observations as well: in my professional and personal experience, I have not seen any serious improvement in shared code bases that use OOP over a procedural approach or a mixture of both. It is often said of those that write procedurally that each man has his own concept of how best to approach scope and instantiation of components in that paradigm, but I would be quick to respond that the same issue occurs within the object-oriented realm as often if not more than it does in the procedural realm. Oftentimes I have seen just as much confusion and incongruence arise in even the most well-sculpted object-oriented codebases over a long enough time, as the original intention of the first author(s) inevitably dissolves with time and others pick up the slack with their own conceptions of how things ought to be. However, because of how rigid the OO structures can be you often tend to see incredibly ugly hacks in such codebases in order to get some sort of behavior out of a mechanism that was never intended to be used in such a way, or if not that then needless repetitions of things that should not need to be repeated in order to achieve the same result. I've dissected and mapped out tens or perhaps hundreds of thousands of lines of the worst spaghetti BASIC code anyone has ever had nightmares about, and I would sooner take that over an object-oriented codebase that has sprawled into an impossible, unruly mess over years of time (though that may, of course, just be my prejudice :3).
There are others here that have seen similar things but have spent longer resolving the matter in their own practice that I would love to hear from on this matter, namely flash who I know still loyally subscribes to the paradigm and uses it very effectively in his projects.
Thanks for your reply. In lieu of an immediate response I want to say that I have a book laying around somewhere, full of line-by-line TRS BASIC programs. Some of them are absolute spaghetti code hell of GOTO and GOSUB (for some reason developers seem to have not used many FORs, there was no WHILE). I want to get it out and post some on the forum some time.
I remember hearing a phrase along the lines of "objects are fine, it's the 'orient' that's the problem" and that stuck with me a good bit. Over time I've come to just using objects where it makes sense and otherwise writing a strictly "C with classes" (or "Orthodox C++") style it seems like every seasoned C++ programmer adopts after enough OOP exposure. Basically objects can work so long as you don't go crazy with inheritance and you aren't forcing yourself to make things into objects that don't really need to be objects. A bad OOP codebase will have you rounding off a square peg to fit the round hole instead of just making yourself a square hole. But ofc not every OOP codebase is like this and good ones can (and do) exist.
I guess what it really comes down to is that the more methods you have of abstracting things, the more chaos you're going to introduce in your codebase as the team size grows (so with one developer pretty much anything goes). This goes beyond OOP or programming paradigms in general. The more toys you have to play with, the more chaotic your codebase will be with a large team. C++ is a shining example of this because the language just has too much shit. You'll have a guy who's really into template metaprogramming and "Modern C++", a "C with classes" guy, a "Java with pointers" guy, etc. and all of these people will have a different intuition for abstracting things. So you need to either enforce a specific style with strict guidelines, or use a more restricted language like C (which promotes a specific kind of procedural code), or Java (which promotes a specific kind of OOP code), or whatever else... Y'know how in the Euler thread the other day, Reemo, you mentioned how Perl kinda fell off the face of the earth? I think this is why, it was just too expressive for its own good. If the world wasn't so dependent on it I'd say C++ is headed down that same path.
I guess what it really comes down to is that the more methods you have of abstracting things, the more chaos you're going to introduce in your codebase as the team size grows (so with one developer pretty much anything goes). This goes beyond OOP or programming paradigms in general. The more toys you have to play with, the more chaotic your codebase will be with a large team. C++ is a shining example of this because the language just has too much shit. You'll have a guy who's really into template metaprogramming and "Modern C++", a "C with classes" guy, a "Java with pointers" guy, etc. and all of these people will have a different intuition for abstracting things. So you need to either enforce a specific style with strict guidelines, or use a more restricted language like C (which promotes a specific kind of procedural code), or Java (which promotes a specific kind of OOP code), or whatever else... Y'know how in the Euler thread the other day, Reemo, you mentioned how Perl kinda fell off the face of the earth? I think this is why, it was just too expressive for its own good. If the world wasn't so dependent on it I'd say C++ is headed down that same path.
I once shot a man in Reno, just to watch him die.
> I remember hearing a phrase along the lines of "objects are fine, it's the 'orient' that's the problem"
that's a really interesting quote, I think that's a good way to put it. Having played with both over-the-top C# OOP code and insane spaghetti, I'd also come to the conclusion that the solution is somewhere in the middle. Objects, classes, inheritance, etc. are amazing tools but in the end it's just that. Having a really great hammer that does many things really well (or can even be the only real way of doing something) doesn't mean you should be using that hammer to chop down a tree.
With TETR.IO I've kept to using classes, inheritance, etc. where they are useful, and where they end up simplifying the actual business code. If classes add complexity, making it harder to find the actual code, swap it out, etc. then it's just not worth it. It should be at most one or two definition lookups to find what causes something, and that business code should not be broken up over many different classes. (I tend to avoid calling supers in methods for that reason too unless the two are unrelated)
In general, my approach to programming (proper programming, not just making a quick thing I know I won't need to touch later) is more about keeping the business code (the stuff you're actually going to be likely to be adding random changes, bugfixes, etc. to) simple, and not enburdening it with having to fit a specific shape. Especially with bugfixes you're going to end up overstepping that shape, and the chance of The Laze setting in and making a quick hack to make it work as opposed to rewriting the entire thing to fit better is very high, especially considering the latter doesn't tend to improve the code from its previous state.
For that purpose, I've sometimes gone even out of my way to make code that would be very hacky and weird, knowing that it's only a net benefit as it makes the business code a lot simpler. For example, my new UI framework uses some really stinky regexes to edit JS as it's built, things that other devs would probably shame me for. But in the end it makes all the actual business code (all the UI components) really simple to write and understand. Paradigms and standards are tools, and good ideas to keep in mind, but in the end you mustn't lose sight of what actually matters (code that you can actually work with).
On top of all that, there's also just the fact that I generally always try to remain a little sceptical about hard-forcing some design concept. In the end, I'm still learning, and what I think is well-designed code now I will likely see as dogshit trash in 5 years. Sure, some of that is caused by tagging in bugfixes and small changes, but in the end it's also just the fact you're learning. So remaining a bit more open helps soothe the pain when you realize you have to live with the fact you know better now.
that's a really interesting quote, I think that's a good way to put it. Having played with both over-the-top C# OOP code and insane spaghetti, I'd also come to the conclusion that the solution is somewhere in the middle. Objects, classes, inheritance, etc. are amazing tools but in the end it's just that. Having a really great hammer that does many things really well (or can even be the only real way of doing something) doesn't mean you should be using that hammer to chop down a tree.
With TETR.IO I've kept to using classes, inheritance, etc. where they are useful, and where they end up simplifying the actual business code. If classes add complexity, making it harder to find the actual code, swap it out, etc. then it's just not worth it. It should be at most one or two definition lookups to find what causes something, and that business code should not be broken up over many different classes. (I tend to avoid calling supers in methods for that reason too unless the two are unrelated)
In general, my approach to programming (proper programming, not just making a quick thing I know I won't need to touch later) is more about keeping the business code (the stuff you're actually going to be likely to be adding random changes, bugfixes, etc. to) simple, and not enburdening it with having to fit a specific shape. Especially with bugfixes you're going to end up overstepping that shape, and the chance of The Laze setting in and making a quick hack to make it work as opposed to rewriting the entire thing to fit better is very high, especially considering the latter doesn't tend to improve the code from its previous state.
For that purpose, I've sometimes gone even out of my way to make code that would be very hacky and weird, knowing that it's only a net benefit as it makes the business code a lot simpler. For example, my new UI framework uses some really stinky regexes to edit JS as it's built, things that other devs would probably shame me for. But in the end it makes all the actual business code (all the UI components) really simple to write and understand. Paradigms and standards are tools, and good ideas to keep in mind, but in the end you mustn't lose sight of what actually matters (code that you can actually work with).
On top of all that, there's also just the fact that I generally always try to remain a little sceptical about hard-forcing some design concept. In the end, I'm still learning, and what I think is well-designed code now I will likely see as dogshit trash in 5 years. Sure, some of that is caused by tagging in bugfixes and small changes, but in the end it's also just the fact you're learning. So remaining a bit more open helps soothe the pain when you realize you have to live with the fact you know better now.