...that to describe object oriented programming in terms of the categories of polymorphism, inheritance and encapsulation was nearly to paint a trite and oversimplified picture of the discipline. I think he said it was the answer that any job interviewer probably would be looking for, but indicated a stunted understanding of the topic. Unfortunately, he was only making a brief comment to the side, and didn't have time to elaborate at the moment, so I didn't get to hear what his Grand Vision of Things OO actually was. (I was disappointed at the time, and I say this without irony.) Hence my question: If you had to summarize the essentials of object oriented programming as consisely and precisely as possible, what would you say? Do you think the textbook answer is useful? To what degree? Everybody (including Paul), take your best shot. Tim P.S. This question is pretty much just for fun. Interested parties only.
wonderful question: for me OO Programming is more of a dream world. Its something makes me beleive that i can write the best code (s/w solution) which is absolutely flexible, wonderfully scalable, easily maintainable. i hate to find design/coding bugs. I want OO Programming to enable me to write something which absolutely bug free and very plug and play type. which gives me interface ports where i can mould it in whatever way i want. i want OO to make me write code that never dies. a code which is a piece of cake for people to see/eat. and my dreams go on.........
Paul Wheaton is absolutely right. The things he describes are "aspects" of OO, but they are simply learnable concepts (or even just buzzwords) without an understanding of the issues. Most software is overbudget and overtime, and the "Software crises" or normal state of affairs (Booch) has continued for many years. OOA+D hopes to address this issue, and things like encapsulation and inheritance will reduce testing and debugging time for a large project. OO also allows programmers to discuss the problems they are solving ON BEHALF OF THE END USERS with the same end users in high level terms, ie the user says "we need a Booking system for our hotel", then obvious classes are room, hotel etc. we can discuss the attributes these classes will require with the user, thus modelling (as best we can) Real World objects. It moves us away from the days when programmers worried about whether to use a stack or a queue and had no structures that the average user could relate to. Obv, in the above example, the user could relate to objects such as rooms etc. Finally, re-use: we find that re-use of OO expertise comes before re-use of the actual software, and as oo lets us use real world objects as the base for our solution, so we should be designing components that have a use within the greater problem domain. Only with oo is any of this possible. pete (sorry if im waffling on and not making sense)
Interesting! There is no answer for this beyond of textbook description. In the real world, everybody who writes C++/Java code declaims he/she is doing OO programming. However, the meaning of the OO is varying from person to person. What you actually see is they are using a difficult way to write C codes. Java is even worse if it is chosen. This leads to codes nobady can understand. Why? it loses the structure of the application in the name of OO. Good programmers write for the people, bad programmers write for the machine. If the machine can not understand, of course, it is not a programmer.
Well, the subject line of this thread caught my attention Abstraction, Polymorphism, Inheritance and Encapsulation are the four big words that some folks use to define OO. I think most of the folks that are glued to these words aren't completely comfortable with OO, but they carry these four words with them as proof that they are. To me, the magic of OO is that you get to extend the language with your own types (classes). Suppose you write a good program with quality types. Then you need a different program in the same industry - well, it just got a lot easier cuz you have a language that has already been extended to be familiar with your industry. You will probably extend it just a little more and your new program will be 20 lines or so. Granted, this is not a comprehensive description of OO, but I do think it makes OO a lot easier to swallow than the big four words above.
Originally posted by Laojar Chuger: Interesting! There is no answer for this beyond of textbook description. In the real world, everybody who writes C++/Java code declaims he/she is doing OO programming. However, the meaning of the OO is varying from person to person. What you actually see is they are using a difficult way to write C codes. Java is even worse if it is chosen. This leads to codes nobady can understand. Why? it loses the structure of the application in the name of OO. Good programmers write for the people, bad programmers write for the machine. If the machine can not understand, of course, it is not a programmer.
I think you have a good point there, but I thinks it's mainly up to the programmer. I think some people overdo the classes, inheritance and other features. It makes it very difficult to track down problems and to add features when you've got to trace though several files. I think OO is good in moderation -- only when it make sense and not because you can.
I like the pasta analogy. Procedural code can be like spaghetti - lots of long strands twisted and knotted around each other and very hard to follow. OO code can be like ravioli - a mess of small lumps with no obvious connection and no indication of what's inside. But just because they can be, doesn't mean they have to be. Good naming practice and package discipline, an understanding of patterns and intelligent refactoring can make either sort of pasta quite digestible.
Let�s use Booch�s plane analogy. He said that a plane is an example of �system effect�: we combine parts such a way that the resulting system will have features none of its parts have. Pilot, plane itself and gas cannot fly, while union of these three can. (Most of the time ) Let�s return to programming. We have 1) data 2) code OO paradigm combines them such a way, that the resulting system receives new features (through mechanism of polymorphism and inheritance). For example: we have well defined �someObject� and also well defined someMethod(). So far, so good. Now let�s combine them: someObject.someMethod(); and oppa, if polymorphism & inheritance did their job than nobody can say for sure what this piece of code does. (Without further investigations). Thus, our system obtained a new important feature � unpredictableness. Joke. Just to illustrate the main point. Extensibility is probably the main result, unpredictableness is just its back side. Better modeling, re-use, reduced testing and debugging time etc. are the consequences of such a combination.
I feel that as soon as you start to use terms such as encapsulation, data-hiding, object coupling, etc., you start to *loose* the high level concept of OOP. These terms are used to differentiate between OO and procedural programming and are better suited to explaining the 'how'. OO is not procedural because 'insert OO buzz word/academic definition here'. My favorite decriptions of OOP relate the overall concept of it to something innately human or common enough to be readily understood. When my girlfriend (not a programmer in any regard) asked "What is object oriented, anyway?" I said something like "It's making code chunks that work like real-world things. Like a photo copier - you put the paper on top, hit 'copy' and a copy pops out. Ideally OOP is just like that - you don't need a manual to use the object and you don't need to be a technician to understand it". Now, for a programmer, once understanding this, can ask 'how' and then you delve into encapsulation, polymorphism, ...... If you don't have a solid high level view such as this (and you'd be suprised how many programmers really don't get this) then you're just building houses from the inside out. Take some procedural code, extends some class - is it an object yet? - add an interface, define some public methods - is this OOP? I prefer you draw a square on some paper, show one input and one output. Then build the guts. Of course, this is just the starting point but at least its from the outside in. There's my 2 cents Sean
The best of OO IMHO, is encapsulation. You have limited access and the behavior of each obj is defined. Personally I don't like code which changes values of variables everywhere - you ever debug such code? I mean something like a watch in the debugger. To me its just nightmare. And then, I would be really surprised if someone says: - OO is bug free. Not always. And even Procedural can be bug-free. - code that never dies. Well, what about deprecated methods. - OO Programming to enable me to write something which absolutely bug free. Actually, IMO, even procedural programming allows you to write absolutely bug free code. regds. - satya
[This message has been edited by Madhav Lakkapragada (edited January 31, 2001).]
Thus far, I tend to think of OO as linguistic operationalizing of systems meta-modeling whose underpinning relies on the mathematical theory of classes and whose task it is to optimize the efficiency of the information processing. LOL
Wow, what a metaphysical discussion we have going... IMHO, the true value of OO is that we can model real world objects. Inheritence, encapsulation, etc are just icing on the cake. The two primary benefits I see as as follows. First, traditionally if you had a problem and needed a program, you had to explian the problem to th programmer, who had to come up with a computer solution (Laojar's point about programming to the computer). With OO we can better represent the domain in a manner more readily understandable to non-programmers. Consequently the programmer can rely on domain expert to solve the problem (which they understand better than the programmer to begin with, in theory, anyway), and the programmer can map that solution into the cmputer. Second, is abstraction. Yes, this is a cononical OO buzzword, but it makes more sesnse in OO, because the boundaries relate to real world, often physical boundaries, with which we are familiar. Most half-decent programmers can create a system in which to boundaries make sesnse. Rarer are the programmers who make programers with abstraction boundaries which make sense to another programmer 6 months after the code was written. Many junior and intermediate programmers don't do this well. By modeling the "real world" the boundaries make more sense to everyone.
The thing that gets me in describing and the initial teaching of OO, is that it is all to common to immediately start using terms that you'd only ever use in relation to OO concepts! "Oh, OOP is the use of inheritance and data encapsulation to create abstract data types" (I just turned my own stomach typing that). What does the preceding mean to anyone who doesn't already understand these concepts in the first place. Of course, a programmer needs to understand these things completely, but to define and understand OOP, we should avoid these terms as a foundation for one's initial introduction. Sean
Joined: Sep 28, 2000
That seems to be a problem with all higher education. Why begin by telling me about parallel metal plates, dielectric constants, and coulombs, when you could just start by saying that a capacitor is like a storage tank? There is no delivery of the big picture. This is Low-Grade Pedagogy, nothing more. Maybe Psychology and Education majors have it better; I don't know. I wonder why this is so. C.S. Lewis once observed that one who can describe an abstruse topic in layman's language is one who truly understands his topic. Having to resort to jargon is an indication of a lack of mastery, though I would stop short of calling it a sufficient condition. Surely our computer science professors know better. (I suppose our graduate student assistants may not!) This is not to say that jargon is useless. One of the reasons I started this thread is that I see value in terms like inheritance and polymorphism, and I wanted to see what could be said about Things OO without them. In their defense, they describe real phenomena at work in OO programming and in real life. Even so, many of the descriptions and metaphors given here show a more comprehensive understanding of the topic than can be shown with mere technical terms. Tim [This message has been edited by Tim Berglund (edited January 31, 2001).]
Just got to chime in here... I'm with Sean; I agree with what he said in his 31-Jan 10:25am post. For me, the 40,000 foot answer to "why is OO good?" is that it supports a style of development in which the code can actually reflect the analysis - the things you discover early in thinking about your program lead you naturally to a design, and the things you create in your design are visibly present in the code. Drop down to 30,000 feet and ask "how?", then you start thinking about P, I, and E. Of course, a lot of procedural code still gets written in OO languages, and it's even harder to understand than if it were written in C. I'm neck-deep in supporting a largely procedural database app that was written last summer - in Java. What an opportunity to practice refactoring... So I'm going to twist the original question Tim asked just a little bit: I'll say the essential heart of OO development is building models - model the real world in your analysis, model the analysis in your design, model the design in your code. Whether working in Java or an assembler, the single critical thing that separates good programmers from average or worse is the ability to apply the appropriate level of abstraction to the problem in front of you today. The advance that OO brings, via languages and modern development practices, is a set of tools that lets people with this ability work magic. Just my opinion, of course. I could be wrong... Jerry
Joined: Jan 21, 2001
wonderful! makes lot of sense to me. thank u. way to go...
Originally posted by Jerry Pulley: Just got to chime in here... refactoring... So I'm going to twist the original question Tim asked just a little bit: I'll say the essential heart of OO development is building models - model the real world in your analysis, model the analysis in your design, model the design in your code. Whether working in Java or an assembler, the single critical thing that separates good programmers from average or worse is the ability to apply the appropriate level of abstraction to the problem in front of you today. The advance that OO brings, via languages and modern development practices, is a set of tools that lets people with this ability work magic. Just my opinion, of course. I could be wrong... Jerry
Joined: Nov 07, 2000
Modeling after the real world - excellent! This is really to the heart of the matter. This is something I mentioned in a post once and I think I just made everyone think I was nuts. I suggested that truely gifted coders aren't caught up in the nut and bolts of language specifics but rather more abstract conceptual overviews of problems. For me its similar to the situation when a great musician speaks of being inspired by a great painting. A student will often nod in awe but actually think "How does that make you play scales so fast?". When the technique of playing is no longer the focus for a student, what's left? How do you become better? Answer - go look a great paintings and you'll figure it out. I guess what I'm trying to tie together here is the observation of a high level understanding equaling a deep understanding and the purity of natural designs. So many things in nature are rather perfect in design yet science often contorts solutions into very unnatural things. Like an "All new mutli chopper kitchen asssisant do-hicky" and the like. Tools like this make great informercials but a well-trained chef will put most of these devises to shame with a good sharp chef's knife. The knife has been around since the bronze age (at least) and I doubt it's basic design will ever be improved upon basically because it is a very natural design (like a tooth or a shard of flint). People are all to often designing and building swiss army knives when a pair of chops sticks will do the trick.
subject: I once heard the great Paul Wheaton say...