The ability to subtype classes is a consequence of a language defect
How can this statement be true? Without subclassing how can the language become object oriented?
I am starting a new thread for this because the goal of the original thread was different. I will repost the responses I got below. [ December 06, 2005: Message edited by: jiju ka ]
Joined: Oct 12, 2004
Reply posted by Ken Blair:
quote: -------------------------------------------------------------------------------- Originally posted by jiju ka:
Can somebody explain the logic behind this statement.
I believe the contrary is true. Inheritance is prevented by declaring a class as final. In the nature (real world objects) inheritance is not prohibited.
[ November 30, 2005: Message edited by: jiju ka ] --------------------------------------------------------------------------------
I believe he was suggesting that being able to extend a class is a language defect. Yes, inheritance is prevented by declaring a class as final. You call that the contrary, but it is not contrary, it is irrelevent. What is a "real world object" anyway? If it's anything I've coded it's final and inheritance is prohibited, there's only one time I've ever used concrete inheritance.
Joined: Oct 12, 2004
Reply from Tony Morris:
quote: -------------------------------------------------------------------------------- Originally posted by Greg Charles: The ability to subtype classes is a language defect? I'm not following your logic there. Be that as it may, Java and all languages that permit inheritence suffer this defect. Therefore immutable classes should be final.
Yes, concrete inheritance (and interface inheritance as we currently know it) is a language defect, and can be traced to an implicit requirement defect under formal analysis given a set of axioms (one of which, for example, is that time is linear, therefore, the assertion may well fall apart at the relative speed of light). Complete reasoning omitted for brevity, and for now; let's assume it anyway.
.. .. .. [ December 06, 2005: Message edited by: jiju ka ]
Originally posted by jiju ka: Without subclassing how can the language become object oriented?
To me, the most important feature of an OO language is runtime polymorphism.
Polymorphism doesn't require subclassing, though. Even in Java, you *could* live with only interface inheritance. In other (dynamically typed) languages, such as Smalltalk, Ruby and, I think, Groovy, no inheritance at all is necessary for polymorphism. In those languages, the only reason to use subclassing is for easy code reuse.
Having said that, I don't know what Tony means by "language defect" in this context. The only thing I can report is that subclassing can be overused, but if used wisely, it seems to be a quite valuable feature to me.
The soul is dyed the color of its thoughts. Think only on those things that are in line with your principles and can bear the light of day. The content of your character is your choice. Day by day, what you do is who you become. Your integrity is your destiny - it is the light that guides your way. - Heraclitus
To draw an analogy (and to sidestep the issue, since debating it on forums is fruitless), imagine an objective of 'eating a cake' (producing 'valid' (formally defined, but unstated) software). To achieve your objective, you jump aboard your scooter and head to your local hardware. You have now produced a contradiction in your requirements. You could of course, change your requirement to something like 'buy a box of nails', in which case, the contradiction goes away, but you'll be left with a box of nails and not a cake. You'll note here an assimilation to the Scientific Method - a method for deriving "truth on some given axiom" (Galileo is my hero, ok maybe not ). The least plausible axiom in the "formally defined, but unstated" definition of valid software is non-zero constant time, which we all know falls apart under certain conditions (changed axioms) such as relative speed of light.
Since it has been ridiculed before, and it's mildly amusing to observe, I'll state it again. It is reasonable to suggest that since at the speed of light, the formal definition of valid software, of which concrete inheritance (and all inheritance as we know it) is a direct contradiction, falls apart, concrete inheritance may indeed become valid within a new axiom (all other axioms withstanding).
Up to this point, a lot of my audience of my words have decided to break out the matches and tie me to the stake. I blame propaganda/marketing material ultimately, pure speculation. However, I am in the process of providing a less formal (and therefore, less precise) publication to hopefully identify with a wider audience. I hope to invoke objective and analytical thought processes on their behalf.
Until then, I can only refer you to the ContractualJ API Specification, which strictly forbids the use of concrete inheritance: "All classes are declared final. Concrete behaviour inheritance is not permitted."
...and attempts to minimise the impact on clients of the flaws of interface inheritance: "All contracts (interfaces) declare one (and only one) operation, unless they are composite interfaces, in which case, they make an effort to approach "Perfect Symbiosis of contractual operations" (to be defined)."
Jiju - I think if you look through the source of ContractualJ and JTiger, you will get a good idea of how this stuff can be achieved. My experience is that Tony's code is often more informative and useful than his verbal expositions are. Your mileage may vary, of course. [ December 06, 2005: Message edited by: Jim Yingst ]
I think Tony has laid out some very interesting rules in his ContractualJ pages. You'll agree with the "defect" argument only if you agree with the rules and believe the language should enforce the rules. The designers of Java obviously didn't. That doesn't make either camp right or wrong; it just means they have different priorities and Tony has the advantage of another decade of study.
Even if you don't call the ability to extend concrete classes a defect, it's easy to show that it's a risky practice and many of us avoid it. I'm not quite so ready to adopt all of his other rules ... yet. [ December 06, 2005: Message edited by: Stan James ]
A good question is never answered. It is not a bolt to be tightened into place but a seed to be planted and to bear more seed toward the hope of greening the landscape of the idea. John Ciardi
My thought processes get slower as they approach the speed of light, but I can tell you that this bakery in Corsicana Texas makes really good fruitcake. Bill
Joined: Jul 11, 2001
The ContractualJ rules certainly are interesting. Some of it actually reminded me of old discussions at Wards Wiki, such as MethodsShouldBePublic.
It would certainly be interesting to see what happens when you try to follow those rules. I'm not sure that I'd want them to be hard and fast rules - I'm more leaning towards heuristics in those things. That could be a personal defect, of course...
I hate to admit it, but I still have no idea what all of this has to do with requirements defects or "non-zero constant time". I look forward to more uncoverings in the future...