|
Posted by Jerry Stuckle on 10/07/07 13:31
Ben C wrote:
> On 2007-10-07, SpaceGirl <nothespacegirlspam@subhuman.net> wrote:
>> Ben C wrote:
>>> On 2007-10-06, SpaceGirl <nothespacegirlspam@subhuman.net> wrote:
>>>> Ben C wrote:
> [...]
>>> For better or for worse.
>> For better, as it really encourages reuse and superclasses (classes of
>> classes etc).
>
> In theory, yes. Where it goes wrong I think is when it is encourages
> people to design arbitrary class hierarchies, which either get
> overcomplicated, or require constant and painstaking "refactoring" when
> the requirements change.
>
OO doesn't "encourage" anything. Just like C doesn't encourage
spaghetti code. People can design as well or as poorly as they like.
> One might say well that's just bad OO programming, not OO programming in
> general. But that's a cop-out-- the real question is how hard or easy is
> it to do good or bad OO programming.
>
It's bad OO design. It is easy to do good OO programming, with the
right training and experience.
> OO can encourage people to make too many design decisions up-front,
> before they really know what they want to do yet.
>
Good design (not just OO) dictates that your decisions MUST be made up
front. Can you imagine creating the blueprints after the house is 1/2
built? But that's how a lot of people approach programming problems.
> It's supposed to protect against dreaded type mismatch errors-- you pass
> the wrong type of object to a function by mistake-- but how often do
> such errors actually really happen?
>
Nothing in OO protects against type mismatch errors. That is completely
dependent on how strict the language's type checking is. PASCAL, for
instance, is a non-OO language, but does not allow any type mismatches.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================
Navigation:
[Reply to this message]
|