Fog Creek Software
Discussion Board




Objects Flamewar

I took a dissenting view of the desirability of object
oriented programming in other discussion threads,
and got shot down, so I thought I'd fight back here
rather than polluting other discussions.

My case is that objects = inheritance, and that
inheritance = spaghetti, and therefore that objects
= spaghetti.

People say that object programming is actually
about modularity, abstraction and encapsulation.
This is because of the Object Lobby, who think
they are some kinds of programming sensei's,
have spread the following propaganda:

"Up to now, you foolish mortals have been
suffering under the Wheel Of Fortran77, but
now you may reach Enlightenment under the
Tree Of SmallTalk, if you realize the Noble Truth,
that Programming is Information Hiding!"

However, in the real world we were using things
like Turbo Pascal in pre-object days, and had to
switch to things like Visual C++ when the Pointy
Haired Bosses decided that objects are A Good
Thing.

Now the information hiding facilities of Visual C++
are Classes (compromised by Friends) and #include,
whereas Turbo Pascal has Units (not compromised
by Friends). So actually, object orientation probably
means *worse* modularity, abstraction and
encapsulation.

People will also say that objects are about
polymorphism, but in object languages this means
overrriding methods in subclasses. So if you kill off
inheritance (which I want to do) you don't get
polymorphism (because you dn't get subclasses)
in the languages that people actually use.

Now that I have made my case for "objects =
inheritance", I want to say why I want inheritance
killed off, which is that "inheritance = spaghetti".

This is because your code becomes riddled with
dependencies from the very bottom of your
hierarchy right the way up to the superclass, and
so very easy to break. This is far worse than the
problems with global variables and goto's,
because *they* are at least explicitly visible. But
the dependency chains you get from inheritance
are too opaque because they are complicated
and partially hidden.

At this point, people say "No! This is a feature!
It's your fault for abusing it! The compiler should
Trust The Programmer, and *you* shouldn't code!"

Unfortunately, in the real world, *I* am not
responsible for the garbage I have to maintain.
So this approach will not work:

Boss: Fix this
          (shows SpaghettiHierarchyNightmare.cpp)
Me:    How dare you write such garbage code!
          Go back and rewrite all 300 Megs of it before
          you pass it to me, you cretin!
Boss: Take this instead
          (shows pink slip)

In the real world, we don't want the compiler to
Trust us, we want it to prevent the cretin from being
able to write SpaghettiHierarchyNightmare.cpp in the
first place. We want Control Of The Idiots (C.O.T.I)
that preceeded us.

Someone would have to pull many stunts in a
non-object language in order to produce inheritance.
This means that the cretins could not produce a
spaghetti hierarchy nightmare as casually as they can
with C++, but we know that is still *possible* to do
inheritance without objects when *absolutely*
necessary because of Turing equivalence.

I'm fine with that, because the number of problems
that are inherently (no pun intended) hierarchical
is very small compared to the number to be solved in
total. A spreadsheet is not hierarchical. A word
processor is not hierarchical. An instant message is
not hierarchical. Only oddball things like Bill-Of-Materials
queries or C.A.D. graphs are hierarchical.

So in practice, if you program with objects you get a
small benefit for a *few* oddbal problems you have
to solve, and a large decrease of C.O.T.I in *all* cases,
because the cretins start making unnecessary spaghetti
heirarchy nightmares all over the place, making it harder
to maintain, evolve or re-use the code, and lumping
*you* with the responsibility of maintaining the cretins'
garbage.

At this point, the Object Lobby might say "Well, why did
you hire cretins in the first place?". Yes, I admit it, the
recruiting practices of the Fortune 1000 are all *my*
fault.

Back in the real world, the cretin filters weed out
low IQ types. What remains is mere mortals with
800 GREs in analytics and computer science like me,
whom Joel would call "C Players", and a bunch of
autistics ("A Players"). They delight in riddling their
code with stupid unnecessary dependencies, they
*can* keep these all in their heads for maintenance
purposes, and they see no problems in burdening
their team members by dumping on them with
SpaghettiHeirarchyNightmare.cpp.

If I didn't know better, I'd think they were deliberately
sabotaging the maintainability (by anyone else) of
the code, in order to render themselves indispensable
and their jobs secure.

It is these "A Players" that *will* get hired, who are the
cretins that the compiler *should* control, in order to
prevent this sabotage. Taking the horrors of objects
and inheritance out of their toy box is a very good step.

A
Saturday, February 21, 2004

Come on, get a grip.  Insulting the "Object Lobby" doesn't make your case any better.  Denigrating object-oriented programming in general is a losing battle.  Obviously it can be done badly, but you would throw out the good with the bad.  In fact, inheritance and polymorphism can be used to great effect even in a word processing program, where document elements share a great deal of data and behavior.

K
Saturday, February 21, 2004

"Now the information hiding facilities of Visual C++
are Classes (compromised by Friends) and #include,
whereas Turbo Pascal has Units (not compromised
by Friends). So actually, object orientation probably
means *worse* modularity, abstraction and
encapsulation."

You're confusing OOP with C++ (a portable macro assembler in evening gown, to quote a poster on the Fortran newsgroup).

The horribly clunky header file mechanism does fly in the face of any concept of encapsulation or data hiding but that's not the fault of OOP -- it's the fault of C, the amazingly primitive non-OOP language on which C++ is based.

Likewise, the "friends" keyword is a C++ specialty that is not mandated by any OOP principles and which does not exist in C#, for example.

Chris Nahr
Saturday, February 21, 2004

What  is your alternative? You sound like a person
who thinks any code you did not write is crap
because you don't understand it immediately.

Complex things are complex. They can be made
more complex or less clear through bad programming.
But that is true in any paradigm.

Inheritance is a way to specify interface and
get an implementation for cheap. How would
you do it?  There's nothing hidden. It's all there
in the text.  If it's too much effort for you to
understand then pretty much anything will
be too much effort, because you'll need to
do something equivalent to solve a problem.

How inheritance fits in with hierarchy is someone
mystifying. Inheritance is a way to structure code.
That's it. Anything deeper metaphysically is your
own baggage.

It looks like spaghetti because there
is indirection. If you don't use indirection
how can you reuse stuff you have already built?
How can you call different stuff at runtime?
How can you build anything general?

You don't like encapsulation, modularity,  polymorphism,
but what is your alternative?

And you prove you don't understand by not understanding
what friends means. Friends recognizes that packages
of classes of work together, that a class is not standalone,
so you need a way to have access like all the classes are
the same thing, which they are, but provide different
interfaces for clarity.

son of parnas
Saturday, February 21, 2004

Most people misunderstand the true benefit of OOP.

It's not that you can reuse the OBJECT. It's that by building a program around an object, you can change the object and reuse the PROGRAM.

I mean, really, maybe 2% of my programs have been Object(s) code.

Reusing that 2% gives me a 2% productivity boost. Not worth the trouble. But reusing the other 98%? Priceless.

EXAMPLE
I wrote an educational software program. I then wanted to sell a SIMILAR program that differed slightly. (instead of multiple choice answers, this new one did fill in the blank or click and drag).

I just modified the part of the code (5% maybe) that deal with the "select an answer" part.  Voila... a new program for 5% of the effort.

In my new programs, that "answer selection" will be an object.  Just slip in a new object and... new program.  Then cash the check.

ANALOGY
An object is (IMHO) a program for programs. The benefit of software is that you have this generic device, a computer, which you can put in different software( or in the old days: punchcards) and REUSE the whole computer for somethign different.

Objects let you do the same thing with the program.

The real Entrepreneur
Saturday, February 21, 2004

Wow, I thought *my* posts are long winded.  So you hoped to elaborate upon your syllogism:

Premise: OO languages have inheritance
Premise: Inheritance leads to tangled dependencies
Conclusion: OO leads to tangled dependencies

Still easy to address: your second premise is wrong, and therefore does not support your conclusion.  Carelessness or stupidity cause tangled dependencies, not so inheritance.

But you admit that momentarily too.  And your solution is to wish everyone would work in handcuffs so your own idiots can do a bit less harm.  Tell us how that works out for you Quixote.  In the meantime the bulk of us will continue to embrace the best tools we know, and many of us will create excellent software with those fine tools.

(Incidentally, insinuating that top programmers must be autistic and that expert OO programmers constitute a malignant cabal who don't live in the real world might help assuage the pain of that 800 GRE, but it didn't help your argument.  Ironic that your budding manifesto to stomp out your own least favorite avenue of complexity swims in a complex swamp of superfluous crap.)

veal
Saturday, February 21, 2004

I don't like OO because it promotes mixing of data and code which manipulates it.

MR
Saturday, February 21, 2004

"It's not that you can reuse the OBJECT. It's that by building a program around an object, you can change the object and reuse the PROGRAM."

Is this a benefit of OOP as a whole, or just a benefit of encapsulation?

Kyralessa
Saturday, February 21, 2004

A,

I thought abut what you said in the other threads and decided I agree with you completely. Inheritence leads to class spaghetti and is bad. The only reason it is included is to allow for a sort of polymorphism using C struct syntax and polymorphism is very good. A desire for it in C brought bad inheritance and even much worse bad multiple inheritance to C++. Neither of these things are present in Smalltalk or ObjC, which also have polymorphism. Tey solve it with messages. They do it right. C++ does it wrong.

Dennis Atkins
Saturday, February 21, 2004

Actually I guess I don't agree with you, but your comments got me thinking about the class spaghetti issue and I realized that polymorphism was not implemented right in C++ or its friends, which has led to the spaghetti problem.

Dennis Atkins
Saturday, February 21, 2004

Actually I am not sure if I agree with you or not. Maybe you could tell me if you agree with me. That would be easier.

Dennis Atkins
Saturday, February 21, 2004

I hate object reuse, because I LOVE to rewriting everthing!

A+
Saturday, February 21, 2004

Hoo boy. Apparently, you think C++ is the be all and end all of OO. A common, but incorrect, assumption.

Inheritance does NOT equal polymorphism. Have you ever used Python, Smalltalk or Javascript? In those languages, you can call obj.SomeMethod( ) on any object. If the object supports the method, it gets called, independent of inheritance hierary. Inheritance is just a reuse mechanism in those languages.

As far as the "reusing the rest of the program" is concerned, this is what polymorphism is all about. Procedural programs let you re-use functions that you call. Polymorphism and OO lets you re-use the functions that do the calling.

Yeah, there's a lot of crap code out there, and OO is a lot harder than the original hype made it out to be. But heck, that was ten years ago! Get over it man! :-)

Chris Tavares
Saturday, February 21, 2004

Simple experiment. Go to any job site, whether it be
Monster, Geekfinder or whatever, and do a search for
development jobs, or even object oriented development
jobs. Now, count out the number of C++ and Java jobs,
against the number of jobs using SmallTalk, Python,
Simula67, CLOS, ObjectiveC, Fortran95, OCaML, object
-oriented assember, object-oriented Perl, and anything
else that I have missed.

You will see that, in practice, object-oriented programming
IS C++ and Java.  C++ has no encapsulation, modularity or
encapsulation, and neither have the kind of polymorphism
that you can get in 1 line of Miranda without inheritance.

So please, let's stop the pretense that objects mean
anything other than inheritance.

A
Saturday, February 21, 2004

I have nothing against polymorphism and know that
it can be useful, though I have rarely found it useful
personally. I am strongly in favour of modularity,
encapsulation and abstraction, and have been
attempting to write programs with these properties
since high school and since before I ever heard of
objects, so hearing these as being the benefits of
OOP sounds to me like complete dishonesty.

You can get these properties from SQL, for goodness
sake!

A
Saturday, February 21, 2004

Sorry for the multiple postings, someone complained
about the long post so I am breaking this up.

son of parnas, I thought the *aim* was to write code
so that someone with a proper background could
understand it immediately.

I wouldn't mind if I had *ever* seen a company where
there is any useful documentation about the code or its
designs. I expect to see that sometime after I see a
company where the developers actually test their work
before checking it in.

Maybe there, inheritance would work.

A
Saturday, February 21, 2004

My own take is that this is the "spaghetti principle", a
variance of the "peter principle" for complex systems.
A process will allow something to grow to its level of
absurdity and general spaghettiness.

In the beginning, there were relatively
simple languages like FORTRAN, which scaled up to a
certain level, beyond which they became spaghetti.
After that, came more procedural languages with fancier
types like C, which had a higher spaghetti level.  After
this came C++, which took spaghettiness to another
level of complexity.  Now, we have various new critters
(C#, Java, Python, etc) which allow even more
abstraction of complexity, and which grow to yet
more exalted levels of spaghettiness.

Since I'm no fan of spaghetti, I ran away from OO
languages a long time ago, for the calm world of
embedded systems and OS internals.  We have our
own religious fights here as well, but weird
meta-arguments of the OO variety are chiefly consigned
to those who are writing the 1,000,000,000th iteration
of "get data from db, display on screen".


Saturday, February 21, 2004

Versus, say, get value from sensor, multiply and present on display?  Or get bytes from device buffer, pass to caller?

Nice dismissal of most of the programming that occurs on the planet, but you can reduce almost anything that way.

veal
Saturday, February 21, 2004

Not that I don't respect those who manage to make
the horror-show that is much modern programming work
(since they are often my customers), but that game is not
for me.  After a "decision point" about six years ago,
I decided to abandon this 98% of programming, since I
saw that it was becoming over-standardized, over
routinized, and over-managed, and that any hope that
elegance and simplicity would make business sense
was to be abandoned - for completely excellent and
rational reasons.

So I chose to go into an area where the business model
favors what I think of as simplicity and elegance, and
whose best organizational model is small teams, usually
in small companies, over megalithic organizations whose
bureaucratic tendencies end up being reflected in the
code they produce.


Saturday, February 21, 2004

Veal, very wells said.  I agree with all your posts. 

A, i've never had a problem with OO programming.  I've worked on some pretty big systems.  I think the majority of people here haven't had the issues your talking about either.  Maybe your just not cut out to be a developer...

vince
Saturday, February 21, 2004

A... first a couple points on your side rant justifying your use C++ as a straw man...

First point.  C++ and Java have almost nothing in common but some syntax traits, so to say that OO == (C++ or Java) cannot be reduced to OO == C++.

Second point.  It's flat wrong to even say OO == (C++ or Java).  I know of plenty of Python, Ruby, Perl, C#, Eiffel and Objective C work going on, even if you don't.  Only a minority of the good jobs ever make it to a job listing board, since personal references fill the bulk of good jobs.  Monster is the career equivalent of a dating service -- you need it only if you have trouble getting dates.  And even if by some freak of probability there wasn't a single *open* job in Objective C on the entire planet at this moment, that says nothing certain about whether there are 9000 Objective C programmers working happily.

Problems with only C++ imply nothing relevant to OO in general.  If you want to battle OO, you need to talk about OO.  You'll make no progress against OO by pointing at problems with a particular language that you happen to have been forced to work with unhappily, or that exhibits some particular unwholesome quality that is not an inherent quality of OO.

Repeating again... OO *does* imply inheritance. Absolutely.  What is this pretense you claim? Nobody who knows OO well would deny that inheritance is one of the defining mechanisms.  Delegation is now favored for many things that could be shoe-horned into an inheritance relationship, but inheritance is still good and powerful in the hands of people who deserve to be employed as programmers.  Like all features, it should be used when it's the best choice.  Hell, I created about 4 inheritance hierarchies today alone between dips into this thread.

You haven't managed to show anything bad about inheritance, except that it's yet another thing that dimwits have difficulty with and that careless people use badly.  [Hey Joel!  Think we should eliminate pointers from C for the same reason?]  You keep waving vague references to the superior polymorphism functional languages support, as if this is evidence that OO got it so wrong that inheritance should be utterly banished.  It's not.

C has polymorphism too.  Doubt that?  What's the perhaps the most common use for a function pointer?  Shit, that's not nearly as powerful as the polymorphism of Haskell.  We better dump function pointers from C.  [See Joel!]

And again you're reduced to pleading that you can get modularity, etc. without OO.  That's great.  We all wrote good modular code before OO.  We can travel by land without machines too, but if you have a great distance to travel in a short time, that's probably not a good idea.

veal
Saturday, February 21, 2004

Egads...you dare to question the holy cow of object oriented programming? You, sir, are one crazy individual.

Dennis Forbes
Saturday, February 21, 2004

Bravo!

you go boyee
Saturday, February 21, 2004

"I don't like OO because it promotes mixing of data and code which manipulates it."

No it doesn't. OO more completely, more effectively and more rigorously separates the data and the code that manipulates it than any other style of programming I have come across.

OO puts the code, in the form of methods (operations) into classes along side a description of the structure of the data that these operations act on.

It puts the data into objects constructed according to the description in the class.

Finally, it constrains the operations permissible on the objects to those specified in the object's class.

David Roper
Saturday, February 21, 2004

It could well be the Peter Principle, or even the
Dilbert Principle, of software.

Let me use the example of an application that
I once maintained, that did a particular
computation of a matrix to produce a vector.
A desktop proof-of-concept of this application
would have taken a day to write, but the
production code was somewhat harder because
the matrix could be gigabytes in size.

It was written by an army of PhDs, who could
not make it work without leaking memory,
producing the wrong numbers and regularly
crashing.

To step through this code was impossible
because of the stunts these guys would pull,
including method pointers, templates, and
multiple inheritance with wild abandon. These
would mean that the code that actually did
anything was miles away and fifteen nesting
levels deep to where it was actually used, and
you often couldn't see the state that you
needed because of the way the code was put
together.

The thing is, what I was maintaining was 99%
code structure, and only 1% actually solving
the problem. There is nothing polymorphic
about a matrix. There is nothing hierarchical
about a matrix. A matrix is *simple*, an array.

The 'A Players' in this company were preventing
'C Players' like me from doing our jobs simply
because they were using these features.
Meanwhile *my* code would not leak, crash or
produce bad data, and I tried my damndest to
make sure that you could understand any line
just by looking at the rest of that 1 page of code
- not because I am a genius, but because you
*shouldn't* have to be a genius to maintain a
damned matrix.

But no, in the software business you have to deal
with whatever pointless garbage is dropped on
you purely arbitrarily, or else you'll get called
anything including 'C Player', 'Technically light',
'Clueless out of College' and 'H1 Bozo' (the jackpot!).

So I hope I will be forgiven for kicking back with
'idiot', 'cretin' and 'autistic'. My first post was
offensive and I apologize.

Maybe I should just drop this all for marketing or
consulting. It seems to be easier, better paid, more
secure and more glamourous work, with prospects
for advancement, '2 chicks at the same time', and
being labelled 'high flying' by the Economist.

A
Saturday, February 21, 2004

Another point on the "spaghetti principle" - several
pundits point out that OO may have run its course, and
that the next programming methodology revolution may
be around the corner.  Like all other methodologies in
the past have done, it'll incorporate the good ideas of
existing methodologies like OO, and will have its own new
stuff.  And like them all, it will expand to its own "spaghetti
level". 

The nasty part is that there'll likely be a dozen languages
between here and there - yet more nastiness on the
"skilz treadmill".  MSFT will make $$$ selling tools for all
of these, and acolytes will wrangle about their minutae.
And, naturally, HR types will be posting job ads demanding
five years experience in them the instant they pop up...


Saturday, February 21, 2004

veal,

If you want to pick the *best* object languages for
purposes of comparison, then I get to pick things like
Modula, non-object Lisp, standard ML and Datalog.
*All* of these will trump or be equivalent to *any*
object language you can name, in terms of features
for information hiding, ease of programming with,
polymorphism, provable correctness of the code,
and performance of the binaries produced.

By your analogy I have already been travelling by
orbital space plane long before you object people
even thought of using a car rather than walking.
Your points about the job market are similarly
dismissed.

Inheritance is the *only* defining property of OO.
If you start talking about the other properties,
you are implying that procedural and declarative
languages *do not* have modularity, abstraction
or encapsulation, which is very wrong and like
assuming anything that is not object oriented is
written in Fortran77. This is a far worse straw
man than assuming OO = C++. (And also, if
I recall correctly Java does have inheritance in
common with C++, so I don't understand your
point about that.)

So, what I am saying is that OO *is equivalent to* inheritance, and I think you've basically won my
argument for me by admitting that inheritance is
something that will be used carelessly and dim
-wittedly. That's the same as the argument used
against Goto, after all.

Remember Goto? I am sure that smart people like
you can use Goto's appropriately and improve the
maintainability of code that would otherwise be an
extremely unpleasant mess of nested if's, while's,
calls, returns and exceptions. But Goto was
sent to the electric chair nevertheless, because
these other constructs were safer, and better in
*most (not all)* places you would use Goto, and
available in all these places.

So let's go through the exercise of finding things
you should model with hierarchies of behaviour,
against things you can more efficiently model with
something else.

I can come up with simulations, CAD queries,
windowing systems, a few OS mechanisms and
the parser data structure for a compiler.

Anything even vaguely mathematical is best
represented as operators on values, anything to
do with data is best represented relationally,
anything to do with artificial intelligence is best
represented declaratively, and anything low level
is best represented as assembly.

So in your language you would endanger all of
business data processing, scientific computing
and artificial intelligence, plus most of system
programming, to make it easier to write simulators
and a few other things.

And remember, the damage that is caused can
cause breaks all along the object hierarchy, when
the classes must all be bug-compatible with the
base class which you decide to fix. This is havoc
far worse than the minor problems that a Goto
can cause.

So I think that inheritance causes time bombs in
your code, and that people use it a lot mostly not
because it's actually that helpful, but for bad reasons
e.g. they are taught to do things (or shamed into
doing things) that way by the object advocates, or
out of simple force of habit, the same way people
used to use Gotos prolifically before.

A
Saturday, February 21, 2004

>son of parnas, I thought the *aim* was to write code
>so that someone with a proper background could
>understand it immediately.

If something is complex how will you understand
it immediately? DNA sucks because i can't grok
it in the time it takes me to take a dump.

If it would take me 30 minutes to explain it to you
then you probably won't understand it immediately. 
I should glance at MS word and understand it
immediately?

We break down complexity with abstraction.
Abstraction at once makes things easier and
harder to understand. You can build more functionality
into the infrastructure, like SQL or prolog, but
you'll find things not to like there too.

You have not said what your alternative is.
You like relational databases, then i guess you love
the simplicity that is SQL and the zillions of lines
of code written around it. You like declaritive languages
but  i'd like to see MS word in such a language. You
have operators on values, which is pretty vague.

In short, you have no idea how to build programs, only
what you don't like. That you can't find stuff not to
like in the other approaches is a joke, because there is stuff
not to like.

Inheritance makes life easier and harder at the same
time. If you can't see the pros and the cons then you
aren't trying. What are the other ways you get the same
effect as inheritance? Evaluate the pros and cons.
Inheritance has a lot of pros, some of the cons can be
addressed with a better IDE.

son of parnas
Saturday, February 21, 2004

>And remember, the damage that is caused can
>cause breaks all along the object hierarchy, when
>the classes must all be bug-compatible with the
>base class which you decide to fix. This is havoc
>far worse than the minor problems that a Goto
>can cause.

Reuse sets up a depenency in any approach.
I don't usually write my own strcpy or OS, but
maybe you do.  What is your strategy for
not suffering from broken code you use?

Plus, i would use interfaces so you don't have to use
an implemenation if you don't want.


>So I think that inheritance causes time bombs in
>your code, and that people use it a lot mostly not
>because it's actually that helpful, but for bad reasons
>e.g. they are taught to do things (or shamed into
>doing things) that way by the object advocates, or
>out of simple force of habit, the same way people
>used to use Gotos prolifically before.

What a pot of piss.  I would like to respond, but there's
nothing concrete enough to respond to. But i am
curious how in other paradigms people somehow
are purged of the urge to do things for bad reasons
or habit?

son of parnas
Saturday, February 21, 2004

A says "Inheritance is the *only* defining property of OO. If you start talking about the other properties, you are implying that procedural and declarative languages *do not* have modularity, abstraction or encapsulation"

Hmmmm... I might see the problem now. If you have this much trouble with characterization and typing, I can understand your disdain for inheritance.

So if I define mammals as having hair, warm blood, and feeding their young with milk, that means I'm implying that birds *do not" have warm blood.  Maybe I better be safe by just saying that the *only* defining property of a mammal is having hair.  But now I suppose I've just categorized a peach as a mammal.

The argument against Goto was that it is simply too primative a construct, and you can achieve all the good uses of Goto with higher level language structures, like conditional clauses and repetition clauses.  Further, the charge against Goto would have fallen flat had important uses been lost by replacing goto with a suite of better primatives.  If anything, your argument, being retrograde, is more analogous to one that would remove conditional clauses because the same thing can be achieved with Goto.

Perhaps you've noticed (if you know something about machine instructions) that Goto didn't actually go anywhere.  We all still use it extensively, every day.  It's still right there, but now it's *below* the language level in the implementation of that conditional clause.  Perhaps if you looked more earnestly at OO, you'd find all your pre-OO Pascal tricks for modularity, abstraction and encapsulation smiling up from below the language level, albeit more crisply defined.

Having very successfully applied OO to more than half the categories in your litany, I'll happily dismiss your unsupported assertions about which problem spaces benefit from OO.  This admittedly goes only an inch farther than your unexplained dismissal of my points about the job market, and how it's only your poor read of the professional landscape that makes you think OO == C++.

Your observation that a change within in the hierarchy propogates along the hierarchy is a very good one.  Of course you didn't mention that this fact cuts both ways, and that one of the great qualities about inheritance is that a change propogates down the hierarchy, so you need only apply it in one place and all the similar types below will benefit.  You see, that's a great purpose of inheritance: you decide what's different and what's alike, and locate the like behaviors up the hierachy.  As some in the functional programming world like to say: once and only once.

veal
Saturday, February 21, 2004

Possibly useful links:
http://venus.va.com.au/suggestion/sapir.html
http://www.dreamsongs.com/Essays.html pro/con debate under the heading "Resolved: Objects Have Failed".
http://mumble.net/jar/articles/oo.html
fnord

I think it's important to know what use the language is put to.  A researcher might want a purer language to gain focussed data.  My type of general programmer would usually want the opposite, a language that has lots of resources but doesn't get in the way.

Tayssir John Gabbour
Saturday, February 21, 2004

veal,

If I have a bird and you tell me I should trade whatever
I have for a mammal because it has warm blood, your
argument is laughable. It is your error for not noticing
that I have the bird already, and assuming that I have
a peach.

Let me educate you on my "tricks":
There is a keyword "module" that defines a set of
functions and data in non-object languages. These are
not accessible from outside unless they are modified by
a keyword "export". Data that is exported as an opaque
type; the intention being that these are changed only
by exported functions. These are called "accessors".
In your design you are supposed to put together
the data with the functions that modify it in the same
module. This is known as "structured" or "procedural"
programming. What I get from object oriented
programming is the chance to rename:

module  => class or component
export    => public
accessor => method

Object programming gives me no better modularity
encapsulation or abstraction features (MAE) than
procedural programming. I could also get polymorphism
using something called a "variant". So we see that
in language features, the only difference between
procedural programming and object oriented
programming is inheritance.

What has happened is that the Object Lobby have
said "use objects, not procedural because you get
MAEP". That is like saying "Have a mammal, not a bird,
because a mammal is warm blooded".

There are people on other threads being made to feel
guilty for doing procedural programming, and that
is as ludicrous as the pro-object propaganda.

A
Saturday, February 21, 2004

veal,

You may also wish to consider the case of a conditional
expression nested 7 levels deep crossing over 10 pages.
This thing has so many brackets around it that you can't
even see them all on the screen, so you have no hope
of fixing them correctly. You can greatly aid the
maintainability of such a piece of c... ode by removing
the nesting, for example by use of Gotos. (I have had to
deal with many of these).

If you wish for a more vigorous defense of Gotos
I suggest you email the Linux Kernel Mailing List, where
you may be flamed by Mr Torvalds himself about where
you SHOULD use Goto, as happened to someone recently.

I only picked Goto because, contrary to the propaganda,
there are a FEW places where it is correct to use, and
where the alternatives DO NOT work as well. That is my
position on inheritance also.

I also suggest that you modify your condescending
tone, I have been trying to watch my language, but you
seem to be getting nastier.

A
Saturday, February 21, 2004

veal,

I dismissed your points about the job market because
you insisted somehow that we do not consider the
bulk of object programming being done, because they
are not "good" jobs, and that we dismiss the bulk of
the programmers as well, because they "don't deserve"
to be.

Since in any case, my selection of non-practical
non-object languages bested any selection of
non-practical object languages that you could make,
I didn't need to argue the point anyway, and do not
like being called an enemployable dateless straw man
who is poor at reading etc etc.

Putting your nasty phrasing aside, could you please
tell me which half of my categories you have applied
OO to? Does this mean, say, that I ought to use SmallTalk
rather than ProLog for my next expert system?
Should I use Simula rather than SQL for my next
database? Should I use Eiffel rather than MatLab
for my next signal processor?

Or are you claiming that you will guarantee that there
will be no problems in the inheritance hierarchies that
I will ever have to maintain? To make this guarantee stick
will you personally re-write for me the object oriented
code written originally by someone less stellar than you?

Or is my mis-reading of the professional landscape so
grave that I can't see that everybody else is absolutely
infallible?

Just because you make inheritance work, it doesn't
mean that Joe Programmer does. I have never seen a
need for inheritance or objects outside the classroom,
but I sure have seen Joe's masterpiece
SpaghettiHierarchyNightmare.cpp, and I have had to
maintain it.

A
Saturday, February 21, 2004

A:

You are a blathering idiot. You dismiss every argument posed to you with a wave of the hand saying "not relevant". Well, they are relevant. Deal with it.

You have said nothing of substance, and your assertions are meaningless.

Go back to the COBOL days and get stoned on salad cream, let us real programmers be.

Yeah, it's a flame. Whoopty fucking do. Take the bait, I dare you!

Asbestos sucks
Saturday, February 21, 2004

son of parnas,

I didn't say that the other paradigms are infallible, just
that objects are the worst.
How about this for suggestions:

90% of CPUs in use are for microcontrollers.
        Program these in low level fashion using assembly.

1%  of CPUs in use are for high performance numerics.
        Program these in structured fashion using a
        specialized mathematical language.

Of the remaining 9%, 60% of that work is business data
processing. Do this in SQL. Relational.

Of the remaining 4%, 60% is GUI programming. Do this
in VB. Count that as Object Oriented.

Of the remaining non-GUI programming, only the bits
that actually need a behaviour hierarchy, such as CAD,
need objects and inheritance. Other things, such as
accounting programs, gain nothing from OO so do those
procedurally, say in Modula, or declaratively, say ML.

And what makes you think a Word processor would be
harder to write in ML than in C++?

A
Saturday, February 21, 2004

I'm mostly not going to pick on the big stuff, as I expect others will

Let me pick on the small stuff

1. A variable that is UNIT level (confined to one implementation module) in TP, is *more* global and *more* spaghetti-like than one that is confined to one class in C++.  For non-TP think of a static outside a function in C for an equivalent.

The UNIT level variable can be seen by many functions (all the functions in the UNIT)

The private/protected-type class member can only be seen by functions in the same class [or derived classes]

Assuming 1 module is usually >= 1 class, the class member is normally narrower scope.


2. I can not remember the last time I used friend, although I have used it occassionally.

friend is necessary because there are certain cases where you can have 2 classes working together with a chicken and egg type problem

It might very occassionally be appropriate as a quick fix

But I agree it's much over-used


3. Do programming problems naturally decompose into objects/classes

Actually I think they often do.  The problem is many people take a way too simplistic approach or go a little object crazy

A spreadsheet as a whole might not be hiearchical (your example), but certain aspects are

Consider just the first step of reading the spreadsheet into memory from a file.

(i) There are lots of different file formats you might support, you could handle these with different classes of "File Parsing objects"

(ii) Some of the formats have sub-variants, e.g. you might need a more specialized "File Parsing Object" for a particular sub-variant of a format

(iii) All the "File Parsing Objects" have a common set of capabilities.  This suggests they all derive ultimately from some common (abstract) base class

(iv) The hierarchy of inheritance here of classes, enforces rule (iii).  Whereas if you have separate modules for parsing/reading each format, there is no enforcement of this type of rule. And the enforcement is a GOOD THING(tm).

Some of (many other) problems in spreadsheets which are hiearchical
- Different types of cells
- Different types of graphs (e.g. pie chart vs line chart vs bar chart)
- And so on

S. Tanna
Saturday, February 21, 2004

And fix your damned word wrap.

Asbestos sucks
Saturday, February 21, 2004

S. Tanna,

1. I wouldn't use it that way.
2. Skip this one.
3. I disagree, but your example is good. Go ahead.
    I would avoid inheritance myself, or I'd end up
    with something like the Java IO libraries.
    It's just a file.

But I'm just giving up on this thread now. In my
experience OO is a failure because it gets used by
normal programmers to make code less maintainable
not more, and I'm not having OO shoved down my
throat because of dogma. Apparently this heresy
angers people. Well, goodnight to you.

A
Saturday, February 21, 2004

If you just mentioned "software crisis" and "1968", everyone'd be in agreement.  http://en.wikipedia.org/wiki/Software_crisis  I believe the software crisis is in effect and we just don't notice it because uncomplicated apps are so useful, but I could be wrong.

Don't worry though.  XML replaced OOP as the reigning monster.  Brace for the crimes committed in the name of data-driven design.

Tayssir John Gabbour
Sunday, February 22, 2004

>I'm not having OO shoved down my
>throat because of dogma. Apparently this heresy
>angers people. Well, goodnight to you.

Nobody cares about heresy. Nobody is shoving
anything down your throat. The problem is you
show a lack of perspective and technial
skill that doesn't match your positions.
So you retreat to the persecution card.
Sweet dreams.

son of parnas
Sunday, February 22, 2004

http://en.wikipedia.org/wiki/Software_crisis


Sunday, February 22, 2004

Forgive me, I haven't read all the replies in this post, but I had gone through enough to be amazed, disgusted and dismayed that no one had said:

"FAVOUR COMPOSITION OVER INHERITANCE"

http://c2.com/cgi/wiki?CompositionInsteadOfInheritance

The assertion that OO is inheritance is just plain wrong.

Walter Rumsby
Sunday, February 22, 2004

How about "Favor Common Sense Over Dogma"

There are situations where inheritance is exactly what is called for by certain design problems.  In those situations, use it.  There are other situations where inheritance is entirely uncalled for.  Of course, a prerequisite is being able to identify and distinguish these situations. 

Instead of spouting, why not actually go study the problem?

As far as the misuse of inheritance (or any other tool in the toolbox), eliminating it doesn't eliminate the problem, it just shifts it to some other place.  The only effective strategy against this is to educate your fellows and yourself, and to evaluate each others' work honestly.

Rameses
Sunday, February 22, 2004

Ramses,

I don't disagree that there are times where inheritance is warranted, however:

* the original poster stated that OO is inheritance which is plainly incorrect

* the principle "favour composition over inheritance" grew out of the observation that designs based on composition tend to be more flexible than ones based on inheritance. A synopsis of the GoF book I read said that in a way it is a guidebook for replacing inheritance with composition.

I found it pretty bizarre that few people seemed to be objecting to the idea that OO is inheritance: OO is more than inheritance, inheritance should only be used where appropriate, inheritance is (probably) used too often and composition provides greater reusability and flexibility.

I have found this to be true in my own programming, and gee shucks, it seems to agree with the principles outlined by many authors, mentors, trainers, etc.

Walter Rumsby
Sunday, February 22, 2004

Sorry Walter,

Didn't mean that to sound like a swipe at you.  Actually, I was trying to build on your lead-in and one-up you :P

What surprises me most about this entire discussion is the almost complete focus on tactics while avoiding what is the real core of O-O.  O-O hypothesizes that the objects of the world are the stable concepts and everything else is subordinated to this hypothesis.  Once you make this assumption you seek a mechanism to use to give "object" an operational definiton. 

The most successful mechanism for this so far has been some variant of the abstract data type (or at least a practical approximation to it).  All this other stuff, inheritance, polymorphism, encapsulation, and so on are just natural derivatives from this one central notion.  Properties of the "O-O method", almost.  So the correct phrasing is probably more like "inheritance is OO" rather than "OO is inheritance."  The relationship is not a symmetric one.  It also might be good to qualify the word, "O-O inheritance" or "ADT inheritance" for example, just to be clear.  Other models admit other approaches that could rightfully be called "inheritance" in their own context.

Identification of the central mechanism of O-O, the abstract data type, gives a lot of insight into the proper use of inheritance.  Uses which are consistent with a theory of abstract data types are, for lack of a better phrase, "O-O uses."  Uses inconsistent with this definition are not.  These uses shouldn't even be called "inheritance" because they are not.

When I see inheritance used correctly (as defined above), it is obviously a powerful tool - without peer for certain classes of problems.  Because of its power, though, when it is put to ill use, it can create a scene of utter chaos.  People that know what they are about can tell the difference.

Try the education and evaluation route.  It works.

Ramses
Monday, February 23, 2004

You wrote, "The most successful mechanism for this so far has been some variant of the abstract data type (or at least a practical approximation to it).  All this other stuff, inheritance, polymorphism, encapsulation, and so on are just natural derivatives from this one central notion."

Are you sure?  Foldoc and Wikipedia take pains to separate adt's and oop.  Also you remind me of this longwinded argument: http://lambda.weblogs.com/discuss/msgReader$7355

But not like I care.  My operational def is something like oop = adt's + the big 3 oop buzzwords.  Tim Budd's book says it's about small computers that communicate with each other, which is kind of what Java's like in a weird way, but.. whatever.

Inheritance can probably be divided into subtyping and implementation inheritance.  I think most people hate the latter and in reality argue against that specific thing when they say they're arguing against oop.  To defend posters on this thread, it seemed to me that it was against hype machines in general and how the overfocus on a silver bullet leads to shiny methodologies eating up Moore's law kinds of advances.  A healthy program should probably use oop in a bunch of little places, like you'd use while() loops, but not as this huge thing with UML-directed design.

Tayssir John Gabbour
Monday, February 23, 2004

BTW I hope that doesn't come across as a flame.  Your redefinition of oop is similar to mine, and I don't like normal oop terminology because they're often loaded words.  I do think though that implementation inheritance is sometimes a neat technique, like with guis and possibly Java's servlet api.

Tayssir John Gabbour
Monday, February 23, 2004

Walter and Ramses- thank you.  Key here is "favored".  Composition is usually (not always) better than inheritence. As for the idea that OO=inheritence, nonsense of course.  OO=objects.  The OP clearly doesn't get that.  Normally I am appalled by the "he just doesn't get it" argument but in some cases it is right on. Fine, don't do OO.  I'm sorry it's being forced down your throat but does it occur to you that maybe the rest of us aren't deluding ourselves.

Asbestos sucks- no asbetsos does not suck.  It is a useful and unjustly maligned material.  The fact that at high levels of exposure it causes a very rare cancer led to its involvement in the huge psuedo-scam of asbetsos removal, and for that, it apologizes.  It wants you to know, however, that it supports OO programming and design

name withheld out of cowardice
Monday, February 23, 2004

Troll.

GOTV
Monday, February 23, 2004

Troll?  What because of the asbestos comment?

name withheld out of cowardice
Monday, February 23, 2004

> 2. I can not remember the last time I used friend, although I have used it occassionally.

Typical case is an "iterator" class being declared a friend of its related container class.

One rule of thumb is "no long-distance friendship", i.e. that friend classes should be defined in the same header file as the class of which they are a friend.

Christopher Wells
Monday, February 23, 2004

Count another heretic here. I won't say much as A pretty much nailed it to me.

But I'll say I lost the count of the times I had to look at some library, and it was such a mess of a thousand classes, methods, and inheritance, and I thought how much easier it could have been done otherwise. My goodness, the guy(s) who designed the Java IO library need(s) a kick in the butt. Talk about productivity — it takes a function to open a file in decent languages, and you probably know it better than your age (mine at least, I do); while you need to query the help for every godessdamned method you need to call for every godessdamned object you need to create, because you can't possibly learn all of that unless you're using it every 10 minutes, which is not my case.

The main issue I have with OOP is that with inheritance, "generalizing integrists" will want to start inheriting from the Big Bang, "just in case this word processor needs to be converted into a fractal compression program someday". To these people, all I have to say is their children won't inherit too much money unless they start being productive for a change.

(And yes, I know about how easily you can transform your word processor into a fractal compression program. And no, I don't think you'll be compressing fractals on your old word processor. And yes, I think you've wasted your time.)

But it looks like we'll have to cope with objects, because that's what decision makers like. And we all know most big decision makers lack a neuron or two, and know less about Computer Science than my grandma, but they always force their teams to use OOP because they've heard it "somewhere" (in a lobby, maybe?) that OOP is so uber kewl. Something fairly similar is happening with XML. It's just a fad.

And I know I'm gonna get bashed for this, so before you start, let me just say I'm not telling you to drop your beloved OO and XML stuff; YMMV; and good ol' C is just my fetish.

Wiseman
Wednesday, June 23, 2004

LOL, and I said I wouldn't be writing much...

Wiseman
Wednesday, June 23, 2004

*  Recent Topics

*  Fog Creek Home