From: ········@bayou.uh.edu
Subject: Re: C++ briar patch (Was: Object IDs are bad)
Date: 
Message-ID: <5lj5vk$3t2$1@Masala.CC.UH.EDU>
Mukesh Prasad (·······@polaroid.com) wrote:
: ········@bayou.uh.edu wrote:

: > It wouldn't be called a class because it wouldn't be a class.
: > No information hiding, no inheritance, no polymorphism, it's
: > nothing more than a data structure with some function pointers
: > as fields.

: Yes, you got the idea -- that's how object orieted programming
: was initially done in languages like C.  However, with liberal
: pre-processor use thrown in, some decent programming methodologies
: could be implemented.  You could implement inheritance and
: polymorphism, though for information hiding you just would
: make your objects opaque e.g something like

[Snip]

Ok with some strategic application of operations you could turn
it into something resembling a class.  My objection was with
my (mis)understanding that you were implying that a structure +
function pointers alone was a substitute for a class.  

[Snip]

: So what language do you want to do Windows programming in?
: SmallTalk?  Actor?

Smalltalk's ok, as is ACL (Allegro Common Lisp), and various
other languages.  I haven't tried Actor (I have heard of it)
so I can't comment on that.


: SmallTalk has been very popular in the press, and Wall Street
: banks have been impressed enough by it to bank on it.
: But other than some mild successes, for more than two
: decades now, it hasn't gone anywhere.

Well from what I've been reading, it seems to be gaining more
and more support, particularly in niches previously occupied
by Cobol, so I wouldn't say that it hasn't gone anywhere, it
just isn't a language that's in your face and that's taking
over the world wherever you look -- that's a far cry from
stagnation.


: So far, Windows had not been standardized enough for serious
: programming in any language at a "higher level" than C or C++.
: There were too many things you could do, too many ways to
: enable this button when the user moved his mouse _this_ way
: or popup this tooltip when _that_ menu was active, or
: automatically change the status-bar when the mouse was
: over this location, or have your own home-grown list-boxes
: or tree-controls or drawing-pad object...

Well VB is at a higher level and it seems to be used for
serious user interface development.  Various other languages
like ACL which are also at a higher level seem to do ok
in Windows.  I fail to see how Windows' lack of any
standardization affects the level at which you can program
in it.


: It is easy to form opinions from a bit of theory, but if
: you have actually had to implement production quality
: large scale windows programs, I am sure you know what
: I mean -- something which allowed the level of detailed
: access like C, was necessary for Windows programming.
: C++ just made it possible for MFC or OWL to simplify
: things for the user, while still allowing the massive
: amount of experimentation going on with Windows
: programming.

Again this issue does not hinge on the level of the language,
but rather how thoroughly and properly it allows access
to Windows.  Again serious development goes on with VB
for the user interface (which is what Windows programming
is all about), and again that is a higher level (and simpler)
language than C/C++.


: (I am discounting Unix a little bit, because though
: X-Windows popularized Windows in general, Windows never
: did overthrow the entrenched Unix model of computer usage.
: To this day, most Unix users use their desktops as a
: bunch of prettified shell terminals.)

Well also the fact that the primary means to write code in
X is still C and not C++ (at least that's been my experience)
sorta leaves it out of the running.  


: Windows are standardized enough now that Java can have
: window support built in.  But still, you have to buy
: into a model of standardization.  This would not
: have been possible in the early days of Windows,
: when "popup" versus "pulldown" were barely defined,
: and it was not clear which one is more useful when.

Java -- there's another language I forgot to mention :)


: > Wrong.  Objective-C only added object orientation to
: > C (and followed a different model at that), while C++
: > added a whole slew of features, of which object orientation
: > was but one.

: I think the success of C++ was tied to the object-orientation.
: Overloaded functions, operator overloading, etc were an extra
: good/bad thing depending upon your philosophy, but were not crucial.

I wasn't arguing about what caused the success of C, but rather
what I felt was a generalization by comparing Objective-C to
C++.  Given that the latter had broader goals than the former,
this comparison didn't seem quite right.
 

: The new stuff, templates, exceptions and all, are useful
: and already tried out in other languages.  Do they make an already
: large language too big?  We will find out in a few years...
: C++ certainly has momentum so that compiler companies that
: implement C++ compilers, have enough resources to implement
: all of this.

Features like these have proven useful in other languages so
C++ really had no choice but to implement them.  Increasing
the size of a language should not be a concern when there is
a substantial gain in functionality to be had.  A big bloated
language with lots of functionality is more useful than a
small elegant language with nothing.


: > I say it's entrenchment.  You've got something that's already
: > widespread, and you release something that's both better and
: > compatible, and people are more apt to take it.  I mean,
: > C++ did not force C programmers to change their ways, it was
: > like an optional expansion pack -- "Hey you can use me or
: > forget it and stick with C, no hurry, no risk".  That is
: > something a change-phobic industry would accept with open
: > arms.

: Yeah.  That change-phobic industry is built from you and I.

Not me :).  If I had my way, I would change once I had
adequate evidence that the cost/benefit ratio would
be worth the switchover.  From what I'm seeing, moving
on to Lisp from C/C++ would be something I'd do without
hesitation.  Of course I don't have my way :(


: I am not going to spend five years perfecting my Esperanto
: unless I know it is going to be of some use to me.  Nor am I
: going to throw away my Qwerty and carry Dvorak
: keyboards everywhere I go.  (Some people do, but
: they are usually change-phobic in other ways.)

Right, but in your case you are worried about whether or not
you will benefit enough to gain after you take the costs
into account.  While organizations do a similar thing, often
their eye is on the short term bottom line and not the long
term benefit.  Something could clearly have a long term
benefit, but when they see the cost in the short run, things
like pink slips and unemployment lines drift into the heads
of the powers that be.  Of course there's also the fear of
something not working after a switch.  At least if something
breaks in the context of teh status quo, the manager can
say "hey don't look at me, we've been doing this for
ages, something musta broke way down the line".  Try
passing off that story when the break occurs after
a change YOU made.  Further factor in the effects
of disinformation in a monolithic corporate "grapevine"
and you can see that decision making in the Corporate
world is quite a different beast from decisions made
by you or I.


: Inertia exists in physics.  It also exists in societies
: and cultures.  Some would say it is a good thing
: and protects us from spending all our time on
: trying out every new thing that came along!
: Only some people, "the early adopters", can be
: counted on to try new things and bring back
: the good or bad news.

Well obviously there is a cutoff point, and yes Inertia does
exist and that's why many companies are _STILL_ using Cobol
and Fortran.  Is Inertia good?  Taken in conservative doses
I'd say it isn't all that bad, but when it ends up ruling
your life then it's a bother.

If we don't try out new things every once in a while then we will
stagnate.  Think of where we'd be if nobody wanted to waste time
with the new-fangled "light bulbs" or the new-fangled "computing
machines"?  This is just as true for programming as it is for
other pursuits.  The existence of legacy code does make the
Inertia argument more compelling, but it is still folly to
live in the past, eventually the benefits gained from a superior
language will offset the cost of trying to keep from modifying
or re-writing legacy code, especially if competitors are using
it and thus end up eating your lunch.

--
Cya,
Ahmed