From: William Paul Vrotney
Subject: Re: another take on "C is faster than lisp" (alias C sickness)
Date: 
Message-ID: <vrotneyCvDs6A.4B4@netcom.com>
In article <··········@info-server.bbn.com> ·····@labs-n.bbn.com writes:

I've hinted at ending this thread, but since Clint has renamed it "another
take" and more importantly he has brought out what I consider to be one of
the more insightful observations related to this thread, I couldn't resist.

> 
> the problem is that managers think one language fits all, and that
> language is now C++. this is likely to drive me from being a s/w
> engineer into being a s/w manager.

First, YES YES YES. Why should managers be allowed to dictate the choice of
language for a project? I would not get surgery from a doctor if the
hospital staff manager made the choice of which tools to use in my
surgery. And yes, the programmer is as important to a successful program as
a doctor is to a successful surgery, even if the manager has to go all the
way to India to find him (Dig dig :-)

> 
> I fail to understand why it is that the implementation language even
> needs to be mentioned.  we bid, we win, we implement, we support.  would
> you go over to Computer City and ask what language Microsoft Word is
> written in?  if they said ALGOL 60, would you buy something else
> instead? (you'd probably be more amazed that the words ALGOL 60 even
> came out of their mouth)
> 
>  -- clint
> 

Second, YES YES YES YES. It is kind of like one of those insight
problems. The people of this thread who are arguing that the application
should dictate the language should also consider a different way of looking
at things. Namely what Clint is suggesting above. I want to be able to write
applications in Lisp and not even tell the user as long as he is happy with
using my program.

After all, it is all ones and zeros in the final image. Yes I know, the
"Application Dictation" people are going to come back and say "Yeah but the
language you choose will have a lot to do with the configuration of those
ones and zeros". But as Jeff Dalton as so eloquently argued in this thread
(at least it seemed to me), if I may, the translated configuration of ones
and zeros is mostly and possibly entirely determined by the language
implementation.

And just to give you another insight example. Since I like Lisp syntax
better than C (easier to edit in Emacs, easier to parse) (yes I know about
the C modes in Emacs), I was, at one time, going to write a fast
preprocessor to the C compiler so that I could write my C programs using
Lisp syntax. So that instead of writing

        foo(a, b, c)

I would write

        (foo a b c)


Now if I also add a "Lisp like" library you will see stuff in my C code file
like

        (apply foo (cdr list))

and note that the resulting semantics after translation is still strictly C.
I hope you see where I am leading. At least to me now the STRICT argument of
the application determining the language is starting to look more and more
silly.

Don't get me wrong, I think C is an elegant language for writing operating
systems and related utilities, and that's what it was designed for.  BUT,
excuse me if I am wrong, to solve the software crisis BY FAR the more
important issues are (1.)  software productivity and (2.) software hardness
(fewer bugs). It seems to me we should not only be using very high level
languages like Lisp for applications but in addition even higher level tools
for building applications like INTERLISP originated tools, interface
builders, visual programming languages, component based software etc.  And
here we are in 1994 still trying to conserve bytes and microseconds. I call
it C sickness.
-- 
Bill Vrotney - ·······@netcom.com