From: Adolfo Socorro
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <1227@culhua.prg.ox.ac.uk>
At Oxford and SRI International, we use AKCL for the development of the OBJ
family of languages and systems.

I was at UMass-Amherst before, and there Common Lisp is used in the
implementation of an extension of the Boyer-Moore theorem prover, and for the
development of a (prototype of a) database language that is connected to the
prover. The AI groups use a combination of Common Lisp and C.

Adolfo

From: Martin Weigele
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <weigele.666261076@bosun2>
···@prg.ox.ac.uk (Adolfo Socorro) writes:

>At Oxford and SRI International, we use AKCL for the development of the OBJ
>family of languages and systems.

[Hi Adolfo -]
I think the crucial question about programming languages is the way of
thinking and the attitude towards programming problems they encourage. 

Historically, lisp was the language of choice for the AI people because
of its at that time exceptional abilities of symbol manipulation and
list processing.

As a least common denominator, Common Lisp was then created to be as
compatible as possible with the existing lisp families.

Nowadays, I think that Common Lisp has become a dinosaur because of the
incredibly many features built in - also known as "creeping featurism" -
as a result of the desire to be as compatible as possible. Languages
like scheme seem much "cleaner" and "nicer". But even scheme lacks the
kind of type support available in the modula/pascal/oberon language family,
or in languages like ML.

On the other hand, the widespread use of Common Lisp in the AI community
and the fact that with some programming self-discipline, you can write
decent and powerful programmes might still justify its use. Also, the more 
specification languages are really used in programming projects, perhaps
the less important the features of the programming languages at least
at the stage of system analysis, and the less incentive from throwing
away the built-up know-how of an existing programming language.

Martin Weigele
FB Informatik, Uni Hamburg, Germany
From: Phil Stubblefield
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <1991Feb11.204514.19880@Neon.Stanford.EDU>
In article <·················@bosun2> ·······@bosun2.informatik.uni-hamburg.de
 (Martin Weigele) writes:

>As a least common denominator, Common Lisp was then created to be as
>compatible as possible with the existing lisp families.
>
>Nowadays, I think that Common Lisp has become a dinosaur because of the
>incredibly many features built in - also known as "creeping featurism" -
>as a result of the desire to be as compatible as possible. Languages
>like scheme seem much "cleaner" and "nicer".

Recently I've been rereading Tracy Kidder's _The_Soul_of_a_New_
_Machine_.  (For the uninitiated, it's about Data General's attempt to
develop their first 32-bit machine, partly in response to the then-new
VAX.)  At one point, a microcoder for the Eagle project was talking
about how wonderful was the VAX's complex and powerful instruction
set.  Having taken a computer architecture class from one of the
Stanford RISC proponents, my thoughts were, "Microcode?  CISC?
Bleah!"

What does this have to do with Lisp, you ask?  Well, it occurred to me
that there are some parallels between the two stories.  The VAX was
the logical outgrowth of some of the theories of the day about machine
design.  A quick glance through my VAX-11 Programming Card reveals 325
separate assembler mnemonics, including thirty "Branch on <condition>"
instructions, fifty type conversion instructions, and even four
instructions for performing queue operations using hardware locking!
The table titled "Assembler Notation for Addressing Modes" contains
sixty entries, including ·@L^12(R10)[R11]", for "forced longword
displacement deferred indexed".  Talk about "creeping featurism"!  In
terms of size, this was (and is) truly a Common-Lisp-sized
architecture.

In addition, the VAX was designed to be as compatible as possible
across the entire family line, and also upwards compatible with the
PDP-11, at least for the VAX-11's.  I think this is an interesting
comparison to Common Lisp's desire "to be as compatible as possible
with the existing lisp families."

In contrast, it seems that Scheme is to Common Lisp what the new RISC
wave is to the VAX.  Scheme seems like a cry of "Enough already!" to
the complexities of Common Lisp, although I know far too little about
Scheme to ascibe motives to its designers.

(BTW, I program in Common Lisp almost exclusively, and would hate to
have to switch to a non-Lispish language.)
-- 

Phil Stubblefield                                        (415) 325-7165
Rockwell Palo Alto Laboratory                    ····@rpal.rockwell.com
From: Andrew L. M. Shalit
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <ALMS.91Feb12093856@ministry.cambridge.apple.com>
In article <······················@Neon.Stanford.EDU> ····@Neon.Stanford.EDU (Phil Stubblefield) writes:

   In contrast, it seems that Scheme is to Common Lisp what the new RISC
   wave is to the VAX.  Scheme seems like a cry of "Enough already!" to
   the complexities of Common Lisp, although I know far too little about
   Scheme to ascibe motives to its designers.

This is an interesting comparison, but I think there's an important
difference to keep in mind.  In theory, no one is supposed to have to
write much RISC assembler code.  Everyone is supposed to write in a
"higher level language" (read as: "C").  Only the compiler authors
need to know about the arcane simplicity of the instruction set.

In contrast, Lisp (whether Common Lisp or Scheme) is supposed to be a
language that people actually use day to day for programming.  The
common complaint about Scheme is that it is too small for real work,
that it needs lots of libraries.

I think the problem with Common Lisp is a combination of size,
redundancy, and inconsistancy.  So, maybe there *is* a parallel
after all :-)

   (BTW, I program in Common Lisp almost exclusively, and would hate to
   have to switch to a non-Lispish language.)

Congratulations!

   -andrew
--
From: Jeff Dalton
Subject: Size of languages vs libraries
Date: 
Message-ID: <4120@skye.ed.ac.uk>
In article <··················@ministry.cambridge.apple.com> ····@cambridge.apple.com (Andrew L. M. Shalit) writes:

>In contrast, Lisp (whether Common Lisp or Scheme) is supposed to be a
>language that people actually use day to day for programming.  The
>common complaint about Scheme is that it is too small for real work,
>that it needs lots of libraries.

This would be fine, if it had the libraries.

A comparison with Prolog is instructive.  Prolog is, in a sense,
even smaller than Scheme and more extreme about putting things in
libraries.  Almost everyone uses append/3, for example, but it
isn't normally built in.  However, there is a large and widely
available Prolog library.  The library more or less started at
Edinburgh but was later maintained at su-score. 

An important reason why it exists is that programmers involved in
large Prolog projects in Edinburgh made an effort to write reusable
routines in the form of libraries and to collect them in one place.

Similar things happen for Lisp, but in a less coherent way.  The
"Scheme is too small" complaint would be defeated if there were
a standard or semi-standard library that was used with Scheme
as a mater of course.

>I think the problem with Common Lisp is a combination of size,
>redundancy, and inconsistency.  So, maybe there *is* a parallel
>[to CISC architectures] after all :-)

I am not sure what the "inconsistency" in CL is supposed to be.
Can you elaborate.

When we come to redundancy, I think this presupposes that a good
language ought to direct the programmer towards a single good way
of doing things.  I think there are a often number of good ways
to do something and that programmers are capable of making up
their own minds which to use.  If we start saying "no redundancy",
then can never use such things as Dick Water's Series macros
because we already have a way of doing loops.  Indeed, I think
it's a Good Think that Lisp allows programmers to define new
ways of doing things without leaving the language.

This brings us to size.  As I've said before, I think there is a
problem with _implementations_ that do not let you produce a small
system when you don't need a big one.  For example, if you're doing
iteration only one way, you shouldn't have to have all the other ways
around when you don't need them.  A small language with libraries
tends to accomplish this automatically, but for historical reasons
Common Lisp wasn't presented that way.

-- jeff

--
N.B. This is followup-to: comp.lang.lisp, because it branched
off from a discussion there, and because I wanted to avoid adding
to several newsgroups.  So you may want to edit the newsgroups
line.
From: Mott Given
Subject: Re: Size of languages vs libraries
Date: 
Message-ID: <2555@dsac.dla.mil>
From article <····@skye.ed.ac.uk>, by ····@aiai.ed.ac.uk (Jeff Dalton):
> isn't normally built in.  However, there is a large and widely
> available Prolog library.  The library more or less started at
> Edinburgh but was later maintained at su-score. 

       Where is su-score?  

       Is it on the INTERNET?

       Are the PROLOG programs on it accessible via anonymous FTP or
       any other way?
-- 
Mott Given @ Defense Logistics Agency Systems Automation Center,
             DSAC-TMP, Bldg. 27-1, P.O. Box 1605, Columbus, OH 43216-5002
INTERNET:  ······@dsac.dla.mil   UUCP: ...{osu-cis}!dsac!mgiven
Phone:  614-238-9431  AUTOVON: 850-9431   FAX: 614-238-9928 I speak for myself
From: Rob MacLachlan
Subject: Size of Common Lisp
Date: 
Message-ID: <1991Feb15.230707.22142@cs.cmu.edu>
So what about the size of Common Lisp?  Is Lisp a VAX?  Consider the
similarities:
 -- Big (too big to run on a VAX)
 -- Old (older than a VAX)
 -- Slow
 -- Sold by DEC

What lessons (if any) can language designers draw from the victory of RISC?
 -- Choose primitive primitives and migrate complexity upward.
 -- At any given level, implement only those features that are cost-effective.
 -- Optimize operations with a high dynamic frequency in favor of rare
    operations. 

Note that the lesson of RISC is *not* "choose simple systems."  Considering
the additional complexity of the compiler (due to the optimization a RISC
processor demands), a RISC-based programming system is more complicated
overall.  A solution should be as simple as possible, and no simpler.

Consider the minimalist-maximalist dimension of the system design space:
 -- A minimalist design is as simple as possible (in hopes of low production
    costs.) 
 -- A maximalist design is as complex as is useful (in hopes of maximizing
    flexibility and ease of use.)

RISC was initially publicized using largely minimalist arguments:
  "Our chip is faster because it has fewer instructions."

Although the "RISC philosophy" has won over almost all hardware architects,
the minimalist argument has been largely rejected as simplistic.  This is
why you see RISC machines with hundreds of instructions.  The key point is
that although a simple CPU design is inherently cheaper to design and
manufacture, it is not true that the simplest CPU chip invariably produces
the most cost-effective computer *system*.

A minimalist programming system is based on a small set of cheap operations.
Whenever you need another operation, you implement it using these cheap
operations.  At least for simple problems, this results in programs that use
about as little hardware as possible.  There are two major problems with
minimalist programming systems:
 -- The minimalist design goals totally ignore the cost of programming.
 -- The hardware efficiency advantage of minimalist systems is only clear-cut
    with minimalist programming problems.

Today there is no such thing as a minimalist programming system.  All
programming systems (assembler, C, whatever) have accepted some some reduction
in hardware efficiency in exchange for an increase in ease of programming.
This is widely accepted as true, as a "good thing".  [But programming systems
still do differ in their position on the minimalist-maximalist continuum.
More later...]

It is less widely acknowledged that minimalist systems aren't that much more
efficient for solving complex problems.  Could you write X much more
efficiently in assembler than its current C implementation?  Before you answer
"Hell yes!", I point out that I mean X11, xlib, resources, widgets, xtk and
motif, with total software pin-for-pin compatibility with the C version.

Of course, nobody would attempt to do such a thing.  If someone wanted a
maximally efficient window system, they would start by throwing out huge
amounts of "unnecessary" functionality.  Only after they have finished
crafting a minimal window system for their application would they consider
anything drastic like assembly coding.

Everybody agrees that X contains lots of unnecessary stuff.  But given the
evident truth of this, it is amazingly difficult to find features that almost
everyone agrees are unnecessary.  This is "design by committee" in a nutshell.
If you were writing a window system for just you to use, then you really could
throw out all the junk.  But if you want your office-mate to help you, then
you may find he has his own ideas.  Pretty soon you have keymaps and
environment variables.  And he may want to draw splines.

The bottom line is that today's programming systems are used by many people.
This means that a programming system must please many people's tastes and
cater to many people's needs.  This means there will be some mismatch between
your needs and the available offerings, but there is *no better way*.  If you
want to buy a car, but you hate arm-rests, it is cheapest to buy a Ford off
the show-room floor and cut off the arm-rests.

Any system over a certain small complexity has to be a multi-person effort.
At this point, the minimalists are no longer offering an alternative -- they
are merely bitching that systems that complex *should not be*.  There are some
things man was not meant to know...  God's will, Dijkstra's will, whatever.

But suppose you did decide to reimplement on top of a minimalist system.  If
so, there is a real risk that reimplementing "just what you need" will result
in a less efficient program.  To be sure, if you were as wizardly as the X
implementors and spent as much time tuning as they did, then you could get the
same or better efficiency for your operation subset.  But if you say, "2-d
trees are too hairy, I'll use a linear search", then you may be in for a
surprise.

I think that in reality, the biggest efficiency advantage of minimalist
programming systems is not the any inherent advantage of reimplementing all
the time, but rather that:
    **** Minimalist operations provide a good Efficiency Model ****

An efficiency model is a tool the programmer uses to tell how expensive a
particular code fragment will be.  In the current state of the art,
application of efficiency models is purely manual: you stare at the code.
If you see a lot of operations or a lot of repetitions, then you know the
code is expensive.  

In C, the efficiency model is so straightforward that Jon Bentley can
publish an article telling you how to predict the performance of your C
program with 80% accuracy.  In contrast, most of today's Lisp programmers
are working with internal efficiency models having perhaps 5% accuracy.  And
every time somebody writes a Lisp program 20x slower than a C program,
people conclude "Lisp is terribly inefficient."

o.k., now I've stopped baffling you with bullshit about window systems, and
we're back to the Common Lisp v.s. Scheme v.s. C wars.  And RISC v.s. CISC
too!  Way to go...

I'll restate my RISC lessons here so that you don't have to page back:
 -- Choose primitive primitives and migrate complexity upward.
 -- At any given level, implement only those features that are cost-effective.
 -- Optimize operations with a high dynamic frequency in favor of rare
    operations. 

Remember those 100 instruction multi-million transistor RISCs?  Well, the
reason that new instructions creep into RISC processors is one of the reasons
that new functions creep into Lisp.  Integer multiplication was added to the
SPARC because some people use it a lot, and it is a pain for each of them to
design their own hardware multiplier and hook it into the motherboard.
Similarly, Common Lisp has hash-tables because some people use them a lot, and
it is a pain for everyone to have to design their own and hook them into the
garbage collector.

With either integer multiplication or hash-tables it is possible to come up
with high-level software solutions that don't require a soldering iron, but
you will pay a big efficiency hit.

The *other* reason why functions creep into Common Lisp is a strictly
maximalist one: to get economies of scale by standardizing the interfaces to
useful functionality.  In a word, reuse.  In fact, this is a lot of the reason
that CISCs got so complicated: not because the hairy instructions were much
faster, but because they were easier to use in assembly language.

RISC has to some degree punted on the interface standardization issue by
observing that only compilers and assemblers need to know the interface to the
hardware.  In other words, they pushed the interface up a level.  Programming
systems can do this too, but at some point, there has to be a top level that
programmers can recognize.  This strongly encourages standardization of the
intermediate layers as well (to avoid duplication of effort).

Common Lisp is definitely a maximalist programming system.  Does this mean
that it is a total loss for efficiency?  I think not, but there are some
major efficiency challenges:
 1] Make better efficiency models for Common Lisp.  Add automatic compiler
    efficiency notes that flag simple problems, and document advanced compiler
    capabilities so that programmers can exploit them.
 2] Eliminate unnecessary inefficiencies in Common Lisp by adding complexity 
    to the implementation:
     - Compiler optimizations provide graceful performance degradation for
       less-tuned programs.
     - Generational GC makes consing cheap and eliminates offensive GC pauses.
 3] Make simple Common Lisp programs comparably efficient to simple C
    programs.  This is the "delivery" problem.

The Python compiler for CMU Common Lisp has already addressed 1 and 2.  A
case study is Scott Fahlman's "export" connectionist AI simulator.  This is
a medium-size float-intensive application written by someone with a good
Lisp efficiency model.  It was written well before Python's number-crunching
capabilities were designed, and was tuned to perform well in the
then-available Lisp systems.

When first compiled with Python, there were more than 100 efficiency notes.
Even with all these problems, the program ran 2x faster than the commercial
Lisp on the same machine (I won't say whether it was Franz or Lucid.)
After fixing type declaration problems revealed by these notes, it was 5x
faster.  Interestingly, a version of this program had been recoded in C for
speed.  The Python version took only 1.6 times as long as the MIPS C compiler.

Floating point?  Who cares?  That's "unnecessary"...

The point is, that with sufficient effort and complexity in the Lisp
implementation, users can fairly easily come within a factor of 2 of C, even
for problems that seem "un-Lispy".  In other words, the efficiency model has
been improved from 5% to 50% accuracy.

As for 3 above (the delivery problem), there is still a lot of work to be done
there.  Both Lucid and Franz are working on it, and I have some ideas too.
The problem is how to get a relatively small binary for a delivery system.
The difficulty is in figuring out how to do this without throwing the baby out
with the bath-water.  People don't use Lisp instead of C because they like its
funky syntax.  They use Lisp instead of C because it has lots of useful stuff
built in, and is easy to extend.  Unfortunately, those are the exact same
reasons that "ld" doesn't work well on Lisp programs.

What I find more interesting than the prospect of making FORTRAN-in-Lisp as
fast as FORTRAN or C-in-Lisp as small as C, is the prospect of making
Lisp-in-Lisp as fast as possible.  Now we have FORTRAN efficiency in Lisp, but
how about making efficient use of all the stuff that's in Common Lisp but not
FORTRAN?  Although that's lots of stuff, I assure you that there are few
useless features.  I recently read through the index of functions, macros and
special forms in CLtL, and it was rare to find an operation I had never used
(and I found none that I didn't know what they did.)

The real answer to "Why is Common Lisp so big?" is "Compared to what?"
If your system is very small compared to Common Lisp, then maybe you should
attack a more complex problem.  Big systems easily exceed the size of
Common Lisp, making the penalty for having all that stuff around relatively
small.

As systems become more complex, the original CLtL I Common Lisp will come to
seem less and less complex.  Remember ALGOL with no I/O?  Fortunately, Common
Lisp is growing to keep up with progress in computer science.

  Robert A. MacLachlan (···@cs.cmu.edu)
From: Jeff Dalton
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <4112@skye.ed.ac.uk>
In article <······················@Neon.Stanford.EDU> ····@Neon.Stanford.EDU (Phil Stubblefield) writes:
>In addition, the VAX was designed to be as compatible as possible
>across the entire family line, and also upwards compatible with the
>PDP-11, at least for the VAX-11's.  I think this is an interesting
>comparison to Common Lisp's desire "to be as compatible as possible
>with the existing lisp families."
>
>In contrast, it seems that Scheme is to Common Lisp what the new RISC
>wave is to the VAX.  Scheme seems like a cry of "Enough already!" to
>the complexities of Common Lisp, although I know far too little about
>Scheme to ascibe motives to its designers.

I like the CL : CISC :: Scheme : RISC analogy.  However, I think
it's worth bearing in mind that Common Lisp wasn't meant to include
everything from every Lisp or to be a union of (all good features
of) all Lisps.  Indeed, some people have criticized the CL designers
for making a MacLisp sucessor and not paying enough attention to the
rest of the Lisp world (eg, InterLisp, PSL, Lisp/VM), although they
aren't necessarily arguing for CL to be bigger.
From: Guillermo J. Rozas
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <JINX.91Feb12132549@chamarti.ai.mit.edu>
    In contrast, it seems that Scheme is to Common Lisp what the new RISC
    wave is to the VAX.  Scheme seems like a cry of "Enough already!" to
    the complexities of Common Lisp, although I know far too little about
    Scheme to ascibe motives to its designers.

Although reducing the size of the instruction set and addressing modes
seems to be a performance win in architecture, it is not clear that
reducing the size of a programming language is desireable.  The real
question is what the expressive power of the language is and whether
Scheme "gives up" some of Common Lisp's expressive power.  I don't
think the answer is clear.

In other words, richer languages don't necessarily mean slower or
worse languages, while it seems that richer instruction sets mean
slower machines.

I am firmly in the Scheme community but often envy Common Lisp
constructs and features.
From: David Boles
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <44103@ut-emx.uucp>
In article <··················@chamarti.ai.mit.edu> ····@zurich.ai.mit.edu writes:
>In other words, richer languages don't necessarily mean slower or
>worse languages, while it seems that richer instruction sets mean
>slower machines.

Not necessarily, the IBM RIOS processor delivers more performance
per Hz than any other processor used in a box and it has more than
180 instructions. RISC indeed :)

--
David Boles
·······@hermes.chpc.utexas.edu
From: Piercarlo Grandi
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <PCG.91Feb15143111@odin.cs.aber.ac.uk>
On 12 Feb 91 18:25:49 GMT, ····@zurich.ai.mit.edu (Guillermo J. Rozas) said:

jinx> The real question is what the expressive power of the language is
jinx> and whether Scheme "gives up" some of Common Lisp's expressive
jinx> power.  I don't think the answer is clear.

I beg to disagree. To my it is *clear* that the expressive power of
Scheme, with all the options, is superior to that of CL, as Scheme can
do things that CL cannot do in any way. Note that I am not saying that
Scheme can do everything that CL can do *in the same way*; some
rewriting may be needed. But there are things that Scheme can do that CL
cannot do under any rewriting.

jinx> In other words, richer languages don't necessarily mean slower or
jinx> worse languages, while it seems that richer instruction sets mean
jinx> slower machines.

If they are richer, not just bulkier because they include, like the
abominable Ansi C, the libraries in the language definition.

Just look at this: the Scheme report, for the core language, and it is
one that includes (optional) rationals and complexes (arrgggh!), is well
under a hundred pages; the CL one is well over a thousand. Nearly two
orders of magnitude. Does this make CL almost two orders of magnitude
"richer" or "more useful", whatever your metric is, than Scheme?
Hardly...

jinx> I am firmly in the Scheme community but often envy Common Lisp
jinx> constructs and features.

Maybe you just envy the CL libraries -- well, many Scheme
implementations do have libraries as rich as those of most any CL
system; T, and obviously CScheme, that you probably are using, come to
mind. What little is missing can be usually implemented quite easily and
quickly, with much less verbiage than in CL.

CLOS? Well, I have just reread Abelson & Sussman (generics and
packages), and there is something similar there. Not as fanatically
overspecified as CLOS, but, incredibly, not that distant from it.
--
Piercarlo Grandi                   | ARPA: ·················@nsfnet-relay.ac.uk
Dept of CS, UCW Aberystwyth        | UUCP: ...!mcsun!ukc!aber-cs!pcg
Penglais, Aberystwyth SY23 3BZ, UK | INET: ···@cs.aber.ac.uk
From: Jeff Dalton
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <4143@skye.ed.ac.uk>
In article <·················@odin.cs.aber.ac.uk> ···@cs.aber.ac.uk (Piercarlo Grandi) writes:
>jinx> The real question is what the expressive power of the language is
>jinx> and whether Scheme "gives up" some of Common Lisp's expressive
>jinx> power.  I don't think the answer is clear.
>
>I beg to disagree. To my it is *clear* that the expressive power of
>Scheme, with all the options, is superior to that of CL, as Scheme can
>do things that CL cannot do in any way. 

What Scheme has that CL lacks is (full) call/cc and the exact/inexact
distinction for numbers.  Or is there something else I'be forgotten.

Here are some things I can do in CL but not scheme.  I don't think
it matters that much whether they're part of expressive power or not.

  1. Macros (soon to be fixed). 
  2. Definition of a new, distinct data type.
  3. Eval.

>                                          Note that I am not saying that
>Scheme can do everything that CL can do *in the same way*; some
>rewriting may be needed. But there are things that Scheme can do that CL
>cannot do under any rewriting.

Well, you can do a rewriting to get call/cc.  It's not a simple
rewriting, but it's a rewriting.  (I'm thinking of CPS conversion.)

>jinx> I am firmly in the Scheme community but often envy Common Lisp
>jinx> constructs and features.
>
>Maybe you just envy the CL libraries -- well, many Scheme
>implementations do have libraries as rich as those of most any CL
>system; T, and obviously CScheme, that you probably are using, come to
>mind. What little is missing can be usually implemented quite easily and
>quickly, with much less verbiage than in CL.

There's all kinds of stuff in CL that isn't in T.  And CLOS, for
example, cannot be implemented quite easily and quickly.

(To be fair, there are also things in T that are not in CL and
that could not be implemented easily and quickly.)

>CLOS? Well, I have just reread Abelson & Sussman (generics and
>packages), and there is something similar there. Not as fanatically
>overspecified as CLOS, but, incredibly, not that distant from it.

Maybe you don't like CLOS.  Ok.  But don't confuse that with other
issues.  CLOS cannot be implemented easily and quickly in Scheme
(nor, for that matter, in a Common Lisp that doesn't already have
it).  The stuff in Abelson and Sussman can be implemented easily
and quickly in Common Lisp.

-- jd
From: Robert Marti
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <25595@neptune.inf.ethz.ch>
In article <················@disuns2.epfl.ch> ········@disuns2.epfl.ch
(Emmanuel Baechler) writes:

> Here, we are still using Common Lisp for all our work, and there is no
> consideration to "go back" to a classical language like C++.

I'm amazed to learn that C++ is a "classical" language.  It certainly
isn't a classical language in my book, so what exactly is your definition
of a classical language?  A strongly-typed language?  An imperative
language?  A language for which (almost) no interpreters exist?
(Note that all of this is true for C++, even though C++ is more of
an object-oriented than an imperative language -- if used correctly.)


> I know that, theoretically, all languages are equivalent,

An interesting statement, especially in conjunction with your remark
on the expressive power of CommonLisp vs Scheme below.


> but I am convinced that without lisp our programs would be unmanageable.

What evidence do you have to support the above statement?  Have you
actually compared at least one nontrivial Lisp application (say >10000
lines of code) with the same application written in another language?
(You may want to look at "C++ versus Lisp: A Case Study", by H. Trickey,
SIGPLAN Notices Vol.23, No.2, February 1988, pp.9-18.  Note that I do
not claim that Trickey's findings generalize to _all_ applications,
though.)


> In addition, the environment of a Lisp Machine help us a lot.

I don't doubt this for one minute.  However, there are quite a few
impressive programming environments for other languages running on
"stock" hardware as well.


> I don't know whether Scheme has really more expressive power than CL.

Didn't you just tell us that "all [sic!] languages are equivalent"?!


While I readily agree that Lisp is a very useful language for certain
kinds of applications -- after all, I teach Scheme in one of my lectures
-- I object to the above sweeping generalizations which appear to be
rampant among _some_ people in the AI community.  

--

Robert Marti                      |  Phone:    +41 1 254 72 60
Institut fur Informationssysteme  |  FAX:      +41 1 262 39 73
ETH-Zentrum                       |  E-Mail:   ·····@inf.ethz.ch
CH-8092 Zurich, Switzerland       |
From: Jeff Dalton
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <4330@skye.ed.ac.uk>
In article <·····@neptune.inf.ethz.ch> ·····@mint.inf.ethz.ch (Robert Marti) writes:
>> but I am convinced that without lisp our programs would be unmanageable.
>
>What evidence do you have to support the above statement?  Have you
>actually compared at least one nontrivial Lisp application (say >10000
>lines of code) with the same application written in another language?
>(You may want to look at "C++ versus Lisp: A Case Study", by H. Trickey,
>SIGPLAN Notices Vol.23, No.2, February 1988, pp.9-18.  Note that I do
>not claim that Trickey's findings generalize to _all_ applications,
>though.)

But you think they're true of _his_ application?  That is, if someone
else performed the same experiment they'd get the same result?  

Indeed, when someone says "our programs would be unmanageable"
you need some evidence that those very programs would be manageable,
not that someone once found that they preffered C++ for some other
programs.

What Trickey says is extremely misleading as a guide to what's true,
as opposed to how things appear to someone who prefers C++ to Lisp.
As far as Dickey was concerned compile-time type checking was the
most important difference, and it appears that he would decide
against Lisp no matter what Lisp implementations do so long as
that difference remained.

A common pattern in his observations is to say Lisp has feature X,
but he could code up something in C++ that was nearly as good and
after this initial effort the advantage to Lisp was essentially
removed.  Ok, so in his view print functions are just as good as
an inspector, but this is hardly something with which everyone
will agree.

Moreover, he is strongly influenced by problems with particular
implementations of Lisp (eg, the (real) superiority of dbx over
the Lisp debugger).  He tried to gain a measure of objectivity
by trying two implementations, but he was nonetheless comparing
implementations rather than languages.

I have no objection to people preferring one language over another,
and Trickey has good reasons (from his point of view) for preferring
C++.  What I do object to is the attempt to universalize this
judgement.  Trickey doesn't do this explicitly, but his article
is easily used to that end and goes some way to encourage it.
From: Emmanuel Baechler
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <1991Mar20.110700@disuns2.epfl.ch>
In article <····@skye.ed.ac.uk>, ····@aiai.ed.ac.uk (Jeff Dalton) writes:
> In article <·····@neptune.inf.ethz.ch> ·····@mint.inf.ethz.ch (Robert
> Marti) writes:
> >> but I am convinced that without lisp our programs would be
> unmanageable.
> >
> >What evidence do you have to support the above statement?  Have you
> >actually compared at least one nontrivial Lisp application (say
> >10000
> >lines of code) with the same application written in another
> language?
> >(You may want to look at "C++ versus Lisp: A Case Study", by H.
> Trickey,
> >SIGPLAN Notices Vol.23, No.2, February 1988, pp.9-18.  Note that I
> do
> >not claim that Trickey's findings generalize to _all_ applications,
> >though.)
> 
> But you think they're true of _his_ application?  That is, if someone
> else performed the same experiment they'd get the same result?  
>

What I am doing is qualitavite reasoning applied to mechanism kinematics.
My code is bigger than 1 Meg. It was (and is) difficult to develop. However
Lisp proved to be *VERY* helpful to do it. I still claim than doing it with
a "classical" (i.e. imperative (and even with an object-oriented extension,
C++ remains imperative)) would have been several orders of magnitude more
difficult.

 Common Lisp is not only, functional, object-oriented, and so on, it is
EXTENSIBLE, wich allows to describe something in a way corresponding to
the problem.

  As all programming languages have a THEORETICALLY equivalent expressive
power, experience showed me that there is a strong difference between theory
and prectice, in the sense that some languages allow to express thing more 
easily than other. This is where Lisp definitely wins. As this difference
between theory and prectice has already been discussed a lot, I won't go
into it any more.

  I have read than some people are extemely happy to build AI programs in
C or whatever else. In a sense, they could also do it in assembly language,
which wouldn't be very different. In my opinion this is like people spending 
their time writing huge programs in assembly langage, because it can be done
ansd because one can win 1 femtosecond on a loop wich might be never executed.
This is simply not the thing to do, because things are not described at the
proper level.
 
E. Baechler
AI Lab
Ecole Polytechnique Federale de Lausanne
MA - Ecublens
1015 Lausanne			Switzerland
From: Jeff Dalton
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <4105@skye.ed.ac.uk>
In article <·················@bosun2> ·······@bosun2.informatik.uni-hamburg.de (Martin Weigele) writes:
>Nowadays, I think that Common Lisp has become a dinosaur because of the
>incredibly many features built in - also known as "creeping featurism" -
>as a result of the desire to be as compatible as possible. Languages
>like scheme seem much "cleaner" and "nicer". But even scheme lacks the
>kind of type support available in the modula/pascal/oberon language family,
>or in languages like ML.

I'm glad it doesn't have the kind of "type support" available in
Pascal.

ML is reasonably flexible as far as types are concerned, but lacks too
many things (eg, much in the way of I/O, a good macro mechanism).

In short, Lisp still wins.  If you want something smaller and cleaner
than CL, try Scheme.

BTW, the idea that CL has too many features out of a desire to be
compatible with older dialects isn't quite right.  For example, the
scoping rules in CL are a cleanup and generalization of those in
(compiled) MacLisp, not (as some have supposed) a mix of things from
different dialects.

-- jd
From: Guillermo J. Rozas
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <JINX.91Feb12132111@chamarti.ai.mit.edu>
    Nowadays, I think that Common Lisp has become a dinosaur because of the
    incredibly many features built in - also known as "creeping featurism" -
    as a result of the desire to be as compatible as possible. Languages
    like scheme seem much "cleaner" and "nicer". But even scheme lacks the
    kind of type support available in the modula/pascal/oberon language family,
    or in languages like ML.

Your statement seems to imply that there is consensus that such type
systems are desirable.  Some of us in the Scheme community, and I bet
many in the Lisp community, view such type systems as a step
backwards, not forwards.  I realize that this is a matter of taste,
but there are good arguments for both kinds of type systems.  Rather
than choosing one over the other, it's even better to have a choice,
so don't look for the ultimate language, because it would probably be
a dinosaur or boring.
From: Harley Davis
Subject: Type systems: What's the hitch?
Date: 
Message-ID: <DAVIS.91Feb13144756@barbes.ilog.fr>
In article <··················@chamarti.ai.mit.edu> ····@zurich.ai.mit.edu (Guillermo J. Rozas) writes:

       like scheme seem much "cleaner" and "nicer". But even scheme lacks the
       kind of type support available in the modula/pascal/oberon
       language family, or in languages like ML.

   Your statement seems to imply that there is consensus that such type
   systems are desirable.  Some of us in the Scheme community, and I bet
   many in the Lisp community, view such type systems as a step
   backwards, not forwards.

What problem do the Scheme people have with the CL type system?  Is it
the complex type combination language, CLOS, some basic issue
concerning extensible types, or something else?

-- Harley 
--
------------------------------------------------------------------------------
nom: Harley Davis			ILOG S.A.
net: ·····@ilog.fr			2 Avenue Gallie'ni, BP 85
tel: (33 1) 46 63 66 66			94253 Gentilly Cedex, France
From: Harley Davis
Subject: Re: So who's really using LISP?
Date: 
Message-ID: <DAVIS.91Feb13145728@barbes.ilog.fr>
In article <·················@bosun2> ·······@bosun2.informatik.uni-hamburg.de (Martin Weigele) writes:

   Nowadays, I think that Common Lisp has become a dinosaur because of the
   incredibly many features built in - also known as "creeping featurism" -
   as a result of the desire to be as compatible as possible. Languages
   like scheme seem much "cleaner" and "nicer". But even scheme lacks the
   kind of type support available in the modula/pascal/oberon language family,
   or in languages like ML.

I'd like to point out EuLisp, under development here in Europe.  It is
a multi-layered, Scheme-like language with an extensible class system,
generic functions, and modules.  At its lowest level, level-0, it is
small like Scheme.  At higher levels, more interesting and complex
CommonLisp-like functionality is added, including a metaobject
protocol for the object system, and many libraries.

You can get more information about EuLisp, including the latest
version of the specification (subject to change at any time) and a PD
implementation, by sending mail to ······@maths.bath.ac.uk.

EuLisp is not yet finished, but I think it points the way toward a
plausible future for Lisp, supporting both the complex, high-level
applications with rapid development cycles we are used to in Lisp, and
commercially viable delivery systems.

-- Harley
--
------------------------------------------------------------------------------
nom: Harley Davis			ILOG S.A.
net: ·····@ilog.fr			2 Avenue Gallie'ni, BP 85
tel: (33 1) 46 63 66 66			94253 Gentilly Cedex, France