From: David J. Braunegg
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <127724@linus.mitre.org>
>Lack of demand due to Common LISP's enormous size, complexity, resource
>requirements, training, etc.
>
>Common LISP effectively died from obesity.



OK.  What are the problems preventing a smaller, more efficient Lisp
so that we aren't forced to use the almost-a-programming-language C?

Dave

From: Dean NEWTON
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <1991Jan14.180050.519@cs.mcgill.ca>
In article <······@linus.mitre.org> ···@babypuss.mitre.org (David J. Braunegg) writes:
>>Lack of demand due to Common LISP's enormous size, complexity, resource
>>requirements, training, etc.
>>
>>Common LISP effectively died from obesity.
>
>
>
>OK.  What are the problems preventing a smaller, more efficient Lisp
>so that we aren't forced to use the almost-a-programming-language C?
>
>Dave

Nothing.  It's called Scheme.

Kaveh Kardan
Taarna Systems
Montreal, Quebec, Canada
(posting from a friend's account)
From: Mark Ahlenius
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <5569@turquoise.UUCP>
···@babypuss.mitre.org (David J. Braunegg) writes:

>>Lack of demand due to Common LISP's enormous size, complexity, resource
>>requirements, training, etc.
>>
>>Common LISP effectively died from obesity.

>OK.  What are the problems preventing a smaller, more efficient Lisp
>so that we aren't forced to use the almost-a-programming-language C?

There is a smaller, faster dialect of CL out there and as far as I
know it is being taught in some of the major universities
 - its Scheme.

 Quite a while back I read that Scheme was taught as the "first programming"
 language at MIT (via Structure and Interpretation of Computer Prg.)
 Is this still the case?

 There appears to be some renewed interest in Scheme lately.

 Although it lacks many features that CL has, it is small., compact,
 and fairly quick.

	'mark
-- 
===============	regards   'mark  =============================================
Mark Ahlenius 		  voice:(708)-632-5346  email: uunet!motcid!ahleniusm
Motorola Inc.		  fax:  (708)-632-2413
Arlington, Hts. IL, USA	 60004
From: Jeff Dalton
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <3954@skye.ed.ac.uk>
In article <····@turquoise.UUCP> ········@motcid.UUCP (Mark Ahlenius) writes:
>···@babypuss.mitre.org (David J. Braunegg) writes:

>>OK.  What are the problems preventing a smaller, more efficient Lisp
>>so that we aren't forced to use the almost-a-programming-language C?
>
>There is a smaller, faster dialect of CL out there and as far as I
>know it is being taught in some of the major universities - its Scheme.

I'm not sure the Scheme folk think of their language as a "dialect
of CL".  A dialect of Lisp, yes, but not a Common one.  Nor am I
convinced that Scheme will be faster.  Standard Scheme, at least,
lacks many of the efficiency tricks (e.g., declarations) available
in CL.  However, Scheme is smaller and, consequently, easier to
implement and (for the most part) easier to understand fully.
It is still difficult to implement an efficient Scheme, despite
its size, but at least the effort will be concentrated on fewer
constructs.

Much of the reason Common Lisp appears to be so big is, in my opinion,
a matter of organization and presentation.  If we start with C and
then add in various libraries, it starts to look fairly large too.  On
the other hand, it's easier with C to distinguish the essential core
from the rest.  There isn't any reason, other than historical, why
Common Lisp couldn't be presented, and even implemented, in a more
C-like way, as a language plus libraries of procedures and data types.
The core language would still be larger than C, but it would also 
have greater capabilities.

A different Lisp design that tries to get some of the advantages of
both Common Lisp and Scheme is EuLisp, a Lisp being developed mostly
in Europe (hence the name).  The conceptual and implementational
complexity of EuLisp is controlled by the use of two mechanisms:
levels and modules.  

There are 3 levels in EuLisp, each an extension of the one below.
Level 0 is a "kernel" Lisp, not too far from Scheme.  Level 1 is about
the size and scope of Le Lisp or Franz Lisp.  Level 2 is close to
Common Lisp.  The advantage of levels over three separate languages
is, of course, that they fit together in a coherent way.

In addition, constructs with related functionality (a data type and
procedures for operating on its instances, for example) are often
packaged together in a module.  If your program makes use of such
facilities, you must request the appropriate module, just as you must
include the appropriate ".h" file in C.

Modules are a finer division than levels.  An implementation aims
at a particular level, and each level has certain modules as standard.
An example of a difference between levels is that level 0 has only
the most basic mechanisms for defining new classes, while higher
levels have capabilities similar to those of CLOS.

-- Jeff
From: Ozan Yigit
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <20544@yunexus.YorkU.CA>
In article <····@skye.ed.ac.uk> ····@aiai.UUCP (Jeff Dalton) writes:
> ...... Nor am I
>convinced that Scheme will be faster.

I know life is too short to even attempt to convince you, :-) so let me
just say that if you ever want to find out for sure, at least try out an 
industrial-strength implementation (such as Chez etc.) for your tests.
Who knows, you may be surprized.

>	...  Standard Scheme, at least,
>lacks many of the efficiency tricks (e.g., declarations) available
>in CL.

Scheme literature thus far available (Steele [1], Dybvig [2], Kranz et al.
[3] just to mention a few) seem to suggest that scheme may not need much in
the way of efficiency tricks (except perhaps to indicate to the compiler
that built-in functions will not be re-defined) to be compiled and
optimized properly.  [On the other hand, arguably a case may be made for
additional constructs for even *better* results]

>It is still difficult to implement an efficient Scheme, despite
>its size ...

Give me a unit of measure for your understanding of *efficient*, so that
we'll know what this new claim is all about.

oz
---
[1]  Guy Lewis Steele Jr., Rabbit: a Compiler for Scheme, MIT AI Memo
     474,  Massachusetts  Institute  of Technology, Cambridge, Mass.,
     May 1978.

[2]  R.  Kent  Dybvig,  Three  Implementation  Models   for   Scheme,
     Department  of  Computer Science Technical Report #87-011 (Ph.D.
     Dissertation), University of  North  Carolina  at  Chapel  Hill,
     Chapel Hill, North Carolina, April 1987.

[3]  David Kranz, Richard Kelsey, Jonathan A. Rees, Paul Hudak, James
     Philbin  and  Norman I. Adams, Orbit: An Optimizing Compiler for
     Scheme, Proceedings of the  SIGPLAN  Notices  '86  Symposium  on
     Compiler Construction, June 1986, 219-233.

---
Where the stream runneth smoothest,   | Internet: ··@nexus.yorku.ca 
the water is deepest.  - John Lyly    | UUCP: utzoo/utai!yunexus!oz

  
From: Barry Margolin
Subject: Scheme optimization (was Re: Is this the end of the lisp wave?)
Date: 
Message-ID: <1991Jan17.054800.9036@Think.COM>
In article <·····@yunexus.YorkU.CA> ··@yunexus.yorku.ca (Ozan Yigit) writes:
>Scheme literature thus far available (Steele [1], Dybvig [2], Kranz et al.
>[3] just to mention a few) seem to suggest that scheme may not need much in
>the way of efficiency tricks (except perhaps to indicate to the compiler
>that built-in functions will not be re-defined) to be compiled and
>optimized properly.  [On the other hand, arguably a case may be made for
>additional constructs for even *better* results]

Most of the papers I've seen about optimizing Scheme compilers have
concentrated on optimizing the *control* structures (the simplest of which
is the basic transformation of a tail call into a jump).  Common Lisp
optimizations are geared towards removing some of the overhead of dynamic
typing, i.e. allowing generic functions (arithmetic and array operations,
in particular) to be compiled into type-specific code.


--
Barry Margolin, Thinking Machines Corp.

······@think.com
{uunet,harvard}!think!barmar
From: Ozan Yigit
Subject: Re: Scheme optimization (was Re: Is this the end of the lisp wave?)
Date: 
Message-ID: <20598@yunexus.YorkU.CA>
In article <·····················@Think.COM> ······@think.com
(Barry Margolin) writes:

>Common Lisp
>optimizations are geared towards removing some of the overhead of dynamic
>typing ...

Ah, I see what he means by optimization tricks now. I presume it can pay
off for scheme compilers as well, given that some implementations already
use something similar, i.e. (integrate-usual-procedures) to inform the
compiler that it can make the assumption that the built-in procedures will
not be redefined or assigned so it is *ok* to go ahead and (for example)
inline them.

oz
---
We only know ... what we know, and    | Internet: ··@nexus.yorku.ca 
that is very little. -- Dan Rather    | UUCP: utzoo/utai!yunexus!oz

  
From: John Gateley
Subject: Re: Scheme optimization (was Re: Is this the end of the lisp wave?)
Date: 
Message-ID: <GATELEY.91Jan17162628@datura.rice.edu>
In article <·····@yunexus.YorkU.CA> ··@yunexus.yorku.ca (Ozan Yigit) writes:
   In article <·····················@Think.COM> ······@think.com
   (Barry Margolin) writes:
   >Common Lisp
   >optimizations are geared towards removing some of the overhead of dynamic
   >typing ...

   Ah, I see what he means by optimization tricks now. I presume it can pay
   off for scheme compilers as well

Yes it can, having worked on a type inferencer used for both Scheme
and Common Lisp. The main difference is that Common Lisp declares
provide hints to the inferencer (inferrer?) in a concise form, while
Scheme has no such mechanism.

John
·······@rice.edu

--
"...Yes, I've got some questions that are guaranteed to shake you up. How
much marriage urges a windmill to paint infinity? Is a magic hide-a-bed
the vile home of spanish fire? Is firm corn merrier under gifts of less
important love? We wonder ..." The Residents
From: Scott K. Walker
Subject: Common Lisp for MS-DOS
Date: 
Message-ID: <1991Jan17.232827.24545@eng.umd.edu>
I have been searching eagerly for the replies to the earlier message
by someone else looking for a Common Lisp implementation on a 386 PC.
I am also looking for the same thing, but it does not have to be 
386-specific.  Is there ANY Lisp implementation for my PC at ALL?
(Common Lisp prefered...)

Thank you very much.
Scott King Walker 

······@eng.umd.edu
·······@record1.umd.edu
From: Jeff Dalton
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <3965@skye.ed.ac.uk>
In article <·····@yunexus.YorkU.CA> ··@yunexus.yorku.ca (Ozan Yigit) writes:
>In article <····@skye.ed.ac.uk> ····@aiai.UUCP (Jeff Dalton) writes:
>> ...... Nor am I
>>convinced that Scheme will be faster.
>
>I know life is too short to even attempt to convince you, :-) so let me
>just say that if you ever want to find out for sure, at least try out an 
>industrial-strength implementation (such as Chez etc.) for your tests.
>Who knows, you may be surprized.

Suppose I use T.  Would that count?  (I am willing to try your test
some time, if I can do it for free.  There is no money for Lisp here
these days.)

>>	...  Standard Scheme, at least,
>>lacks many of the efficiency tricks (e.g., declarations) available
>>in CL.
>
>Scheme literature thus far available (Steele [1], Dybvig [2], Kranz et al.
>[3] just to mention a few) seem to suggest that scheme may not need much in
>the way of efficiency tricks (except perhaps to indicate to the compiler
>that built-in functions will not be re-defined) to be compiled and
>optimized properly.  [On the other hand, arguably a case may be made for
>additional constructs for even *better* results]

I have no problem with the idea that Scheme's control structures,
including call/cc, can be implemented efficiently.  Ditto lists,
function calls, ...  The things I was thinking of were more like:

  *  Fixnum arithmetic.

  *  Dynamic extent declarations.

>>It is still difficult to implement an efficient Scheme, despite
>>its size ...
>
>Give me a unit of measure for your understanding of *efficient*, so that
>we'll know what this new claim is all about.

By efficient, I mean something like: as fast as C.

But I think the "difficult" is more important.  An efficient Scheme
requires a lot of attention to garbage collection and compiler
technology, and this tends to make the small size of the language
less significant.
From: Ozan Yigit
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <20650@yunexus.YorkU.CA>
In article <····@skye.ed.ac.uk> ····@aiai.UUCP (Jeff Dalton) writes:
> ...  The things I was thinking of were more like:
>
>  *  Fixnum arithmetic.
>  *  Dynamic extent declarations.

Yes, I realized that later. The first would be very useful, the second,
for the purposes of (fluid-let ...) can possibly be implemented with
minimum cost *and* without interfering with the correct optimization of
tail recursion, as in [1].

>By efficient, I mean something like: as fast as C.

Ah, I thought you probably meant only as fast as an average CommonLipth
with all tricks turned on. ;-)

>... An efficient Scheme
>requires a lot of attention to garbage collection ...

No more or no less than anything else that has a need for GC ...

>... and compiler technology, 

Aw, come on. You well know that there is a lot of mileage on this one.
Writing an average compiler for R3.99RS/IEEE scheme is essentially
trivial. [in comparison to something like C for example]. Further, it is
my experience that some low-cost common-sense optimizations (that don't
require a detailed side-effect analysis as in Rabbit and others) still pay
off handsomely. It just depends how far you want to go. (!!)

>and this tends to make the small size of the language less significant.

See (!!). The ever-growing number of scheme implementations seem to suggest
that the small size and reduced complexity of the language *is*
significant. It *is* possible to have a small, std-compliant, useful and
fast scheme.

oz
---
[1] B.F. Duba, M. Felleisen, D. P. Friedman, Dynamic Identifiers can
    be Neat, Computer Science Technical Report 220, Indiana University,
    Bloomington, Indiana, April 1987.

---
We only know ... what we know, and    | Internet: ··@nexus.yorku.ca 
that is very little. -- Dan Rather    | UUCP: utzoo/utai!yunexus!oz
From: Andrew L. M. Shalit
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <ALMS.91Jan17145321@ministry.cambridge.apple.com>
In article <····@skye.ed.ac.uk> ····@aiai.ed.ac.uk (Jeff Dalton) writes:

   There isn't any reason, other than historical, why
   Common Lisp couldn't be presented, and even implemented, in a more
   C-like way, as a language plus libraries of procedures and data types.

Offhand, I disagree with this.

It's true, Common Lisp has many features.  But these features are
often used to implement other features.  In other words, a CL
implementation has a very tangled call tree.  It's hard to find
portions of the language which could be removed.  If you put APPEND,
ASSOC, MEMBER, REVERSE, and MAPCAR into a separate module (as EuLisp
does, I believe) chances are that every implementation is going to
have them in the kernel anyway.  Hash-tables are used to implement
packages.  Format is used for error messages and other system io.
Sequence functions are used all over the place, etc.  The only real
candidate for separability I can think of is non-integer numerics.

     -andrew
--
From: John Gateley
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <GATELEY.91Jan17163302@datura.rice.edu>
In article <··················@ministry.cambridge.apple.com> ····@cambridge.apple.com (Andrew L. M. Shalit) writes:
   In article <····@skye.ed.ac.uk> ····@aiai.ed.ac.uk (Jeff Dalton) writes:
      There isn't any reason, other than historical, why
      Common Lisp couldn't be presented, and even implemented, in a more
      C-like way, as a language plus libraries of procedures and data types.
   Offhand, I disagree with this.
   It's true, Common Lisp has many features.  But these features are
   often used to implement other features.  In other words, a CL
   implementation has a very tangled call tree.

By choosing an appropriate set of primitives, you can get a small core
library with the property that the majority of functions in the CL
library will call only members of the core library (or the core
library plus a small set of others). This gives you the needed
untanglement.

John
·······@rice.edu

--
"...Yes, I've got some questions that are guaranteed to shake you up. How
much marriage urges a windmill to paint infinity? Is a magic hide-a-bed
the vile home of spanish fire? Is firm corn merrier under gifts of less
important love? We wonder ..." The Residents
From: Tim Bradshaw
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <TIM.91Jan18130158@kahlo.cstr.ed.ac.uk>
>>>>> On 17 Jan 91 22:33:02 GMT, ·······@rice.edu (John Gateley) said:

> By choosing an appropriate set of primitives, you can get a small core
> library with the property that the majority of functions in the CL
> library will call only members of the core library (or the core
> library plus a small set of others). This gives you the needed
> untanglement.

And as well as this it is relatively easy to disentangle the language
at a coarser level: leave CLOS, the new loop macro and various other
big chunks of CL mentioned in ClTL2 out of the core of CL.  In fact I
would be fairly surprised & disappointed if CL implemtations did *not*
do this!

--tim
Tim Bradshaw.  Internet: ···········@nsfnet-relay.ac.uk
UUCP: ...!uunet!mcvax!ukc!cstr!tim  JANET: ···@uk.ac.ed.cstr
"...wizzards & inchanters..."
From: Gregor Kiczales
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <GREGOR.91Jan18104953@spade.parc.xerox.com>
In article <·················@kahlo.cstr.ed.ac.uk> ···@cstr.ed.ac.uk (Tim Bradshaw) writes:

   And as well as this it is relatively easy to disentangle the language
   at a coarser level: leave CLOS, the new loop macro and various other
   big chunks of CL mentioned in ClTL2 out of the core of CL.  In fact I
   would be fairly surprised & disappointed if CL implemtations did *not*
   do this!

Actually, I would think there were better things to leave out of the
core implementation.  In fact, I would think that using CLOS in the core
is a good idea.  Its runtime can be quite small, and using it there can
provide a foundation for extensibility that many users want.

What I would leave out of the kernel implementation is stuff like the
hairy sequence functions, format and the like.  I think of these as
libraries, which can easily be separated.  It would seem that, in most
implementation strategies, these things would have a larger runtime and
be more intertwined with one another.
From: Jeff Dalton
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <3971@skye.ed.ac.uk>
In article <··················@ministry.cambridge.apple.com> ····@cambridge.apple.com (Andrew L. M. Shalit) writes:

>   There isn't any reason, other than historical, why
>   Common Lisp couldn't be presented, and even implemented, in a more
>   C-like way, as a language plus libraries of procedures and data types.

>Offhand, I disagree with this.

Well, for one thing, I don't think you're taking history enough
into account.  The large, monolithic Common Lisp systems we all
know and love were constructed that way for reasons that I would
say are essentially historical.  Implementors from a different
background and with different experience of Lisp and different
expectations of how it would be used might well have implemented
Common Lisp (the very same language) in a significantly different
way.

You later say that Common Lisp implementations have a "very tangled
call tree".  Now, maybe that is true.  But it's not necessary to
implement CL that way, and it looks like you haven't looked at any
implementations to see if it's even _been_ implemented that way.  (Not
that that would settle the matter, since I was talking about how the
language could be implemented, not how it was implemented.)

Of course, there's no doubt that the kernel of CL is bigger than that
of C.  But that doesn't mean that a separation between kernel and the
rest can't be made.

>It's true, Common Lisp has many features.  But these features are
>often used to implement other features.  In other words, a CL
>implementation has a very tangled call tree.  It's hard to find
>portions of the language which could be removed.

There seem to be a fair number of people (especially in the UK?) who
think this is so and, moreover, must be so.  However, after citing
format, packages, and maybe multiple values they tend to run out of
examples.  Perhaps there are many more examples, but it requires some
careful investigation to determine just how far the problem extends.

My feeling is that the impression of a tangled language is due at
least in part to how CL has been presented, and -- I claim -- it could
be presented differently.  It is possible to extract a coherent subset
of CL, at least conceptually.  (Since I've done this a couple of
times, I think I have fairly good reason to think it _can_ be done.)
Of course, the language would be better in this respect if it had been
one of the goals of the design, but it is nonetheless possible to go
further in this direction than one might suppose.

>                                                 If you put APPEND,
>ASSOC, MEMBER, REVERSE, and MAPCAR into a separate module (as EuLisp
>does, I believe) chances are that every implementation is going to
>have them in the kernel anyway.  

That may well be so (except for ASSOC), but you need it to be so
for more than five functions for this to be significant evidence 
for your claim that it is difficult to untangle the language.
Moreover, the kernel needn't use function in their full generality.
Most likely, MEMQ is used rather than MEMBER, for example (and
the MEMQ calls perhaps compiled away).

BTW, Prolog implementations have append/2 in them, and yet users have
to write it themselves or load it from a library of they want to use
it.  So it's certainly possible to think of a language as not having
append even though it's "really" built-in.

>Hash-tables are used to implement packages.

Not necessarily.

>Format is used for error messages and other system io.

This is the most cited example, and it's not a very good one.  Many
format calls can be compiled away into calls on much simpler functions.
Moreover, system messages can be (and almost certainly are) written to
use only a subset of format's capabilities.

>Sequence functions are used all over the place, etc. 

Really?  Remove-if, perhaps?  There are a number of functions (and not
just sequence functions) that aren't used all over the place.  Look at
KCL, for example, where the more central functions are written in C
while others are written in Lisp.  And it would be possible to reduce
the size of the C kernel if this were felt to be sufficiently
desirable.

Remember too that libraries can be implemented, or connected to the
Lisp system, in various ways that reduce the impact of procedures not
actually in use: e.g., autoloading, shared libraries.

-- JD
From: Aaron Sloman
Subject: Re: Is this the end of the lisp wave? (core + penumbra)
Date: 
Message-ID: <4255@syma.sussex.ac.uk>
····@cambridge.apple.com (Andrew L. M. Shalit) writes:

> In-reply-to: ····@aiai.ed.ac.uk's message of 16 Jan 91 19:01:04 GMT
>
>    There isn't any reason, other than historical, why
>    Common Lisp couldn't be presented, and even implemented, in a more
>    C-like way, as a language plus libraries of procedures and data types.
>
> Offhand, I disagree with this.
>
> It's true, Common Lisp has many features.  But these features are
> often used to implement other features.  In other words, a CL
> implementation has a very tangled call tree.  It's hard to find
> portions of the language which could be removed.
    ......

Jeff Dalton then replied
>
> You later say that Common Lisp implementations have a "very tangled
> call tree".  Now, maybe that is true.  But it's not necessary to
> implement CL that way ......

Here's some indirect evidence in support of Jeff's claim.

Pop-11 is a language that has much of the functionality of Common
Lisp, though with a Pascal-like syntax and a slightly different
execution model using an open stack (like Forth). It supports nearly
all the Lisp data-types plus a few of its own (lightweight
processes, partially applied procedures, external function
closures).

The largest implementation of Pop-11 is in Poplog (which also has
Common Lisp, ML and Prolog), and the latest version includes
mechanisms for interfacing to X windows by linking in widget sets.
This implementation of Pop-11 comes in the form of a core language
plus autoloadable extensions, plus additional incrementally loadable
utilites. Unlike most Common Lisp systems the core Poplog pop-11
executable image on a Sun-3, including the basic X facilities and
the integrated editor VED, takes about 1 Mbyte (1103264 bytes) and
slightly more on a Sun-4 (1471496 bytes) (as revealed by "size").
This includes the incremental compiler, and compiler building tools
that form the basis for the other language implementations. There's
no interpreter.

(The previously released version was about 200Kbytes smaller,
before the addition of new external load, callback, signal handling
mechanisms, and external function closures.)

I suspect that if this degree of compactness is possible for a core
Pop-11 system (including sophisticated editor) with the remaining
facilities available when required, then it should also be possible
for Common lisp.

In fact Poplog Common lisp requires an additional 742 KB on a Sun-3
and about 1 extra Mbyte on a Sun-4. I.e. it requires about 2Mbytes,
including the editor and Pop-11 system. (Prolog and ML each add
somewhat less).

It is possible to build a somewhat smaller version of Pop-11 by
leaving out various components, e.g. the editor, the compiler,
big integer arithmetic, complex arithmetic, etc.

However, because there are so many facilities (e.g. generic
arithemtic functions) that have to be able to operate on many
data-types it was a non-trivial exercise to implement the system in
such a way as to make it possible to relink with various components
missing.

But I expect the same thing could be done for common lisp. This
suggests that Jeff is correct. I don't know what the minimum sized
functioning Common Lisp able to compile and run user programs would
be, with the ability to be expanded to a full reasonably efficient
common lisp by compiling library files. In the case of Pop-11 I
syspect it would be somewhere between 300 and 600 Kb, leaving out
the editor, pattern matcher, tracing facilities, etc.

Aaron Sloman,
School of Cognitive and Computing Sciences,
Univ of Sussex, Brighton, BN1 9QH, England
    EMAIL   ······@cogs.sussex.ac.uk
From: Eliot Handelman
Subject: Poplog (was: Re: Is this the end of the lisp wave? (core + penumbra))
Date: 
Message-ID: <5538@idunno.Princeton.EDU>
In article <····@syma.sussex.ac.uk> ······@syma.sussex.ac.uk (Aaron Sloman) writes:

;The largest implementation of Pop-11 is in Poplog (which also has
;Common Lisp, ML and Prolog), and the latest version includes
;mechanisms for interfacing to X windows by linking in widget sets.


I keep hearing about Poplog from our friends in the UK, but I'm still
unclear as to what exactly it means "to have CL, ML and Prolog" in this 
language. Could some poplogger post an example that shows how these 
three languages might be intertwined in a poplog program? 


--eliot
From: Chris Dollin
Subject: Re: Poplog (was: Re: Is this the end of the lisp wave? (core + penumbra))
Date: 
Message-ID: <KERS.91Jan21100538@cdollin.hpl.hp.com>
Eliot Handelman writes:

   I keep hearing about Poplog from our friends in the UK, but I'm still
   unclear as to what exactly it means "to have CL, ML and Prolog" in this 
   language. Could some poplogger post an example that shows how these 
   three languages might be intertwined in a poplog program? 

Poplog is not a language; it is a system in which the four mentioned languages
(Pop11, Prolog, ML, and Common Lisp) share the same execution environment.
Hence they share the same data-type representations and call-stack structures,
hence any of the languages is callable from any of the others [*1].

Consequently a Poplog program can have different components coded in different
languages; for example, I once wrote a Prolog program that needed its input
tokenised; rather than writing a Prolog tokeniser, I just used Pop to write one
(using the Pop tokeniser and hash tables). It's a horses-for-courses approach,
allowing the exploitation of the strengths of each language where they're most
beneficial [*2].

[*1] Sometimes a little glue is needed to connect disparate language concepts,
eg Prolog backtracking, and the ML <-> other link started off weaker than the
others because of ML's strong typing and peculiar approach to redefinition of
identifiers in an interactive environment.

[*2] Actually, I almost never use Prolog, CL, or ML inside Poplog - but that's
a matter of linguistic preference rather than anything else.
--

Regards, Kers.      | "You're better off  not dreaming of  the things to come;
Caravan:            | Dreams  are always ending  far too soon."
From: Steve Knight
Subject: Re: Poplog (was: Re: Is this the end of the lisp wave? (core + penumbra))
Date: 
Message-ID: <1350034@otter.hpl.hp.com>
> I keep hearing about Poplog from our friends in the UK, but I'm still
> unclear as to what exactly it means "to have CL, ML and Prolog" in this 
> language. Could some poplogger post an example that shows how these 
> three languages might be intertwined in a poplog program? 

The Poplog system includes compilers for four "intrinsic" languages.  Each
language has extensions for accessing the other compilers.  (The compilers
are incremental and interactive and play the role of interpreters.)  
It's normal to organise a multi-language program so that the usages of
the different languages are isolated in different files.  The example I give
below puts the program all in one file, which is less usual.

What makes inter-language working feasible in Poplog is that the four
intrinsic languages share the same underlying datatypes.  For example, Prolog's
numbers, atoms, and lists are identical to those of Common Lisp.  Structures
which are special to Common Lisp (such as vectors) are treated as atoms
by Prolog.  Common Lisp is supplied with a number of functions for manipulating
Prolog terms.

For example, from inside Prolog you can call out to Lisp using
    lisp_apply(Func, Arglist, Result)
or, for multiple values, 
    lisp_mv_apply(Func, Arglist, Resultlist)
There are many other ways to access Lisp entities from Prolog, of course.

The Lisp/Prolog interface utilises the following read macros:
     {functor arg1 arg2 ... argN}    notates a goal in prefix form
     $word                           notates a prolog atom
     ?var                            notates a prolog atom
To illustrate:
     {$append ?x ?y '(a b c d e)}
This creates a Prolog 'goal', a data structure comprising a Prolog term
and an association list mapping Lisp symbols to Prolog variables. Note
that the forms inside the {} brackets are evaluated as normal Lisp forms
(hence the use of $ to notate the functor).

Prolog goals can be invoked as boolean functions, by using:
     (PLOG-GOAL-CALL goal)
There's a special-form DO-PLOG that can be used to iterate over all the 
solutions to a Prolog goal.  e.g.

     (do-plog {$append ?x ?y '(a b c)} (x y)
         (pprint x)
         (pprint y)
         (terpri))

     NIL
     (A B C)

     (A)
     (B C)

     (A B)
     (C)

Finally, here's a tiny example showing Prolog interworking with Pop11 to write
a predicate 'shuffles' that creates random permutations of a list.  I'd do it 
in Commn Lisp, but I write Common Lisp so rarely I'd be bound to make a 
mistake!

    /* Note that the RHS of an 'is' call call Pop11 functions -- exploiting
       the syntactic similarity of Pop11 expressions and Prolog terms in order
       to look natural.
    */
    shuffles( L, S ) :- repeat, S is shuffle( L ).

    /* Drop into Pop11 to define the shuffle function.  It's easier to write
       shuffle in Pop11.
    */
    :- prolog_language( 'pop11' ).

    define shuffle( L ); lvars L;
        ;;; copies the list L to a simple vector v
        lvars v = L.destlist.consvector;
        ;;; iterate over decreasing segments of v, swapping the ith element
        ;;; for any item in the segment.
        lvars i;
        for i from v.datalength by -1 to 2 do
            lvars r = random( i );
            ;;; this expression exploits the open stack
            ( v( r ), v( i ) ) -> ( v( i ), v( r ) );
        endfor;
        ;;; returns v copied back to a list
        v.destvector.conslist;
    enddefine;
    
    /* Switch back to prolog */
    :- prolog_language( 'prolog' ).
    
I hope these small examples give you a feel for what Poplog folks are talking
about.  It's only a tiny glimpse, of course, but it may enable you to relate
it to previous work.  

You should also bear in mind that the implementations of
Common Lisp, Prolog, and ML in Poplog do not currently perform as well as
standalone implementations of the same languages -- perhaps half the 
performance, sometimes much worse, sometimes much better.  This is a 
consequence of the underlying compiler backend, which is optimised for
speed of compilation rather than speed of execution of final code.  
(Compilation speed is excellent, though.  There's no need for an interpreter.)

Steve 
From: Aaron Sloman
Subject: Poplog (was: Re: Is this the end of the lisp wave? (core + penumbra))
Date: 
Message-ID: <4333@syma.sussex.ac.uk>
·····@phoenix.Princeton.EDU (Eliot Handelman) writes:

> Date: 21 Jan 91 04:25:21 GMT
> In article <····@syma.sussex.ac.uk> ······@syma.sussex.ac.uk (Aaron Sloman)
    writes:
>
> ;The largest implementation of Pop-11 is in Poplog (which also has
> ;Common Lisp, ML and Prolog), and the latest version includes
> ;mechanisms for interfacing to X windows by linking in widget sets.
>
>
> I keep hearing about Poplog from our friends in the UK, but I'm still
> unclear as to what exactly it means "to have CL, ML and Prolog" in this
> language. Could some poplogger post an example that shows how these
> three languages might be intertwined in a poplog program?
>
Actually, Poplog is not a language: it is a development environment,
which includes several languages.

As for what it means, I'll try to answer in enough detail for you
to imagine how you could implement it and use it.

When the core Poplog starts up it includes a working version of
Pop-11, including store manager, garbage collector, facilities for
interfacing to the operating system, arithmetic, data-structuring
facilities, and external language interface. It also includes the
Poplog editor VED and the incremental compiler for Pop-11.

This compiler reads Pop-11 code from the terminal or form files or
from the editor buffer, and compiles it to machine code procedures,
as you might expect an incremental compiler for Lisp to do. This
works by reading in a character stream, breaking it into tokens,
analysing them, and then planting instructions for the "Poplog
virtual machine". These Poplog VM instructions are then translated
to a low level virtual machine, and from there a host-specific
translater generates executable machine code (in procedure records).

The core Poplog also includes many built in procedures, including
the procedures that are used to define the Pop-11 compiler. These
procedures are also available to define other compilers.

So among the Poplog libraries are (in source code) compilers for
other languages. These usually include a portion written in Pop-11
to get the language going, and the rest written in the new language.

For example, one of the prolog predicates could be defined in Pop-11
thus:

define:predicate retract/1(Clause);
	lvars Clause;

    prolog_deref(Clause) -> Clause;

	if isprologvar(Clause) then
        bad_goal(Clause, "retract", 1)
    endif;

	prolog_retract(Clause);
enddefine;


And later in the same file you could find a call of a Pop-11 macro
to switch to the Prolog compiler, followed by a prolog definition
using the above Pop-11 procedure:

retractall(Clause) :-
	retract(Clause),
	fail.
retractall(Head) :-
	retract(Head :- Body),
	fail.
retractall(_).


All the Poplog compilers essentially use the same mechanisms as
Pop-11 to read in a text stream, but do different analyses, and then
call the built-in Pop-11 procedures for planting Poplog VM
instructions, which then get compiled in a language independent
manner.

In this way the core Poplog can be extended with compilers for
Common Lisp, Prolog or ML. To avoid having to compile the ocmpilers
every time you use them, they can be pre-compiled and stored in a
saved image, which can then be started up as if it were a standalone
Common Lisp, or Prolog, or ML, etc. If desired a saved image can
include more than one of the extra languages, though Pop-11 is
always there even for people who don't use it explicitly (except
perhaps to extend the editor). Similarly, users can define other
languages using the same mechanism. One user claimed that in a
few weeks of his spare time he implemented Scheme in Poplog. It
then ran fully compiled on all the machines that Poplog runs on.

For all this to be possible, the Poplog VM, and its compiler, had to
include facilities that are not necessarily required for Pop-11. For
example, there is a mechanism for handling a stack of continuations,
used by the Prolog compiler. Also, because Pop-11 has a distinct
boolean data-type, unlike Lisp (which uses NIL for False) there are
special facilities for optimising Lisp conditionals. There are also
some built in sub-routines (defined in the system in Pop-11) that
are required only for particular languages and could not be defined
efficiently in libraries. E.g. procedures for creating and
manipulating Prolog terms and prolog variables, are built in, and
the Prolog unifier is built in as a Pop-11 procedure, as are some
list-processing facilities needed for Lisp. (So the Poplog Pop-11
users has a small overhead of unused facilities, though Poplog
could be linked without them if required.)

Although not strictly necessary, it was decided to use exactly the
same internal data-structures for the different languages wherever
possible. So for example, the integers, bigintegers, floats, double
floats, complex numbers, and rationals, are the same entities for
all the languages (for which they are defined). Hence the same
system routines are used to operate on them. Similarly, Lisp, Pop-11
and Prolog by default use the same data-structures for lists (hence
the problem that arises because Lisp uses NIL instead of a boolean
data type).

Prolog atoms are Pop-11 words, and this makesit possile for Prolog
and Pop-11 to use the same symbol table. Also the Pop-11 "section"
mechanism can then be used to implement Prolog modules.

Pop-11 and ML use the same data-types for strings. Lisp and Pop-11
use the same structures for arrays. And so on.

However some things have to be done differently. E.g. the
requirements for symbol tables in Pop-11 and Lisp are different.
(Pop-11's sections are very different from Lisp's packages). So all
Pop-11 identifiers are accessed via a special Lisp package. Lisp
identifiers are accessed from Pop-11 by prefixing them with a
special prefix (which acts as a Pop-11 macro).

The sharing of data-structures between the languages, means that a
list, or array, or string, etc constructed in one language can be
given as argument to a procedure written in another language. (In
the case of Prolog you have to be careful about lists containing
variables.)

Moreover, Pop-11 procedures, Lisp procedures, ML PRocedures and
compiled Prolog predicates are all the same kinds of Poplog
datastructures, and they all use the same procedure call stack when
invoked. Hence they can all be called in the same way, and any one
of them can call one of the others, provided that the language is
extended with syntax that can be compiled into such a call. This
sort of unofficial extension is provided in all the Poplog versions
of these languages. So for example, to call the Pop-11 function
"last" from Lisp, with the list '(a b c)

    (pop11::last '(a b c))

This will return the lisp atom 'c

Mechanisms are also provided for switching compilers in the same
file, so that, for example, a prolog program can define some prolog
code, then switch to pop-11 to define some procedures, then switch
back to prolog to define some predicates that make use of those
procedures. E.g. Some Prolog programmers struggling with assert and
retract for storing global information have found that they can
speed up their programs by a few orders of magnitude by using Pop-11
properties (hash tables) instead.

Poplog also provides an external language interface, implemented
in Pop-11. So a Prolog user wanting to invoke C programs can use
this via Pop-11. Libraries that call the relevant Pop-11 utilities
would normally hide the interface from the user who does not want
to learn a new syntax.

I doubt that any user has seriously employed more than two of the
Poplog languages in one application, though people combining Lisp
and Prolog will be implicitly using Pop-11, and since the Poplog
editor VED is implementedin Pop-11 it is generally most convenient
to use Pop-11 to extend and tailor it.

Examples of applications mixing the languages have been described at
Poplog user group conferences (so far, there are no printed
proceedings). One of the earliest was a medical image processing
system that used Pascal for the lowest level image processing,
Pop-11 for intermediate level processing and Prolog for an expert
system advisor.

A big project at Sussex combines Prolog and Pop-11 in a system that
takes in shorthand typed English reports of traffic incidents as
recorded at police stations, interprets them, makes plans for
distributing the information, and provides a variety of interactive
graphical displays showing what's going on. I don't happen to know
which bits are in Prolog and which in Pop-11, but I expect Prolog is
used for parsing and planning and Pop-11 for managing the graphical
display, inter-process communication, etc.

Sometimes special purpose langauges are added. For example, RESCU
the "Real Time Expert System Club" project in the UK Alvey programme
developed a plant control system based on a mixture of Pop-11 and
a special purpose language implemented for expressing rules for
the plant control system. That prototype has been extended in a
system called COGSYS and a company called COGSYS Ltd has been set up
to market it.


You can find out more in
James A.D.W. Anderson (ed)
    Pop-11 comes of age: the advancement of an AI programming language
    Ellis Horwood, Chichester, 1989

though I fear the price of the book is horrendous.

I hope this helps to answer the question. Apologies for the length.

Aaron Sloman,
School of Cognitive and Computing Sciences,
Univ of Sussex, Brighton, BN1 9QH, England
    EMAIL   ······@cogs.sussex.ac.uk
From: Jeff Dalton
Subject: Re: Is this the end of the lisp wave? (core + penumbra)
Date: 
Message-ID: <3995@skye.ed.ac.uk>
Re: It's true, Common Lisp has many features.  But these features are
    often used to implement other features.  In other words, a CL
    implementation has a very tangled call tree.  It's hard to find
    portions of the language which could be removed.

One reason why it may be hard to see some underlying simplicity in
Common Lisp is that it is sometimes hidden by being "below" the level 
described in CLtL.

Streams are perhaps the easiest example to dissect.  The various
stream types (broadcast-stream, echo-stream, etc) appear as pimitives,
but only a few of them _have_ to be primitives.  The others could be
built up by defining new structures and extending the existing
operations (read, print, etc) to recognize the new stream types.  That
would be a fairly straightforward thing to do if the stream operations
were generic functions (as in CLOS).

In implementation with a well-integrated CLOS, streams might even
be implemented that way.  However, when CL was defined there wasn't
a standard object system and so most implementations have used
internal object-like mechanisms of their own.

Here we can see a break in the history leading to CL that has
been partially repaired by the additions of CLOS.  In (ITS) MacLisp,
a number of stream operations were implemented using a simple object-
like facility called "software file arrays" (SFAs).  In Lisp Machine
Lisp, it was done with Flavors.  The reason the repair is partial
is that CLOS was a sufficiently late addition that the stream
operations were not specified as generic functions.  They may be
in particular implementations, but portable code can't rely on it.

-- jd
From: ········@cc.utah.edu
Subject: Re: mistaken idea of C? (was end of lisp wave?)
Date: 
Message-ID: <106389@cc.utah.edu>
In article <····@skye.ed.ac.uk>, ····@aiai.ed.ac.uk (Jeff Dalton) writes:
 (some stuff omitted for brevity)

Oh no!  Here we go again...  It is bad enough for FORTRAN users
to mistake C's libraries to be the same thing as FORTRAN's, and
Ada/Pascal/Modula-2 folks to mistake the C header files for the
being the same thing as Pascal's {INCLUDE ....} directives.  But
as fellow functional language advocates, I expect LISPers to know
better.
>
>  If we start with C and
> then add in various libraries, it starts to look fairly large too.
> 

In C, you do not add in the libraries.  You add in ONLY those
functions from that library that you requested.  For example,
if you have the log() function in your code, the -lm directive
to the link loader extracts only the log() function, not every
function in the library!  In other words, you pay for only what
you need, no more.  I would say that most C executable programs
are 1/2 the size of a LISP program (including the overhead of the
LISP system) for any moderately large system.

> 
> In addition, constructs with related functionality (a data type and
> procedures for operating on its instances, for example) are often
> packaged together in a module.  If your program makes use of such
> facilities, you must request the appropriate module, just as you must
> include the appropriate ".h" file in C.
> 

You do not have to #include the header file, and a header file is
NOT a module.  All header files contain (at least system header files)
are type definitions, the return type of a function (and with ANSI C,
possibly the types of its arguments) and perhaps some data structures.
Or at least that is all they SHOULD contain.  I cringe every time I am
reminded of a teacher at a University (who shall remain unnamed) telling
his students to write code they use over and over and put it in a header
file!  This is NOT the way to write C!  For example, the following is a
legitimate C program, and note, not one header file.  Just be sure to say
(assuming the file the program is in is named prog.c) -lm on Unix systems,
ie, "cc -o prog prog.c -lm"; in other words, direct the linker to be sure
to look for the log() function in the Math library.

/*
	prog.c
	note that since we did not include any header files, we MUST
	declare the return type of any function that is not of the
	default type (int for standard C).
*/
double	atof(), log();

int main(argc, argv)
int	argc;
char	*argv[];
{
	double	arg, result;

	if (argc < 2)
	{
		puts("usage: prog <number you want log of>");
		exit(0);
	}
	arg = atof(argv[1]);
	result = log(arg);
	printf("The log of %f\nis: %f\n\n", arg, result);

	exit(0);
}

Hope this clears up ANY confusion about the language C.  Personally,
I like both LISP and C, so please don't flame me.
From: Harley Davis
Subject: Re: mistaken idea of Lisp?
Date: 
Message-ID: <DAVIS.91Jan22133951@barbes.ilog.fr>
In article <······@cc.utah.edu> ········@cc.utah.edu writes:

   >  If we start with C and
   > then add in various libraries, it starts to look fairly large too.

   In C, you do not add in the libraries.  You add in ONLY those
   functions from that library that you requested.  For example,
   if you have the log() function in your code, the -lm directive
   to the link loader extracts only the log() function, not every
   function in the library!  In other words, you pay for only what
   you need, no more.  I would say that most C executable programs
   are 1/2 the size of a LISP program (including the overhead of the
   LISP system) for any moderately large system.

I think it was clear that Jeff was talking about the size of the
language itself, as represented, for example, in the weight of the
ANSI C specification, not the size of applications developed in the
language.  There is no special reason why some Lisp dialect using
modules could not use the same kind of linkers which C uses to limit
final application size.

   Hope this clears up ANY confusion about the language C.  Personally,
   I like both LISP and C, so please don't flame me.

I like both Bordeaux and beer.  To each his own.

-- Harley
--
------------------------------------------------------------------------------
Harley Davis			internet: ·····@ilog.fr
ILOG S.A.			uucp:  ..!mcvax!inria!ilog!davis
2 Avenue Gallie'ni, BP 85	tel:  (33 1) 46 63 66 66	
94253 Gentilly Cedex		
France
From: Jeff Dalton
Subject: Re: mistaken idea of C? (was end of lisp wave?)
Date: 
Message-ID: <3990@skye.ed.ac.uk>
In article <······@cc.utah.edu> ········@cc.utah.edu writes:
>In article <····@skye.ed.ac.uk>, ····@aiai.ed.ac.uk (Jeff Dalton) writes:
>
>Oh no!  Here we go again...  It is bad enough for FORTRAN users
>to mistake C's libraries to be the same thing as FORTRAN's, and
>Ada/Pascal/Modula-2 folks to mistake the C header files for the
>being the same thing as Pascal's {INCLUDE ....} directives.  But
>as fellow functional language advocates, I expect LISPers to know
>better.

Sigh.  I was assuming that in this newsgroup I wouldn't have to
spell out every detail and wouldn't have to explain everything
at length just to prove I knew what I was talking about.  I
also expected people to try to interpret my statements as true
ones before concluding that they must be false.

>>  If we start with C and
>> then add in various libraries, it starts to look fairly large too.

And indeed it does start to look large.  Common Lisp looks big
because, for one thing, there are over 700 functions in it.  The
while point of my message was that (1) if more of the C libraries
were considered part of the language it would start to look big
just as CL does, and (2) if CL were implemented more like C, it's
size wouldn't be such a burden in practice.  

But for some reason, you have decided that I think all the C libraries
might be included in every a.out:

>In C, you do not add in the libraries.  You add in ONLY those
>functions from that library that you requested.

Not quite.  You get the ones you call plus the whatever they call
and so on.

Moreover, Lisp could be implemented in a similar way.  Indeed, some
implementations are moving in that direction.  But they work by
throwing out what's not needed rather than by putting in only what is.
There's nothing to stop them from doing it the other way, though.

>> In addition, constructs with related functionality (a data type and
>> procedures for operating on its instances, for example) are often
>> packaged together in a module.  If your program makes use of such
>> facilities, you must request the appropriate module, just as you must
>> include the appropriate ".h" file in C.

Note: this is what we call an anology.  I am not saying "header files
_are_ modules".  Nor am I assuming the header files contain the code.
The header file (used in a certain rather common way) is analogous to
an interface description.  If you want to use the stdio library, for
example, you will (usually) #include <stdio.h>.  I assumed that this
was common knowledge.  Guess not.

>You do not have to #include the header file, 

Depends on whether you want to write the declarations yourself
or not.

>and a header file is NOT a module.

See above.

>  All header files contain (at least system header files)
>are type definitions, Or at least that is all they SHOULD contain.

Actually, header files can include anything you want.  However
there are certain conventions ...  And system header files contain
declarations (not just type definitions) and macro definitions.

>                                            I cringe every time I am
>reminded of a teacher at a University (who shall remain unnamed) telling
>his students to write code they use over and over and put it in a header
>file!  This is NOT the way to write C!  For example, the following is a
>legitimate C program, and note, not one header file.

I hope you don't think it's good practice to not, for example,
include stdio.h when using stdio.

>Hope this clears up ANY confusion about the language C.  Personally,
>I like both LISP and C, so please don't flame me.

Well, sorry, but I couldn't let so complete a misunderstanding
go uncorrected and a flame is what came out.

-- jeff
From: Mark Friedman
Subject: Re: mistaken idea of C? (was end of lisp wave?)
Date: 
Message-ID: <MARKF.91Jan22142355@montreux.ai.mit.edu>
In article <······@cc.utah.edu> ········@cc.utah.edu writes:

   In C, you do not add in the libraries.  You add in ONLY those
   functions from that library that you requested.  For example,
   if you have the log() function in your code, the -lm directive
   to the link loader extracts only the log() function, not every
   function in the library!  In other words, you pay for only what
   you need, no more.  I would say that most C executable programs
   are 1/2 the size of a LISP program (including the overhead of the
   LISP system) for any moderately large system.

This generally not the case for C linkers. Usually the entire object
module containing the function that you need is extracted from the
library and linked into your executable. Run nm (which lists external
symbol names) on your executable and you'll see what I mean. I would
have included the output from nm on your small test program in this
message, but it was 209 lines long :-)

   Hope this clears up ANY confusion about the language C.  Personally,
   I like both LISP and C, so please don't flame me.

Hope this clears up ANY confusion about the language C.  Personally, I
like both LISP and Scheme, so please don't flame me :-)

-Mark
--

Mark Friedman
MIT Artificial Intelligence Lab
545 Technology Sq.
Cambridge, Ma. 02139

·····@zurich.ai.mit.edu
From: Mike Clarkson
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <17550@ists.ists.ca>
In article <····@turquoise.UUCP> ········@motcid.UUCP (Mark Ahlenius) writes:
>···@babypuss.mitre.org (David J. Braunegg) writes:
>
>>>Lack of demand due to Common LISP's enormous size, complexity, resource
>>>requirements, training, etc.
>>>
>>>Common LISP effectively died from obesity.
>
>>OK.  What are the problems preventing a smaller, more efficient Lisp
>>so that we aren't forced to use the almost-a-programming-language C?
>
>There is a smaller, faster dialect of CL out there and as far as I
>know it is being taught in some of the major universities
> - its Scheme.

I'm not sure if Scheme is really smaller. I am sure, that in my benchmarking
of almost every dialect of Scheme available, Franz and Lucid CL's are faster.
And this is without declarations, which are used regularly in CL.

As for being smaller, it depends on what you mean by smaller.  Yes the
languages as defined by their respective standards makes Scheme a smaller
dialect of Lisp.  But bear in mind that the Scheme standard does not define
many things lisp programmers consider essential, such as macros.  MIT makes
probably the most complete Scheme working environment, which includes
important elements of a Lisp programming environment such as macros,
packages, an inspector etc.; compiled it is over 35 Mbytes.  
Not small by my measure. 

> There appears to be some renewed interest in Scheme lately.

There is a lot of interest in Scheme, for very good reasons.  But speed and
size are not two of them.

Mike.


--
Mike Clarkson					····@ists.ists.ca
Institute for Space and Terrestrial Science	uunet!attcan!ists!mike
York University, North York, Ontario,		FORTRAN - just say no. 
CANADA M3J 1P3					+1 (416) 736-5611
From: Ozan Yigit
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <20709@yunexus.YorkU.CA>
In article <·····@ists.ists.ca> ····@apl.ists.ca (Mike Clarkson) writes:

>As for being smaller, it depends on what you mean by smaller. 

Sure. It all depends. I, for example, like this definition: "smaller" as
it relates to the time it takes an average 16-year-old highschool student
with a Mac Classic and a C compiler to implement the entire language. ;-)

>But bear in mind that the Scheme standard does not define
>many things lisp programmers consider essential, such as macros.

That depends what you mean by essential. If I were to take some of the
discussion in this newsgroup as a crude measure, I would probably conclude
that anything in the silver book is "essential", with all that implies.
As for macros, the issue has never been "doing it" but rather, "doing it
right", just to re-iterate in case this point hasn't been made enough many
times. A macro facility is typically included in just about every
implementation of the language, and most of those also support a more
unified extend-syntax (dbyvig) facility. A proposal for a very powerful
and hygenic macro facility is being completed on for some edition of the
revised report.

> MIT makes probably the most complete Scheme working environment, ...

which depends on what you mean by a most complete working environment,
which turns out to change in some circles just about every week. ;-)

... oz
---
We only know ... what we know, and    | Internet: ··@nexus.yorku.ca 
that is very little. -- Dan Rather    | UUCP: utzoo/utai!yunexus!oz
From: Dan Corbett
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <18944@ultima.socs.uts.edu.au>
···@babypuss.mitre.org (David J. Braunegg) writes:

>>Lack of demand due to Common LISP's enormous size, complexity, resource
>>requirements, training, etc.
>>
>>Common LISP effectively died from obesity.


>OK.  What are the problems preventing a smaller, more efficient Lisp
>so that we aren't forced to use the almost-a-programming-language C?

We had those Lisps and we threw them away.  People got obsessed with big and
powerful languages.  Here's how you too can help bring back useful Lisps.

1) Read McCarthy's paper, in which he describes the whole purpose of
	inventing Lisp. ("Recursive Functions of Symbolic Expressions,"
	CACM 3,4, April 1960)  

2) Compare McCarthy's description to Common Lisp, and see where the authors
	of CL have completely deviated from the original intent of Lisp.

3) Look at the older implementations of Lisp and see the beauty of a simple,
	well-defined language.  You don't have to go back to 1.5, take a
	look at Franz or UCI Lisp.

-----------------------------------------------------------------------------
Dan Corbett
Department of Computer Science
University of Technology, Sydney
Australia
········@ultima.socs.uts.edu.au
-----------------------------------------------------------------------------
From: Eliot Handelman
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <5420@idunno.Princeton.EDU>
In article <·····@ultima.socs.uts.edu.au> ········@phobos.socs.uts.edu.au (Dan Corbett) writes:

;3) Look at the older implementations of Lisp and see the beauty of a simple,
;	well-defined language.  You don't have to go back to 1.5, take a
;	look at Franz or UCI Lisp.


Yes, I miss this style of programming:

(defun grev (lis)
  (cond ((null (eval (caaadddadr lis))) (eval (caddadddaaadr lis)))
	((gremfrnk1 (caddadadr lis) 'glork 'fzt nil nil nil 'ftzwk)
	  (append (eval (cons (concat 
			  (cons 'flm (cadr (explode (cadddadar lis)))) )
				 (eval (cons 'list (caddadr lis)))))
		  (list (cadadadddar lis) 'brkvt)))
        (t (grev1 (list 'grk 'frmp nil nil nil nil nil nil nil)))))

Simple, yet elegant.
From: Mark Rosenstein
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <MBR.91Jan16073223@ponape.flash.bellcore.com>
In article <····@idunno.Princeton.EDU> ·····@phoenix.Princeton.EDU (Eliot Handelman) writes:

   From: ·····@phoenix.Princeton.EDU (Eliot Handelman)
   Newsgroups: comp.lang.lisp
   Date: 16 Jan 91 07:22:16 GMT
   References: <······@linus.mitre.org> <·····@ultima.socs.uts.edu.au>
   Sender: ····@idunno.Princeton.EDU
   Organization: Princesspool University, New Jersey
   Lines: 19

   Yes, I miss this style of programming:

   (defun grev (lis)
     (cond ((null (eval (caaadddadr lis))) (eval (caddadddaaadr lis)))
	   ((gremfrnk1 (caddadadr lis) 'glork 'fzt nil nil nil 'ftzwk)
	     (append (eval (cons (concat 
			     (cons 'flm (cadr (explode (cadddadar lis)))) )
				    (eval (cons 'list (caddadr lis)))))
		     (list (cadadadddar lis) 'brkvt)))
	   (t (grev1 (list 'grk 'frmp nil nil nil nil nil nil nil)))))

   Simple, yet elegant.

Nah.
"Our goal is that students who complete this subject should have a
good feel for the elements of style and the aesthetics of programming.
They should have command of the major techniques for controlling
complexity in a large system. They should be capable of reading a
50-page-long program, if it is written in an exemplary style. They
should know what not to read, and what they need not understand at any
moment They should feel secure about modifying a program, retaining
the spirit and style of the original author."
      from: "Structure and Interpretation of Computer Programs"
             Abelson, Sussman

You can write bad code in any language. Maybe even Snobol? I believe
with the macro facility of lisp, and keyword args, and the string stuff,
it is possible to generate an understandable and more importantly
maintainable version of the above. I would argue to get a fundamental
feel between the difference of C and Lisp, look and think hard about
the differences between CLX and the C version of Xlib. The use of
keyword args, the use of objects, a major part of the decomposition
of the problem is different, yet with the same functionality. Look
and decide which fits your style. Also which goes into a break loop
with wrong argument, and which goes into a segmentation fault core
dumped (oops sorry, and environment issue accidentally cropped up :^))

More fundamentally, I don't understand the: "If I understand it, I'd
build it in C" and the obesity argument. When I program in C, I'd be
using C++ and X, and maybe some other stuff. Look at how big Xclock
is. Real programmers don't need window systems? (or hash tables? or
string manipulation? or object systems? or io?)

If you want the functionality you have to pay a price. Not that
I don't want my lisp vendor to try and slim things down, but to get
to a full functionality C program, it isn't going to be small.

Maybe it's if I understand it, and it doesn't do much, I build it 
in C? This isn't fair, and I know it. But I think the overhead
of Lisp is smaller when you have bigger things.

I dunno. I honestly have no idea why lisp isn't more widely used,
or if we played back history and say Perq won the workstation contest
instead of Sun, it wouldn't be more likely that lisp would be more
widely used. Maybe there'd be hundreds of people declaring how
wonderfully clear variant records are? I dunno.

Mark.
-----
"C is a flexible programming language that gives a great deal of
freedom to the programmer. This freedom is the source of much of its
expressive power and one of the main strengths of C,making it
powerful, versatile and easy to use in a variety of application areas.
However, undisciplined use of this freedom can lead to errors."
      from: "C An Advanced Introduction"
             Gehani
From: Ozan Yigit
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <20535@yunexus.YorkU.CA>
In article <····@idunno.Princeton.EDU> ·····@phoenix.Princeton.EDU
(Eliot Handelman) writes:

>Simple, yet elegant.

Still is. ;-)

;;; fib with peano arithmetic and call/cc (by Kent Dybvig)

(define addc
   (rec addc
      (lambda (x y k)
         (if (zero? y)
             (k x)
             (addc (1+ x) (1- y) k)))))

(define fibc
   (rec fibc
      (lambda (x c)
         (if (zero? x)
             (c 0)
             (if (zero? (1- x))
                 (c 1)
                 (addc (call/cc (lambda (c) (fibc (1- x) c)))
                       (call/cc (lambda (c) (fibc (1- (1- x)) c)))
                       c))))))


And just in case you are not convinced, here is another expert
opinion. 

The wonderful thing about Scheme is:
Scheme is a wonderful thing.
Complex procedural ideas
Are expressed via simple strings.
Its clear semantics, and lack of pedantics,
Help make programs run, run, RUN!
But the most wonderful thing about Scheme is:
Programming in it is fun,
Programming in it is FUN!

                        ······@hundred-acre-wood.milne.disney
                        forwarded by ········@linus.uucp


cheers...	oz
---
Where the stream runneth smoothest,   | Internet: ··@nexus.yorku.ca 
the water is deepest.  - John Lyly    | UUCP: utzoo/utai!yunexus!oz  
From: Jeff Dalton
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <3955@skye.ed.ac.uk>
In article <·····@ultima.socs.uts.edu.au> ········@phobos.socs.uts.edu.au (Dan Corbett) writes:

>2) Compare McCarthy's description to Common Lisp, and see where the authors
>	of CL have completely deviated from the original intent of Lisp.

This is an interesting claim.  Can you flesh it out a bit so I don't
have to do all that textual analysis?
From: John R. Dudeck
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <279533fb.52e8@petunia.CalPoly.EDU>
>>2) Compare McCarthy's description to Common Lisp, and see where the authors
>>	of CL have completely deviated from the original intent of Lisp.
>
>This is an interesting claim.  Can you flesh it out a bit so I don't
>have to do all that textual analysis?

When I first tried to learn Lisp, somebody loaned me a copy of McCarthy's
book on 1.5.  It was about 1/4" thick, and I think most of that was
background.  I think you can describe the whole language in about 2 pages.

Sorry I can't be more helpful.  I wish I had that book!

-- 
John Dudeck                                        "Communication systems are
·······@Polyslo.CalPoly.Edu                              inherently complex".
ESL: 62013975 Tel: 805-545-9549                                 -- Ron Oliver
From: Jeff Dalton
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <4010@skye.ed.ac.uk>
In article <·············@petunia.CalPoly.EDU> ·······@polyslo.CalPoly.EDU (John R. Dudeck) writes:

>>>2) Compare McCarthy's description to Common Lisp, and see where the authors
>>>	of CL have completely deviated from the original intent of Lisp.

>>This is an interesting claim.  Can you flesh it out a bit so I don't
>>have to do all that textual analysis?

>When I first tried to learn Lisp, somebody loaned me a copy of McCarthy's
>book on 1.5.  It was about 1/4" thick, and I think most of that was
>background.  I think you can describe the whole language in about 2 pages.

1. It's fairly easy to find that book in libraries, and many people
   (eg, me) have a copy.

2. It didn't describe the whole language in about two pages.  It is
   possible to describe the core in about a few pages, and something
   like that is done in part of the book.  However, a description of
   all the built-in functions, the program feature (prog), the
   compiler, etc. takes a bit more.

3. "McCarthy's description" might not refer to the 1.5 book but to
   his CACM article or to one of his other short descriptions of the
   language.

Nonetheless, there's no doubt that the CL description is much bigger.
Even in interpreter for CL in CL that handles all the special forms
and the lambda-list-keywords (&optional, &rest, &key), etc. won't
fit in two pages, though a Scheme-in-Scheme or a Lisp 1.5 -in- Lisp
1.5 probably would.

But none of this amounts to showing that CL has completely deviated
from the original intent of Lisp (except, perhaps, in size -- if that
was part of the original intent).

Moreover, the original poster seemed to think that, once one had
compared McCarthy's description to CL, it would be obvious where
CL completely deviates from the original intent.  Well, I've read
various McCarthy descriptions and I know CL and it's not obvious
to me.

On the other hand, I have heard a number of claims about why CL
has strayed from the One True Path of Lisp.  Some people think
that it was a big mistake to have both dynamic and lexical scoping
(forgetting, perhaps, that many Lisps have had both, although
the lexical scoping took a more limited form).  One person said
that CL was far from the center of Lisp because it had taken on
too much from procedural languages.

So there's quite a range of things someone making such a claim
might have in mind.

-- JD
From: Drew Adams
Subject: CL has something for everyone to knock (WAS: Is this the end of the lisp wave?)
Date: 
Message-ID: <1991Feb18.111348.3498@aar.alcatel-alsthom.fr>
In article <····@skye.ed.ac.uk> ····@aiai.UUCP (Jeff Dalton) writes:
>On the other hand, I have heard a number of claims about why CL
>has strayed from the One True Path of Lisp.  Some people think
>that it was a big mistake to have both dynamic and lexical scoping
>(forgetting, perhaps, that many Lisps have had both, although
>the lexical scoping took a more limited form).  One person said
>that CL was far from the center of Lisp because it had taken on
>too much from procedural languages.
>
>So there's quite a range of things someone making such a claim
>might have in mind.

Agreed.    One  person  objects to  iterative, procedural constructs,
prefering  stream  processing  and  functional  mappings;  another is
oppositely inclined.  

"Les gouts et  les couleurs,  ca ne  se discute  pas!"
(One person's meat is another's poison.)

Perhaps  partly  because  CL  has  something  for  everyone,  it  has
something for everyone to knock.  

Is  such  all-inclusiveness  itself  something  negative?
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This is a reasonable subject for debate in  general, and  also in the
narrower context of CL and its design goals.  

In the latter case, I don't think CL's designers did so badly, *given
their goals*.  Among  others, a  principal goal  was to  come up with
something `largely  compatible' with  the major  existing dialects of
the time.  This, in itself, would tend to lead more to a  stew than a
clear broth.  
-- 
Drew ADAMS:     ·····@aar.alcatel-alsthom.fr       Tel. +33 (1) 64.49.11.54 
            ALCATEL ALSTHOM Recherche, Route de Nozay, 91460 MARCOUSSIS, FRANCE
From: Michael H Bender
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <MIKEB.91Jan17125624@wdl31.wdl.loral.com>
David J. Braunegg writes:

   >>Lack of demand due to Common LISP's enormous size, complexity, resource
   >>requirements, training, etc.
   >>
   >>Common LISP effectively died from obesity.

Having just taken a Lisp class I will voice an outsider's opinion:

If Lisp dies it will not be from size but fron REDUNDANCY and LACK OF
CONSISTENCY.  Let's face it, Common Lisp was designed by a committee and
looks like it! (Remember the camel -- the horse that was designed by 
a committee?!).  Although the format and syntax of functions is consistent
and straight-forward, the format and syntax of macros and special forms are
completely inconsistent. And there are too many different
functions/marcros/special formst/etc... for doing the exact same thing.

It would have come out a much cleaner language if it had been designed
by a single person, I expect.

Mike Bender
From: Scott "TCB" Turner
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <97216@aerospace.AERO.ORG>
It's been my perception that LISP gets used to build vaguely-defined,
evolving systems.  Much AI fits in that category.  LISP and the
programming environment that comes with it is amenable to a tinkering,
incremental approach to problem solving.

On the other hand, when the problem and its solution are well-defined,
a language like C is a more likely choice.  The code is written, the 
executable delivered, and then set aside until a round of bug fixes.

Obviously I'm over-simplifying, but I think the difference is valid.

						-- Scott Turner
From: Jeff Dalton
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <3953@skye.ed.ac.uk>
In article <·····@aerospace.AERO.ORG> ···@aerospace.aero.org (Scott "TCB" Turner) writes:
>It's been my perception that LISP gets used to build vaguely-defined,
>evolving systems.  Much AI fits in that category.  LISP and the
>programming environment that comes with it is amenable to a tinkering,
>incremental approach to problem solving.

But that's not all Lisp is good for.  I don't think there's anything
in the nature of Lisp that means it must be worse than C at the tasks
for which C is used.

>On the other hand, when the problem and its solution are well-defined,
>a language like C is a more likely choice.  The code is written, the 
>executable delivered, and then set aside until a round of bug fixes.

But better programming environments are needed for C and are being
built.  That is, C programming will get more Lisp-like, so far as the
environment is concerned.  And this will make other differences
between the languages more important.

I think you are right to suggest that it's much more straightforward
to deliver the executable when using C.  But C also has advantages
when delivering the source.  Because more machines come with C
compilers than with Lisps, C is in practice more portable (even
though, as a language, it seems to provide more opportunities for
machine-dependence).  C also tends to be much more efficient at
certain tasks, such as processing text files, and tends to produce
smaller executables.  C technology is often fairly primitive, or at
least simple.  But it works well enough.  For example, Lisp systems
seem to have to go to a lot of effort to get rid of procedures that
will not be used, while in C they tend not to be included in the first
place.  Lisp's ability to load new procedures at run time, etc, is
more sophisticated, but is often more than is needed.

-- jeff
From: Jamie Zawinski
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <JWZ.91Jan19014625@sunvalleymall.lucid.com>
In article <····@skye.ed.ac.uk> ····@aiai.ed.ac.uk (Jeff Dalton) wrote:
>
> I think you are right to suggest that it's much more straightforward
> to deliver the executable when using C.  But C also has advantages
> when delivering the source.  Because more machines come with C
> compilers than with Lisps, C is in practice more portable (even
> though, as a language, it seems to provide more opportunities for
> machine-dependence).

Not to restart the Language War to End All Language Wars again, but...  I
really disagree with you that C source is more portable than Lisp!  How many
bits is an "int"?  A "short"?  What kind of padding and alignment nonsense is
inserted into structures?  In Lisp these sorts of issues almost never matter,
but in C they almost always do.  If something is written in Common Lisp,
you're pretty much guarenteed it will work in any CL.  If something is 
written in K&R C, all bets are off.

		-- Jamie
From: Jeff Dalton
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <3976@skye.ed.ac.uk>
In article <·················@sunvalleymall.lucid.com> ···@lucid.com (Jamie Zawinski) writes:
>In article <····@skye.ed.ac.uk> ····@aiai.ed.ac.uk (Jeff Dalton) wrote:

>> ... delivering the source.  Because more machines come with C
>> compilers than with Lisps, C is in practice more portable (even
>> though, as a language, it seems to provide more opportunities for
>> machine-dependence).
>
>Not to restart the Language War to End All Language Wars again, but...  I
>really disagree with you that C source is more portable than Lisp!  How many
>bits is an "int"?  A "short"?  What kind of padding and alignment nonsense is
>inserted into structures?  In Lisp these sorts of issues almost never matter,
>but in C they almost always do. 

I understand that (hence my "more opportunities for machine-dependence").
Perhaps I should have gone into greater detail.

Nonetheless, despite all those problems with C, C programmers can
write code that is portable to a wide variety of compilers/operating
systems/machines.  Moreover, they can expect that these operating
systems and machines will have C compilers.  They cannot expect so
many of them to have Common Lisp.  Even when CL is can be obtained,
they can't expect that someone will have actually obtained it.

Indeed, there are machines here that have Common Lisp only because
Kyoto Common Lisp has been ported to them.  KCL can be ported so
easily because it is written in C and compiles Lisp to C.  When
someone wants to make a "portable" Scheme or other Lisp, they do
it in C.  And C works fairly well for this despite all the problems
with ints and the like.

>                               If something is written in Common Lisp,
>you're pretty much guarenteed it will work in any CL.  If something is 
>written in K&R C, all bets are off.

In C, many bets are off, but not all of them.

And, unfortunately, it isn't true that anything written in Common Lisp
will work in any CL.  You have to write "portable" Common Lisp, and it
takes some experience of different Common Lisps and a careful reading of
Steele in order to have a good feel for what portability requires.

-- Jeff
From: Brett G. Person
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <7796@plains.NoDak.edu>
In article <······@linus.mitre.org> ···@babypuss.mitre.org (David J. Braunegg) writes:
>>Lack of demand due to Common LISP's enormous size, complexity, resource
>>requirements, training, etc.
>>
>>Common LISP effectively died from obesity.
>
This is very right.  How manny different functions do the same thig in 
LISP?  LISP is beginning to look like something written by the government.
Which - I suppose - with the ANSI standards, it is:-)
>
>
>OK.  What are the problems preventing a smaller, more efficient Lisp
>so that we aren't forced to use the almost-a-programming-language C?
>

Simple, get rid of the redundancies, force people to re-write old code with
new styles, make the new lisp powerfull enough - i.e. give me a cheap
compiler that won't bring the machine I'm on to it's knees evertytime I use
it. 

-- 
Brett G. Person
North Dakota State University
uunet!plains!person | ······@plains.bitnet | ······@plains.nodak.edu
From: Jeff Dalton
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <4026@skye.ed.ac.uk>
In article <····@plains.NoDak.edu> ······@plains.NoDak.edu (Brett G. Person) writes:

>>>Common LISP effectively died from obesity.

I don't think CL has died, though it may not be as successful as one
might like.

>This is very right.  How manny different functions do the same thig in 
>LISP? 

I don't think this is a very big problem.  There are very few
functions that do exactly the same thing (car and first come
to mind).  It's true that there is often more than one way to
do things (e.g. to get the third element of a list on might
use caddr, third, nth, or elt).  On the other hand, a fair 
amount of potential duplication has been avoided by having
sequence functions (e.g., length) that can be applied to both
vectors and lists.
From: David A Keldsen
Subject: Re: Is this the end of the lisp wave?
Date: 
Message-ID: <1991Feb10.204123.21394@sq.sq.com>
···@babypuss.mitre.org (David J. Braunegg) writes:

>>Lack of demand due to Common LISP's enormous size, complexity, resource
>>requirements, training, etc.
>>
>>Common LISP effectively died from obesity.

>OK.  What are the problems preventing a smaller, more efficient Lisp
>so that we aren't forced to use the almost-a-programming-language C?

Ah, well, there is such an animal.  It's called Scheme, and it has just
been standardized by the IEEE.  It's a small, lexically-scoped member
of the LISP family.  (Small in that the standard is only about 50
pages; compare this to _Common Lisp, The Language_ (Second Edition) at
just over 1000 pages.  (Yep, apples vs. oranges, but you get the point.))

Scheme even has a newsgroup, which is available as a mailing list digest
as well.  Have a look at comp.lang.scheme.

I'm not a spokesperson for SoftQuad or the IEEE, but I sure do like
their tastes in programming languages :-)

Dak
-- 
David A. 'Dak' Keldsen of SoftQuad, Inc. email: ···@sq.com  phone: 416-963-8337
"Sort of?  _Sort of_ the end of the world?  You mean we won't be certain?
We'll all look around and say 'Pardon me, did you hear something?'?"
	-- _Sourcery_ by Terry Pratchett