From: Scott "TCB" Turner
Subject: Memory Management in Lisp?
Date: 
Message-ID: <1991Feb15.191259.20090@aero.org>
In building large systems in Lisp, I am consistently torn between (1)
making free use of Lisp's memory allocation and automatic garbage
collection to make my life as a developer easy and (2) handling 
garbage explicitly in hopes of creating a system that doesn't spend
all its effort thrashing garbage.

In T, I was aided somewhat in this conflict by weak reference ("pools")
which I could use to recycle objects freely, knowing that they could
still be GCed if they had no other reference.  Common Lisp lacks weak
reference, so this is no longer possible.

Will weak reference be added to Common Lisp?  Or even better, will some
kind of user memory management be added?

						-- Scott Turner

From: Barry Margolin
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <1991Feb15.223520.17267@Think.COM>
In article <······················@aero.org> ···@aero.org (Scott "TCB" Turner) writes:
>Will weak reference be added to Common Lisp?  

Not in the current round.  Only a few Common Lisp implementations currently
have such mechanisms, and there isn't yet a concensus on the right
interface to them.  Maybe a future, followon standard (Common Lisp 2000?)
will have them, because many of us acknowledge that they are useful.

>					       Or even better, will some
>kind of user memory management be added?

There have been no proposals made to X3J13 for any such mechanism, and it's
too late now for new features.  I'm not even sure how good a portable
memory management mechanism could be in a Lisp environment.  Could you
describe the kind of interface you're hoping for?  Something reasonably
simple like resources (pools of similar objects, useful when you're
repeatedly allocating and deallocating objects, as you reuse objects rather
than letting them become garbage), or something more complicated like areas
(sections of memory with different garbage collection parameters).

Note that Common Lisp currently specifies virtually nothing about memory
management.  CLtL doesn't use the term "garbage collection" at all, and
CLtL2 only mentions it in passing (in the description of the DYNAMIC-EXTENT
declaration).  An implementation without a garbage collector would conform
to the spec; in fact, for several years most MIT Lisp Machine users
disabled the garbage collector because the original implementation was very
buggy -- they simply rebooted when they ran out of memory.

--
Barry Margolin, Thinking Machines Corp.

······@think.com
{uunet,harvard}!think!barmar
From: Scott "TCB" Turner
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <1991Feb18.191549.7575@aero.org>
(Barry Margolin) writes:
>I'm not even sure how good a portable memory management mechanism
>could be in a Lisp environment.  Could you describe the kind of
>interface you're hoping for?

I haven't really given it much thought.  But it has always seemed
strange to me that Lisp in general gives little control over memory
management to the user, when it is so obviously critical to system
performance.

I've been thinking about the subject since my original post, and one
facility I think might be interesting is a garbage collection advisor.
Currently gc collects an object if it has no references.  The idea
here is to let gc collect an object if the (user-supplied) advisor
predicate returns true.

						-- Scott Turner
From: Barry Margolin
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <1991Feb19.030719.1137@Think.COM>
In article <·····················@aero.org> ···@aero.org (Scott "TCB" Turner) writes:
>I haven't really given it much thought.  But it has always seemed
>strange to me that Lisp in general gives little control over memory
>management to the user, when it is so obviously critical to system
>performance.

Traditionally, one of the most important differences between Lisp and most
other languages has been the fact that memory management is automatic.
Even EVAL isn't as fundamental to Lisp as I once thought, since Scheme
doesn't have it (although many *implementations* of Scheme add it as an
extension), but it does have automatic memory management.

>I've been thinking about the subject since my original post, and one
>facility I think might be interesting is a garbage collection advisor.
>Currently gc collects an object if it has no references.  The idea
>here is to let gc collect an object if the (user-supplied) advisor
>predicate returns true.

An interesting idea, but potentially very dangerous.  An object may have
references that the application isn't aware of.  For instance, the user
could have assigned an object to a global variable while in the debugger.
Or in a Dynamic Windows or CLIM environment, just printing out the object
creates a new reference to the object from the window's output history
structure.

A similar caveat could be made about objects allocated on the stack (as
ANSI CL will permit, using the DYNAMIC-EXTENT declaration).  However, in
this case, the routine that saves away the value could check whether it is
stack-allocated, and move it to the heap first (that's what Dynamic Windows
does).  However, in your example, the GC advisor runs *after* the reference
has been made.
--
Barry Margolin, Thinking Machines Corp.

······@think.com
{uunet,harvard}!think!barmar
From: Piercarlo Grandi
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <PCG.91Feb23162955@odin.cs.aber.ac.uk>
On 19 Feb 91 03:07:19 GMT, ······@think.com (Barry Margolin) said:

barmar> In article <·····················@aero.org> ···@aero.org (Scott
barmar> "TCB" Turner) writes:

srt> I haven't really given it much thought.  But it has always seemed
srt> strange to me that Lisp in general gives little control over memory
srt> management to the user,

It is a fact of life that in many important applications or subsystems
thereof the lifetimes of values can be easily calculated as it is
implicit in the nature of the datastructures being used.

In these cases it is extremely easy to cut garbage generation to zero by
using a 'make' operation and a 'disp' operation and a 'reuse' list;
'make' only invokes 'cons' if 'reuse' is empty, else it pops an element
from 'reuse'. 'disp' pushes an element onto 'reuse'. All this is farily
easy to arrange for certain container structures and using OO flavoured
programming.

The win is that many applications reach steady states in which, thanks
to the above technique, garbage generation rates can become zero or
nearly so.

Clearly one needs, especially with copying collection, that references
in the 'reuse' lisp be weak, that is when finally a collection is
triggered, that it also collect whatever reusable object has been
cached. But this is already fairly common, and for reasons not too
dissimilar from managed reclamation.

So, I don't think that it is true that Lisp gives little control over
storage management to the user; it just provides a storage allocator,
and a worst case storage reclamation technology. Most users simply
cannot see anything else, and are happy to go with the default.

To rely on garbage collection, which is a worst case mechanism to use to
reclaim storage when nothing is known a priori about value lifetime, for
known lifetime cases is clearly wasteful. An analogy is using dynamic
type dispatching when the type of a value is well known by design. Maybe
we need sophisticated, checkable lifetime declarations (better than
DYNAMIC-EXTENT), not just sophisticated, checkable, type declarations.

srt> when it is so obviously critical to system performance.

This is a very untraditional concern. It used to be fairly true, as one
quote puts it, that "Lisp programmers know the value of everything but
the cost of nothing" :-).

barmar> Traditionally, one of the most important differences between
barmar> Lisp and most other languages has been the fact that memory
barmar> management is automatic.

Unfortunately this has traditionally encouraged sloppiness.  For example
many functions, where even without this simple technique garbage
generation should be zero, are sloppily coded. Part of it may be desire
to aboit using 'set' or 'rplac' variants, part of it is just
carelessness.

This is simply because Lisp programmers have always been offered,
because they were doing "research", the largest machines around and
resource consumption has never been a problem, until they started having
to do delivery. But unfortunately just like the nerds in Unixland don't
know anything about locality of reference, the hackers in Lispland don't
even think about garbage generation rates. I am sure that you, an old
Multics fan, can weep with me for hours about both attitudes.

You will also already find it ironic that the large benefits, when
applicable, of automatic reclamation (which is just a part of storage
management, let me nitpick) are almost as unkown outside the Lisp (and
similar languages) world as those of managed reclamation inside it.

barmar> Even EVAL isn't as fundamental to Lisp as I once thought, since
barmar> Scheme doesn't have it [ ... ]

The entire Scheme report is really about defining the semantics of
'eval'. It does not have a way to invoke recursively the 'eval', but
that is probably best left as an implementation issue. Also, Scheme is
not a Lisp -- it is an Algorithmic Language :-).
--
Piercarlo Grandi                   | ARPA: ·················@nsfnet-relay.ac.uk
Dept of CS, UCW Aberystwyth        | UUCP: ...!mcsun!ukc!aber-cs!pcg
Penglais, Aberystwyth SY23 3BZ, UK | INET: ···@cs.aber.ac.uk
From: Jeff Dalton
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <4234@skye.ed.ac.uk>
In article <·················@odin.cs.aber.ac.uk> ···@cs.aber.ac.uk (Piercarlo Grandi) writes:

>This is a very untraditional concern. It used to be fairly true, as one
>quote puts it, that "Lisp programmers know the value of everything but
>the cost of nothing" :-).

That must be why they never bothered to write compilers :-).

>barmar> Traditionally, one of the most important differences between
>barmar> Lisp and most other languages has been the fact that memory
>barmar> management is automatic.
>
>Unfortunately this has traditionally encouraged sloppiness.  For example
>many functions, where even without this simple technique garbage
>generation should be zero, are sloppily coded. Part of it may be desire
>to aboit using 'set' or 'rplac' variants, part of it is just
    avoid ?
>carelessness.

Well, of course many functions are sloppily coded.  This is a problem
in every programming language.  But I don't count a lack of attention
to object lifetimes as part of the sloppiness except in certain cases.
More often automatic storage management makes it easier to write code
that is clearer and in a sense more abstract.

>This is simply because Lisp programmers have always been offered,
>because they were doing "research", the largest machines around and
>resource consumption has never been a problem, until they started having
>to do delivery. 

I think this is true to a large extent, although resource consumption
_has_ often been recognized as a problem.

>                 But unfortunately just like the nerds in Unixland don't
>know anything about locality of reference, the hackers in Lispland don't
>even think about garbage generation rates. I am sure that you, an old
>Multics fan, can weep with me for hours about both attitudes.

A number of Lisp hackers _did_ think about garbage generation rates
when it was appropriate to do so.

>barmar> Even EVAL isn't as fundamental to Lisp as I once thought, since
>barmar> Scheme doesn't have it [ ... ]
>
>The entire Scheme report is really about defining the semantics of
>'eval'. 

I hope you're not supposing that Barmar didn't know this.

>It does not have a way to invoke recursively the 'eval', but
>that is probably best left as an implementation issue.

There are programs that can use it and they cannot be written in
portable Scheme.  There are good reasons for leaving EVAL out of
Scheme, but they are not so good that all Lisps should do the same.
I don't see how any of this makes it an implementation issue.

>Also, Scheme is not a Lisp -- it is an Algorithmic Language :-).

Single inheritiance thinking rides again.  [Common Lisp is a Lisp but
not supposedly an "Algorithmic language".  Scheme is an Algorithmic
language.  Since Algorithmic language is not a subcategory of Lisp,
and Lisp is not a subcategory of Algorithmic language (because of
such Lisps as Common Lisp), it must be that Scheme is not a Lisp.]

-- jd
From: Piercarlo Grandi
Subject: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <PCG.91Feb26194909@odin.cs.aber.ac.uk>
On 25 Feb 91 15:12:05 GMT, ····@aiai.ed.ac.uk (Jeff Dalton) said:

jeff> In article <·················@odin.cs.aber.ac.uk>
jeff> ···@cs.aber.ac.uk (Piercarlo Grandi) writes:

barmar> Traditionally, one of the most important differences between
barmar> Lisp and most other languages has been the fact that memory
barmar> management is automatic.

pcg> Unfortunately this has traditionally encouraged sloppiness.

jeff> More often automatic storage management makes it easier to write
jeff> code that is clearer and in a sense more abstract.

Mostly agreed, but clearer and more abstract code should be modularized
so that explicit storage reclamation can be slipped in easily. When this
cannot be done, one has a tradeoff between having it easier to write
code that is slightly clearer and more abstract and using a machine
which costs ten time less, and the choice is often loaded :-). We all
already know this, but I'd like to emphasize it.

jeff> A number of Lisp hackers _did_ think about garbage generation
jeff> rates when it was appropriate to do so.

Ah yes, they did, the good ones and those with vast experience and
serious problems, but not everybody is a Dalton or Margolin, and garbage
generation rates are not even mentioned, just like locality of reference
in C/Unix books, in any Lisp/Scheme book I have seen -- please don't
post counterexamples! Every gross generalization like this has them, but
I hope you get the message, which is garbage generation rates is not
thought to be an issue worth mentioning, while for example time, but not
space, complexity is often prominently addressed (entire books devoted
to it!).

Even research in these issues has become unfashionable, like research in
locality of reference, and I can think of precious few contributions in
either field for the past ten years or so.

barmar> Even EVAL isn't as fundamental to Lisp as I once thought, since
barmar> Scheme doesn't have it [ ... ]

pcf> The entire Scheme report is really about defining the semantics of
pcg> 'eval'. 

jeff> I hope you're not supposing that Barmar didn't know this.

Your hope is well founded. I do occasionally restate the obvious just to
make is clearer what is the shape of my reasoning, if any, and not null
and void where prohibited by relevant statutes :-).

pcg> It does not have a way to invoke recursively the 'eval', but
pcg> that is probably best left as an implementation issue.

jeff> There are programs that can use it and they cannot be written in
jeff> portable Scheme. [ ... ] I don't see how any of this makes it an
jeff> implementation issue.

Well, my reasoning goes as follows: the Scheme RR defines the semantics
of 'eval', but does not define how to *invoke* it, because invoking it
*is* an implementation dependent issue. It may be something like 'scheme
<file>', for example, in most implementations. Or the implementation may
be a Scheme machine and all you have to type is '<expr>', 'eval' being
implicit.

pcg> Also, Scheme is not a Lisp -- it is an Algorithmic Language :-).

jeff> Single inheritiance thinking rides again.  [Common Lisp is a Lisp but
jeff> not supposedly an "Algorithmic language".  Scheme is an Algorithmic
jeff> language.  Since Algorithmic language is not a subcategory of Lisp,
jeff> and Lisp is not a subcategory of Algorithmic language (because of
jeff> such Lisps as Common Lisp), it must be that Scheme is not a Lisp.]

Now, *you* are underestimating me a bit. The serious point I was making
is that a Lisp is historically a symbolic List Processing language,
while, except for the survival of 'cons', 'car', and 'cdr', Scheme is
more of something in the Algol 60 tradition, with a superficially
Lisp-like syntax.

Unfrotunately I was not using, mea culpa, Algorithmic Language in the
proper sense of alternative to Programming Language, but in the narrower
and less proper sense of Algol like language. Scheme is an Algorithmic
Language that is Algol-like, while there are non Algol-like algorithmic
languages (e.g.  APL, or ML); CommonLisp is a Programming Language that
is Lisp-like (in a cancerous way :->), while there are Lisp-like
languages that are not programming languages (the old UW or Vincennes
Lisps, Forth and TRAC (TM) maybe qualify as Lisp-like too).

The lack of an explicit 'eval' gives the show away, I am afraid (just
like the consequence of having 'quasiquote' & Company as primitives
does).

Scheme cannot be used, portably, to "reason" about programs, inasmuch it
cannot *directly* build and execute *program fragments*, like every
Lisp-like language is supposed to do. Some say this is *the*
distinguishing characteristic of Lisp-like languages, however rarely it
is used in practice.

Scheme can "reason" more powerfully than most other languages (including
Common Lisp), thanks to lexical closures and to continuations, and to
explicit environments supported by many implementations, about *program
states*, in both the context and the control graph, but this really is
in the Algol-like language tradition, where 'own' is the root of all
such powerful technology. OO technology, which is related to it, is
after all a consequence of 'own', of Simula I and Simula 67, all
Algol-like languages.

The Lisp-like language tradition had to fumble with the upward and
downward funarg issue almost forever (until Steele and Baker, more or
less), even if as of now funargs are more mainstream in the Lisp-like
language camp than in the Algol-like language camp.

After all this, IMNHO it is not so far out to say that Scheme is an
alternative syntax (modulo the latent type issue) for an Algol 68
subset, if Algol 68 had upward funargs (and some Algol 68
_implementations_ had them, as an extension to the standard!).

I think that the provision of *excessive* and *regrettable* (complex and
rational!) numeric facilities in Schemeis also designed to give the
impression that it is designed to be more of a Algol-like language than
a Lisp-like language.

It is true however that most Scheme *implementations* have actually been
like Lisp implementations, that is workspace based, and with 'eval'; but
I am sure that a 'schemec' compiler that generated object code like the
'cc' compiler could be done, and actually some imperfect realizations do
exist (scheme2c from DEC actually is mostly used to produce objects to
be loaded in a Scheme environment; PreScheme from MIT is better geared,
I seem to understand, to generating standalone executables to be linked
with a library).

Actually, let me say, I think that one of the fatal mistakes in Scheme,
like it was for Pascal, is that there is no standard way to address
separate compilation! Seriously. Think carefully about the
implications...
--
Piercarlo Grandi                   | ARPA: ·················@nsfnet-relay.ac.uk
Dept of CS, UCW Aberystwyth        | UUCP: ...!mcsun!ukc!aber-cs!pcg
Penglais, Aberystwyth SY23 3BZ, UK | INET: ···@cs.aber.ac.uk
From: Guillermo J. Rozas
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <JINX.91Feb26213028@chamarti.ai.mit.edu>
    Now, *you* are underestimating me a bit. The serious point I was making
    is that a Lisp is historically a symbolic List Processing language,
    while, except for the survival of 'cons', 'car', and 'cdr', Scheme is
    more of something in the Algol 60 tradition, with a superficially
    Lisp-like syntax.

Well, what makes a Lisp?  Let me suggest a few requirements.
1. List processing.
2. Programs that look like data.
3. Objects with unbounded extent and automatic storage management.
4. EVAL (making use of 2).
5. Powerful syntactic extension facilities that make use of 2.
6. Call by value.
7. Imperative constructs.
8. Functional style encouraged, or at least not discouraged.
9. Latent types.
10. Generic arithmetic, with integers not restricted to the word size
of the machine.
11. Interactive development enviroments.

You are right to claim that Scheme is not a Lisp because of its lack
of 4 and 5, but

- Every implementation of Scheme that I know of has both.

- It is very much the intent of the RnRS group to agree on a portable
way to do 5, and we have not yet agreed on 4 not because we don't want
EVAL in the language, but primarily because we cannot completely agree
on its arity.

- There are dialects of Lisp out there that don't satisfy some of
those requirements, yet no one thinks they are not Lisp.


Of course, Scheme is also Algol-like because it is a lexically scoped,
procedural language with imperative constructs, but so is CL.  As you
probably know, the original claim that Scheme was Algol-like, as
opposed to other Lisps, was because other Lisps were dynamically
scoped at the time.

    Scheme cannot be used, portably, to "reason" about programs, inasmuch it
    cannot *directly* build and execute *program fragments*, like every
    Lisp-like language is supposed to do. Some say this is *the*
    distinguishing characteristic of Lisp-like languages, however rarely it
    is used in practice.

Well, not quite right about Scheme.  R3RS, the last report published,
includes LOAD, and an adequate EVAL can be written in terms of it:

(define (eval S-expression)
  (with-output-to-file "<some temp file>"
    (lambda ()
      (write S-expression)))
  (load "<some temp file>"))

    I think that the provision of *excessive* and *regrettable* (complex and
    rational!) numeric facilities in Schemeis also designed to give the
    impression that it is designed to be more of a Algol-like language than
    a Lisp-like language.

Hmm.  That's interesting.  My reading of the various reports and the
draft IEEE standard, and my understanding of their intent, must be
different from yours.  Scheme does not require ratnums or recnums, it
merely requires that built-in operators should handle them
transparently if they are provided by an implementation.

You are confusing rational numbers with ratnums, an implementation
technique for representing arbitrary rationals.  You are also
confusing complex numbers with recnums, a representation technique for
reals with non-zero imaginary parts.

The built-in operators should handle rationals (whether implemented as
ratnums or floats) and complex numbers (whether they have a non-zero
imaginary part or not) correctly, but implementations are free not to
supply any way to construct non-float rationals (nor even floats for
that matter) nor any way to construct non-real complex numbers.

In other words, the requirement is one of integration if the features
are present, not a requirement on the presence!

Furthermore, I can't see why you would say that these features put
Scheme in the Algol camp instead of the Lisp camp.  Algol-60 (the
Algol meant by Steele and Sussman) did not have generic arithmetic,
but Lisps typically do.

    Actually, let me say, I think that one of the fatal mistakes in Scheme,
    like it was for Pascal, is that there is no standard way to address
    separate compilation! Seriously. Think carefully about the
    implications...

More on this along the way, we hope.

Scheme is not a finished language.  It is finished enough for some
purposes, but it is seriously lacking in others.  The lack of other
facilities is not because they are not considered important by the
authors of the report, but because they have not yet agreed on what
they should look like.

I think you are reading too much into the Scheme reports.
From: Aubrey Jaffer
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <13573@life.ai.mit.edu>
>> 4. EVAL (making use of 2).
>> 5. Powerful syntactic extension facilities that make use of 2.
   ...
>> You are right to claim that Scheme is not a Lisp because of its lack
>> of 4 and 5, but
>> - Every implementation of Scheme that I know of has both.

SCM has neither facility.  Eval makes compilation impossible without
including an interpreter or compiler in the run-time support.

The lack of macros makes pure scheme code easily readable.  I would
not object to macros if they were required to be syntactically
differentiated from variable names in order to preserve readability.
I find when reading common-lisp code that I often don't know if a user
defined symbol is a macro or a function.

To make a more radical suggestion I think that scheme might do very
well to NOT include macros.  Introducing new syntactic constructs
(macros) in order to avoid typing a few lambdas is bad programming
style in that the code becomes unreadable.  If a syntactic construct
provides a programming paradigm not already supported by scheme then
it should be added to the language.
From: Jeff Dalton
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <8761@castle.ed.ac.uk>
In article <·················@odin.cs.aber.ac.uk> ···@cs.aber.ac.uk (Piercarlo Grandi) writes:
>jeff> More often automatic storage management makes it easier to write
>jeff> code that is clearer and in a sense more abstract.
>
>Mostly agreed, but clearer and more abstract code should be modularized
>so that explicit storage reclamation can be slipped in easily. When this
>cannot be done, one has a tradeoff between having it easier to write
>code that is slightly clearer and more abstract and using a machine
>which costs ten time less, and the choice is often loaded :-). We all
>already know this, but I'd like to emphasize it.

I would disagree that the code is only "slightly clearer" and that
the machine would cost 10 times less.  Let's suppose Lisp is a factor
of 2 slower due to garbage collection, which in some cases is an
overestimate.  I bought a 386 machine a few months ago, and now for
more or less the same price I could get a 486 that's twice as fast.
So it looks like I could get my clearer and more abstract code
without paying anything more.  This estimate is no doubt off,
but I don't think it's off by a factor of 10.

Of course, if someone came up with a language in which I could slip in
explicit storage management (1) without having to write more awkward
code in order to allow the slipping-in and (2) without having slower
automatic storage management, I would be happy to use it. 

>jeff> A number of Lisp hackers _did_ think about garbage generation
>jeff> rates when it was appropriate to do so.
>
>Ah yes, they did, the good ones and those with vast experience and
>serious problems, but not everybody is a Dalton or Margolin, and garbage
>generation rates are not even mentioned, just like locality of reference
>in C/Unix books, in any Lisp/Scheme book I have seen -- please don't
>post counterexamples! Every gross generalization like this has them, but
>I hope you get the message, which is garbage generation rates is not
>thought to be an issue worth mentioning, while for example time, but not
>space, complexity is often prominently addressed (entire books devoted
>to it!).

Although books clearly have some impact on programming practice,
I don't think they necessarily give a true picture of what that 
practice is.

Since you won't let me use counterexamples, let me reply with some
gross generalizations.  Almost all Lisps texts are introductions and
consequently treat efficicency as a relatively minor concern.  It might
be argued that they're wrong to do this, but I think it's understandable.
Experienced programmers think differently, and many User Manuals for
implementations discuss efficiency at length in recognition of this. 

>Even research in these issues has become unfashionable, like research in
>locality of reference, and I can think of precious few contributions in
>either field for the past ten years or so.

I will leave this to people who have a stock of references nearer to
hand.  There are certainly things I've heard about only in the last 10
years, but of course that's not the same thing.

>barmar> Even EVAL isn't as fundamental to Lisp as I once thought, since
>barmar> Scheme doesn't have it [ ... ]
>pcf> The entire Scheme report is really about defining the semantics of
>pcg> 'eval'. 
>jeff> I hope you're not supposing that Barmar didn't know this.
>
>Your hope is well founded. I do occasionally restate the obvious just to
>make is clearer what is the shape of my reasoning, if any, and not null
>and void where prohibited by relevant statutes :-).

Yes, but I think you end up addressing the wrong point.  Barmar was
clearly talking about a sense of "have EVAL" in which Lisp normally
has it but Scheme does not.  All Lisps, including Scheme, define the
semantics of 'eval'.  So I would say Barmar must be talking about
something else, namely that Scheme does not make eval available as
a function.  Since this is a well-known and much-discussed issue,
Bamar didn't need to state it explicitly in order to make his meaning
clear.  Or so I would have thought.

>pcg> It does not have a way to invoke recursively the 'eval', but
>pcg> that is probably best left as an implementation issue.
>
>jeff> There are programs that can use it and they cannot be written in
>jeff> portable Scheme. [ ... ] I don't see how any of this makes it an
>jeff> implementation issue.
>
>Well, my reasoning goes as follows: the Scheme RR defines the semantics
>of 'eval', but does not define how to *invoke* it, because invoking it
>*is* an implementation dependent issue. It may be something like 'scheme
><file>', for example, in most implementations. Or the implementation may
>be a Scheme machine and all you have to type is '<expr>', 'eval' being
>implicit.

Well, all that is true, but it is still not what Barmar was talking
about.

I will leave the "Scheme isn't a Lisp" for another message while I
go look for a better editor on this random machine I'm using (the
usual one being broken).

-- jeff
From: Guillermo J. Rozas
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <JINX.91Feb27114702@chamarti.ai.mit.edu>
In article <·····@life.ai.mit.edu> ······@gerber.ai.mit.edu (Aubrey Jaffer) writes:

   Path: ai-lab!gerber!jaffer
   From: ······@gerber.ai.mit.edu (Aubrey Jaffer)
   Newsgroups: comp.lang.lisp,comp.lang.scheme
   Date: 27 Feb 91 03:40:22 GMT
   References: <······················@aero.org> <······················@Think.COM>
	   <·····················@aero.org> <·····················@Think.COM>
	   <·················@odin.cs.aber.ac.uk> <····@skye.ed.ac.uk>
	   <·················@odin.cs.aber.ac.uk> <JINX.91Feb262130
   Sender: ····@ai.mit.edu
   Lines: 22
   Xref: ai-lab comp.lang.lisp:3113 comp.lang.scheme:1455

   >> 4. EVAL (making use of 2).
   >> 5. Powerful syntactic extension facilities that make use of 2.
      ...
   >> You are right to claim that Scheme is not a Lisp because of its lack
   >> of 4 and 5, but
   >> - Every implementation of Scheme that I know of has both.

   SCM has neither facility.  Eval makes compilation impossible without
   including an interpreter or compiler in the run-time support.

I did not know that SCM did not include EVAL or macros.  Now I do.

Your comment about EVAL is no different from saying that compilation
of code that does IO is impossible without including WRITE (or even
more apropos, FORMAT) in the run-time support.
It is perfectly possible to build a Scheme system that links only
those run-time modules needed, and thus code not using EVAL would
never need it.

   The lack of macros makes pure scheme code easily readable.  I would
   not object to macros if they were required to be syntactically
   differentiated from variable names in order to preserve readability.
   I find when reading common-lisp code that I often don't know if a user
   defined symbol is a macro or a function.

   To make a more radical suggestion I think that scheme might do very
   well to NOT include macros.  Introducing new syntactic constructs
   (macros) in order to avoid typing a few lambdas is bad programming
   style in that the code becomes unreadable.  If a syntactic construct
   provides a programming paradigm not already supported by scheme then
   it should be added to the language.

I agree that macros can be abused and often are.  But I disagree with
your suggestion that all common and useful constructs should be
included in the language.  A language can't be (and shouldn't be) that
comprehensive.  Furthermore, as large as the design group is, it will
not include the complete community, nor will it be able to envision
all paradigms.

In particular, a language designed by consensus, such as the RnRS
dialects of Scheme, will have a hard time adding new special forms if
part of the design group feels strongly against adding them (I and
many others in the MIT Scheme community feel this way).  Yet I would
like to be able to use WHILE, FLUID-LET and perhaps even DO* even
though I might not want them in the language.

It is also the case that code can become much more readable with
judicious use of syntactic sugar.  I find code written with FLUID-LET
easier to read than code that passes many additional almost-constant
parameters around, or that open-codes FLUID-LET where it would be
used.  WHILE is often appropriate for imperative programs, and many
other special forms can be similarly defended.

I don't know how to resolve the tension between providing powerful
(and easily misused) tools and discouraging their abuse.  Perhaps the
best solution is to follow the guidelines that I once heard from Alan
Bawden (my paraphrasing): 
- No Lisp programmer should be allowed to write macros unless s/he's
been granted a license.
- Alan Bawden has a license and grants all other licenses.
I think this would have the properties that I (and perhaps you)
desire, but is, unfortunately, infeasible and politically incorrect.
From: Mark Friedman
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <MARKF.91Feb27121721@montreux.ai.mit.edu>
In article <·····@life.ai.mit.edu> ······@gerber.ai.mit.edu (Aubrey Jaffer) writes:

   I find when reading common-lisp code that I often don't know if a user
   defined symbol is a macro or a function.

I find when reading code that I often don't know what a procedure does
or what its arguments are supposed to be.  My point is that there are
other things that one needs to know in order to understand code. The
issue of whether that a combination is a macro call or a syntactic
form or a procedure call is only one of them.

   If a syntactic construct provides a programming paradigm not
   already supported by scheme then it should be added to the
   language.

You've obviously never been to a Scheme standards or RNRS meeting :-)
Seriously, macros ARE a way to add to the language. Procedures are
another. Why discriminate against one of them.

-Mark
--

Mark Friedman
MIT Artificial Intelligence Lab
545 Technology Sq.
Cambridge, Ma. 02139

·····@zurich.ai.mit.edu
From: Jeff Dalton
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <8768@castle.ed.ac.uk>
In article <·················@odin.cs.aber.ac.uk> ···@cs.aber.ac.uk (Piercarlo Grandi) writes:
>On 25 Feb 91 15:12:05 GMT, ····@aiai.ed.ac.uk (Jeff Dalton) said:

>pcg> Also, Scheme is not a Lisp -- it is an Algorithmic Language :-).
>
>jeff> Single inheritiance thinking rides again.  [Common Lisp is a Lisp but
>jeff> not supposedly an "Algorithmic language".  Scheme is an Algorithmic
>jeff> language.  Since Algorithmic language is not a subcategory of Lisp,
>jeff> and Lisp is not a subcategory of Algorithmic language (because of
>jeff> such Lisps as Common Lisp), it must be that Scheme is not a Lisp.]
>
>Now, *you* are underestimating me a bit. The serious point I was making
>is that a Lisp is historically a symbolic List Processing language,
>while, except for the survival of 'cons', 'car', and 'cdr', Scheme is
>more of something in the Algol 60 tradition, with a superficially
>Lisp-like syntax.

1. Scheme aims to be both a Lisp and an Algol-like language.  The
   former ought to be clear on a variety of grounds.  Evidence for
   the latter includes the name of the TeX style file for the Scheme
   reports (ie, "algol60").

   The idea that Scheme is not a Lisp is promulgated mostly by those
   who do not like Lisp but want to, somehow, make an exception of
   Scheme.  Due to single-inheritance thinking, they think this ought
   to be done by placing Scheme outside of Lisp.

   I would also disagree with the suggestion that Scheme is more an
   Algol than a Lisp.

2. The idea that Scheme is further away from list processing than
   other Lisps is somewhat bizarre, because many other Lisps have a
   greater range of data types than Scheme.  Scheme is different
   from traditional Lisp in large part because it was developed
   later.  Lisp is not now, if it ever was, a language in which
   "everything is a list".  

3. The scheme syntax is not _superficially_ Lisp-like; it is a Lisp
   syntax.

>Unfortunately I was not using, mea culpa, Algorithmic Language in the
>proper sense of alternative to Programming Language, but in the narrower
>and less proper sense of Algol like language. Scheme is an Algorithmic
>Language that is Algol-like, while there are non Algol-like algorithmic
>languages (e.g.  APL, or ML); CommonLisp is a Programming Language that
>is Lisp-like (in a cancerous way :->), while there are Lisp-like
>languages that are not programming languages (the old UW or Vincennes
>Lisps, Forth and TRAC (TM) maybe qualify as Lisp-like too).

"Proper sense"?  Give me a break.

I will agree that you have now departed from single-inheritance, but
only to the extent of having two hierarchies instead of one.  I do not
find your elaborated classification scheme any more convincing.  ML is
closer to Algol in many ways than Scheme is.  Forth and Track may be
in some class that also includes Lisp, but there's surely a better
name for it than "Lisp-like".

Scheme, on the other hand, is a variety of Lisp.  (It's often said to
be a "dialect" of Lisp, but that may imply too strongly that Lisp is a
single language.)  Indeed, you have the bizarre notion that "Lisp-like"
includes some Lisps (eg, Common) but not others (Scheme).  (Or perhaps
you just have the bizarre notion that "Lisp-like" is a good name for
this category.)

The idea that Common Lisp is a cancerous departure from "old VW Lisp"
is, moreover, just wrong.  The evolution from Lisp 1.5 to the main
Lisps of the 70s, MacLisp and InterLisp, was a fairly natural one.
Common Lisp is the unification of several successors to Maclisp.  It
is, clearly, larger than MacLisp, but it is not larger than InterLisp.
Moreover, many things that may seem peculiar to Common Lisp (format,
defstruct, dotimes, etc) were already used in MacLisp.

Some critics of Common Lisp don't want to accept this.  They want
Common Lisp to be a uniquely bad language.  (Incidently, when I talk
about "some critics" I do not want to imply that they necessarily
include pgc.  Maybe they do, but I don't know.)

>The lack of an explicit 'eval' gives the show away, I am afraid (just
>like the consequence of having 'quasiquote' & Company as primitives
>does).

What?  I'll address eval below, where you elaborate on this point.
However, Common Lisp could just as well have quasiquote.  It isn't
that significant a difference.  (I hope you aren't counting it as part
of the claim that Scheme cannot be used to "*directly*" build program
fragments.)

>Scheme cannot be used, portably, to "reason" about programs, inasmuch it
>cannot *directly* build and execute *program fragments*, like every
>Lisp-like language is supposed to do. Some say this is *the*
>distinguishing characteristic of Lisp-like languages, however rarely it
>is used in practice.

Some say it, but they're wrong.  There isn't a single distinguishing
feature, and EVAL is not first on my list.  But this goes back to the
notion that there is a useful category "Lisp-like", characterized (it
now seems) by the ability to build and execute program fragments at
run time.  This way of thinking makes sense only if you escape single-
inheritance thinking and let this ability be one characteristic among
others (and change the name).  

>Scheme can "reason" more powerfully than most other languages (including
>Common Lisp), thanks to lexical closures and to continuations, and to
>explicit environments supported by many implementations, about *program
>states*, 

I'd stay away from "supported by many implementations" if I were you,
since virtually all implementatins let you directly execute program
fragments by calling EVAL and construct them via macros if by nothing
else.

>in both the context and the control graph, but this really is
>in the Algol-like language tradition, where 'own' is the root of all
>such powerful technology. OO technology, which is related to it, is
>after all a consequence of 'own', of Simula I and Simula 67, all
>Algol-like languages.

If you start tracing good ideas back to Algol, virtually every "good"
language will be "Algol-like".  The principal root of such things in
Scheme is the lambda calculus and the first-class status of functions.

>The Lisp-like language tradition had to fumble with the upward and
>downward funarg issue almost forever (until Steele and Baker, more or
>less), even if as of now funargs are more mainstream in the Lisp-like
>language camp than in the Algol-like language camp.

Most of the programming language world managed to "fumble" with
functions, if you want to call it that.  Lisp 1.5 had upward
funargs.  Where it went wrong was in not having lexical scoping.
The reason the upward/downward distinction became important
was that the downward ones were easier to implement efficiently.

>After all this, IMNHO it is not so far out to say that Scheme is an
>alternative syntax (modulo the latent type issue) for an Algol 68
>subset, if Algol 68 had upward funargs (and some Algol 68
>_implementations_ had them, as an extension to the standard!).

Well, yes, let's confine the type question to parentheses.  That way
we can treat it as unimportant.

And let's try this alternative syntax + some implementations trick
more generally.  Let's see.  Because of the KCL compiler, Common Lisp
is demonstrably an alternative syntax for C, plus a library of
procedures.  And C is very like a subset of Algol68 (or else say the
KCL compiler could just as well emit Algol68), so it must be that
Common Lisp is an Algol-like language.  If anything doesn't quite fit,
we can just say "modulo" in parentheses.

>I think that the provision of *excessive* and *regrettable* (complex and
>rational!) numeric facilities in Schemeis also designed to give the
>impression that it is designed to be more of a Algol-like language than
>a Lisp-like language.

It was MacLisp (more or less) that started the emphasis on having
good numeric facilities in Lisp.  Common Lisp and Scheme have *fortunately*
provided exact rational arithmetic.  (N.B. not "exact" in the technical
Scheme sense.)  Other languages (eg, Haskell) have been influenced by
Common Lisp and Scheme.  

>It is true however that most Scheme *implementations* have actually been
>like Lisp implementations, that is workspace based, and with 'eval'; but
>I am sure that a 'schemec' compiler that generated object code like the
>'cc' compiler could be done, and actually some imperfect realizations do
>exist (scheme2c from DEC actually is mostly used to produce objects to
>be loaded in a Scheme environment; PreScheme from MIT is better geared,
>I seem to understand, to generating standalone executables to be linked
>with a library).

And the same can be done with (surprise!) Common Lisp.  The same sort of
imperfect realizations already exist, eg KCL.

-- jeff
From: Barry Margolin
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <1991Feb28.051911.846@Think.COM>
In article <····@castle.ed.ac.uk> ····@castle.ed.ac.uk (Jeff Dalton) writes:
>In article <·················@odin.cs.aber.ac.uk> ···@cs.aber.ac.uk (Piercarlo Grandi) writes:
>>On 25 Feb 91 15:12:05 GMT, ····@aiai.ed.ac.uk (Jeff Dalton) said:
>
>The idea that Common Lisp is a cancerous departure from "old VW Lisp"
>is, moreover, just wrong.  The evolution from Lisp 1.5 to the main
>Lisps of the 70s, MacLisp and InterLisp, was a fairly natural one.
>Common Lisp is the unification of several successors to Maclisp.  It
>is, clearly, larger than MacLisp, but it is not larger than InterLisp.
>Moreover, many things that may seem peculiar to Common Lisp (format,
>defstruct, dotimes, etc) were already used in MacLisp.

Quite true.  The most notable difference between Maclisp and Common Lisp in
this regard is the first-class standing given to many of these packages.
In Maclisp, one had to load lots of optional libraries to get these
facilities; consequently, the Maclisp manual was much smaller than the
Common Lisp manual.  The problem was that there were sometimes more than
one version of some of the facilities (Bernie Greenberg did a DEFVAR for
Multics Emacs, but its syntax was (defvar (var1 init1 doc1) ... (varN initN
docN))).  Common Lisp incorporates most of the popular facilities, so that
users would be able to port their programs without having to copy all these
auxiliary libraries.

>>Scheme cannot be used, portably, to "reason" about programs, inasmuch it
>>cannot *directly* build and execute *program fragments*, like every
>>Lisp-like language is supposed to do. Some say this is *the*
>>distinguishing characteristic of Lisp-like languages, however rarely it
>>is used in practice.
>
>Some say it, but they're wrong.  There isn't a single distinguishing
>feature, and EVAL is not first on my list.

I just wanted to mention that my original comment about realizing that EVAL
isn't fundamental to Lisp-like languages, was based on my previous belief
in the above statement about "reasoning" about programs.  If the lambda
calculus was designed to allow reasoning about algorithms, and Lisp was
intended to be a realization of the lambda calculus, then it seemed that
Lisp should be able to reason about itself.  Further, many AI researchers
thought Lisp was appropriate for machine learning, because learning could
be implemented by rewriting programs.

However, I don't think the original Lisp had EVAL.  Recall that Lisp was
originally a Fortran library for symbolic manipulation of data structures.
All it had was conses, symbols, and numbers.  Lambda calculus and predicate
calculus could be represented using these structures, but I suspect that
the programs that did this treated them abstractly (i.e. to write theorem
provers).  At some point one of McCarthy's students realized that an
*executable* program could be represented using the same data structures as
lambda calculus, and wrote EVAL, which allowed the interactive Lisp
interpreter to be written.  While EVAL is important to the Lisp family of
languages, it seems that symbolic manipulation is more fundamental.

--
Barry Margolin, Thinking Machines Corp.

······@think.com
{uunet,harvard}!think!barmar
From: Ken Dickey
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <451@data.UUCP>
[WARNING: I have not seen the previous postings.  Some brain-dead
comments may result].

···@cs.aber.ac.uk (Piercarlo Grandi) writes:
pcg>On 25 Feb 91 15:12:05 GMT, ····@aiai.ed.ac.uk (Jeff Dalton) said:


AUTOMATIC STORAGE MANAGEMENT:	================================

>jeff> More often automatic storage management makes it easier to write
>jeff> code that is clearer and in a sense more abstract.

pcg>Mostly agreed, but clearer and more abstract code should be modularized
pcg>so that explicit storage reclamation can be slipped in easily. When this
pcg>cannot be done, one has a tradeoff between having it easier to write
pcg>code that is slightly clearer and more abstract and using a machine
pcg>which costs ten time less, and the choice is often loaded :-). 

This strikes me as very simular to the argument against virtual memory.

>jeff> A number of Lisp hackers _did_ think about garbage generation
>jeff> rates when it was appropriate to do so.

pcg>Ah yes, they did, the good ones and those with vast experience and
pcg>serious problems, but not everybody is a Dalton or Margolin, and garbage
pcg>generation rates are not even mentioned, just like locality of reference
pcg>in C/Unix books, in any Lisp/Scheme book I have seen -- please don't
pcg>post counterexamples! Every gross generalization like this has them, but
pcg>I hope you get the message, which is garbage generation rates is not
pcg>thought to be an issue worth mentioning, while for example time, but not
pcg>space, complexity is often prominently addressed (entire books devoted
pcg>to it!).

I don't ever remember a book on C discussing file sizes, but many Uni#
systems have disk usages ~98% full at all times.  Perhaps some topics
are considered "advanced" for programming language texts?

Are you particularly fond of "dumb" storage managers?  [I have to tell
it everything, even tell it to clean up after itself!].


EVAL: 	================================

>jeff> There are programs that can use it and they cannot be written in
>jeff> portable Scheme. [ ... ] I don't see how any of this makes it an
>jeff> implementation issue.

My recollection is that R^NRS defined LOAD.  Thus EVAL may be written
as a function which writes an expression to a file and LOADs it.

pcg>The lack of an explicit 'eval' gives the show away, I am afraid (just
pcg>like the consequence of having 'quasiquote' & Company as primitives
pcg>does).

pcg>Scheme cannot be used, portably, to "reason" about programs, inasmuch it
pcg>cannot *directly* build and execute *program fragments*, like every
pcg>Lisp-like language is supposed to do. Some say this is *the*
pcg>distinguishing characteristic of Lisp-like languages, however rarely it
pcg>is used in practice.

I find it interesting that you argue for making storage management
explicit, yet seem to ignore implementation issues such as linking
together applications with minimal runtime systems [which is what EVAL
is about--keeping a full runtime because one might have to EVAL
anything].  I have never seen a Scheme system without some form of
EVAL (although it may be a compiler invocation).

As Scheme has READ, one can certainly reason well about programs
without using EVAL.  Again, however, I know of no Scheme
implementation which lacks the ability to "*directly* build and
execute *program fragments*".


PROGRAMMING ENVIRONMENTS:	================================

pcg>Actually, let me say, I think that one of the fatal mistakes in Scheme,
pcg>like it was for Pascal, is that there is no standard way to address
pcg>separate compilation! Seriously. Think carefully about the
pcg>implications...

Now that there is a Scheme Language standard, perhaps the Scheme
community will do more work on development environment/invocation
standards.  Note that other languages have typically failed in this
area.

pcg>--
pcg>Piercarlo Grandi		 | INET: ···@cs.aber.ac.uk

================================

Ken Dickey			····@data.uucp
From: David Vinayak Wallace
Subject: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <GUMBY.91Feb27172643@Cygnus.COM>
   Date: 26 Feb 91 19:49:09 GMT
   From: ···@cs.aber.ac.uk (Piercarlo Grandi)

   [discussion of grabage generation profiles]

   Even research in these issues has become unfashionable, like research in
   locality of reference, and I can think of precious few contributions in
   either field for the past ten years or so.

Just check out the last few L&FP proceedings.  The last one was even
in Europe!  And that's just for starters...

Not to mention that several modern lisps provide stack-consing and
areas, allowing manual storage management if desired.
From: Paul Wilson
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <1991Feb28.212626.23340@uicbert.eecs.uic.edu>
·····@Cygnus.COM (David Vinayak Wallace) writes:


>   Date: 26 Feb 91 19:49:09 GMT
>   From: ···@cs.aber.ac.uk (Piercarlo Grandi)

>   [discussion of garbage generation profiles]

>   Even research in these issues has become unfashionable, like research in
>   locality of reference, and I can think of precious few contributions in
>   either field for the past ten years or so.

>Just check out the last few L&FP proceedings.  The last one was even
>in Europe!  And that's just for starters...

References to relevant recent research can be found in my paper in the
March SIGPLAN Notices, "Issues and Strategies in Heap Management and
Hierarchical Memories," which is a position paper from the GC workshop
at OOPSLA/ECOOP '91.

Being a resolutely unfashionable person, I'm doing research on garbage
collection, locality of reference, and their interactions.  My next
paper, to be presented at the next SIGPLAN conference, is on improving
virtual memory performance by using better copying traversal algorithms,
to cluster data.  (Yes, it's been tried before, but I do some different
things, and it works very well...)

Some other people doing relevant research include Ben Zorn at the
U. of Colorado at Boulder, and Doug Johnson at TI, and the MUSHROOM
group at the U. of Manchester.

>Not to mention that several modern lisps provide stack-consing and
>areas, allowing manual storage management if desired.

Right.  On the other hand, the holy grail for gc folk is to avoid
having to do that sort of stuff explicitly the vast majority of the
time.  There have been great strides toward this with generational
garbage collectors -- having a bunch of short-lived data isn't nearly
as expensive as it used to be, though it's still more expensive than
stack allocation.  I suspect a bit of lifetime analysis in compilers
could help a lot too.

   -- PRW


Paul R. Wilson                         
Software Systems Laboratory               lab ph.: (312) 996-9216
U. of Illin. at C. EECS Dept. (M/C 154)   ······@bert.eecs.uic.edu
Box 4348   Chicago,IL 60680 
-- 
Paul R. Wilson                         
Software Systems Laboratory               lab ph.: (312) 996-9216
U. of Illin. at C. EECS Dept. (M/C 154)   ······@bert.eecs.uic.edu
Box 4348   Chicago,IL 60680 
From: Kiran Wagle
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <kiran.667905260@copper>
······@gerber.ai.mit.edu (Aubrey Jaffer) writes:

>The lack of macros makes pure scheme code easily readable. 

No it doesn't. (Who is doing the reading?) i find a named procedure
_much_ easier to handle conceptually, and often name things just for
this reason. Macros allow one to say things using words whose meanings
are immediately obvious (at least to the writer of the macro) 
and thus are easier to reason about.

 
>To make a more radical suggestion I think that scheme might do very
>well to NOT include macros.  Introducing new syntactic constructs
>(macros) in order to avoid typing a few lambdas is bad programming
>style in that the code becomes unreadable.  

Doesn't this argument apply to all special forms and procedures?
All we need is if & lambda--should we get rid of and & or,
cond, etc.? Or are these tools that allow us to focus on the rest of
the program? I use named procedures and macros to avoid the
lambda--but also to help me conceptualize what's going on here. 
Other syntactic sugar likewise--are you willing to say that all 
code should be written at the lowest level possible? 
Why not code in binary?

Never underestimate the power of a name.

--
		·····································@copper.ucs.indiana.edu

"There may be two people in the world		Kiran Wagle 
who agree with each other on everything,	405 E. 8th St. #7
but *I* am not one of them...."			Bloomington, IN 47408-3788
		--David Friedman		(812) 331-1710
_______________________________________________________________________________
From: Aubrey Jaffer
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <13645@life.ai.mit.edu>
>>Doesn't this argument apply to all special forms and procedures?
>>All we need is if & lambda--should we get rid of and & or,
>>cond, etc.? Or are these tools that allow us to focus on the rest of
>>the program? I use named procedures and macros to avoid the
>>lambda--but also to help me conceptualize what's going on here. 
>>Other syntactic sugar likewise--are you willing to say that all 
>>code should be written at the lowest level possible? 
>>Why not code in binary?

No.  As I said in the beginning of the article, the problem stems from
not being able to syntacticly differentiate between procedure calls
and special forms.  Scheme has less than 20 special forms.  I can
remember that small number.  A requirement that macro (or special
form) symbols start with * or some other marker would make me happy.

The radical suggestion I made was prompted by the realization that
Scheme's 16 special forms seem to cover almost all the ways I write
code (control structure).  No one seems to share that observation with
me.

Someone suggested that `while' should be added to Scheme.  To my mind
`while' is not different enough from `do' to be useful:

(while <CONDITIONAL> <CODE ...>)  ==>
(do () ((not <CONDITIONAL>)) <CODE ...>)
From: Richard A. O'Keefe
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <4881@goanna.cs.rmit.oz.au>
In article <·····@life.ai.mit.edu>, ······@gerber.ai.mit.edu (Aubrey Jaffer) writes:
> Someone suggested that `while' should be added to Scheme.  To my mind
> `while' is not different enough from `do' to be useful:

> (while <CONDITIONAL> <CODE ...>)  ==>
> (do () ((not <CONDITIONAL>)) <CODE ...>)

This may be a matter of taste and background.  I find DO inordinately
hard to read.  The fact that the loop condition is negated is sometimes
an advantage, but not always.  I have learned to like T's ITERATE which
appears in Scheme as named-LET (let's face it, that's what a Prolog
programmer _expects_ a loop to look like), but even so WHILE is clearer
at times.

-- 
The purpose of advertising is to destroy the freedom of the market.
From: Guillermo J. Rozas
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <JINX.91Mar2204854@chamarti.ai.mit.edu>
In article <·····@life.ai.mit.edu> ······@gerber.ai.mit.edu (Aubrey Jaffer) writes:

   Someone suggested that `while' should be added to Scheme.  To my mind
   `while' is not different enough from `do' to be useful:

   (while <CONDITIONAL> <CODE ...>)  ==>
   (do () ((not <CONDITIONAL>)) <CODE ...>)

+ is not different enough from - to be useful:

(define (+ x y)
  (- x (- 0 y)))

yet you wouldn't want to do away with it, right? :-)

The language should be flexible enough to adapt to the thinking of the
programmer, not the other way around.
From: Jeff Dalton
Subject: Re: Scheme as an Algol-like, not Lisp-like, language
Date: 
Message-ID: <4258@skye.ed.ac.uk>
In article <··················@chamarti.ai.mit.edu> ····@zurich.ai.mit.edu writes:
>    Now, *you* are underestimating me a bit. The serious point I was making
>    is that a Lisp is historically a symbolic List Processing language,
>    while, except for the survival of 'cons', 'car', and 'cdr', Scheme is
>    more of something in the Algol 60 tradition, with a superficially
>    Lisp-like syntax.
>
>Well, what makes a Lisp?  Let me suggest a few requirements.
>1. List processing.
>2. Programs that look like data.
>3. Objects with unbounded extent and automatic storage management.
>4. EVAL (making use of 2).
>5. Powerful syntactic extension facilities that make use of 2.
>6. Call by value.
>7. Imperative constructs.
>8. Functional style encouraged, or at least not discouraged.
>9. Latent types.
>10. Generic arithmetic, with integers not restricted to the word size
>of the machine.
>11. Interactive development enviroments.
>
>You are right to claim that Scheme is not a Lisp because of its lack
>of 4 and 5, but

I'd say he is _wrong_ to claim Scheme is not a Lisp because it doesn't
have 4 and 5.  As you note later in your article:

>- There are dialects of Lisp out there that don't satisfy some of
>those requirements, yet no one thinks they are not Lisp.

The idea that we judge whether something is of a certain kind by
saying it must have _all_ properties in a list is philosophically
dubious to say the least [references omitted].  This has long been
recognized explicitly by the AI community and at least implicitly by
the Lisp community.  That's why the Algol-syntax language used with
Reduce was called RLISP, for example.  RLISP didn't have all of the
typical Lisp properties, but it did have enough of them.  [Exactly
what counts as "enough" may depend on many things.]

Remember too that some people used to (and maybe still do) write Lisp
in a syntax based on the Meta-language described in the Lisp 1.5 book.
If the Lisp 2 project has been a success, dual-notation Lisp (such as
RLISP / PSL) might have become the norm, and our idea of what Lisp is
typically like might be very different.

-- jd
From: Guillermo J. Rozas
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <JINX.91Feb26214634@chamarti.ai.mit.edu>
    To rely on garbage collection, which is a worst case mechanism to use to
    reclaim storage when nothing is known a priori about value lifetime, for
    known lifetime cases is clearly wasteful. An analogy is using dynamic
    type dispatching when the type of a value is well known by design. Maybe
    we need sophisticated, checkable lifetime declarations (better than
    DYNAMIC-EXTENT), not just sophisticated, checkable, type declarations.


Hmm.  So you would argue for programming with overlays rather than
using a virtual memory system?  Or having programmers think about disk
sectors, head motion, and queueing operations on the disk rather than
using the OS-supplied facilities that hide such details.

Of course, there are applications where performance is so critical
that you cannot use virtual memory, or ignore details of disk access,
etc., but the vast majority can.  Programmer productivity is greatly
enhanced and the likelyhood of bugs is greatly decreased by using the
system-supplied facilities, albeit at a cost in performance.

I think automatic storage managment has the same characteristics as
these OS services, and should be avoided only under duress.  That does
not mean, however, that other facilities should not be made available
as well.
From: Ozan Yigit
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <21797@yunexus.YorkU.CA>
In article <··················@chamarti.ai.mit.edu> ····@zurich.ai.mit.edu writes:
     [quoting Piercarlo Grandi]
>
>    To rely on garbage collection, which is a worst case mechanism to use to
>    reclaim storage when nothing is known a priori about value lifetime, for
>    known lifetime cases is clearly wasteful. An analogy is using dynamic
>    type dispatching when the type of a value is well known by design. Maybe
>    we need sophisticated, checkable lifetime declarations (better than
>    DYNAMIC-EXTENT), not just sophisticated, checkable, type declarations.
>
>
>Hmm.  So you would argue for programming with overlays rather than
>using a virtual memory system?  Or having programmers think about disk
>sectors, head motion, and queueing operations on the disk rather than
>using the OS-supplied facilities that hide such details.

This must be the usual gang-up-on-Piercarlo week ;-)

I think your interpretation makes Piercarlo's views appear much more
radical than they actually are. Heck, you know the type of analsis that
goes into scheme compilers [for example] to avoid heap allocation
[McDermott, Dybvig etc] better than I do. Aren't nemory allocation strategies
based on lifetimes [Hanson] and Generational GC mechanisms are in essence
use exactly the idea of making use of some knowledge about the extent of
things? [Do weak references count as a similar mechanism?]

So, what is so offensive about his suggestions?

oz
---
In seeking the unattainable, simplicity  |  Internet: ··@nexus.yorku.ca
only gets in the way. -- Alan J. Perlis  |  Uucp: utai/utzoo!yunexus!oz
From: Guillermo J. Rozas
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <JINX.91Mar2180826@chamarti.ai.mit.edu>
In article <·····@yunexus.YorkU.CA> ··@yunexus.yorku.ca (Ozan Yigit) writes:

   Path: ai-lab!snorkelwacker.mit.edu!shelby!unix!Teknowledge.COM!uw-beaver!ubc-cs!news-server.csri.toronto.edu!helios.physics.utoronto.ca!ists!yunexus!oz
   From: ··@yunexus.yorku.ca (Ozan Yigit)
   Newsgroups: comp.lang.lisp
   Date: 2 Mar 91 04:43:36 GMT
   References: <······················@aero.org> <······················@Think.COM> <·····················@aero.org> <·····················@Think.COM> <·················@odin.cs.aber.ac.uk> <··················@chamarti.ai.mit.edu>
   Sender: ····@yunexus.YorkU.CA
   Organization: York U. Communications Research & Development
   Lines: 32

   In article <··················@chamarti.ai.mit.edu> ····@zurich.ai.mit.edu writes:
	[quoting Piercarlo Grandi]
   >
   >    To rely on garbage collection, which is a worst case mechanism to use to
   >    reclaim storage when nothing is known a priori about value lifetime, for
   >    known lifetime cases is clearly wasteful. An analogy is using dynamic
   >    type dispatching when the type of a value is well known by design. Maybe
   >    we need sophisticated, checkable lifetime declarations (better than
   >    DYNAMIC-EXTENT), not just sophisticated, checkable, type declarations.
   >
   >
   >Hmm.  So you would argue for programming with overlays rather than
   >using a virtual memory system?  Or having programmers think about disk
   >sectors, head motion, and queueing operations on the disk rather than
   >using the OS-supplied facilities that hide such details.

   This must be the usual gang-up-on-Piercarlo week ;-)

   I think your interpretation makes Piercarlo's views appear much more
   radical than they actually are. Heck, you know the type of analsis that
   goes into scheme compilers [for example] to avoid heap allocation
   [McDermott, Dybvig etc] better than I do. Aren't nemory allocation strategies
   based on lifetimes [Hanson] and Generational GC mechanisms are in essence
   use exactly the idea of making use of some knowledge about the extent of
   things? [Do weak references count as a similar mechanism?]

   So, what is so offensive about his suggestions?

I certainly want Lisp systems to provide quality memory management
just like I want my OS to provide good virtual memory and disk IO.

That does not imply, however, that most (or any) users should be doing
memory management themselves any more than it implies that users
should be bypassing the virtual memory system or the system's supplied
IO facilities.

I agree that Lisp memory management needs to improve, but I doubt that
the way to do this is to throw it out the window and revert to C-style
explicit memory management.  Explicit memory management is very
error-prone, and (in my case) reduces my productivity significantly.
I am much less likely to trust a program that manages memory
explicitly than one that does not.

Having said this, I will repeat what has already been said.  It is
trivial to do ``malloc/free'' memory management in Lisp.  No one stops
you from doing it.  It is also trivial to make it extent dependent:

(define-macro (with-dynamic-pair name x y . body)
  (let ((var (gensym)))
    `(let ((,name (pair/cons ,x ,y)))
       (let ((,var (begin ,@body)))
	 (pair/free ,name)
	 ,var))))

(define pair/malloc)
(define pair/free)
		
(let ((free-list '()))
  (set! pair/malloc
	(lambda (x y)
	  (if (null? free-list)
	      (cons x y)
	      (let ((pair (car free-list)))
		(set! free-list (cdr free-list))
		(set-car! pair x)
		(set-cdr! pair y)
		pair))))
  (set! pair/free
	(lambda (pair)
	  (if (not (pair? pair))
	      (error "pair/free: Not a pair" pair)
	      (begin
		(set-cdr! pair free-list)
		(set-car! pair #f)
		(set! free-list pair))))))

Note that if weak pairs are used to hold the free list, then this
method mixes well with garbage collection.
From: Piercarlo Antonio Grandi
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <PCG.91Mar4211533@aberdb.cs.aber.ac.uk>
[ apologies if this appear more than once; previous attempts at posting
may have not been successful because of a full partition ]

I had written:

pcg> To rely on garbage collection, which is a worst case mechanism to use to
pcg> reclaim storage when nothing is known a priori about value lifetime, for
pcg> known lifetime cases is clearly wasteful. An analogy is using dynamic
pcg> type dispatching when the type of a value is well known by design. Maybe
pcg> we need sophisticated, checkable lifetime declarations (better than
pcg> DYNAMIC-EXTENT), not just sophisticated, checkable, type declarations.

On 27 Feb 91 02:46:34 GMT, ····@zurich.ai.mit.edu (Guillermo J. Rozas)
commented:

jinx> Hmm.  So you would argue for programming with overlays rather than
jinx> using a virtual memory system?

Not at all, a well designed VM system is more flexible than overlays.

But surely I think that over reliance on the default demand policies of
a VM subsystem is catastrophic for performance, as amply demonstrated by
any major application you can find around nowadays. When SunOS 3 became
SunOS 4 a colossal performance hit was experienced because the default
demand loading policy of VM is simply inappropriate to file access. GNU
Emacs is another major and catastrophic example.

jinx> Or having programmers think about disk sectors, head motion, and
jinx> queueing operations on the disk rather than using the OS-supplied
jinx> facilities that hide such details.

Yes, most definitely yes. As long as the latency or bandwidth of disk
and RAM are separated by some orders of magnitude most programmers have
to carefully structure their programs around this simple fact.

I have this idea that a very large, very large part of Computer Science
is simply the consequence of dealing with this gap. Certainly most of OS
and DBMS design is centered around it. Minimizing IO operations and
memory usage is a large and important worry of any programmer's work.

The OS-supplied facilities hide the ugly details of dealing with such
facts; but the unpleasant facts remain, that application growth keeps
pace with memory growth, and that disks are so slow, and bite hard.

jinx> Of course, there are applications where performance is so critical
jinx> that you cannot use virtual memory, or ignore details of disk
jinx> access, etc., but the vast majority can.

Here we are really on different planets. You sound like a vendor, or an
agent of the Imperial MITI DRAM Service.

Most nearly any "serious" application cannot be wholly memory resident
and cannot be run with frequent collections; most any serious
application writer has to be damn careful about locality of reference,
and has to be damn careful about garbage generation rates. Relying on
default replacement and reclamation policies simply doesn't cut it.

The definition of "serious" is more or less the same as "mainstream". If
one is unwilling to let Lisp/Scheme out in the mainstream then
everything is simpler. But when you start considering for example as
languages in which to do databases, numerical applications, compilers,
operating systems, text processors and the like, caring about locality
of reference and garbage generation rates pays off a lot.

In short, do we want for Lisp/Scheme to remain a ghetto language for
doing (even large scale!) AI demos or do we want to use Lisp/Scheme for
programming ? :-)

If the latter is desired, engineering considerations become important.

As to me, I have been considering writing an OS kernel in Scheme, and so
on. Current Scheme implementations are in the Lisp mould, unfortunately,
but I can easily imagine Scheme being usable for all the above
applications (something that is hard to imagine, even if it is feasible,
for Common Lisp), even on a PC/AT class machine.

jinx> Programmer productivity is greatly enhanced and the likelyhood of
jinx> bugs is greatly decreased by using the system-supplied facilities,
jinx> albeit at a cost in performance.

I remember the old saw that says "an engineer can do for a dime what any
fool can do for a dollar". If you have no resource constraints, more
power to less prepared people, or more power to people that are more
prepared in other fields.

Admittedly as long as one has to write programs that are a few thousand
lines long and run on immense personal workstations, there are no
effective resource constraints. GNU Emacs is popular because its users
all have 4-8MB each in their workstations. XyWrite is another story...

In another newsgroup there was a remark that the exceptional success of
Xenix/286 (yes, the 16 bit version) is due to the fact that a lot of
VARs and their customers have discovered that a $1,000 286 gizmo can
support 12 terminals doing business applications simply because it is
the leanest and most efficient of its breed. Orders of magnitude away
from Lisp, but amusing to note nonetheless.

Nowadays that Lisp images are dozens of MBytes long and that people
aspire to deliver them as products running on inexpensive machines,
things are not longer that simple.  Some Lisp programmers have had, much
to their annoyance, to start reading books about IO operation
minimizations written by database people. Garbage generations
consideration have had to be discovered again. A promising start has
been made towards putting Lisp programs out of their workspace ghetto
and making them modular standalone applications that can interact with
things like databases and other tools.
--
Piercarlo Grandi                   | ARPA: ··············@nsfnet-relay.ac.uk
Dept of CS, UCW Aberystwyth        | UUCP: ...!mcsun!ukc!aber-cs!pcg
Penglais, Aberystwyth SY23 3BZ, UK | INET: ···@aber.ac.uk
From: Barry Margolin
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <1991Mar5.040348.25396@Think.COM>
In article <················@aberdb.cs.aber.ac.uk> ···@cs.aber.ac.uk (Piercarlo Antonio Grandi) writes:
>I have this idea that a very large, very large part of Computer Science
>is simply the consequence of dealing with this gap. Certainly most of OS
>and DBMS design is centered around it. Minimizing IO operations and
>memory usage is a large and important worry of any programmer's work.

And if we didn't have OSes hiding these details then a large part of
Computer Science would be programming I/O and memory management routines.
And we'd still have to worry about minimizing I/O operations, because they
are still orders of magnitude slower than in-memory operations.

If the programs don't get written, it doesn't matter how little memory they
run in.

--
Barry Margolin, Thinking Machines Corp.

······@think.com
{uunet,harvard}!think!barmar
From: Piercarlo Antonio Grandi
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <PCG.91Mar6215111@aberdb.cs.aber.ac.uk>
On 5 Mar 91 04:03:48 GMT, ······@think.com (Barry Margolin) said:

barmar> In article <················@aberdb.cs.aber.ac.uk>
barmar> ···@cs.aber.ac.uk (Piercarlo Antonio Grandi) writes:

pcg> I have this idea that a very large, very large part of Computer Science
pcg> is simply the consequence of dealing with this gap. Certainly most of OS
pcg> and DBMS design is centered around it. Minimizing IO operations and
pcg> memory usage is a large and important worry of any programmer's work.

barmar> And if we didn't have OSes hiding these details then a large
barmar> part of Computer Science would be programming I/O and memory
barmar> management routines.

Just a moment, this is not a fair comment; I don't regret that OSes 
and other sw layers give programmers a highger level view of certain
things, because what is being abstracted over is details.

In the view of some this is indeed the same thing as abstracting over
essential things like the performance profile of the available assets,
but surely not to me.

barmar> And we'd still have to worry about minimizing I/O operations,
barmar> because they are still orders of magnitude slower than in-memory
barmar> operations.

No, no. With more abstract interfaces (like VM and memory mapped files)
we can *concentrate* just on minimizing IO operations. Unless we are
Lisp programmers, that is :-).

I really don't see why so many people would die tomorrow for reducing by
10% the time complexity of their code, but are indifferent to reducing
the space or IO complexity by an order of magnitude.

If your attitude were right, then why bother with any sort algorithm
other than bubble sort? It gets the job done, and if it is too slow, add
more CPUs (actually a quadratic number of CPUs :->).
--
Piercarlo Grandi                   | ARPA: ··············@nsfnet-relay.ac.uk
Dept of CS, UCW Aberystwyth        | UUCP: ...!mcsun!ukc!aber-cs!pcg
Penglais, Aberystwyth SY23 3BZ, UK | INET: ···@aber.ac.uk
From: Barry Margolin
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <1991Mar8.000215.21918@Think.COM>
In article <················@aberdb.cs.aber.ac.uk> ···@cs.aber.ac.uk (Piercarlo Antonio Grandi) writes:
>If your attitude were right, then why bother with any sort algorithm
>other than bubble sort? It gets the job done, and if it is too slow, add
>more CPUs (actually a quadratic number of CPUs :->).

[Look at what company I work for -- we *do* add more CPUs :->]

It's been a long time since I've had to write a sort subroutine, but I
think 90% of the ones I've written have been bubble sorts.  I depend on the
environment providing a sorting function that has reasonable performance.

Similarly, I've never written a routine that optimizes disk r/w head
motion; if the OS doesn't optimize it, it doesn't get done.

If it's easy to optimize something in my code and it doesn't obscure the
structure, I'll do it.  For instance, I use NREVERSE and NCONC when it's
clear that the old structure of the list can be reused (generally when it
is a newly-consed intermediate value).  I'll use Symbolics's STACK-LET (in
ANSI Common Lisp this will be the DYNAMIC-EXTENT declaration) for local
temporaries in an inner loop.  Other than simple stuff like this, I don't
believe in heavy source optimizing except for major bottlenecks.

--
Barry Margolin, Thinking Machines Corp.

······@think.com
{uunet,harvard}!think!barmar
From: Richard A. O'Keefe
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <4936@goanna.cs.rmit.oz.au>
I note that T has "pools", and that if you have "weak references" it is
easy to implement them in user code, and if you haven't, it's not _that_
hard to tell a garbage collector how to flush them.

The basic idea of a pool (from memory, so the interface is _wrong_):
	(define my-pool (create-pool ALLOCATOR))
	;; ALLOCATOR is a function.  my-pool is a "weak set" of objects.
	(pool-allocate my-pool)
	;; if my-pool is not empty, remove and return any one of the
	;; objects in it.  if my-pool is empty, call (ALLOCATOR) and
	;; return its result.
	(pool-release my-pool object)
	;; adds the object to my-pool, so that pool-allocate may return it
When a garbage collection happens, every pool is emptied, but an object in
a pool will not be reclaimed if there are any "hard" references to it.

By using pools, you can recycle objects faster than the garbage collector;
the price is that you may accidentally free something that is still in use.

I further note that Lisp Machines have had "areas" for a long time, so that
providing some kind of control over locality is not entirely out of the
question.  I suppose the real question is whether anyone succeeded in
getting useful speedups by using it.

I note that the compilers on the Burroughs B6700 made it easy for a
programmer to control which Algol procedures, Fortran program units,
or COBOL paragraphs were placed together in what segments, but I never
knew anyone bother much, even though the concept of grouping was clear
and the means simple and clearly documented; it was rare for people to
know which program units belonged together.

Should good programmers spend a lot of time thinking about storage
allocation and placement, or should they spend their time thinking
about how to reduce the amount of storage they need so that memory
management ceases to be a bottle-neck?

> I really don't see why so many people would die tomorrow for reducing by
> 10% the time complexity of their code, but are indifferent to reducing
> the space or IO complexity by an order of magnitude.

My attitude is that reducing the amount of space used *IS* how I make
my programs run faster, but that the way to do that is not to give
myself more scope for mistakes by explicitly managing the intermediate
objects, but instead by designing my algorithms so that they create as
few intermediate objects as possible in the first place.
-- 
The purpose of advertising is to destroy the freedom of the market.
From: Scott "TCB" Turner
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <1991Mar11.173156.24976@aero.org>
(Richard A. O'Keefe) writes:
>I note that T has "pools", and that if you have "weak references" it is
>easy to implement them in user code, and if you haven't, it's not _that_
>hard to tell a garbage collector how to flush them.

The circular discussion is an infamous Usenet phenomenon.  (The "Memory 
Management" discussion started when I mentioned T's weak reference and
asked if anything similar would ever be in CL.)

Perhaps you could post some code showing how to implement weak reference
or pools in Common Lisp?
From: Richard A. O'Keefe
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <4993@goanna.cs.rmit.oz.au>
In article <······················@aero.org>, ···@aero.org (Scott "TCB" Turner) writes:
> (Richard A. O'Keefe) writes:
> >I note that T has "pools", and that if you have "weak references" it is
> >easy to implement them in user code, and if you haven't, it's not _that_
> >hard to tell a garbage collector how to flush them.
> 
> The circular discussion is an infamous Usenet phenomenon.  (The "Memory 
> Management" discussion started when I mentioned T's weak reference and
> asked if anything similar would ever be in CL.)
> 
> Perhaps you could post some code showing how to implement weak reference
> or pools in Common Lisp?

Common Lisp, by design, does not include all the low level operations
one would need.  Given any Common Lisp *implementation*, I doubt that
it would be hard to do.  Have I any evidence for this claim?

It took me all of one hour to add weak references to Elk,
and a further half hour to implement pools on top of weak references.
This was the very first time I had ever added a data type to Elk.

Alas, my operations are not quite the same as the ones in T.  If anyone
is really interested, I am willing to post my two new files
	elk/lib/weak.c (143 lines, about 40 are comments)
	elk/scm/pool   (47 lines, about half of that is comments)
	[no changes are required to any existing file]

Now, Elk is unusually easy to augment.  (Was that "extension language
kit" or "extensible language kit" (:-)?)  But T doesn't need a lot of
code to implement weak references either.  The secret in Elk is a
function that looks like

	void Visit_Weak_Ref(hp, f)
	    Object *hp;
	    int (*f)();
	    {
		struct S_Weak *p = WEAK(*hp);
		p->curval = p->defalt;
		(*f)(&(p->defalt));
	    }

When the Elk garbage collector is traversing the non-garbage, user-
defined types are traversed by a user-defined function.  When you
create a new type, you specify the function.  This is the only part
of my implementation of weak references which differs significantly
from pairs.  (The printing function is different too, of course.)
-- 
Seen from an MVS perspective, UNIX and MS-DOS are hard to tell apart.
From: Richard A. O'Keefe
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <5008@goanna.cs.rmit.oz.au>
In article <····@goanna.cs.rmit.oz.au>,
I wrote, in defence of a claim that pools were easy to add,
that it had taken me very little time to write the code to add
weak references (not weak pointers!) and pools to Elk, and I
offered to post the code.  I have received several E-mail requests,
and the code _is_ small, so here goes.  The code _has_ been tested,
and appears to work.

#!/bin/sh
# This is a shell archive, meaning:
# 1. Remove everything above the #!/bin/sh line.
# 2. Save the resulting text in a file.
# 3. Execute the file with /bin/sh (not csh) to create the files:
#	scm/pool
#	lib/weak.c
cat >scm/pool <<'------ EOF ------'
;;; -*-Scheme-*-
;;; 
;;; Scheme provides automatic garbage collection.  However, sometimes
;;; you know early that an object of a particular type will not be
;;; used again, so you would like to make it available for re-use.
;;; 
;;; This file provides three functions:
;;;	(make-pool allocator) => pool
;;;	(allocate pool) => object
;;;	(release pool object) => unspecified
;;; The idea is that a pool consists of a list of available objects and
;;; a function (the allocator) for allocating and initialising new ones.
;;; When you try to allocate an object from the pool, if there are any
;;; available objects it will return one of them.  If there aren't any,
;;; it will call the allocator to make a new one.
;;; When you have finished with an object, you can add it to the pool
;;; by calling release.
;;; When a garbage collection occurs, every pool is forcibly emptied.
;;; If there are other references to an object in a pool, it will
;;; survive, so this is quite safe.
;;; Using this package can save a fair bit of garbage collection.
;;; You will never get your hands on invalid pointers.  On the other
;;; hand, you had better be *sure* that you have finished with an
;;; object before putting it back in a pool.

;;; The representation of a pool is a pair
;;;	(<allocation function> . <weak reference to list of objects>)

(define (make-pool allocator)
    (cons allocator (cons-weak-ref '() '()) ))

(define (pool? object)
    (and (pair? object)
	 (procedure? (car object))
	 (weak-ref? (cdr object))
	 (null? (weak-default (cdr object)) )) )

(define (allocate pool)
    (let ((available (weak-contents (cdr pool))))
	(if (null? available) ((car pool))
	    (begin (weak-set-contents! (cdr pool) (cdr available))
		   (car available)) )))

(define (release pool object)
    (weak-set-contents! (cdr pool)
	(cons object (weak-contents (cdr pool)) )))

------ EOF ------
ls -l scm/pool
cat >lib/weak.c <<'------ EOF ------'
#include <scheme.h>

/*  weak.c defines a type "weak-reference" with operations
    (cons-weak-ref [default [initial]])
	-- if initial is omitted, it is the same as default
	-- if default is omitted, it is #F
    (weak-ref? object)
    (weak-contents weak-ref)
	-- returns the current value of the weak-ref
    (weak-default weak-ref)
	-- returns the default value of the object
    (weak-set-contents! weak-ref value)
	-- updates the current value of the object
    (weak-set-default! weak-ref value)
	-- updates the default value of the object
    A weak reference is just like a pair except that when a garbage
    collection occurs, the current value is replaced by the default
    value.  The point of this is to let you define "pools".
*/

static int T_Weak;

#define WEAK(x)   ((struct S_Weak *)POINTER(x))

struct S_Weak {
    Object defalt;
    Object curval;
};

static Object P_Weak_Cons(argc, argv)
    int argc;
    Object *argv;
    {
	Object defalt = argc < 1 ? False : argv[0];
	Object curval = argc < 2 ? defalt : argv[1];
	register char *p;
	Object h;
	GC_Node2;

	GC_Link2(defalt, curval);
	p = Get_Bytes(sizeof (struct S_Weak));
	SET(h, T_Weak, (struct S_Weak *)p);
	WEAK(h)->defalt = defalt;
	WEAK(h)->curval = curval;
	GC_Unlink;
	return h;
    }

static Object P_Weakp(x)
    Object x;
    {
	return TYPE(x) == T_Weak ? True : False;
    }

static Object P_Weak_Contents(h)
    Object h;
    {
	Check_Type(h, T_Weak);
	return WEAK(h)->curval;
    }

static Object P_Weak_Default(h)
    Object h;
    {
	Check_Type(h, T_Weak);
	return WEAK(h)->defalt;
    }

static Object P_Weak_Set_Cont(h, val)
    Object h, val;
    {
	Check_Type(h, T_Weak);
	WEAK(h)->curval = val;
	return h;
    }

static Object P_Weak_Set_Dflt(h, val)
    Object h, val;
    {
	Check_Type(h, T_Weak);
	WEAK(h)->defalt = val;
	return h;
    }


static int Weak_Eqv(a, b)
    Object a, b;
    {
	return EQ(a, b);
    }

static int Weak_Equal(a, b)
    Object a, b;
    {
	return	Equal(WEAK(a)->defalt, WEAK(b)->defalt)  &&
		Equal(WEAK(a)->curval, WEAK(b)->curval);
    }

static Weak_Print(h, port, raw, depth, length)
    Object h, port;
    int raw, depth, length;
    {
	Printf(port, "#[hunk3 %u: ", POINTER(h));
	Print_Object(WEAK(h)->defalt, port, raw, depth-1, length);
	Printf(port, "]");
    }

static Weak_Visit(hp, f)
    Object *hp;
    int (*f)();
    {
	struct S_Weak *p = WEAK(*hp);
	p->curval = p->defalt;
	(*f)(&(p->defalt));
    }

init_lib_weak()
    {
	T_Weak = Define_Type(0, "weak-ref", NOFUNC, sizeof (struct S_Weak),
			     Weak_Eqv, Weak_Equal, Weak_Print, Weak_Visit);
	Define_Primitive(P_Weak_Cons,	  "cons-weak-ref",      0, 2, VARARGS);
	Define_Primitive(P_Weakp,	  "weak-ref?",	        1, 1, EVAL);
	Define_Primitive(P_Weak_Contents, "weak-contents",      1, 1, EVAL);
	Define_Primitive(P_Weak_Default,  "weak-default",       1, 1, EVAL);
	Define_Primitive(P_Weak_Set_Cont, "weak-set-contents!", 2, 2, EVAL);
	Define_Primitive(P_Weak_Set_Dflt, "weak-set-default!",  2, 2, EVAL);
    }

------ EOF ------
ls -l lib/weak.c
exit
-- 
Seen from an MVS perspective, UNIX and MS-DOS are hard to tell apart.
From: Oliver Laumann
Subject: Re: Memory Management in Lisp?
Date: 
Message-ID: <2910@kraftbus.cs.tu-berlin.de>
In article <····@goanna.cs.rmit.oz.au> ··@goanna.cs.rmit.oz.au (Richard A. O'Keefe) writes:
> Now, Elk is unusually easy to augment.  (Was that "extension language
> kit" or "extensible language kit" (:-)?)

When I had to invent a name for the beast, I also considered "eel"
(extensible extension language) :-)