From: Erann Gat
Subject: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <1991Feb12.233157.20820@elroy.jpl.nasa.gov>
This is a (probably) vain plea to see call-with-current-continuation
included in common lisp.  There are a number of reasons that call/cc
should be included.

1.  It cannot be implemented in terms of existing CL constructs, so you
can't build it even if you want to.  You can build a downward-only call/cc,
but see the note below.

2.  It is not that hard to implement.  Including call/cc would not make
compilers that much more complex.  All that call/cc does is copy the
stack onto the heap.

3.  Call/cc makes it INFINITELY less painful to write coroutines.  For
example, here's a short piece of code that allows you to interleave
the execution of an arbitrary number of arbitrary functions:  (This
is written in T, which allows continuations to take an arbitrary number
of arguments and return them as multiple values.)

(define (scheduler . fns)
  (set *queue* (append *queue* fns))
  (if (null? *queue*)
    (escape)
    (let ( (fn (pop *queue*)) )
      (fn)
      (scheduler))))

(define-syntax (define-coroutine name&args . body)
  `(define ,name&args (call/cc scheduler) ,@body))

(call/cc (lambda (esc) (define escape esc)))

There.  Coroutines in ten lines of fairly transparent code, but it
requires upward continuations so you can't do this sort of thing in
CL as it stands now.

BTW: T is NOT just another dialect of Scheme.  It provides an object
primitive which cannot be implemented in pure Scheme.  (You can come
very close, but you can't implement it exactly.)  The T object construct,
besides providing an elegant vehicle for doing object-oriented programming,
also has the pleasant side effect of elegantly solving the SETQ/SETF
problem.  It also lets you do things like redefine a function to give it
a different name and still have (SET (new-name ...) ...) work.  T also
has a structure primitive which lets you access all the slot names and
accessor functions in a very neat way.  It also extends call/cc as described
above.

Unfortunately, no one is commercializing T, so it's hard to get support
or good development environmnts.  So, if we can't have T, can't we at least
get call/cc with our CL?  (And STYPES?  And Objects?  And...  :-)

E.

From: Barry Margolin
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <1991Feb13.055938.22853@Think.COM>
In article <······················@elroy.jpl.nasa.gov> ···@robotics.jpl (Erann Gat) writes:
>This is a (probably) vain plea to see call-with-current-continuation
>included in common lisp.  There are a number of reasons that call/cc
>should be included.
>
>1.  It cannot be implemented in terms of existing CL constructs, so you
>can't build it even if you want to.  You can build a downward-only call/cc,
>but see the note below.

I wasn't in the original Common Lisp design group.  However, it seems
obvious that they explicitly decided against allowing upward closures to
capture control state.  Had they not made this decision, call/cc could be
implemented in terms of closures or vice versa.  I haven't read the
archives of the CL design email for about six years, so I don't remember
their justification, but I expect it was mostly along the lines of "there
isn't enough experience with it, and it doesn't really seem necessary."

>2.  It is not that hard to implement.  Including call/cc would not make
>compilers that much more complex.  All that call/cc does is copy the
>stack onto the heap.

There are some architectures where operations on the stack pointer register
are restricted, so you can't just set the stack pointer to point to a
heap-allocated activation record.

And what about the fact that call/cc makes unwind-protect more complicated;
don't you have to provide code that runs when re-entering the environment?
This means that the addition of upward control closures would not be
transparent to implementors of many macros (e.g. WITH-OPEN-FILE would have
to be changed to save its state in the closure and reopen the file and seek
to the saved place upon re-entry).

>3.  Call/cc makes it INFINITELY less painful to write coroutines.  

So do the Lisp Machine's stack group facility, and I expect some other
Common Lisp implementations provide similar mechanisms (Lucid and Franz
support multitasking within Lisp, in a manner very similar to Symbolics's).

What if we were to add a coroutine facility, rather than a primitive for
building coroutines?  See my previous posting, where I mentioned that we
prefer to define user-oriented facilities, rather than implementor-oriented
primitives.  Additionally, by defining the high-level facility rather than
the low-level mechanism, we allow implementors to tailor the implementation
to the environment (e.g. Lisp Machine coroutines would use the
hardware-supported stack group mechanism), rather than forcing the user to
implement the facility using suboptimal primitives.
--
Barry Margolin, Thinking Machines Corp.

······@think.com
{uunet,harvard}!think!barmar
From: Piercarlo Grandi
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <PCG.91Feb14184024@odin.cs.aber.ac.uk>
On 13 Feb 91 05:59:38 GMT, ······@think.com (Barry Margolin) said:

barmar> I wasn't in the original Common Lisp design group.  However, it
barmar> seems obvious that they explicitly decided against allowing
barmar> upward closures to capture control state.  Had they not made
barmar> this decision, call/cc could be implemented in terms of closures
barmar> or vice versa.

This would a be a misdesign. The point about call/cc is that allows you
to actualize an arbitrary control graph, instead of a control tree
linearized onto a stack (usual case) or on multiple stacks (coroutines
as stack groups).

Closures are completely orthogonal to continuations; closures allow you
to actualize an arbitrary context graph, instead of a context tree
linearized onto a stack or multiple stacks. Incidentally, fully general
closures also obviate the need for packages.

barmar> I haven't read the archives of the CL design email for about six
barmar> years, so I don't remember their justification, but I expect it
barmar> was mostly along the lines of "there isn't enough experience
barmar> with it, and it doesn't really seem necessary."

This would be really surprising. The technology for arbitrary control
and context graphs has been known for a long time (twenty five years at
least, and it has been fifteen years since the more or less definitive
works of Baker and Steele). Closures and continuations have also been
extensively used in the AI community, under various guises. It can be
argued that objects and actors and frames are just some of these guises.

As to being necessary, both are simply indispensable; you need both
arbitrary context and control graphs in all algorithms that deal with
non trivial graph walking, or equivalents.

Many programmers do not realize this, they keep reimplementing closures
and continuations "manually", simulating closures usually with bunches
of global variables and continuations with state machines or "markers".

barmar> And what about the fact that call/cc makes unwind-protect more
barmar> complicated; don't you have to provide code that runs when
barmar> re-entering the environment?

Of course! When you have a control graph and a context graph, you want
fully general entry and exit functions, not just exit functions.
Actually you want them in general, except that they are hidden as
"initialization" code. Fully general entry and exit functions are
incredibly useful abstractions, because allow clean and flexible
modularization and "advising" of code. I remember having read (either in
Allen's book or in Baker's paper on shallow binding) that the only known
language with fully general entry and exit functions is TECO, of all
things :-).

barmar> What if we were to add a coroutine facility, rather than a
barmar> primitive for building coroutines?  See my previous posting,
barmar> where I mentioned that we prefer to define user-oriented
barmar> facilities, rather than implementor-oriented primitives.

In other words you are saying that CL is by design a "programming"
language, like PL/1 or PERL, not an algorithmic language like Scheme or
C or the Bourne shell.

This philosphy is one of the reasons why CL is perceived as a jumble
sale of features, a giant and monolithic entity. Coupled with the weak
or otherwise vastly underexploited library facilities, and to the
tradition to implement Lisp (and even more regrettably, Scheme) in a
workspace based way, with little interaction with other host system
modules, and we have a fatal combination.
--
Piercarlo Grandi                   | ARPA: ·················@nsfnet-relay.ac.uk
Dept of CS, UCW Aberystwyth        | UUCP: ...!mcsun!ukc!aber-cs!pcg
Penglais, Aberystwyth SY23 3BZ, UK | INET: ···@cs.aber.ac.uk
From: Barry Margolin
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <1991Feb15.081259.797@Think.COM>
In article <·················@odin.cs.aber.ac.uk> ···@cs.aber.ac.uk (Piercarlo Grandi) writes:
>Closures are completely orthogonal to continuations; closures allow you
>to actualize an arbitrary context graph, instead of a context tree
>linearized onto a stack or multiple stacks. 

All I know is that I've implemented call/cc using Common Lisp closures.
It looks something like this:

(defun call/cc (function)
  (funcall function #'(lambda (x) (return-from call/cc x))))

It would even work if Common Lisp allowed returning from a block multiple
times; the implementation of closures would have to include continuations.

>					     Incidentally, fully general
>closures also obviate the need for packages.

I've seen all the arguments to this effect, and remain unconvinced.  Common
Lisp packages can be used for other things besides bindings.  Symbols are
often used as data objects, e.g. as indicators on property lists, and
packages can be used to prevent these from colliding.

>barmar> I haven't read the archives of the CL design email for about six
>barmar> years, so I don't remember their justification, but I expect it
>barmar> was mostly along the lines of "there isn't enough experience
>barmar> with it, and it doesn't really seem necessary."
>
>This would be really surprising. The technology for arbitrary control
>and context graphs has been known for a long time (twenty five years at
>least, and it has been fifteen years since the more or less definitive
>works of Baker and Steele). Closures and continuations have also been
>extensively used in the AI community, under various guises. It can be
>argued that objects and actors and frames are just some of these guises.

And in the mid-80's, when Common Lisp was being designed, all these were
still hot *research* topics.  They hadn't yet passed into common day-to-day
use by Lisp programmers.  None of MacLisp, Lisp Machine Lisp, nor Interlisp
had them.  Yes, I realize that this is also true of lexical closures, but I
guess these were felt to be so obviously right and relatively easy to
implement that they were added.

>In other words you are saying that CL is by design a "programming"
>language, like PL/1 or PERL, not an algorithmic language like Scheme or
>C or the Bourne shell.

You've got it.  The charter of X3J13 is to produce a specification for an
industrial-strength Lisp dialect.  Usability for research into programming
paradigms is not a goal of the Common Lisp project; it *is* a high priority
of the Scheme designers.  Different strokes for different folks.
--
Barry Margolin, Thinking Machines Corp.

······@think.com
{uunet,harvard}!think!barmar
From: Jeff Dalton
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <4138@skye.ed.ac.uk>
In article <·················@odin.cs.aber.ac.uk> ···@cs.aber.ac.uk (Piercarlo Grandi) writes:

>barmar> I wasn't in the original Common Lisp design group.  However, it
>barmar> seems obvious that they explicitly decided against allowing
>barmar> upward closures to capture control state.  Had they not made
>barmar> this decision, call/cc could be implemented in terms of closures
>barmar> or vice versa.
>
>This would a be a misdesign. The point about call/cc is that allows you
>to actualize an arbitrary control graph, instead of a control tree
>linearized onto a stack (usual case) or on multiple stacks (coroutines
>as stack groups).
>
>Closures are completely orthogonal to continuations; closures allow you
>to actualize an arbitrary context graph, instead of a context tree
>linearized onto a stack or multiple stacks. Incidentally, fully general
>closures also obviate the need for packages.

I agree that closures and continuations should be regarded as 
distinct.  The spaghetti stack view of things, where the access
link and the control link seem very similar things, may be one
reason for the "closures subsume continuations" view.

However, fully general closures do not obviate the need for
packages, because (1) packages are also used for data structuring
(ie, controlling the potentially huge number of symbols), (2)
it's much easier to make macros work reasonably with packages
than with modules that are based on closures / environments.

And let's not pretend that there are no costs to call/cc, only
benefits.  That is simply false.

I think the right question to ask is whether a Lisp that has lexical
closures and coroutines needs to go further and add call/cc (and take
out coroutines or reimplement them with call/cc).  My view is that we
have by then reached the point of substantially diminishing returns.
Moreover, the coroutines that one can implement oneself with call/cc
tend to be less usable than those that are provided with the language.

>As to being necessary, both are simply indispensable; you need both
>arbitrary context and control graphs in all algorithms that deal with
>non trivial graph walking, or equivalents.
>
>Many programmers do not realize this, they keep reimplementing closures
>and continuations "manually", simulating closures usually with bunches
>of global variables and continuations with state machines or "markers".

There are much better "simulations", sometimes of both together.  For
example, agendas are often used to implement control.  And a typical
agenda would have entries that each contained a function and some
arguments to apply it to (a simulated closure, if you will).

Other better simulations for non-trivial graph-walking are streams
(ie, lazy lists) and explicit continuation functions (eg, the caller
passes a function to be called on success).

>barmar> What if we were to add a coroutine facility, rather than a
>barmar> primitive for building coroutines? ... prefer to define 
>barmar> user-oriented facilities, rather than implementor-oriented 
>barmar> primitives.
>
>In other words you are saying that CL is by design a "programming"
>language, like PL/1 or PERL, not an algorithmic language like Scheme or
>C or the Bourne shell.

This is pretty amusing.  CL is like PL/1 because it aims to be
useful for programming.

I agree with Barmar about coroutines (though not quite about the
user/implementor distinction).  And this is in part because I think
different things are appropriate for different languages, and that
more than one of these combinations can be good.  Some people seem to
feel that there is essentially only one good way to do things and that
any language that does something else must therefore be worse.  These
people are invited to define the One True Language and then set about
convincing everyone to use it.

The slightly more generous view in which there is a class of
good languages (the "algorithmic languages", say) and a class
of bad ones (those that can be claimed to be like PL/1) has
similar problems.  In particular it still fails to recognize that
there can be more than one good way of doing things and so
ends up regarding everyone who does things the "bad" way as
misinformed, misguided, or worse.

Let several language design philosophy flowers bloom, at least.

>This philosphy is one of the reasons why CL is perceived as a jumble
>sale of features, a giant and monolithic entity. 

A giant and monolithic entity and a jumble sale of features seem
rather different to me, but I would agree that such perceptions
often go together.

I think that one of the most important reasons why CL is sometimes
perceived as a jumble sale is that it is mistakenly supposed that CL
was designed by taking every feature from every Lisp and putting it in
Common Lisp.  But that isn't true.  CL is a cleaner and simpler design
in many ways than, say, InterLisp; and many of the features that
appear most in complaints (loop, format, dotimes, multiple values)
were already used, in one form or another, in MacLisp.

Indeed, a fairly common view (at least in the UK) is that the scoping
rules for CL (where lexical/static and special/dynamic are both
supported) are as they are because of some union of features approach.
However, what they actually are is a cleanup and generalization of the
rules in MacLisp.

-- jd
From: Aaron Sloman
Subject: Re: In defense of call/cc (and a plug for T)(and Pop-11)
Date: 
Message-ID: <4561@syma.sussex.ac.uk>
···@cs.aber.ac.uk (Piercarlo Grandi) writes:

> barmar> And what about the fact that call/cc makes unwind-protect more
> barmar> complicated; don't you have to provide code that runs when
> barmar> re-entering the environment?
>
> Of course! When you have a control graph and a context graph, you want
> fully general entry and exit functions, not just exit functions.
> Actually you want them in general, except that they are hidden as
> "initialization" code. Fully general entry and exit functions are
> incredibly useful abstractions, because allow clean and flexible
> modularization and "advising" of code. I remember having read (either in
> Allen's book or in Baker's paper on shallow binding) that the only known
> language with fully general entry and exit functions is TECO, of all
> things :-).

Another is the version of Pop-11 available in Poplog. Poplog allows
the construction of "processes" which combine a function with a
control stack, values of local dynamic variables, etc. and
facilities for suspending or resuming a process. These can be used
for co-routines, and have been used, for instance, for teaching
beginners about operating systems.

Along with the process mechanism, Pop-11 supports "dynamic local
expressions" which are run when a procedure is entered or exited.
These can distinguish the following contexts (extract from online
documentation on the Poplog virtual machine follows):

-------------------------------------------------------------------
    To enable the access or update code to determine in which context it
is being called, an integer context value  (1 - 4) is available via  the
special  active  variable   -dlocal_context-  (this   variable  may   be
referenced ONLY  in  local  expression  code,  and  nowhere  else).  The
contexts and their applicablity are as follows:

        value   context         applies to
        -----   -------         ----------
          1     normal entry      access
                normal exit       update

          2     abnormal exit     update

          3     suspend           access
                                  update

          4     resume            access
                                  update
-------------------------------------------------------------------

This addition to the Poplog virtual machine was designed and
implemented by John Gibson at Sussex University. It handles Common
Lisp's "unwind-protect" as a special case.

Access to these mechanisms is provided in Pop-11. The relevant
features of the Poplog virtual machine are also available to people
using it to implement other languages. At least one user has claimed
that he implemented Scheme in Poplog in a few weeks in his spare
time.

Aaron Sloman,
School of Cognitive and Computing Sciences,
Univ of Sussex, Brighton, BN1 9QH, England
    EMAIL   ······@cogs.sussex.ac.uk
or:
            ························@nsfnet-relay.ac.uk
From: Erann Gat
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <1991Feb14.223202.11475@elroy.jpl.nasa.gov>
In article <······················@Think.COM> ······@think.com (Barry Margolin) writes:
>What if we were to add a coroutine facility, rather than a primitive for
>building coroutines?  See my previous posting, where I mentioned that we
>prefer to define user-oriented facilities, rather than implementor-oriented
>primitives.

Common Lisp tries to be 1) powerful, 2) portable, 3) efficient.  The trouble
is that these three goals are often in conflict.  Portability is usually
served best by a small language with a few well-defined primitives that
everything else is built on top of.  Efficiency is usually served best by
having a large number of less general constructs which can make lots of
assumptions about the way they will be used.  Power (e.g. lots of data
types and built in procedures to compute with them) is often at odds with
efficiency.  (Canonical examples of languages which cater to portability,
efficiency and power are Scheme, Fortran and Ada repectively.)

CL has actually gone above and beyond the call for being all things to all
people.  However, its philosophy of catering to users rather than implementors
leads to one extremely annoying side effect.  If the thing you want to do
just happens not to be one the ten zillion things that CL does, then you
are just screwed.  Examples of such things appear in this group regularly -
things like coroutines or accessing the slot names of a structure.  More
often than not people just write implementation-dependent hacks to do what
they want.  ("Well," says Joe Programmer, "type-of returns the right thing
on this machine, and so I'll go ahead and use it in this non-portable way
and fix it up later.")

What I do not understand is why you can't do both: first start with a set
of primitives.  Define the hell out of them so everyone agrees what they do.
Then build all the user features on top of the primitives.  It seems to me
that this gives you the best of all possible worlds.  Because the whole
language is based on a small set of primitives, you can have complete
implementations that consist of a small kernel and a huge (maybe PD)
library of the user features (in source-code form).  On the other hand,
nothing prevents a manufacturer from implementing a compiler to optimize
all the user features into tight code.  Portability is enhanced because if
some implementation has a bug, that feature can simply be redefined (albeit
less efficiently) in terms of the primitives.

Therefore, the answer to the original question is that I would rather
have call/cc than coroutines.  Tomorrow I might want to build engines
(a really cool construct that somehow got lost in the shuffle).  If I
have call/cc I can do it myself.  If I don't have call/cc then I have
no recourse other than to go begging to LISP manufacturers or write my own
compiler.  (Ah!  Maybe I have answered my own question there!)

E.
From: Barry Margolin
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <1991Feb15.082301.2154@Think.COM>
In article <······················@elroy.jpl.nasa.gov> ···@forsight.jpl.nasa.gov (Erann Gat) writes:
>What I do not understand is why you can't do both: first start with a set
>of primitives.  Define the hell out of them so everyone agrees what they do.

The Common Lisp designers didn't have the luxury of starting from first
principles.  Read the history chapter of CLtL.  The goal of Common Lisp was
to unify many of the Lisp dialects that were proliferating by the early
80's.  They needed to produce something relatively quickly, or users would
get too dependent on their incompatible dialects.  A reasonable amount of
compatibility with existing dialects, or at least the ability to coexist
with them and/or automatically translate from them to CL, was also
important.  These goals all had higher priority than producing a
conceptually beautiful language design.  The Scheme people were doing the
"clean" Lisp; there was no need to duplicate their work, since the goals
were completely different.
--
Barry Margolin, Thinking Machines Corp.

······@think.com
{uunet,harvard}!think!barmar
From: Jeff Dalton
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <4140@skye.ed.ac.uk>
In article <······················@elroy.jpl.nasa.gov> ···@forsight.jpl.nasa.gov (Erann Gat) writes:

>CL has actually gone above and beyond the call for being all things to all
>people.  However, its philosophy of catering to users rather than implementors
>leads to one extremely annoying side effect.  If the thing you want to do
>just happens not to be one the ten zillion things that CL does, then you
>are just screwed.  Examples of such things appear in this group regularly -
>things like coroutines or accessing the slot names of a structure.

But this is always a problem.  There are things I want to do in
Scheme that I can't do (eg, define a distnct data type).  But
the situation is usually much worse in languages outside the Lisp
family.  Common Lisp is already better than most languages in
this respect, and I don't think it's actually much worse than
Scheme.

Moreover the examples that appear in this group regularly seem fairly
few in number.  

>What I do not understand is why you can't do both: first start with a set
>of primitives.  Define the hell out of them so everyone agrees what they do.
>Then build all the user features on top of the primitives.  It seems to me
>that this gives you the best of all possible worlds. 

I agree that this is a good approach.  But (as I've been saying with
tiresome regularity), Common Lisp can be seen that way to a surprising
extent.  It's just that lots of user features are presented, implemented,
etc as "part of the language".  

>Therefore, the answer to the original question is that I would rather
>have call/cc than coroutines.  Tomorrow I might want to build engines
>(a really cool construct that somehow got lost in the shuffle).  If I
>have call/cc I can do it myself. 

You can?  That's not what happens in Kent Dybnig's book.  He needs
an additional primitive, timer interrupts.

However, I agree that it's important to have powerful primitives
on which other abstractions can be built.  This is one reason I
don't quite agree with the user/implementor distinction.  In Lisp,
users often _are_ implementors.

-- jd
From: Erann Gat
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <1991Feb18.011213.7962@elroy.jpl.nasa.gov>
In article <····@skye.ed.ac.uk> ····@aiai.UUCP (Jeff Dalton) writes:
>>Tomorrow I might want to build engines
>>(a really cool construct that somehow got lost in the shuffle).  If I
>>have call/cc I can do it myself. 
>
>You can?  That's not what happens in Kent Dybnig's book.  He needs
>an additional primitive, timer interrupts.

Granted, you need timer interrupts to do it properly, but you can fake
it to a pretty high degree of fidelity using call/cc alone.

I think it's worth relating a theory once told to me by Jim Firby when
everyone was griping about a graphics package he had written.  He said
that the more people complain about a piece of development software, the
better it is because if it were really bad people just wouldn't use
it and so there would be no complaints.  I have been a very vocal critic
of CL, but that's only because I use it a lot, and the fact that it's as
good as it is makes me wish even more that it could be just that much
better.  

That said, CL really needs call/cc, and engines, and stypes, and...  :-)

E.
From: Matthias Felleisen
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <1991Feb18.041425.5764@rice.edu>
Let me try to reformulate Erann's plug for call/cc. 

During the design of a programming language, it is impossible to
anticipate all needs for functions. Therefore we build in LAMBDA so
that programmers can define their own functions. It is also impossible
to foresee the kinds of data, programmers will have to deal with.
Languages therefore include tools for structuring data in some form or
another. And, in the same vein, it is just as impossible to project
the sort of control abstractions a programmer would like to have
around.

The most natural solution is the inclusion of a powerful control
construct such that programmers can build a set of control
abstractions for specific problems with a set of macros.  With call/cc
and first-class continuations (for example), it is easy to implement
  <> generators,
  <> Prolog-style backtracking, 
  <> non-blind backtracking, 
  <> all kind of coroutine models, 
  <> system trap doors (see the SML/NJ compiler),
  <> engines (with a timer or by redefining lambda and set!),
  <> operating systems (in conjunction with engines), 
and a few more. And that on top of all the control constructs in
Common Lisp. 

The disadvantage of building a specific high-level control construct
into a language is two-fold. First, and again, there are too many.
Every recent language from Icon via ML to Prolog has its own (or
several). Clearly, all of them were thought to be useful; otherwise
they wouldn't be in these languages. Second, a specific control
construct comes with a specific programming paradigm, and when
thinking about something else it is difficult to think of using it in
non-standard ways. Yes, FIRST-CLASS coroutines can be used to define
call/cc and therefore other control constructs.  BUT, who would think
of doing so?  Schemers, on the other hand, are trained to think of
call/cc as the tool to implement the control construct that is best
for a given problem and their way of thinking about it.

Restricting the programmer's tool for thinking about problems is
wrong: Life is too short to waste hours and days just because some
committee wanted to save some cycles and some pain for implementors.

I'd love to see the CL committe correct this mistake by the time CLtL3
comes around. Perhaps we could get call/cc PLUS a set of high-level
control constructs just like we got lambda PLUS a huge library of
functions, def-macro PLUS a number of standard macros, cons and
vectors and types PLUS CLOS, etc. Guy, any chances?

-- Matthias Felleisen
From: Jeff Dalton
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <4122@skye.ed.ac.uk>
In article <······················@elroy.jpl.nasa.gov> ···@robotics.jpl (Erann Gat) writes:

>3.  Call/cc makes it INFINITELY less painful to write coroutines. 

I would rather have coroutines (a.k.a. stack groups) directly,
as several CL implementations have done.

I think call/cc is too general and powerful (kind of like goto,
only more so).  It's right for a language such as Scheme that
tried to have a small number of powerful constructs, but that
doesn't make it right for other languages.

BTW, T is excellent.

-- jd
From: Steve Knight
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <1350035@otter.hpl.hp.com>
It's interesting to relate the discussion of call/cc to the experience of
using "processes" in Poplog.  (The "process" facilities in Poplog are a dual
of the Scheme call/cc facilities in that each can be used to (fairly)
directly implement the other.)  Since the two facilities are very closely
related -- I used Poplog "processes" to implement call/cc in my
Scheme compiler -- the comparison is useful.

From the viewpoint of implementation the key problem is dealing with what
in Poplog is called dynamically-localised constructs -- which is a small
generalisation of specials & sys-unwind-protect.  As Barry points out in an
earlier message: "And what about the fact that call/cc makes unwind-protect 
more complicated; don't you have to provide code that runs when re-entering 
the environment?"  The interaction between thread switching and dynamic
localisation really does complicate implementation, at least it does in Poplog.

The issue to be decided is that when a process-switch occurs, the dynamic
environment is exitted temporarily (or permanently), and dynamic localisations
may or may not have to be reversed.  Special-variables typically should be
restored and sys-unwind-protect typically should be left alone.  

The solution adopted in Poplog is that by default, dynamic localisations are
restored (i.e. put back to the state on entry) at thread-switch.  However,
there is a simple, rarely used, mechanism for localisations to inspect the 
kind of switching going on and to make a decision on the basis of that 
data.  This is packaged up for sys-unwind-protect.

A third strand to the overall solution that Poplog employs is "destroy 
actions".  In Poplog, it is possible to associate an action with an object
that is triggered when that object becomes eligible for garbage-collection.
In particular, system resources such as file-handles or X-windows have
associated clean-up actions so that if they are "lost" they are quietly
shut down at the next garbage-collection (and actually collected next time
around).  Furthermore, file-opening code can cause a garbage-collection to
happen if file-opening fails, in case the GC can reclaim the existing open
files.  Destroy actions are a slightly complicated topic -- which I've rather
skimped on here to keep things short -- but they play an important role
in reducing the necessity for sys-unwind-protect and other protective macros.

Barry continues:
> What if we were to add a coroutine facility, rather than a primitive for
> building coroutines?  See my previous posting, where I mentioned that we
> prefer to define user-oriented facilities, rather than implementor-oriented
> primitives.  

I certainly find programming using the Poplog "process" facilities to be
a great deal more intuitive than call/cc.  I instinctively find myself using
call/cc to build a process-like layer.  Since the two are interchangeable in
expressive power, I'm inclined to think that the higher-level solution is more
suitable for CL.

Steve Knight
HP Labs, Bristol
From: Matthias Felleisen
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <1991Feb16.215055.6177@rice.edu>
Here is a semi-formal treatment of the puzzle that was posed to
comp.lang.lisp by Steve Knight.

GIVEN: An Indiana-style definition of coroutines in Scheme, using
call/cc.

  (extend-syntax (COROUTINE RESUME)
    [(COROUTINE x body)
     (letrec ([lcs (lambda (x) body)]
	      [RESUME (lambda (dest val)
			(call/cc (sigma (lcs) (dest val))))])
       (lambda (x) (lcs x)))])

LOOKING FOR: A definition of call/cc in terms of the COROUTINE syntax.

SOLUTION: 

  (define callcc
    (lambda (f)
      (letrec ([c (COROUTINE x (RESUME f c))]) (c13))))

If you are not interested in a semi-formal proof that callcc is really
related to call/cc, skip the rest. The proof informally uses the
extended lambda calculus for Scheme that several co-workers and myself
have developed over the last few years. -- Matthias Felleisen

P.S. A more readable version is contained in public/puzzle.dvi @
titan.rice.edu, which you may obtain via anonymous ftp. 





-------------------------- cut here -----------------------------------

PROOF: The following is a proof in (a variant of) the
Felleisen-Hieb(*) calculus of Scheme.

[1] Replace the use of the extended syntax by its definition. 

(define callcc
  (lambda (f)
    (letrec ([c (letrec ([lcs (lambda (x) (RESUME f c))]
			 [RESUME (lambda (dest val)
					 (call/cc (sigma (lcs)
					    (dest val))))])
			(lambda (x) (lcs x)))])
      (c 13))))

[2] Flatten the letrec definitions.

(define callcc
  (lambda (f)
    (letrec ([c (lambda (x) (lcs x))]
	     [lcs (lambda (x) (RESUME f c))]
	     [RESUME (lambda (dest val)
		       (call/cc (sigma (lcs) (dest val))))])
      (c 13))))
   ; Expand the non-rho use of letrec to see that this is correct.

[3] Dereference c and use a beta-value step to discharge the
resulting beta-value redex.

(define callcc
  (lambda (f)
    (letrec ([c (lambda (x) (lcs x))]
	     [lcs (lambda (x) (RESUME f c))]
	     [RESUME (lambda (dest val)
		       (call/cc (sigma (lcs) (dest val))))])
      (lcs 13))))

[4] Dereference lcs and use a beta-value step to discharge the
resulting beta-value redex. 13 disappears: lucky us :-). 

(define callcc
  (lambda (f)
    (letrec ([c (lambda (x) (lcs x))]
	     [lcs (lambda (x) (RESUME f c))]
	     [RESUME (lambda (dest val)
		       (call/cc (sigma (lcs) (dest val))))])
       (RESUME f c))))

[5] Dereference RESUME and use multi-beta-value redex to discharge
the resulting beta-value redex.

(define callcc
  (lambda (f)
    (letrec ([c (lambda (x) (lcs x))]
	     [lcs (lambda (x) (RESUME f c))]
	     [RESUME (lambda (dest val)
		       (call/cc (sigma (lcs) (dest val))))])
      (call/cc (sigma (lcs) (f c))))))

[6] Expand call/cc such that it takes a lambda-argument: this is one
form of continuation-eta rule.

(define callcc
  (lambda (f)
    (letrec ([c (lambda (x) (lcs x))]
	     [lcs (lambda (x) (RESUME f c))]
	     [RESUME (lambda (dest val)
		       (call/cc (sigma (lcs) (dest val))))])
      (call/cc (lambda (k)
	 ((sigma (lcs) (f c)) k))))))

[7] Swap letrec and (call/cc (lambda k. ***)).

(define callcc
  (lambda (f)
    (call/cc (lambda (k)
	       (letrec ([c (lambda (x) (lcs x))]
			[lcs (lambda (x) (RESUME f c))]
			[RESUME (lambda (dest val)
				  (call/cc (sigma (lcs) (dest val))))])
		 ((sigma (lcs) (f c)) k))))))

[8] Perform the sigma-assignment.

(define callcc
  (lambda (f)
    (call/cc (lambda (k)
	       (letrec ([c (lambda (x) (lcs x))]
			[lcs k]
			[RESUME (lambda (dest val)
				  (call/cc (sigma (lcs) (dest val))))])
		 (f c))))))

[9] Garbage collect RESUME.

(define callcc
  (lambda (f)
    (call/cc (lambda (k)
	       (letrec ([c (lambda (x) (lcs x))] [lcs k])
		 (f c))))))

[10] Dereference c and garbage collect. (Since f is non-assignable it
is in an evaluation context.)

(define callcc
  (lambda (f)
    (call/cc (lambda (k)
	(letrec ([lcs k])
	  (f (lambda (x) (lcs x))))))))

[11] Dereference lcs via "fancy garbage collection". (It is no longer
assignable so we can substitute its uses via (potentially recursive)
values).

(define callcc
  (lambda (f)
    (call/cc (lambda (k) (f (lambda (x) (k x)))))))

[12] "Expand" call/cc lift in empty context.

(define callcc (lambda (f) (call/cc f)))

And this is as close as we can get since call/cc is an assignable
identifier in the global environment, i.e., Replacing the right hand
side with call/cc would not be correct.

In summary this derivation shows that callcc will always have the
same value as call/cc in the global environment.

(*) {Felleisen, M. and R. Hieb}.  The revised report on the syntactic
theories of sequential control and state.  Technical Report 100, Rice
University, June 1989.
From: Jeff Dalton
Subject: PopLog this, PopLog that, ...
Date: 
Message-ID: <4152@skye.ed.ac.uk>
I would like to make a general point about PopLog and Pop-11 verses
Common Lisp.  Pop-11 does have some advantages as a language, and I
don't want to imply that it doesn't.  But in many cases where Pop-11
or PopLog has some nice feature that isn't in Common Lisp it's worth
remembering that it's much easier to add lots of nice things if you
are doing it for one (or even for several) implementation rather than
for a language that a number of different people will implement.

This is also why several Common Lisp implementations have stack
groups, light-weight processes, and weak pointers even though Common
Lisp doesn't.

-- jd
From: Steve Knight
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <1350036@otter.hpl.hp.com>
> If I don't have call/cc then I have
> no recourse other than to go begging to LISP manufacturers or write my own
> compiler.

As I mentioned in a previous posting, call/cc and various other coroutine
models (Poplog processes are just one example) are equivalent and can be used to
implement each other.  The difference lies in the implementation technology.
(Of course, one may still prefer call/cc over other multi-threading models.)

So, provided language designers have aniticipated the need for multi-threading
in some shape or form, there is a good chance the begging bowl can stay at
home.  It's a bit of a shame CL provides no such facilities -- but it gives
those of us who find CL the least acceptable Lisp around a really big
stick to beat up our managers with :-)

Steve

PS. My own personal gripes with CL are [1] the type system is far too loose
[2] user-defined types aren't guaranteed to be distinct from previous
types [3] equality doesn't work the way it does in other lisps & is 
a nasty source of problems (e.g. portability) [4] there's no multi-threading
[5] I don't like keyword parameters/optional parameter etc and dislike
paying a cost for a feature I don't use [6] empty list and false still
aren't distinct [7] the lack of temporary (transparent) hash-tables 
(c.f. Poplog CL, T) [8] the lack of destroy actions (c.f. Poplog CL)
[9] the poor choice of initial syntax features (defun, do) [10] the
dual name spaces for functions and values (bletch!) (use type hints
to avoid the usual complaints of efficiency (see Pop11)) [11] type assertions
are unchecked & are effectively promises rather than hints (I'd like
both, thanks) [12] the lack of a updaters (see Pop11) for an update
model that can support abstraction (SETF is a compile-time thing).

On the other hand, CL gets my thumbs up for {1} putting lexical binding
on the map once and for all {2} its excellent number model {3} the
size of its library (implementations all too often are sloppy in doing
this, see Poplog CL for an implementation that avoids dragging in every bit
of code (even the different cases of FORMAT are separately loaded)) which
I suppose could be better thought out {4} for (mostly) successfully uniting 
the Lisp community behind a single Lisp.
From: Barry Margolin
Subject: Common Lisp gripes (was Re: In defense of call/cc (and a plug for T))
Date: 
Message-ID: <1991Feb17.000333.16922@Think.COM>
In article <·······@otter.hpl.hp.com> ···@otter.hpl.hp.com (Steve Knight) writes:
>PS. My own personal gripes with CL are [1] the type system is far too loose

We're tightening it up a bit in the ANSI version.

>[3] equality doesn't work the way it does in other lisps & is 
>a nasty source of problems (e.g. portability)

X3J13 has cleaned up EQUAL and EQUALP a little.  However, our general
feeling is that it's not really worthwhile to spend lots of time on the
built-in equality predicates.  Equality is generally a domain-specific
notion, and simplistic equality predicates are rarely applicable.  For
instance, should EQUAL recursively descend into defstruct slots?  The
answer generally depends on the semantics of the structure.

>[5] I don't like keyword parameters/optional parameter etc and dislike
>paying a cost for a feature I don't use

What cost is there for them when you don't use them?  There doesn't need to
be much cost for the fact that standard functions have keyword arguments,
because the compiler can transform calls to them into calls to non-keyword
versions.  The only exception is when the argument list is not known at
compile time (i.e. FUNCALL or APPLY is being used), but these are the
situations when the keyword argument list is most useful, so the expense of
parsing the keywords is worth it (IMHO).

> [6] empty list and false still aren't distinct

I don't think this is considered a bug in the CL community.  Even Scheme
has only very recently changed this.

>[10] the
>dual name spaces for functions and values (bletch!) (use type hints
>to avoid the usual complaints of efficiency (see Pop11)) 

What efficiency complaints are these?  I think most of us who are in favor
of multiple namespaces (there are more than two -- there are also namespaces
for classes, types, and packages) like the semantics of it.  I've never
heard an efficiency justification.

>[11] type assertions
>are unchecked & are effectively promises rather than hints (I'd like
>both, thanks) 

It's easy to write a macro that combines a type declaration and a type
check.  Something like

(defmacro with-type ((variable type) &body body)
  `(progn
     (check-type ,variable ,type)
     (locally
       (declare (type ,type ,variable))
       .,body)))

>[12] the lack of a updaters (see Pop11) for an update
>model that can support abstraction (SETF is a compile-time thing).

ANSI CL will allow you to define a function named (SETF <symbol>).  If SETF
doesn't recognize the first subform as a macro-style setf method, it turns
the SETF call into (FUNCALL #'(SETF <symbol>) ...).

By the way, did you ever make your complaints known to HP's representative
to X3J13 (sorry, I don't remember who that is)?
--
Barry Margolin, Thinking Machines Corp.

······@think.com
{uunet,harvard}!think!barmar
From: Jeff Dalton
Subject: Re: Common Lisp gripes (was Re: In defense of call/cc (and a plug for T))
Date: 
Message-ID: <4156@skye.ed.ac.uk>
In article <······················@Think.COM> ······@think.com (Barry Margolin) writes:
>>[10] the
>>dual name spaces for functions and values (bletch!) (use type hints
>>to avoid the usual complaints of efficiency (see Pop11)) 
>
>What efficiency complaints are these?  I think most of us who are in favor
>of multiple namespaces (there are more than two -- there are also namespaces
>for classes, types, and packages) like the semantics of it.  I've never
>heard an efficiency justification.

Maybe you're both right?  There is an efficiency argument, but it
isn't a justification for two namespaces (or at least not an important
one).

The basic idea is that in a Lisp-1 a function call might have to check
that the value is a function.  This is a problem for global variables
and for locals where the compiler can't be sure the value will be a
function.  A solution (for globals?) is to (1) always have a spearate
function value that's called, and (2) have assignment of a non-function
value change the separate function to one that signals an error.

This sort of thing is discussed in (amazingly enough) the first issue
of Lisp Pointers in the endpaper "Technical Issues of Separation in
Function Cells and Value Cells", page 95.  I think you probably have
seen this, but since the efficiency issue has been settled (ie, it's 
not a great problem), it doesn't really count as a justification.

-- jd
From: Rob MacLachlan
Subject: Types in Common Lisp (was Re: In defense of call/cc)
Date: 
Message-ID: <1991Feb18.031643.10720@cs.cmu.edu>
Subject: Re: In defense of call/cc (and a plug for T)
Date: 16 Feb 91 10:21:04 GMT

>PS. My own personal gripes with CL are [1] the type system is far too loose
>[2] user-defined types aren't guaranteed to be distinct from previous
>types
[11] type assertions
>are unchecked & are effectively promises rather than hints (I'd like both,
> thanks).

I think that you are to some degree confusing current implementation
practice with what is necessarily so about Common Lisp.  It is true that
Common Lisp very rarely guarantees that errors will be detected (although in
the ANSI standard, this does happen.)  But implementations are encouraged to
detect errors wherever possible.  This would be an academic distinction
were it not for the fact that there is now an implementation of CL that
is strict about types, namely CMU CL.  I think that the CL type system is
great, what is a loss is the previous implementations of it.

By default, the Python compiler for CMU CL checks all type assertions.  Type
checking (and other error checking) can be suppressed by the compilation
policy specified in OPTIMIZE declarations.  This means that you can write
programs with declarations, getting the benefit of automatic checking of
type assertions.  In code that is performance critical, and that you are
confident of, you may turn off type checking (though due to level of type
check optimization, this may only get you 20%)

In addition to the basic type assertions given for efficiency reasons (such
as SIMPLE-STRING, FIXNUM), type assertions such as (OR NODE (MEMBER :END))
are also useful (and checked precisely.)  Python also gives compile-time
type warnings when its extensive type inference detects code that cannot
execute without an error.

> [3] equality doesn't work the way it does in other lisps & is 
>a nasty source of problems (e.g. portability) 

Meaning what?  People are definitely unstatisfied with EQUAL and EQUALP, but
fixing it right in this round seemed too difficult, so they punted.

>[4] there's no multi-threading

Yes, this definitely seems to be necessary in a viable lisp product.  Both
Lucid and Franz have it, and we are planning it for CMU CL.  This is all
within a preemptive lightweight-process model, like stack groups.  I think
that there is hope for standardization on this once people realize that you
can have lightweight processes without having to discover the solution to
expressing parallel algorithms.  Stack groups are used for multiple
concurrent interaction activities, not parallelism.

As to CALL/CC: it is elegant, and seems to be useful for some purposes, but
it doesn't address the same problems as stack groups.  You need preemptive
scheduling, and this means that you need synchronization primitives.  A
somewhat different ball of wax...

>[5] I don't like keyword parameters/optional parameter etc and dislike
>paying a cost for a feature I don't use

Well, I agree that optionals are mostly a loss, but you shouldn't be paying
any penalty other than compiler complexity.  I am convinced keyword args are
a big win for complex interfaces.  Positional arguments start to lose it
above 5 or 6 arguments.  And keywords allow you to build general functional
interfaces, as in the generic sequence functions.  When combined with good
inline expansion and constant-folding (as in Python), you get inline code
as tense as any you could get from LOOP, etc.  Python has no special magic
for the keyword-accepting list functions such as MEMBER, we just make the
inline expansions avaliable.

>[7] the lack of temporary (transparent) hash-tables 
>(c.f. Poplog CL, T) [8] the lack of destroy actions (c.f. Poplog CL)

Weak pointers and finialization are a win, and I expect you will see them in
more CL implementations.  CMU has weak pointers.

>[12] the lack of a updaters (see Pop11) for an update
>model that can support abstraction (SETF is a compile-time thing).

I think that SETF functions address this issue.  It would have been nice if
they had gone with them from the beginning.

>
>On the other hand, CL gets my thumbs up for {1} putting lexical binding
>on the map once and for all {2} its excellent number model {3} the

Yes to lexical binding and numbers.  And with CMU CL, I think that serious
number-crunching is starting to become a real possiblity in CL.

>size of its library (implementations all too often are sloppy in doing
>this, see Poplog CL for an implementation that avoids dragging in every bit
>of code (even the different cases of FORMAT are separately loaded)) which
>I suppose could be better thought out 

Part of the problem here is that most Unix'es are stuck with crude VM
technology. If you have a good VM system, having *everything* in one
huge mapped file really works pretty well.  Auto-loading is just slow
software demand-paging with no ability to page out.  It is possible that the
VM locality might be better, but with good system building tools, the
locality of a mapped system can be reasonable as well.  Running under Mach
has made us rather complacent about this, since a 24 meg program can start
up instantly.

>{4} for (mostly) successfully uniting 
>the Lisp community behind a single Lisp.

I also think this is quite important.  There is a positive role for
standardization in technological progress, even though most standards are
technically obsolescent by the time they are ready.  I think that CL will
have a longer life than many expect due to the great flexibility of Lisp.
It is now quite different from what it was, and it can become new things.
The next big catastrophe in progamming languages after object-orientation
will be parallelism.  We shall see...

  Robert A. MacLachlan (···@cs.cmu.edu)
From: Chris Dollin
Subject: Re: Types in Common Lisp (was Re: In defense of call/cc)
Date: 
Message-ID: <KERS.91Feb19153205@cdollin.hpl.hp.com>
Steve Knight said:

   > [3] equality doesn't work the way it does in other lisps & is 
   >a nasty source of problems (e.g. portability) 

and got various responses on the lines "but how would the system know to do any
better?". Surely the answer is "it doesn't; but the programmer does, and they
can tell the system what to do"; viz, attatch to datatypes a user-definable
equality procedure with a suitable default (probably structural equality).

Thus a defstruct would have a equality-predicate option (eqp?) and all the
system types would come equipped with standard-defined eqp's (which might be
indirected through symbols, so they could be changed if one felt reckless). The
standard function eqp could then just dispatch through a type-indexed table.

The only question that comes to mind is whether the indexing should be on one
argument or two, and if on one, whether we assume that two objects of different
data-type can ever be equal - sorry eqp - eg is (eqp 1.0 1) true or not?


--

Regards, Kers.      | "You're better off  not dreaming of  the things to come;
Caravan:            | Dreams  are always ending  far too soon."
From: Barry Margolin
Subject: Re: Types in Common Lisp (was Re: In defense of call/cc)
Date: 
Message-ID: <1991Feb20.040855.21375@Think.COM>
In article <··················@cdollin.hpl.hp.com> ····@hplb.hpl.hp.com (Chris Dollin) writes:
>
>Steve Knight said:
>
>   > [3] equality doesn't work the way it does in other lisps & is 
>   >a nasty source of problems (e.g. portability) 
>
>and got various responses on the lines "but how would the system know to do any
>better?". Surely the answer is "it doesn't; but the programmer does, and they
>can tell the system what to do"; viz, attatch to datatypes a user-definable
>equality procedure with a suitable default (probably structural equality).
>
>Thus a defstruct would have a equality-predicate option (eqp?) and all the
>system types would come equipped with standard-defined eqp's (which might be
>indirected through symbols, so they could be changed if one felt reckless). The
>standard function eqp could then just dispatch through a type-indexed table.

I started writing a paper for Lisp Pointers on this subject a couple of
years ago, but never got it polished enough for submission.  I covered both
equality and copying.  My proposal used CLOS to define the generic
operations.

One issue I grappled with is that equality and copying are dependent on
context, not just types.  For instance, if you pass a pair of conses to an
equality predicate, the equality algorithm depends on whether you are
treating it as a tree, an association list, or a set.

My solution was to allow particular types to define keyword arguments so
that they can be further parametrized.  It's then up to the caller of the
function to realize that the data structure contains types that need such
parametrization, and supply the options if necessary.  For instance, the
method for CONSes accepts keywords that specify a predicate for determining
whether an object is a leaf node and how leaf nodes should be compared.

>The only question that comes to mind is whether the indexing should be on one
>argument or two, and if on one, whether we assume that two objects of different
>data-type can ever be equal - sorry eqp - eg is (eqp 1.0 1) true or not?

Since CLOS allows specialization on multiple arguments, I didn't constrain
this, although I think I specified that the default primary method treats
objects of different types as unequal.


>
>
>--
>
>Regards, Kers.      | "You're better off  not dreaming of  the things to come;
>Caravan:            | Dreams  are always ending  far too soon."


--
Barry Margolin, Thinking Machines Corp.

······@think.com
{uunet,harvard}!think!barmar
From: Jeff Dalton
Subject: Re: Types in Common Lisp (was Re: In defense of call/cc)
Date: 
Message-ID: <4200@skye.ed.ac.uk>
In article <··················@cdollin.hpl.hp.com> ····@hplb.hpl.hp.com (Chris Dollin) writes:
>
>Steve Knight said:
>
>   > [3] equality doesn't work the way it does in other lisps & is 
>   >a nasty source of problems (e.g. portability) 
>
>and got various responses on the lines "but how would the system know
>to do any better?". 

What I was, and still am, wondering is how it "doesn't work the way
it does in other Lisps" and why it's a "nasty source of problems
(eg, portability)".  Especially since EQL works reasonably for
numbers, unlike EQ in most Lisps.  (EQ doesn't work "reasonably"
for numbers in CL either, but CL has EQL while most other Lisps
did not.)

You've answered a different question, namely "How is equality handled
better in Pop than in Lisp?"  (I agree, by the way, that Pop handles
equality in a good way and that Common Lisp would be better if it
had an extendable equality predicate that could be specialized by
type.)

>Surely the answer is "it doesn't; but the programmer does, and they
>can tell the system what to do";

But the programmer _can_ tell the system what to do, by writing a
function and calling it, by writing a generic function and lots of
methods, etc.

> viz, attatch to datatypes a user-definable
>equality procedure with a suitable default (probably structural equality).

OK, here's something fairly close:

(defgeneric nice-equality (x y)
  (:method ((r string) (s string))
     (string= r s))                      ;case-sensitive
  (:method (x y)
     (equalp x y)))

This is, of course, another case where CL would probably have been
a better language if an object system had been included from the
start.

-- jd
From: Jeff Dalton
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <4154@skye.ed.ac.uk>
In article <·······@otter.hpl.hp.com> ···@otter.hpl.hp.com (Steve Knight) writes:
> It's a bit of a shame CL provides no such facilities -- but it gives
> those of us who find CL the least acceptable Lisp around a really big
> stick to beat up our managers with :-)

> PS. My own personal gripes with CL are 

CL is not without it's problems, but it's here now and it's usable.
Scheme is better for some things, not for others (though it is being
improved).  The idea that CL is especially bad as Lisps go seems to
me to be wrong.  It's better in many ways than quite a few other
Lisps.  Really.

In any case, most of your gripes are about things that don't
particularly both me, and here's why.

   >[1] the type system is far too loose

In what way?  So general a complaint tells me almost nothing.

BTW, my experience has been that when two people make the same
complaint about CL they often mean different things by it 
(especially if they strongly dislike CL).

   >[2] user-defined types aren't guaranteed to be distinct from
   >previous types

They aren't?  (Of course, it's trivially true that they aren't,
because they will be a subtype of T.)

Some implementations have made defstructs subtypes of vector.
Maybe that was a bad thing.  In any case, X3J13 decided to 
make array and defstruct types disjoint.

   >[3] equality doesn't work the way it does in other lisps & is 
   >a nasty source of problems (e.g. portability) 

What?

Perhaps you mean not the same as in _some_ other Lisps.

In any case, I don't know what problems of equality you have in
mind.  If you mean EQ on numbers, that's a problem in every Lisp
I know anything about.  The big difference in CL is the addition 
of EQL.

   >[4] there's no multi-threading

True.

   >[5] I don't like keyword parameters/optional parameter etc and dislike
   >paying a cost for a feature I don't use 

What cost?  Especially if you don't use them.

There's some cost in implementation size, but for most calls to
built-in functions that use keywords the keywords can be compiled
away.

BTW, I partially disagree with ····@cs.cmu.edu (Rob MacLachlan), who
wrote:

   Well, I agree that optionals are mostly a loss, but you shouldn't
   be paying any penalty other than compiler complexity.  I am
   convinced keyword args are a big win for complex interfaces.
   Positional arguments start to lose it above 5 or 6 arguments.

One way to handle the many argument problem would be to make
keywords part of the programming environment (eg, the editor
would let you specify parameters that way) rather than having
them exist at run-time.  All parameters could then be specified
by keyword rather than positionally.  However, there would be
difficulties with APPLY (either there are keywords in the list
of arguments or else we're back to positional) and we'd be moving
towards a situation where source code wasn't readable without
the aid of some clever environment.  So in this (as in many issues),
I think CL has made a reasonable compromise.

   >[6] empty list and false still aren't distinct

True, but not as much a problem in practice as theory might indicate.

   >[7] the lack of temporary (transparent) hash-tables (c.f. Poplog CL, T) 

I agree, but there wasn't enough agreement on what the user-level
facilities should be.

   >[8] the lack of destroy actions (c.f. Poplog CL)

I think it's reasonable not to have them, although they might be
useful from time to time.  The model in Lisp is often that objects
exist forever, so they'd never be destroyed.  This model is probably
too simple, but then Lisp (even Common Lisp) is basically a simple
language.

   >[9] the poor choice of initial syntax features (defun, do) 

This is largely a matter of taste.  DEFUN and DO are in CL because
they were in MacLisp and because there wasn't a strong enough reason
to get rid of them.  Many Lisps have something like DEFUN (eg, DE)
which I don't think is a big improvement.  Some Lisp have only things
that are much worse than DEFUN.

   >[10] the dual name spaces for functions and values (bletch!)
   >(use type hints to avoid the usual complaints of efficiency 
   >(see Pop11))

One thing that I've often found strange is that some people seem
to think it's more important to have a single name space than to
have lexical scoping.  I would have thought the opposite was true.

Dual name spaces is not an efficiency issue as far as calling is
concerned, if you're willing to use up a little space and have
assignment to globals be a bit slower.  There are other reasons to
put functions in a separate name space.  I think that, on balance,
it's better to have one name space for both functions and variables,
but there is nonetheless something to be said for the other side.

But having to put in a "hint" that something was a function would be a
pain.

   >[11] type assertions are unchecked & are effectively promises
   >rather than hints (I'd like both, thanks) 

I agree that CL declarations are a problem, because in practice one
may want them for two different purposes (to check and to avoid
checking).  Some implementations (eg, CMU CL) seem to handle this
well.

But how is a "hint", if it does anything, different from a "promise".
(It's certainly not obvious just from the name "hint", which is all
I know about them.)

   >[12] the lack of a updaters (see Pop11) for an update model that
   >can support abstraction (SETF is a compile-time thing).

I agree that something like this is a good idea.  (See previous messages.)

-- jeff
From: Andrew L. M. Shalit
Subject: Re: In defense of call/cc (and a plug for T)
Date: 
Message-ID: <ALMS.91Feb19173808@ministry.cambridge.apple.com>
In article <····@skye.ed.ac.uk> ····@aiai.ed.ac.uk (Jeff Dalton) writes:

   BTW, I partially disagree with ····@cs.cmu.edu (Rob MacLachlan), who
   wrote:

      Well, I agree that optionals are mostly a loss, but you shouldn't
      be paying any penalty other than compiler complexity.  I am
      convinced keyword args are a big win for complex interfaces.
      Positional arguments start to lose it above 5 or 6 arguments.

   One way to handle the many argument problem would be to make
   keywords part of the programming environment (eg, the editor
   would let you specify parameters that way) rather than having
   them exist at run-time.  All parameters could then be specified
   by keyword rather than positionally.  However, there would be
   difficulties with APPLY (either there are keywords in the list
   of arguments or else we're back to positional) and we'd be moving
   towards a situation where source code wasn't readable without
   the aid of some clever environment.  So in this (as in many issues),
   I think CL has made a reasonable compromise.

In addition to the problems you mention, this would introduce
order-of-evaluation problems.

Common Lisp specifies left-to-right evaluation, so the following
two calls are different:

(frob :a foo :b (incf foo))
(frob :b (incf foo) :a foo)

    -andrew
--