From: ken yip
Subject: &rest and &key
Date: 
Message-ID: <1992Feb11.224325.10096@cs.yale.edu>
What is the right way to define a function that can take multiple
arguments and specified keywords?  E.g., I'd like to do something like:

(defun foo (x &rest y &key (combiner #'+))
  (apply combiner x y))

and call it with:  (foo 2 3 :combiner #'*)

But the lisp evaluator will complain about this.

From: Emmanuel Baechler
Subject: Re: &rest and &key
Date: 
Message-ID: <BAECHLER.92Feb12104639@liasun6.epfl.ch>
In article <······················@cs.yale.edu> ·······@CS.YALE.EDU (ken yip) writes:

>   What is the right way to define a function that can take multiple
>   arguments and specified keywords?  E.g., I'd like to do something like:
>
>   (defun foo (x &rest y &key (combiner #'+))
>   (apply combiner x y))
>
>   and call it with:  (foo 2 3 :combiner #'*)
>
>   But the lisp evaluator will complain about this.

Using both rest and key args in the same arglist is really painful. I
don't remember all the details, but I tried it once. One of the problem
is that you can't have an odd number of arguments. It's finally much
easier to specify only the rest argument and to process explicitely
the keys in the function. It gives somethig like that:

(defun get-combiner (arglist) ....

(defun conbiner-supplied-p (arglist) ....

(defun foo (x &rest arglist)
  (let* ((combiner (get-combiner arglist))
         (args (if (combiner-supplied-p arglist)
                   (remove :combiner (remove combiner arglist))
                   arglist)))
  ...

Of course, the two auxilliary functions can be placed into foo with a
flet or a labels.

--
Emmanuel Baechler.
Laboratoire d'Intelligence Artificielle
Ecole Polytechnique Federale de Lausanne
MA-Ecublens
CH-1015 Lausanne	Switzerland
Tel.: ++41-21-693-2732
e-mail: ········@liasun6.epfl.ch
Standard Disclaimer

Ban the bomb. Save the world for conventional warfare.
From: Barry Margolin
Subject: Re: &rest and &key
Date: 
Message-ID: <kpj6vvINNge3@early-bird.think.com>
In article <······················@liasun6.epfl.ch> ········@lia.di.epfl.ch (Emmanuel Baechler) writes:
>In article <······················@cs.yale.edu> ·······@CS.YALE.EDU (ken yip) writes:
>>   What is the right way to define a function that can take multiple
>>   arguments and specified keywords?  E.g., I'd like to do something like:
>>
>>   (defun foo (x &rest y &key (combiner #'+))
>>   (apply combiner x y))
>>
>>   and call it with:  (foo 2 3 :combiner #'*)

My first recommendation is to find some other calling sequence.

What's wrong with this:

(defun foo (args &key (combiner #'+))
  (apply combiner args))

I'm not even sure what you expect this to do.  From looking at the lambda
list, I would expect Y to be bound to (3 :COMBINER #<FUNCTION *>), but from
the function body it appears that you expect the rest-list to have the
keyword/value pairs removed (since + and * don't allow keywords in their
arguments), i.e. Y would be bound to (3).

The normal reason to use both &rest and &key is when you also have
&allow-other-keys, e.g.

(defun bar (x &rest options &key option1 option2 ... &allow-other-keys)
  ...
  (apply #'some-other-function x :allow-other-keys t options)
  ...)

This permits BAR to take a combination of options that it uses directly and
other options that are used by SOME-OTHER-FUNCTION.  The &allow-other-keys
tells BAR not to complain about options that it doesn't recognize, and
:ALLOW-OTHER-KEYS T tells SOME-OTHER-FUNCTION to ignore the options that it
doesn't recognize.

>Using both rest and key args in the same arglist is really painful. I
>don't remember all the details, but I tried it once. One of the problem
>is that you can't have an odd number of arguments. It's finally much
>easier to specify only the rest argument and to process explicitely
>the keys in the function. It gives somethig like that:
>
>(defun get-combiner (arglist) ....
>
>(defun conbiner-supplied-p (arglist) ....
>
>(defun foo (x &rest arglist)
>  (let* ((combiner (get-combiner arglist))
>         (args (if (combiner-supplied-p arglist)
>                   (remove :combiner (remove combiner arglist))
>                   arglist)))

Those uses of REMOVE aren't very safe, as there may be multiple instances
of the combiner object in ARGLIST, e.g. if arglist is

	(2 3 :COMBINER #'- :OTHER-FUNCTION #')

the result will be that ARGS is set to

	(2 3 :OTHER-FUNCTION)

which is not what you intended.

Safer would be something like:


(defun partition-args (keywords arglist)
  "Partition the argument list around the first argument that is in KEYWORDS."
  (let ((first-key (position-if #'(lambda (elt) (member elt keywords))
				arglist)))
    (values (subseq arglist 0 first-key)
	    (subseq arglist first-key))))

(defun foo (x &rest arglist)
  (multiple-value-bind (args options)
      (partition-args '(:combiner) arglist)
    (apply #'foo-internal x args options)))

(defun foo-internal (x args &key (combiner #'+))
  (apply combiner x args))

If you try to parse the keyword arguments yourself, you'll almost certainly
make mistakes.  Besides the mistake I pointed out above, you could also
have problems when the value of one option happens to be a keyword that's a
valid option name, e.g. if FOO also had a :NAME parameter it might be
called with:

	(foo 1 2 3 :name :combiner :combiner #'*)

And it's a pain to implement the rule that a keyword may appear multiple
times in an argument list but only the rightmost one has an effect and the
rest should be ignored (Emmanuel didn't show the implementation of
GET-COMBINER, so it might implement the "rightmost" rule, but he clearly
doesn't remove all of the :combiner/value pairs from the argument list (he
removes all the :combiner keywords, but not all the values).
-- 
Barry Margolin, Thinking Machines Corp.

······@think.com
{uunet,harvard}!think!barmar
From: Vincent Delacour
Subject: Re: &rest and &key
Date: 
Message-ID: <DELACOUR.92Feb12151828@waxwing.parc.xerox.com>
In article <············@early-bird.think.com> ······@think.com (Barry Margolin) writes:

[about &rest, &key and &allow-other-keys]

I've always wondered what could justify the presence of such a mess in
the language: runtime keyword are a very fine way to mix syntax with
execution, which is pretty weird, and increases the language's
inter-tanglement of execution, load, and compile-time.


> This permits BAR to take a combination of options that it uses directly and
> other options that are used by SOME-OTHER-FUNCTION.  The &allow-other-keys
> tells BAR not to complain about options that it doesn't recognize, and
> :ALLOW-OTHER-KEYS T tells SOME-OTHER-FUNCTION to ignore the options that it
> doesn't recognize.

As I understand it, the usage of this idiom is typically to "compose"
top-level commands, each of which offer the comfort of (primarily
syntactic) keyword binding. Of course in such a situation, the cost of
this weirdness is very low, because it is typically paid a few times
only in the outer, top-level, user-interface layers of the program. 

Now, considering that a typical application does not always have a
Lisp Listener for the user to type in commands (there are better user
interfaces!), what justifies having runtime keywords *in the
language*, and not rejected as an environment-bound issue ? Notice
that anyway, (except on a Lisp machine) it doesn't help a lot for
launching an application as a shell command and passing it options,
arguments, etc. Other machinery has to be used for this. 

Could anybody give figures about the actual usage of runtime (that is
computed at runtime) keywords ??? Of course, I suspect that a fair
amount of user-level commands might be built using the idiom in a
Lisp-machine environment, but I'd like to insist that these commands
are not part of the user code (so the user doesn't want to pay if they
are written the wrong way, and if the impact on the language is
negative). So my question could be reformulated in one of these ways:

	. How often are runtime keywords really used, compared for
	example to the total number of lines involved? 

	. How many lines of code does it typically save when one uses
	the aforementioned idiom (this is extremely low of course, and
	as I pointed out no argument of efficiency applies). 

Unless somebody proves the contrary, I'd say that the degree of
utility of runtime keywords is very low. The negative impact
of the idiom on the whole language is far too high, IMO runtime
keywords should be considered an obsolete (or at least deprecated)
feature of Common Lisp. 

	V. Delacour
From: Jeff Dalton
Subject: Re: &rest and &key
Date: 
Message-ID: <6181@skye.ed.ac.uk>
In article <······················@waxwing.parc.xerox.com> ········@waxwing.parc.xerox.com (Vincent Delacour) writes:
>
>[about &rest, &key and &allow-other-keys]
>
>I've always wondered what could justify the presence of such a mess in
>the language: runtime keyword are a very fine way to mix syntax with
>execution, which is pretty weird, and increases the language's
>inter-tanglement of execution, load, and compile-time.

>Now, considering that a typical application does not always have a
>Lisp Listener for the user to type in commands (there are better user
>interfaces!), what justifies having runtime keywords *in the
>language*, and not rejected as an environment-bound issue ?

Keywords need a run-time existence so that you can use them with
APPLY, in something like the following:

  (defun f (g &rest args)
    ... (apply g args) ...)

If I want to be able to say

  (f #'g ...)

and pass keyword parameters to G, keywords have to be there at
run-time.

Almost anything can be seen as an environment issue.  Why try to
give a langauge a reasonable syntax at all?  Why not let the
environment provide the readable syntax?

But as far as I'm concerned, the syntax I use to write code and that's
used for the code I read is part of the language.  It has a run-time
existence because Lisp code can be read in as data.

If the env provided keywords, then either every env would provide
them, and they'd effectively be part of the langauage's syntax,
or only some environment's would provide them and we'd have trouble
reading the code in other envs.

But the balance might shift against run-time kwds if not for the
APPLY issue, even though I'm not sure I'd agree with the shift.

-- jeff
From: Vincent Delacour
Subject: Re: &rest and &key
Date: 
Message-ID: <DELACOUR.92Feb13133656@waxwing.parc.xerox.com>
In article <····@skye.ed.ac.uk> ····@aiai.ed.ac.uk (Jeff Dalton) writes:

>
>   Keywords need a run-time existence so that you can use them with
>   APPLY, in something like the following:
>
>     (defun f (g &rest args)
>       ... (apply g args) ...)
>
>   If I want to be able to say
>
>     (f #'g ...)
>
>   and pass keyword parameters to G, keywords have to be there at
>   run-time.

That's right. However, most uses of apply are programming errors in
the sense that they are not robust against explicitly stated
implementation-dependent limits. Maybe apply too should be considered
'guts' of the programming system (might be useful in certain parts of
the development environment)...

 
>
>   Almost anything can be seen as an environment issue.  Why try to
>   give a langauge a reasonable syntax at all?  Why not let the
>   environment provide the readable syntax?
>
>   But as far as I'm concerned, the syntax I use to write code and that's
>   used for the code I read is part of the language.  It has a run-time
>   existence because Lisp code can be read in as data.


I think this way of putting it is a bit misleading: when programs are
read in as data, they are usually made available to macro-expansion
functions, then compiled, then loaded, then executed (and even the
eval function can proceed that way). So I would not say that programs
as lists have a runtime existence, but rather that programs can be
given as lists to the compiler.

[...]

>   But the balance might shift against run-time kwds if not for the
>   APPLY issue, even though I'm not sure I'd agree with the shift.
>

Yes, I think that the question is legitimate and should be debated
when the standard draft is out (I'd rather be for shifting runtime
keywords out, but other might convincely legitimate their existence). 

>   -- jeff
>


As a matter of facts, many points appear not to have been cleaned by
the standard committee: Common Lisp still has a strong taste of
ancient maclisp, where there was no clear distinction between
environment and language, and where some 'clever' rplac's could do
great things. I think we should not let the language be standardized
without some further discussion on certain points.

	Vincent
From: Barry Margolin
Subject: Re: &rest and &key
Date: 
Message-ID: <kpmv1jINNeeo@early-bird.think.com>
In article <······················@waxwing.parc.xerox.com> ········@waxwing.parc.xerox.com (Vincent Delacour) writes:
>Yes, I think that the question is legitimate and should be debated
>when the standard draft is out (I'd rather be for shifting runtime
>keywords out, but other might convincely legitimate their existence). 

If you think such a proposal has even the slightest chance of being
adopted, you're fooling yourself.  Such a change would invalidate an
enormous amount of code, and I don't think a mechanical translator could
convert such code to whatever new style you're proposing.

>As a matter of facts, many points appear not to have been cleaned by
>the standard committee

You seem to misunderstand the charter of X3J13.  We didn't set out to
design a new, "clean" language.  Our purpose was to clarify and
disambiguate the language defined in CLtL, and add an object-oriented
facility, condition handling, better iteration, and a window interface (we
eventually dropped this last one).  The main goal of a language standard is
portability, not design purity; that's for the academicians (that's why
Scheme exists).  We did make some incompatible changes for esthetic
reasons, but they were mostly things like changing names of things to be
consistent, and converters and compatibility packages for these kinds of
changes are relatively straightforward to implement.  We also made some
incompatible changes (e.g. to PROVIDE and REQUIRE) where the language
defined by CLtL was simply impossible to implement portably.
-- 
Barry Margolin, Thinking Machines Corp.

······@think.com
{uunet,harvard}!think!barmar
From: Vincent Delacour
Subject: Re: &rest and &key
Date: 
Message-ID: <DELACOUR.92Feb14115040@waxwing.parc.xerox.com>
In article <············@early-bird.think.com> ······@think.com (Barry Margolin) writes:

>   >Yes, I think that the question is legitimate and should be debated
>   >when the standard draft is out (I'd rather be for shifting runtime
>   >keywords out, but other might convincely legitimate their existence). 
>
>   If you think such a proposal has even the slightest chance of being
>   adopted, you're fooling yourself.  Such a change would invalidate an

Could you please describe the standardization process? How is one
supposed to interact with it at the draft review stage? 

>   enormous amount of code, and I don't think a mechanical translator could
>   convert such code to whatever new style you're proposing.

I think you're quite easy at invoquing an argument such as "that
change would break an enormous amount of code"! may I recall that the
*new* specification (that is the comittee clarification) about
defstruct :constructor option makes *all* vendors represented in the
comittee *wrong* except for Symbolics ? On that particular point, the
clarification does certainly break a lot of existing code, and
actually goes against current practice.

You can not invoque such an argument in some cases and not in others.
In other words, I think that somebody who writes the text (because he
has *mandate* to do so) can not invoque the text to justify his positions. 

>
>   >As a matter of facts, many points appear not to have been cleaned by
>   >the standard committee
>
>   You seem to misunderstand the charter of X3J13.  We didn't set out to
>   design a new, "clean" language.  Our purpose was to clarify and
>   disambiguate the language defined in CLtL, 

In that respect, your attitude differs considerably from that of, say,
the C comittee. I don't think that "clarifying and disambiguating the
language defined in cltl" is in itself something sufficient to deserve
standardization.

For comparison, although C comittee apparently had the same purpose as
that of X3J13 ("to provide an unambiguous and machine-independent
definition of the language C"), they did not limit themselves to
cleaning Kernighan and Ritchie's text, but took into account some
weaknesses of the original language. 

Standardization is a very important step, because once done it has to
be dealt with for a very long time. If Common Lisp is standardized at
it is defined now, implementors are stuck to old-fashioned
implementation techniques, and more important, users (clients) are
stuck with some fondamental and *unnecessary* weirdnesses of the
language. 

Remember that some advocate having the standard out as soon as
possible because "industrials hesitate to use Lisp because of the lack of
standard"; but if the standard is bogus, they will quit definitively. 

The comittee is now acting with the standard exactly as described in
"The mythical man month" for software projects. It is time they
realize they have to "throw one" to get it right. Repeat, once the
standard is out, no further move is possible. The outcrying weakness
of current Common Lisp from the "team development" point of view will
prevent it definitively from being used widely for development. This
weakness comes from a backward symbols/package system and must be
addressed by the standard. 

>                                               and add an object-oriented
>   facility, condition handling, better iteration, and a window interface (we
>   eventually dropped this last one). 

This I do not consider "cleaning the language as defined in cltl". In
that respect the comittee has escaped towards high-level and side
issues (despite it's relatively limited workforce). They have spent
much time discussing extensions from an implementor's point of view,
fixing things here and there, without adressing the real needs of a
standard: to be appealing for industrials. 

The *existence* of the standard is not sufficient: I bet industrials
would like to find in it that "the language supports the notion of
team programming through a sound module system", and be convinced from
the specification that it is the case... I bet also that they would
not expect to find too much occurences of "is undefined" or "is
implementation-dependent" in chapters specifying basic features of the
language. 
 
>                                      The main goal of a language standard is
>   portability, not design purity; that's for the academicians (that's why

Academicians think that CL has many qualities and might be reworked in
a 99% compatible way, and become a good industrial standard. (maybe
industrial (potential) users should also be committed, not only
implementors)...

>   Scheme exists).  We did make some incompatible changes for esthetic
>   reasons, but they were mostly things like changing names of things to be
>   consistent, and converters and compatibility packages for these kinds of
>   changes are relatively straightforward to implement.  We also made some
>   incompatible changes (e.g. to PROVIDE and REQUIRE) where the language
>   defined by CLtL was simply impossible to implement portably.
>   -- 
>   Barry Margolin, Thinking Machines Corp.
>
>   ······@think.com
>   {uunet,harvard}!think!barmar
>

	Vincent Delacour
From: Ronald Bodkin
Subject: Re: CL standardization (A vendor comittee)
Date: 
Message-ID: <RJBODKIN.92Feb24201321@lister.lcs.mit.edu>
In article <·················@taunton.crl.dec.com> ···@taunton.crl.dec.com (Bob Kerns) writes:
   EVAL is a far, far bigger source of problems for program proving than
   the package system.
	Do people really use EVAL nowadays?  I would argue that just
as one should use the package system in a restricted way which
facilitates verification of correctness, that one should use higher
order functions instead of eval.
		Ron
From: Barry Margolin
Subject: Re: CL standardization (A vendor comittee)
Date: 
Message-ID: <kqjj8hINNf0e@early-bird.think.com>
In article <······················@lister.lcs.mit.edu> ········@theory.lcs.mit.edu (Ronald Bodkin) writes:
>	Do people really use EVAL nowadays?  I would argue that just
>as one should use the package system in a restricted way which
>facilitates verification of correctness, that one should use higher
>order functions instead of eval.

How do you implement a read-eval-print loop without using EVAL?

There aren't always better functions than EVAL for certain facilities.
Sometimes the only portable interface to some mechanism is a macro, and the
only way to invoke a macro at runtime is by using EVAL.  The right solution
is to standardize the primitive functions that underly the runtime effect
of macros, but since that hasn't been done you sometimes still need EVAL.

By the way, this is also the argument for having both DEFPACKAGE and the
package-manipulation functions.  On those occasions where runtime
manipulation of the package systems *is* necessary, the programmer
shouldn't be forced to EVAL a DEFPACKAGE form.
-- 
Barry Margolin
System Manager, Thinking Machines Corp.

······@think.com          {uunet,harvard}!think!barmar
From: Vincent Delacour
Subject: Re: CL standardization (A vendor comittee)
Date: 
Message-ID: <DELACOUR.92Feb26154347@waxwing.parc.xerox.com>
In article <············@early-bird.think.com> ······@think.com (Barry Margolin) writes:

   In article <······················@lister.lcs.mit.edu> ········@theory.lcs.mit.edu (Ronald Bodkin) writes:
   >	Do people really use EVAL nowadays?  I would argue that just
   >as one should use the package system in a restricted way which
   >facilitates verification of correctness, that one should use higher
   >order functions instead of eval.

   [...]
   There aren't always better functions than EVAL for certain facilities.
   Sometimes the only portable interface to some mechanism is a macro, and the
   only way to invoke a macro at runtime is by using EVAL.  The right solution

What the heck do you mean by "calling macros at runtime". This is
amazing. People are *really* more vicious than one would think!

   [...]
   -- 
   Barry Margolin
   System Manager, Thinking Machines Corp.

   ······@think.com          {uunet,harvard}!think!barmar


Vincent
From: Barry Margolin
Subject: Re: CL standardization (A vendor comittee)
Date: 
Message-ID: <kqpi35INNhj1@early-bird.think.com>
In article <······················@waxwing.parc.xerox.com> ········@waxwing.parc.xerox.com (Vincent Delacour) writes:
>What the heck do you mean by "calling macros at runtime". This is
>amazing. People are *really* more vicious than one would think!

There's no portable way to create a class other than by using the DEFCLASS
macro (not counting the MOP, which isn't going to be in the ANSI standard).
In order to create a class at runtime, I must use EVAL to invoke this
macro.  For example:

(defun make-a-class ()
  (format *query-io* "What should the class's name be? ")
  (let ((class-name (read)))
    (check-type class-name (and symbol (not null)))
    (eval `(defclass ,class-name () ()))))
-- 
Barry Margolin
System Manager, Thinking Machines Corp.

······@think.com          {uunet,harvard}!think!barmar
From: Jeff Dalton
Subject: Re: CL standardization (A vendor comittee)
Date: 
Message-ID: <6301@skye.ed.ac.uk>
In article <············@early-bird.think.com> ······@think.com (Barry Margolin) writes:
>In article <······················@waxwing.parc.xerox.com> ········@waxwing.parc.xerox.com (Vincent Delacour) writes:
>>What the heck do you mean by "calling macros at runtime". This is
>>amazing. People are *really* more vicious than one would think!
>
>There's no portable way to create a class other than by using the DEFCLASS
>macro (not counting the MOP, which isn't going to be in the ANSI standard).
>In order to create a class at runtime, I must use EVAL to invoke this
>macro.  For example:
>
>(defun make-a-class ()
>  (format *query-io* "What should the class's name be? ")
>  (let ((class-name (read)))
>    (check-type class-name (and symbol (not null)))
>    (eval `(defclass ,class-name () ()))))

1. All that's needed is the ability to construct source code and
   create functions.  Eg,

      (funcall (coerce `(lambda () (defclass ...))
                       'function))

   Of course, this sort of thing is equivalent to EVAL.  But does
   that show that it's as bad as some people think EVAL is, or that
   EVAL isn't as bad as they thought?

2. Macros are called at run time all the time.  Just type in an
   expression that contains a macro call.  You could eliminate 
   EVAL and COERCE-to-function completely, and macros would
   still be called at run time.  This is a consequence of Lisp
   being interactive.  There doesn't even have to be an
   interpreter, if the compiler is sufficiently cooperative.
From: Bob Kerns
Subject: Re: CL standardization (A vendor comittee)
Date: 
Message-ID: <RWK.92Feb28164317@taunton.crl.dec.com>
In article <············@early-bird.think.com> ······@think.com (Barry Margolin) writes:

   Date: 27 Feb 1992 11:15:49 GMT
   From: ······@think.com (Barry Margolin)

   In article <······················@waxwing.parc.xerox.com> ········@waxwing.parc.xerox.com (Vincent Delacour) writes:
   >What the heck do you mean by "calling macros at runtime". This is
   >amazing. People are *really* more vicious than one would think!

   There's no portable way to create a class other than by using the DEFCLASS
   macro (not counting the MOP, which isn't going to be in the ANSI standard).

By the way, Vincent, this isn't regarded as a feature.
The Meta-Object Protcol wasn't included only because we
didn't think we could standardize on it in the timeframe
we needed to get things done, not because we don't think
Lisps should provide it.

It IS well on its way to becoming a defacto standard, nontheless,
and that's good, because nobody thinks that the only way you
should be able to create a class is by evaling DEFCLASS!

But as long as you have EVAL, the user MIGHT call a macro
at runtime.
From: Jeff Dalton
Subject: Re: CL standardization (A vendor comittee)
Date: 
Message-ID: <6300@skye.ed.ac.uk>
In article <······················@waxwing.parc.xerox.com> ········@waxwing.parc.xerox.com (Vincent Delacour) writes:

>What the heck do you mean by "calling macros at runtime". This is
>amazing. People are *really* more vicious than one would think!

Give me a break.  It sounds like you want to eliminate
interactive systems.
From: Mark Tillotson
Subject: EVAL necessary? [was Re: CL standardization]
Date: 
Message-ID: <MARKT.92Mar4095937@wundt.harlqn.co.uk>
······@think.com (Barry Margolin) writes:

> How do you implement a read-eval-print loop without using EVAL?

By using COMPILE, of course!  Something like:

(let*  ((form  (read)))
  (shiftf +++ ++ + - form)
  (let* ((results  (multiple-value-list
                    (funcall (compile nil `(lambda () ,form))))))
;;;                  no need for EVAL
    (shiftf /// // / results)
    (shiftf *** ** * (car results))
    (dolist (res results)
       (print res))))

But of course (FUNCALL (COMPILE ...)) is just a different
implementation of the function EVAL.  I presume by EVAL you meant "any
function with the semantics of EVAL"

> There aren't always better functions than EVAL for certain facilities.
> Sometimes the only portable interface to some mechanism is a macro, and the
> only way to invoke a macro at runtime is by using EVAL.  The right solution
> is to standardize the primitive functions that underly the runtime effect
> of macros, but since that hasn't been done you sometimes still need EVAL.
 
If a macro is part of an interface, then you have to have the macro
loaded before using that interface, but I can't see why that should
ever lead to the need to invoke the macro-function at runtime... At
load-time, however, is a different matter. 

I would have thought that the need to use EVAL in application programs
was both rare, and a strong indication that something is wrong, or at
least hacked.  EVAL has its place as a development tool, as part of
the user interface rather than the language itself.  

... probably time to get out the asbestos newsreader  :-)

--

------------------------------------------------------
|\  /|          | ,  M. Tillotson       Harlequin Ltd. \
| \/ |  /\| |/\ |<   ·····@uk.co.harlqn  Barrington Hall,\
|    |  \_| |   | \  +44 223 872522       Barrington,      \
I came, I saw, I core-dumped...            Cambridge CB2 5RG \
My opinions, like my teeth, are all my own (and full of holes?)\
From: Barry Margolin
Subject: Re: EVAL necessary? [was Re: CL standardization]
Date: 
Message-ID: <krb4uaINNsai@early-bird.think.com>
In article <··················@wundt.harlqn.co.uk> ·····@harlqn.co.uk (Mark Tillotson) writes:
>I would have thought that the need to use EVAL in application programs
>was both rare, and a strong indication that something is wrong, or at
>least hacked.

You are correct.  Ideally, both functional and macro interfaces would be
provided for any useful operations.  The macro interface would be used when
defining things statically in source files, while the functional interface
would be used for dynamic creation by applications.  But empirically,
that's not the case.  For instance, ANSI/CLtL2 CL only provides macro
interfaces for creating new classes, structures, and type names.  EVAL is a
general hook that makes all those interfaces usable at run time.

I'm not saying this is a Good Thing.  But it's part of the flexibility and
power of Lisp.
-- 
Barry Margolin
System Manager, Thinking Machines Corp.

······@think.com          {uunet,harvard}!think!barmar
From: lou
Subject: Why not let the environment provide the readable syntax?
Date: 
Message-ID: <LOU.92Feb14100511@atanasoff.rutgers.edu>
In article <····@skye.ed.ac.uk> ····@aiai.ed.ac.uk (Jeff Dalton) writes:

   Almost anything can be seen as an environment issue.  Why try to
   give a langauge a reasonable syntax at all?  Why not let the
   environment provide the readable syntax?

Actually, one of the interesting things about Lisp is that in a way
this is already true.  The reader/prettyprinter provide (much of) the
external syntax, and can be seen as part of the "environment", taking
the environment to include the standard available programs and
libraries, just as awk, grep, et al are part of the Unix
"environment".  

The existence of this parser/unparser in the environment helps make
the use of "embedded languages" so natural in Lisp, just as the
analogous (but less convenient) facility provided by lexx/yacc in unix
encourages (but not as strongly) the use of "little languages" in Unix.

(An embedded language is a language for some very special purpose,
e.g. the commands in Interlisp for specifying what whould be printed
in what order on a source-code file.  Such a language acts as a
specialized sub-language extending lisp, and is implemented by a
simple interpretter written in lisp.  A "little language" is the same
idea, implemented as a compiler written using lexx/yacc or the
equivalent, and is an extension to the Unix environment - programs
written in the language are compiled and can be used with other unix
programs via pipes.)

In fact, the style of AI research that involves designing a specialize
"AI" language (e.g. planner) and using that language to write problem
solvers in was probably also encouraged by the availability of this
external <-> internal syntax conversion.
--
					Lou Steinberg

uucp:   {pretty much any major site}!rutgers!aramis.rutgers.edu!lou 
internet:   ···@cs.rutgers.edu
From: David V. Wallace
Subject: "little languages"
Date: 
Message-ID: <GUMBY.92Feb14075414@Cygnus.COM>
   Date: 14 Feb 92 15:05:11 GMT
   From: ···@cs.rutgers.edu (lou)

   The existence of this parser/unparser in the environment helps make
   the use of "embedded languages" so natural in Lisp, just as the
   analogous (but less convenient) facility provided by lexx/yacc in unix
   encourages (but not as strongly) the use of "little languages" in Unix.

Any time you write a macro you're extending the syntax of the
language.  Most lisp programs end up being of the form "define a set
of primitives for describing my domain; implement my program in them."
This is so natural that even using the unix-inspired name "little
languages" seems like overkill.

It's funny, and sort of sad, that the folks in the c/unix domain have
to stand on their head so often.  In that environment, even a "litttle
language" is a big deal.
From: Kevin Gallagher
Subject: Re: &rest and &key
Date: 
Message-ID: <43303@dime.cs.umass.edu>
In article <············@early-bird.think.com> ······@think.com (Barry Margolin) writes:

[while discussing doing your own keyword processing]

 >And it's a pain to implement the rule that a keyword may appear multiple
 >times in an argument list but only the rightmost one has an effect and the
 >rest should be ignored ...

Actually, it's the leftmost keyword that is used (CLtL2, p. 80).  This
rule is easy to implement using (for example) GETF.

Kevin Gallagher
·········@cs.umass.edu
From: Len Charest
Subject: Re: &rest and &key
Date: 
Message-ID: <1992Feb12.225012.20699@jpl-devvax.jpl.nasa.gov>
In article <······················@cs.yale.edu>, ·······@CS.YALE.EDU (ken yip) writes:
|> What is the right way to define a function that can take multiple
|> arguments and specified keywords?  E.g., I'd like to do something like:
|> 
|> (defun foo (x &rest y &key (combiner #'+))
|>   (apply combiner x y))
|> 
|> and call it with:  (foo 2 3 :combiner #'*)
|> 
|> But the lisp evaluator will complain about this

I'm assuming the result you want from the example is (* 2 3) = 6.
The problem is that &rest when used in conjunction with &key is simply shorthand for binding a lexical variable to a list of all the &key keyword-value pairs. Therefore, the &rest argument must be even in length and may not include arbitrary data. If you must define a function then you are restricted to something like
	(defun foo (args &key (combiner #'+))
	  (declare (list args)
	           (function combiner))
	  (apply combiner args))

However, if you can get away with a macro, then you can play with the 'shape' of the arglist:
	(defmacro foo ((x &rest y) &key (combiner #'+))
	  `(apply ,combiner ,x (list ,@y)))

such that (foo (2 3) :combiner #'*) expands into (apply #'* 2 (list 3)). Furthermore (foo (a b c d)) expands into (apply #'+ a (list b c d)) so the variables a, b, c and d *are* evaluated.
-- 
*
Len Charest, Jr.                                       ·······@ai-cyclops.jpl.nasa.gov
JPL Artificial Intelligence Group
*
From: ken yip
Subject: Re: &rest and &key
Date: 
Message-ID: <1992Feb13.001320.27159@cs.yale.edu>
Thanks for the half a dozen replies and suggestions that people
sent me.  The problem that leads to the original question is to
write a generalized cartesian product which can take any arbitrary
number of sets and a user-specifable combining function for combing
elements between sets.

The reason I don't want to bundle all arguments as a list is
purely stylistic.  It is the same reason that I don't want to
write:
	(+ '( 1 2 3 ))

I don't want to treat combiner as a required parameter because
it doesn't allow default.

I was hoping that there is some simple combination of lambda keywords
that does what I want.  For instance, a &rest* parameter that accumulates 
all the remaining arguments not matched by either required or &key arguments.

The solution I like most comes from Dan Rabin and Mark Kantrowitz. 
I am fleshing out their ideas with the following code:

;;;generalized product of n sets

(defun create-combiner (&optional (element-combiner #'list))
  #'(lambda (&rest l)
      (labels ((combine-2 (A B)
		 (mapcan #'(lambda(x)
			     (mapcar #'(lambda(y)
					 (funcall element-combiner x y))
				     B))
			 A)))
	(reduce #'combine-2 l))))

Then, I can do the standard cartesian product by:

	(funcall (create-combiner) A B C)

for some sets A, B, and C.  Or,

	(funcall (create-combiner #'append) A B C)	

to do a pairwise union of elements, which would be useful for
sets whose elements are sets.