From: Andrew Catton
Subject: extensible macros
Date: 
Message-ID: <65489e51.0108201535.333e5cf2@posting.google.com>
Greetings all,

I was wondering whether anyone knows of any work on extensible macros,
allowing one to compose macro behavior in some way.  For example,
rather than having to define macros like defun-memoized, defun-traced,
defun-memoized-traced, etc, I'd like to be able to something like:

(defun :memoized :traced foo () ()))

Where the memoized and traced behavior is supplied by separate macros
(and you could add any other behavior to defun as well.

Thanks,

Andrew

From: Kent M Pitman
Subject: Re: extensible macros
Date: 
Message-ID: <sfwy9oe5lhi.fsf@world.std.com>
······@cs.ubc.ca (Andrew Catton) writes:

> I was wondering whether anyone knows of any work on extensible macros,
> allowing one to compose macro behavior in some way.  For example,
> rather than having to define macros like defun-memoized, defun-traced,
> defun-memoized-traced, etc, I'd like to be able to something like:
> 
> (defun :memoized :traced foo () ()))
> 
> Where the memoized and traced behavior is supplied by separate macros
> (and you could add any other behavior to defun as well.

There are two parts to this question.

The first part is whether you could create a general-purposes syntax allowing
named entities to have some transformational effect.  The trivial answer is
yes.

The second part is what the input and output type of that
transformation is.  Most DEFUN-ish kinds of things result in a number
of expressions not all of which are function definitions.  Some are
other auxiliary forms.  The question then becomes whether the output
of one is possible to pipe in as the input to another, and a related
question becomes whether they are commutative.  If they are not
commutative, then the problem becomes how to order them, given that
they are defined independently.  (I suspect that you'll find that a
memoized traced function is, for example, different than a traced
memoized one.)

In any case, I really don't think the idea is remarkably general if you
allow it to be fully generalized, even though you might find it fun to
create a set of particular options that you find it useful to connect
pragmatically.  I did have a generalized defining form I used for Macsyma
(in the proprietary branch) which allowed one to say about 16 things I
expected people to want and it would combine all the effects in a 
known-to-work ordering because I had total knowledge of the things and
how they combined at macro define-time... but I don't think the idea
generalized.
From: Andrew Catton
Subject: Re: extensible macros
Date: 
Message-ID: <65489e51.0108202206.7bfe5b86@posting.google.com>
Kent M Pitman <······@world.std.com> wrote in message news:<···············@world.std.com>...
> The second part is what the input and output type of that
> transformation is.  Most DEFUN-ish kinds of things result in a number
> of expressions not all of which are function definitions.  Some are
> other auxiliary forms.  The question then becomes whether the output
> of one is possible to pipe in as the input to another, and a related
> question becomes whether they are commutative.  If they are not
> commutative, then the problem becomes how to order them, given that
> they are defined independently.  (I suspect that you'll find that a
> memoized traced function is, for example, different than a traced
> memoized one.)
>

In a research project dealing with transformations in Java: (I know...
I know ;-), we've implemented an ordering system which requires the
transformations to register the preconditions they require and the
postconditions they satisfy.  This was necessary because the effects
of these transformations could be non-local -- but that may be
overkill for local transforms like macros.  I might be willing to
settle here for "transform in the order the arguments are given",
along with some sort of error-checking framework (a little fuzzy on
this for now)... obviously this forces those using composed macros to
be careful with the order they provide, and maybe this requires them
to be thinking too much about the inner workings of a particular
macro... on the other hand, is this too much to ask?  Developers must
think about proper ordering when composing function calls and must
consider whether the pipeline makes sense.  Admittedly, people usually
have a harder time reasoning about transformations, so the demands are
probably greater here (although no one would be *forced* to use
composition if they were uncomfortable with it).

A further question:  How often do people find themselves writing
macros to compose the behavior of simpler macros (with-X-and-Y-and-Z)?
 And how much duplication of code results from this?  If this is not
occurring, does that mean that macros are being used less effectively
than they could be, or that composition is simply not that useful?
 
> In any case, I really don't think the idea is remarkably general if you
> allow it to be fully generalized, even though you might find it fun to
> create a set of particular options that you find it useful to connect
> pragmatically.  I did have a generalized defining form I used for Macsyma
> (in the proprietary branch) which allowed one to say about 16 things I
> expected people to want and it would combine all the effects in a 
> known-to-work ordering because I had total knowledge of the things and
> how they combined at macro define-time... but I don't think the idea
> generalized.

I do understand your suspicions, since we'd be introducing a new
source of errors, and perhaps finding a reasonable order among a given
group of macros would be too painful... Maybe it is something you
special-case when you need it, as you did with Macsyma, but of course
a generalized framework would be nice, if it was practical...
From: Kent M Pitman
Subject: Re: extensible macros
Date: 
Message-ID: <sfwu1z1hd2q.fsf@world.std.com>
······@cs.ubc.ca (Andrew Catton) writes:

> Kent M Pitman <······@world.std.com> wrote in message news:<···············@world.std.com>...
>
> Maybe it is something you
> special-case when you need it, as you did with Macsyma, but of course
> a generalized framework would be nice, if it was practical...

Actually, if I recall correctly, Symbolics Genera did have a
general-purpose encapsulation facility.  You might start by looking at
what they did.  (I think I saw someone advertising Symbolics manuals
in a recent newsgroup post. :-)  

But there's a difference between encapsulating functions in series and
encapsulating the definition code. My remarks above were about the problem
that a typical expansion of defun may look like:

 `(progn (eval-when (:compile-toplevel)
           (inform-compiler-about-arguments 'foo '(x &rest y)))
         (setf (source-file 'foo) #P"foo.lisp")
         (setf (symbol-function 'foo) #'(lambda (x &rest y) ...)))

It's this that I'm worried about encapsulating.  The algebra of combining
definitional forms is very different than the algebra of combining
what goes into the function cell.

e.g., instead of the setf of symbol-function above, it might be that 
this should go into the function cell and find the part that corresponds
to the previous defun-definition, leaving traces and other things in place.

Or it may be that that's a bad idea since some "advice" encapsulating
the old definition was maybe there to fix bugs in the definition you are
replacing.