From: Jacek Generowicz
Subject: Scheme macros
Date: 
Message-ID: <tyfad3foz03.fsf@pcepsft001.cern.ch>
[Don't want to start a flamefest, so I'm not crossposting this to
c.l.s]

In _Lambda: the ultimate "little language"_, Olin Shivers suggests
(second paragraph of section 9) that " ... Scheme has the most
sophisticated macro system of any language in this family ...", where
'this family' is explicitly stated to include Common Lisp.

Now, I don't know much about the Scheme macro system ... my impression
is that the main difference from CL's macros lies in Scheme's
so-called hygiene, which I (mis?)understand to be a mechanism for
preventing variable capture (which in CL is considered to be a
feature, not a bug, of the macro system (particularly given that
variable capture is less "problematic" in a Lisp-N, where N>1)).

My question is ... how should I interpret the claim that Scheme's
macro system is more "sophisticated" than CL's ?  Does it merely reflect
a preference for Scheme's "hygene", or is there something else behind
the statement ?

From: Paul Dietz
Subject: Re: Scheme macros
Date: 
Message-ID: <4034ED6D.4B172F77@motorola.com>
Jacek Generowicz wrote:

> My question is ... how should I interpret the claim that Scheme's
> macro system is more "sophisticated" than CL's ?  Does it merely reflect
> a preference for Scheme's "hygene", or is there something else behind
> the statement ?

One definition of sophisticated: 'lacking natural simplicity'.

Seems right to me.

	Paul
From: Barry Margolin
Subject: Re: Scheme macros
Date: 
Message-ID: <barmar-A81451.11541419022004@comcast.ash.giganews.com>
In article <···············@pcepsft001.cern.ch>,
 Jacek Generowicz <················@cern.ch> wrote:

> My question is ... how should I interpret the claim that Scheme's
> macro system is more "sophisticated" than CL's ?  Does it merely reflect
> a preference for Scheme's "hygene", or is there something else behind
> the statement ?

That may be part of it.  He could also be referring to Scheme's 
pattern-based macros using syntax-case.

-- 
Barry Margolin, ······@alum.mit.edu
Arlington, MA
*** PLEASE post questions in newsgroups, not directly to me ***
From: Tim Bradshaw
Subject: Re: Scheme macros
Date: 
Message-ID: <fbc0f5d1.0402200459.6a32806a@posting.google.com>
Jacek Generowicz <················@cern.ch> wrote in message news:<···············@pcepsft001.cern.ch>...
> 
> My question is ... how should I interpret the claim that Scheme's
> macro system is more "sophisticated" than CL's ?  Does it merely reflect
> a preference for Scheme's "hygene", or is there something else behind
> the statement ?

It requires more pages to describe it.  Indeed, isn't the macro
section of the scheme standard bigger than the rest of the standard
put together or something?
From: Anton van Straaten
Subject: Re: Scheme macros
Date: 
Message-ID: <%%hZb.11951$W74.6462@newsread1.news.atl.earthlink.net>
Jacek Generowicz wrote:
> [Don't want to start a flamefest, so I'm not crossposting this to
> c.l.s]
>
> In _Lambda: the ultimate "little language"_, Olin Shivers suggests
> (second paragraph of section 9) that " ... Scheme has the most
> sophisticated macro system of any language in this family ...", where
> 'this family' is explicitly stated to include Common Lisp.
>
> Now, I don't know much about the Scheme macro system ... my impression
> is that the main difference from CL's macros lies in Scheme's
> so-called hygiene, which I (mis?)understand to be a mechanism for
> preventing variable capture (which in CL is considered to be a
> feature, not a bug, of the macro system (particularly given that
> variable capture is less "problematic" in a Lisp-N, where N>1)).
>
> My question is ... how should I interpret the claim that Scheme's
> macro system is more "sophisticated" than CL's ?  Does it merely reflect
> a preference for Scheme's "hygene", or is there something else behind
> the statement ?

It's not only hygiene.  Like Barry, I would assume that Olin was talking
about the syntax-case macro system:
http://wombat.doc.ic.ac.uk/foldoc/foldoc.cgi?Syntax-Case
http://www.scheme.com/syntax-case/
http://www.scheme.com/tspl/syntax.html#./syntax:h3

Syntax-case is a rich system that's hard to summarize briefly, but I'll take
a stab at it.  I've identified four key features below:

* Although syntax-case is a hygienic macro system, it allows you to
selectively break hygiene.  This allows it to do the same kinds of things
defmacro can do, but from the opposite direction: you need to take special
action to break hygiene, rather than having to take action to preserve
hygiene.  All things being equal, the syntax-case approach ought to be
preferable; but defmacro fans will tell you that all things aren't equal,
that syntax-case pays a price in terms of complexity.

* Syntax-case supports pattern-matching of syntax, but also supports use of
procedural code, as defmacro does.

* With syntax-case, uses of pattern variables or macro variables don't have
to be escaped in the same way as they do with defmacro, i.e. you get rid of
all the quasiquote, unquote, etc., along with the need to think about that
issue when writing macros.  In a 9-line example on c.l.s. the other day[*],
a hygienic pattern-matching macro eliminated "two gensyms, one quasiquote,
six unquotes, and one unquote-splicing."  The only time you really need to
think about issues related to escaping of variables is when breaking
hygiene, which is the exception rather than the rule.

* Syntax-case represents syntax using syntax objects with their own
specialized structure, rather than using ordinary lists, as defmacro does.
This is something of a tradeoff, which can be thought of as analogous to the
tradeoff that can arise in almost any Lisp program, when choosing between
implementing a data structure using only lists, or implementing it using
some kind of structure mechanism (defstruct, CLOS).  The flexibility of
lists still has some advantages in some situations, but I don't know of
anyone who uses lists exclusively in real programs today.  It can be argued
that a similar situation applies to macros, especially more complex ones:
using nothing but lists to manipulate syntax leads to exactly the same sort
of problems and limitations that you have when using nothing but lists to
manipulate an ordinary program's data; but using non-list structures also
loses some simplicity.

My own feeling is that it's useful to have access more than one macro
system.  They all have a different flavor and have different pros and cons.
For example, although syntax-rules is the most restricted of these systems,
it makes for cleaner and simpler macros than defmacro, in many cases where a
purely hygienic macro is wanted.  Happily, Lisp being the ultra-flexible
language that it is, anyone who wants to play with syntax-rules in CL can
take a look at:
http://www.ccs.neu.edu/home/dorai/mbe/mbe-lsp.html

Anton

[*]
http://groups.google.com/groups?selm=f0wWb.458%24hm4.393%40newsread3.news.at
l.earthlink.net
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c14o89$7vj$1@newsreader2.netcologne.de>
Anton van Straaten wrote:

> It's not only hygiene.  Like Barry, I would assume that Olin was talking
> about the syntax-case macro system:
> http://wombat.doc.ic.ac.uk/foldoc/foldoc.cgi?Syntax-Case
> http://www.scheme.com/syntax-case/
> http://www.scheme.com/tspl/syntax.html#./syntax:h3

Thanks for these links.

> Syntax-case is a rich system that's hard to summarize briefly, but I'll take
> a stab at it.  I've identified four key features below:
> 
> * Although syntax-case is a hygienic macro system, it allows you to
> selectively break hygiene.  This allows it to do the same kinds of things
> defmacro can do, but from the opposite direction: you need to take special
> action to break hygiene, rather than having to take action to preserve
> hygiene.  All things being equal, the syntax-case approach ought to be
> preferable; but defmacro fans will tell you that all things aren't equal,
> that syntax-case pays a price in terms of complexity.

Hm, not sure. I think the main issue is perhaps somewhat different.

To me, the conceptual simplicity of CL-style macros is striking: It's 
just a transformation of s-expression. That's it. Once understood, it's 
clear that you can do anything with this conceptual model.

Here is an analogy:

In Scheme, you have continuations and space-safe tail recursion. This 
guarantees that you can model any kind of iteration construct that you 
might need with some guaranteed properties. In Common Lisp you have the 
LOOP macro which covers the presumably important cases.

To me, things like syntax-rules and syntax-case look to macro 
programming like the LOOP macro looks to iteration. Maybe they really 
cover the important cases, but they seem hard to learn. And it 
immediately makes me wonder whether it is really worth it. After all, I 
know how to make things work with DEFMACRO.

Strange thing is: I like the LOOP macro. There's something mysterious 
going on here...

BTW, what you really need to make something like DEFMACRO work is, on 
top of that, of course quasiquotation, GENSYM/MAKE-SYMBOL or 
string->uninterned-symbol and most probably a Lisp-2.

My feeling is that Common Lispniks would have an easier time to consider 
using Scheme when appropriate if Scheme implementations would more 
clearly document whether they support these features (except, of course, 
for the Lisp-2 thing). It's important that you can create uninterned 
symbols.

> * Syntax-case supports pattern-matching of syntax, but also supports use of
> procedural code, as defmacro does.
> 
> * With syntax-case, uses of pattern variables or macro variables don't have
> to be escaped in the same way as they do with defmacro, i.e. you get rid of
> all the quasiquote, unquote, etc., along with the need to think about that
> issue when writing macros.  In a 9-line example on c.l.s. the other day[*],
> a hygienic pattern-matching macro eliminated "two gensyms, one quasiquote,
> six unquotes, and one unquote-splicing."  The only time you really need to
> think about issues related to escaping of variables is when breaking
> hygiene, which is the exception rather than the rule.

I have looked at this example, and the funny thing is that I immediately 
start to wonder which elements of the macro definition refer to 
macro-expansion-time entities and which refer to run-time stuff. I don't 
have to think about these issues with quasiquotations because I 
immediately see it from the code.

I am not saying that this makes syntax-case worse than quasiquotation. 
Maybe I am just missing something.

> * Syntax-case represents syntax using syntax objects with their own
> specialized structure, rather than using ordinary lists, as defmacro does.
> This is something of a tradeoff, which can be thought of as analogous to the
> tradeoff that can arise in almost any Lisp program, when choosing between
> implementing a data structure using only lists, or implementing it using
> some kind of structure mechanism (defstruct, CLOS).  The flexibility of
> lists still has some advantages in some situations, but I don't know of
> anyone who uses lists exclusively in real programs today.  It can be argued
> that a similar situation applies to macros, especially more complex ones:
> using nothing but lists to manipulate syntax leads to exactly the same sort
> of problems and limitations that you have when using nothing but lists to
> manipulate an ordinary program's data; but using non-list structures also
> loses some simplicity.

Hm, I recall reading that syntax-case allows for recording line numbers 
of the original expression. Are there more advantages?

> My own feeling is that it's useful to have access more than one macro
> system.  They all have a different flavor and have different pros and cons.
> For example, although syntax-rules is the most restricted of these systems,
> it makes for cleaner and simpler macros than defmacro, in many cases where a
> purely hygienic macro is wanted.  Happily, Lisp being the ultra-flexible
> language that it is, anyone who wants to play with syntax-rules in CL can
> take a look at:
> http://www.ccs.neu.edu/home/dorai/mbe/mbe-lsp.html

Having more options is certainly better. (Such a statement from a fan of 
a supposedly minimal language?!? ;)

Anyway, thanks for your insightful posting. I appreciate your efforts to 
bridge the gaps between the Scheme and Common Lisp communities, 
especially because this isn't always easy.


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <40369152$0$195$edfadb0f@dread12.news.tele.dk>
Pascal Costanza wrote:
> Anton van Straaten wrote:

> My feeling is that Common Lispniks would have an easier time to consider 
> using Scheme when appropriate if Scheme implementations would more 
> clearly document whether they support these features (except, of course, 
> for the Lisp-2 thing). It's important that you can create uninterned 
> symbols.

I don't know any Scheme with defmacro that doesn't support gensym.
Note en passant that define-macro in DrScheme is implemented in terms
of syntax-case (one page of code AFAIR).

> I have looked at this example, and the funny thing is that I immediately 
> start to wonder which elements of the macro definition refer to 
> macro-expansion-time entities and which refer to run-time stuff. I don't 
> have to think about these issues with quasiquotations because I 
> immediately see it from the code.
> 
> I am not saying that this makes syntax-case worse than quasiquotation. 
> Maybe I am just missing something.

Colors. Colors and arrows. That's what you are missing:

<http://www.scheme.dk/macros-in-color.png>

-- 
Jens Axel S�gaard
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c165ug$nb$1@newsreader2.netcologne.de>
Jens Axel S�gaard wrote:

> Pascal Costanza wrote:
> 
>> Anton van Straaten wrote:
> 
>> My feeling is that Common Lispniks would have an easier time to 
>> consider using Scheme when appropriate if Scheme implementations would 
>> more clearly document whether they support these features (except, of 
>> course, for the Lisp-2 thing). It's important that you can create 
>> uninterned symbols.
> 
> I don't know any Scheme with defmacro that doesn't support gensym.

I have read docs of some Schemes where I had the feeling that the 
respectively provided gensym functions just tried hard to generate 
unique strings, but didn't actually rely on object identity / symbol 
identity to ensure uniqueness. Maybe I misunderstood those docs. But 
that's the whole point of my argument.

In fact, I have just browsed through some of the Scheme docs in order to 
double-check my impression. I have to admit it was easier this time to 
track down the relevant sections. When a Scheme implements 
string->uninterned-symbol, I am relatively happy. When it doesn't, but 
it implements gensym, then I wonder how gensym is implemented. Some docs 
say that gensym creates a unique symbol, some even state that it's 
uninterned. What do the others do?

So at least, I know what to look for by now. Still, I have the feeling 
that things could be easier. For example, the Scheme standard could just 
standardize string->uninterned-symbol, couldn't it?

> Note en passant that define-macro in DrScheme is implemented in terms
> of syntax-case (one page of code AFAIR).

...but this isn't a Turing equivalence argument, is it? ;)

>> I have looked at this example, and the funny thing is that I 
>> immediately start to wonder which elements of the macro definition 
>> refer to macro-expansion-time entities and which refer to run-time 
>> stuff. I don't have to think about these issues with quasiquotations 
>> because I immediately see it from the code.
>>
>> I am not saying that this makes syntax-case worse than quasiquotation. 
>> Maybe I am just missing something.
> 
> Colors. Colors and arrows. That's what you are missing:
> 
> <http://www.scheme.dk/macros-in-color.png>

No, I am red/green colorblind.


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <40369c6a$0$224$edfadb0f@dread12.news.tele.dk>
Pascal Costanza wrote:

>> Colors. Colors and arrows. That's what you are missing:

>> <http://www.scheme.dk/macros-in-color.png>

> No, I am red/green colorblind.

You are in luck, nothing is colored red.

-- 
Jens Axel S�gaard
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1682n$47k$1@newsreader2.netcologne.de>
Jens Axel S�gaard wrote:

> Pascal Costanza wrote:
> 
>>> Colors. Colors and arrows. That's what you are missing:
> 
>>> <http://www.scheme.dk/macros-in-color.png>
> 
>> No, I am red/green colorblind.
> 
> You are in luck, nothing is colored red.

Red/green colorblindness doesn't "work" like that. For example, it's 
really hard for me to parse any useful information from the coloring of 
the (define-syntax for ...) form in the example you have linked to. 
Unlesss my eyes are only a few inches away from the screen. It is only 
called "red/green colorblindness" because of some biological details, 
but the effect doesn't really have to do with the colors red and green 
as such, at least not as we perceive them.

In general, it's a good idea to have a colorblind person check colors in 
a program, presentation, etc., in order to ensure that they have the 
desired effect on them, if it is really important. About 10% of the male 
population are colorblind, so I don't think it's negligible.


Pascal

P.S.: Hey, this is still on topic! ;)

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <4036a8d9$0$235$edfadb0f@dread12.news.tele.dk>
Pascal Costanza wrote:
> 
> Jens Axel S�gaard wrote:
> 
>> Pascal Costanza wrote:
>>
>>>> Colors. Colors and arrows. That's what you are missing:
>>
>>
>>>> <http://www.scheme.dk/macros-in-color.png>
>>
>>
>>> No, I am red/green colorblind.
>>
>>
>> You are in luck, nothing is colored red.
> 
> 
> Red/green colorblindness doesn't "work" like that. For example, it's 
> really hard for me to parse any useful information from the coloring of 
> the (define-syntax for ...) form in the example you have linked to. 
> Unlesss my eyes are only a few inches away from the screen. 

I apologize. I sincerely thought, you were making a joke (a good one even).

> It is only 
> called "red/green colorblindness" because of some biological details, 
> but the effect doesn't really have to do with the colors red and green 
> as such, at least not as we perceive them.

Does it affect the perception of other colors as well?
I mean, is there a problem with disguishing say a mixture of 50% red and 50% blue,
from a mixture with 50% green and 50% blue?

> In general, it's a good idea to have a colorblind person check colors in 
> a program, presentation, etc., in order to ensure that they have the 
> desired effect on them, if it is really important. About 10% of the male 
> population are colorblind, so I don't think it's negligible.

Which colors would you prefer? Hm. An option
to use alternative means of indicating the various categories,
say underline, bold, italic etc would be nice.

-- 
Jens Axel S�gaard
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c16adf$81h$1@newsreader2.netcologne.de>
Jens Axel S�gaard wrote:

>> Red/green colorblindness doesn't "work" like that. For example, it's 
>> really hard for me to parse any useful information from the coloring 
>> of the (define-syntax for ...) form in the example you have linked to. 
>> Unlesss my eyes are only a few inches away from the screen. 
> 
> I apologize. I sincerely thought, you were making a joke (a good one even).

No problem. Really. ;)

  >> It is only called "red/green colorblindness" because of some
>> biological details, but the effect doesn't really have to do with the 
>> colors red and green as such, at least not as we perceive them.
> 
> Does it affect the perception of other colors as well?
> I mean, is there a problem with disguishing say a mixture of 50% red and 
> 50% blue, from a mixture with 50% green and 50% blue?

I can't describe the effects, because I really don't know the details. I 
I am only a victim. ;) Indeed, the mixed colors are the most problematic 
ones.

>> In general, it's a good idea to have a colorblind person check colors 
>> in a program, presentation, etc., in order to ensure that they have 
>> the desired effect on them, if it is really important. About 10% of 
>> the male population are colorblind, so I don't think it's negligible.
> 
> Which colors would you prefer?

Colors like yellow, orange and blue are generally ok. But the best thing 
in my experience is to sit down together with someone affected, explain 
to him what you want to achieve, and then agree on some colors.

> Hm. An option
> to use alternative means of indicating the various categories,
> say underline, bold, italic etc would be nice.

Yes, that's also a good idea. Bold and italic can have negative 
consequences for the layout, though. Shades of grey are usually also 
very good when you don't use too many. This also helps in getting usable 
printouts on b/w printers. (This is also a useful rule of thumb: If you 
can still distinguish the colors when converted to grey scale, then it's 
likely that colorblinded people can also distinguish them.)


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: pete kirkham
Subject: Re: Scheme macros
Date: 
Message-ID: <40373dc3$0$6849$cc9e4d1f@news.dial.pipex.com>
Pascal Costanza wrote:
> Shades of grey are usually also 
> very good when you don't use too many. This also helps in getting usable 
> printouts on b/w printers. (This is also a useful rule of thumb: If you 
> can still distinguish the colors when converted to grey scale, then it's 
> likely that colorblinded people can also distinguish them.)

Though some kinds of dyslexics find luminosity variations confusing (I 
do, though more for backgrounds rather than foreground), so it's better 
to have both colour and a grey-scale default modes, and the option for 
the user to configure these.

One of the projects I'm on uses background colour to differentiate the 
status of input fields (unset, entered, inherited from parent, 
calculated from child, etc.), and one of the users is colour blind. He 
can't tell the difference between the fields with my settings, and I 
can't read the entries with his.


Pete
(who ended up studying electronics as he was the only non-colour blind 
member of his family, so had to help wire all the plugs as a kid)
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <403912d3$0$235$edfadb0f@dread12.news.tele.dk>
Pascal Costanza wrote:
> Jens Axel S�gaard wrote:

>> Does it affect the perception of other colors as well?
>> I mean, is there a problem with disguishing say a mixture of 50% red 
>> and 50% blue, from a mixture with 50% green and 50% blue?

> I can't describe the effects, because I really don't know the details. I 
> I am only a victim. ;) Indeed, the mixed colors are the most problematic 
> ones.

OK

>>> In general, it's a good idea to have a colorblind person check colors 
>>> in a program, presentation, etc., in order to ensure that they have 
>>> the desired effect on them, if it is really important. About 10% of 
>>> the male population are colorblind, so I don't think it's negligible.

>> Which colors would you prefer?

> Colors like yellow, orange and blue are generally ok. But the best thing 
> in my experience is to sit down together with someone affected, explain 
> to him what you want to achieve, and then agree on some colors.
> 
>> Hm. An option
>> to use alternative means of indicating the various categories,
>> say underline, bold, italic etc would be nice.
> 
> 
> Yes, that's also a good idea. Bold and italic can have negative 
> consequences for the layout, though. Shades of grey are usually also 
> very good when you don't use too many. This also helps in getting usable 
> printouts on b/w printers. (This is also a useful rule of thumb: If you 
> can still distinguish the colors when converted to grey scale, then it's 
> likely that colorblinded people can also distinguish them.)

I'll keep that in mind.

-- 
Jens Axel S�gaard
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <40369d7d$0$224$edfadb0f@dread12.news.tele.dk>
Pascal Costanza wrote:
> Jens Axel S�gaard wrote:
>> Pascal Costanza wrote:

>>> My feeling is that Common Lispniks would have an easier time to 
>>> consider using Scheme when appropriate if Scheme implementations 
>>> would more clearly document whether they support these features 
>>> (except, of course, for the Lisp-2 thing). It's important that you 
>>> can create uninterned symbols.

>> I don't know any Scheme with defmacro that doesn't support gensym.

> I have read docs of some Schemes where I had the feeling that the 
> respectively provided gensym functions just tried hard to generate 
> unique strings, but didn't actually rely on object identity / symbol 
> identity to ensure uniqueness. Maybe I misunderstood those docs. But 
> that's the whole point of my argument.

That's a valid concern. Documentation is worth nothing, if it isn't
precise.

> In fact, I have just browsed through some of the Scheme docs in order to 
> double-check my impression. I have to admit it was easier this time to 
> track down the relevant sections. When a Scheme implements 
> string->uninterned-symbol, I am relatively happy. When it doesn't, but 
> it implements gensym, then I wonder how gensym is implemented. Some docs 
> say that gensym creates a unique symbol, some even state that it's 
> uninterned. What do the others do?
> 
> So at least, I know what to look for by now. Still, I have the feeling 
> that things could be easier. For example, the Scheme standard could just 
> standardize string->uninterned-symbol, couldn't it?

I suppose so.

>> Note en passant that define-macro in DrScheme is implemented in terms
>> of syntax-case (one page of code AFAIR).

> ...but this isn't a Turing equivalence argument, is it? ;)

Hence the "en passant" :-)

-- 
Jens Axel S�gaard
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1684u$47k$2@newsreader2.netcologne.de>
Jens Axel S�gaard wrote:

>>> Note en passant that define-macro in DrScheme is implemented in terms
>>> of syntax-case (one page of code AFAIR).
> 
> 
>> ...but this isn't a Turing equivalence argument, is it? ;)
> 
> 
> Hence the "en passant" :-)

:-))

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Anton van Straaten
Subject: Re: Scheme macros
Date: 
Message-ID: <P2WZb.14144$W74.2917@newsread1.news.atl.earthlink.net>
Pascal Costanza wrote:
> I have read docs of some Schemes where I had the feeling that the
> respectively provided gensym functions just tried hard to generate
> unique strings, but didn't actually rely on object identity / symbol
> identity to ensure uniqueness. Maybe I misunderstood those docs.
> But that's the whole point of my argument.

It seems to me that you're exercising unwarranted suspicion.  Do you have
any reason to believe gensym won't work the way it should?  I suspect the
authors of whichever docs you're referring to may not have bothered to
specifically address the details you're concerned about, simply because it
wouldn't make sense to implement a gensym that doesn't work reliably in all
cases, and it seems almost too obvious to mention that.

> In fact, I have just browsed through some of the Scheme docs in order to
> double-check my impression. I have to admit it was easier this time to
> track down the relevant sections. When a Scheme implements
> string->uninterned-symbol, I am relatively happy. When it doesn't, but
> it implements gensym, then I wonder how gensym is implemented. Some docs
> say that gensym creates a unique symbol, some even state that it's
> uninterned. What do the others do?

You're worrying about internal implementation details, which only matter in
two cases: if the observed behavior is incorrect (which afaik, is not the
case); or if you have some particular interest in implementation details.
In the latter case, you shouldn't expect the documentation that tells you
how to use the language, to give all the specifics of the internals.
However, many Schemes do have papers available which go into some detail
about their internals.

> So at least, I know what to look for by now. Still, I have the feeling
> that things could be easier.

Do you mean that it could be easier for a CL user to find information that
satisfies him that something in Scheme has the same observable semantics as
it does in CL?  I could make the same claim about CL.  One way to address
that would be "CL for Scheme users" and "Scheme for CL users" (I think I've
seen at least one of those somewhere.)

> For example, the Scheme standard could just
> standardize string->uninterned-symbol, couldn't it?

You could submit a SRFI for it.  But are you suggesting standardizing
string->uninterned-symbol simply to force a particular implementation style
for gensym?  If so, why is that necessary?

> > Note en passant that define-macro in DrScheme is implemented in terms
> > of syntax-case (one page of code AFAIR).
>
> ...but this isn't a Turing equivalence argument, is it? ;)

No, but it does say something about the generality and power of syntax-case.
The PLT defmacro implementation is about 110 lines in total, much of which
is handling exceptions and two syntax variations.

Anton
From: Brian Mastenbrook
Subject: Re: Scheme macros
Date: 
Message-ID: <220220041629036659%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <····················@newsread1.news.atl.earthlink.net>,
Anton van Straaten <·····@appsolutions.com> wrote:

> It seems to me that you're exercising unwarranted suspicion.  Do you have
> any reason to believe gensym won't work the way it should?  I suspect the
> authors of whichever docs you're referring to may not have bothered to
> specifically address the details you're concerned about, simply because it
> wouldn't make sense to implement a gensym that doesn't work reliably in all
> cases, and it seems almost too obvious to mention that.

Perhaps someone should tell that to Dybvig:

> (define g1 '#{g18 |%japmchQ4DMDM\\%+|})
> (define g2 (gensym))
> g2

Error in intern-gensym: unique name "%japmchQ4DMDM\\\\%+" already
interned.
Type (debug) to enter the debugger.

I did this by observing the pattern of printed gensyms and guessing the
next value in the sequence.

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Anton van Straaten
Subject: Re: Scheme macros
Date: 
Message-ID: <PVa_b.14987$W74.13553@newsread1.news.atl.earthlink.net>
Brian Mastenbrook wrote:
> In article <····················@newsread1.news.atl.earthlink.net>,
> Anton van Straaten <·····@appsolutions.com> wrote:
>
> > It seems to me that you're exercising unwarranted suspicion.  Do you
have
> > any reason to believe gensym won't work the way it should?  I suspect
the
> > authors of whichever docs you're referring to may not have bothered to
> > specifically address the details you're concerned about, simply because
it
> > wouldn't make sense to implement a gensym that doesn't work reliably in
all
> > cases, and it seems almost too obvious to mention that.
>
> Perhaps someone should tell that to Dybvig:
>
> > (define g1 '#{g18 |%japmchQ4DMDM\\%+|})
> > (define g2 (gensym))
> > g2
>
> Error in intern-gensym: unique name "%japmchQ4DMDM\\\\%+" already
> interned.
> Type (debug) to enter the debugger.
>
> I did this by observing the pattern of printed gensyms and guessing the
> next value in the sequence.

Ouch!  Is that with Chez?  Petite Chez 6.0a doesn't do that.

Anton
From: Brian Mastenbrook
Subject: Re: Scheme macros
Date: 
Message-ID: <220220042248199222%NObmastenbSPAM@cs.indiana.edu>
In article <·····················@newsread1.news.atl.earthlink.net>,
Anton van Straaten <·····@appsolutions.com> wrote:

> Ouch!  Is that with Chez?  Petite Chez 6.0a doesn't do that.
> 
> Anton

I tried that example with Chez 6.9b, which corresponds to "whatever is
on the CS computers for us to use". I think the real issue here is with
print-gensym - it's not generating a unique representation for the
gensym, so it simply collides. Of course I have issues with
print-gensym to begin with - gensyms should not be READable to the same
value. To do otherwise makes it not a gensym. Otherwise gensym could be
boiled down to something evil like (loop for i from 1 do (if (not
(find-symbol (format nil "g~A" i)))) (return (intern (format nil "g~A"
i)))) .

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1b0bh$qd7$1@newsreader2.netcologne.de>
Anton van Straaten wrote:

> Pascal Costanza wrote:
> 
>>I have read docs of some Schemes where I had the feeling that the
>>respectively provided gensym functions just tried hard to generate
>>unique strings, but didn't actually rely on object identity / symbol
>>identity to ensure uniqueness. Maybe I misunderstood those docs.
>>But that's the whole point of my argument.
> 
> It seems to me that you're exercising unwarranted suspicion.  Do you have
> any reason to believe gensym won't work the way it should?  I suspect the
> authors of whichever docs you're referring to may not have bothered to
> specifically address the details you're concerned about, simply because it
> wouldn't make sense to implement a gensym that doesn't work reliably in all
> cases, and it seems almost too obvious to mention that.

OK, I have to admit that this is my mistake. The last time, I wanted to 
know whether a Scheme implementation suits my needs, I have looked at 
SISC's documentation, among others. At that time I didn't know about 
string->uninterned-symbol, but expected something more Common Lispish.

I have found the following section which made me very skeptical: 
http://sisc.sourceforge.net/manual/html/apc.html#N15287

To my ears, this sounded a lot like they were trying to achieve 
uniqueness by properly named strings. However, now that I have checked 
the docs again I have noticed that the details of that section serve a 
different purpose.

Someone should write a highly opinionated guide to Scheme. ;) (not me)

>>In fact, I have just browsed through some of the Scheme docs in order to
>>double-check my impression. I have to admit it was easier this time to
>>track down the relevant sections. When a Scheme implements
>>string->uninterned-symbol, I am relatively happy. When it doesn't, but
>>it implements gensym, then I wonder how gensym is implemented. Some docs
>>say that gensym creates a unique symbol, some even state that it's
>>uninterned. What do the others do?
> 
> You're worrying about internal implementation details, which only matter in
> two cases: if the observed behavior is incorrect (which afaik, is not the
> case); or if you have some particular interest in implementation details.
> In the latter case, you shouldn't expect the documentation that tells you
> how to use the language, to give all the specifics of the internals.

OK, thanks for being insisting.

>>So at least, I know what to look for by now. Still, I have the feeling
>>that things could be easier.
> 
> Do you mean that it could be easier for a CL user to find information that
> satisfies him that something in Scheme has the same observable semantics as
> it does in CL?  I could make the same claim about CL.  One way to address
> that would be "CL for Scheme users" and "Scheme for CL users" (I think I've
> seen at least one of those somewhere.)

...maybe a joint paper for both communities...

>>For example, the Scheme standard could just
>>standardize string->uninterned-symbol, couldn't it?
> 
> You could submit a SRFI for it.  But are you suggesting standardizing
> string->uninterned-symbol simply to force a particular implementation style
> for gensym?  If so, why is that necessary?

It's not. Just a wrong expectation on my side.


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Anton van Straaten
Subject: Re: Scheme macros
Date: 
Message-ID: <e0UZb.13992$W74.3152@newsread1.news.atl.earthlink.net>
Pascal Costanza wrote:
> > Syntax-case is a rich system that's hard to summarize briefly, but I'll
take
> > a stab at it.  I've identified four key features below:
> >
> > * Although syntax-case is a hygienic macro system, it allows you to
> > selectively break hygiene.  This allows it to do the same kinds of
things
> > defmacro can do, but from the opposite direction: you need to take
special
> > action to break hygiene, rather than having to take action to preserve
> > hygiene.  All things being equal, the syntax-case approach ought to be
> > preferable; but defmacro fans will tell you that all things aren't
equal,
> > that syntax-case pays a price in terms of complexity.
>
> Hm, not sure. I think the main issue is perhaps somewhat different.
>
> To me, the conceptual simplicity of CL-style macros is striking: It's
> just a transformation of s-expression. That's it.

That's what all of these macro systems are.

> Once understood, it's clear that you can do anything
> with this conceptual model.

The same is true of syntax-case.

> To me, things like syntax-rules and syntax-case look to macro
> programming like the LOOP macro looks to iteration. Maybe they really
> cover the important cases, but they seem hard to learn.

Syntax-rules is not hard to learn.  If anything, it suffers from being
almost too simple; as well as from lacking good, short introductory
material.  You specify a pattern, and specify the syntax that should replace
that pattern.  That's all there is to it.

Syntax-case is more complex, and I do think that's a drawback when compared
to defmacro.  It increases the temptation to conclude the following:

> And it immediately makes me wonder whether it is really worth it.
> After all, I know how to make things work with DEFMACRO.

I might wonder something similar if I were a Python programmer looking at
Lisp: Lisp seems hard to learn, and I would know how to make things work
with Python.

> BTW, what you really need to make something like DEFMACRO work is, on
> top of that, of course quasiquotation, GENSYM/MAKE-SYMBOL or
> string->uninterned-symbol and most probably a Lisp-2.

I don't see that Lisp-2 is an issue.  As Jens pointed out, all the major
Schemes implement defmacro & gensym, and it works fine.  And, of course,
quasiquotation is standardized by R5RS.

> My feeling is that Common Lispniks would have an easier time to consider
> using Scheme when appropriate if Scheme implementations would more
> clearly document whether they support these features (except, of course,
> for the Lisp-2 thing). It's important that you can create uninterned
symbols.

I wouldn't expect many Common Lispniks to use Scheme when appropriate.
Unless they like already like Scheme, why should they?  And if they already
like Scheme, they're unlikely to be put off by such vague concerns as the
above.  For the record, I'm not aware of any limitations on the defmacro
capabilities in the many Schemes which implement it.

> I have looked at this example, and the funny thing is that I immediately
> start to wonder which elements of the macro definition refer to
> macro-expansion-time entities and which refer to run-time stuff. I don't
> have to think about these issues with quasiquotations because I
> immediately see it from the code.

You don't have to think about these issues with syntax-rules - you're only
used to thinking about them because you're forced to, with defmacro.

In terms of name scoping, syntax-rules macros work exactly the same way name
scoping does with lambda.  If you're comfortable with the way name scoping
works with lambda, you should have no problems with syntax-rules.

To back that up: in a given lambda expression, all names used are defined in
one of the following places:

* the lambda formals list;
* within the lambda body;
* in a lexically enclosing scope;
* in the dynamic environment.

In a syntax-rules macro, all names used are defined in:

* the syntax-rules pattern;
* within the syntax-rules body;
* in a lexically enclosing scope;
* in the dynamic environment.

There's no more need to quote or unquote names in a syntax-rules macro, than
there is in an ordinary lambda expression.

The reason you have to worry about quoting and unquoting of names in
defmacro is because you need to control whether or not you capture names
from the invoking environment.  With pure hygienic macros like syntax-rules,
you simply can't do that, so it's not an issue.

Of course, quoting in defmacro is not just about names: it's also about
syntax.  The reason syntax doesn't have to be quoted in syntax-rules macros
is that they rely on pattern matching and template replacement, not
procedural code.  No procedural code means that except for macro variables
(addressed above), and special tokens like "...", everything in a macro
template is syntax at the same level, so there's no need to quote it.

The result is that it's actually easier to reason about syntax-rules
macros - which makes them easier to write, and easier to read.  As a result,
and also because of the enforced hygiene, they're less error-prone.

Note that I'm not claiming syntax-rules macros are appropriate in all
cases - obviously, where you want to be able to break hygiene, they're not a
good solution (although it's possible to "break hygiene" with syntax-rules,
by redefining binding operators like lambda and let [*]).

> I am not saying that this makes syntax-case worse than quasiquotation.
> Maybe I am just missing something.

I don't think it's possible to compare without actually learning to use both
systems.  If you're interested in becoming a bit more familiar, I'd
recommend starting with syntax-rules, which is very simple and easy to get
up to speed with.  It also has the benefit that you could use it from within
CL via Dorai Sitaram's package.

> Hm, I recall reading that syntax-case allows for recording line numbers
> of the original expression. Are there more advantages?

Using domain-specific structures instead of lists to represent syntax means
that you can associate any information you want to with source code.  As
Jens pointed out, PLT uses this to good effect.  I think there's an argument
that this is an obvious way forward for Lisp-like languages - things like
refactoring tools and any other kinds of automated code processing can
benefit from it.

> Having more options is certainly better. (Such a statement from a fan of
> a supposedly minimal language?!? ;)

I think options are required here because none of these systems are perfect
in all respects.  But the core language is still minimal - these macro
systems are all surface features, for syntax transformation.  The portable
syntax-case system can be implemented in almost any standard Scheme, and
both syntax-rules and defmacro are easy to implement in terms of
syntax-case.  You can view syntax-rules and defmacro as applications of
syntax-case.  The reverse is not the case.

> Anyway, thanks for your insightful posting. I appreciate your efforts to
> bridge the gaps between the Scheme and Common Lisp communities,
> especially because this isn't always easy.

Thanks.  I'm more interested in the underlying ideas, than in language
advocacy.  Systems for transforming s-expressions are relevant to all Lisp
dialects.

Anton

[*]
http://groups.google.com/groups?selm=7eb8ac3e.0203271253.74bb0819%40posting.
google.com
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1avb7$oem$1@newsreader2.netcologne.de>
Anton van Straaten wrote:

> Pascal Costanza wrote:
> 
>>To me, the conceptual simplicity of CL-style macros is striking: It's
>>just a transformation of s-expression. That's it.
> 
> That's what all of these macro systems are.

R5RS doesn't say so. At least, I don't see where the term "macro 
transformer" is defined. It seems to me that the standard tries hard to 
hide that fact. (But I might simply not have found the relevant sections.)

>>Once understood, it's clear that you can do anything
>>with this conceptual model.
> 
> The same is true of syntax-case.

Of course, I will take your word for that. But I still don't understand 
what syntax-case does. I have browsed through the various links that are 
usually referred (mainly papers and a book by Dybvig), but I find it 
very hard to follow the contents. It would be good if there would exist 
some kind of high-level overview about syntax-case for people who 
already know DEFMACRO.

>>To me, things like syntax-rules and syntax-case look to macro
>>programming like the LOOP macro looks to iteration. Maybe they really
>>cover the important cases, but they seem hard to learn.
> 
> Syntax-rules is not hard to learn.  If anything, it suffers from being
> almost too simple; as well as from lacking good, short introductory
> material.  You specify a pattern, and specify the syntax that should replace
> that pattern.  That's all there is to it.

Examples like those given in 
http://okmij.org/ftp/Scheme/r5rs-macros-pitfalls.txt seem to indicate 
that syntax-rules just trade one set of possible pitfalls with a 
different set, but along that way the conceptual simplicity is lost.

Here are the examples from that reference implemented with DEFMACRO:

(defun foo-f (x)
   (flet ((id (x) x))
     (id (1+ x))))

(defmacro foo-m (x)
   `(macrolet ((id (x) x))
      (id (1+ ,x))))

(defmacro bar-m2 (var &body body)
   `(macrolet ((helper (&body body)
                 `(lambda (,',var) ,@body)))
      (helper ,@body)))


I really don't see the problem. Seriously not.

> Syntax-case is more complex, and I do think that's a drawback when compared
> to defmacro.  It increases the temptation to conclude the following:
> 
>>And it immediately makes me wonder whether it is really worth it.
>>After all, I know how to make things work with DEFMACRO.
> 
> I might wonder something similar if I were a Python programmer looking at
> Lisp: Lisp seems hard to learn, and I would know how to make things work
> with Python.

Lisp and Scheme bring you metacircularity. As soon as Pythonistas write 
program generators, it's clear that their laguage is missing something 
important. Of course, they can write a Lisp interpreter in Python, but 
that's besides the point.

Do you really think that syntax-case is an equally important step forward?

>>BTW, what you really need to make something like DEFMACRO work is, on
>>top of that, of course quasiquotation, GENSYM/MAKE-SYMBOL or
>>string->uninterned-symbol and most probably a Lisp-2.
> 
> I don't see that Lisp-2 is an issue. 

See http://citeseer.nj.nec.com/bawden88syntactic.html

Here are comments on the examples given in that paper from a Lisp-2 
point of view:

1. Lexical variable bindings in the client code don't capture function 
definitions used in macros:

(let ((cons 5))
   (push 'foo stack))

is not an issue.

2. A client's lexical reference wouldn't conflict with a binding 
introduced by a macro because the macro would make sure to use unique 
symbols:

(defmacro or (exp-1 exp-2)
   (let ((temp (gensym)))
     `(let ((,temp ,exp-1))
        (if ,temp ,temp ,exp-2))))

The expansion of (or (memq x y) temp) would be correct. This isn't 
really a Lisp-2 issue. Note, however, that there are good examples why 
one would want to introduce a new binding in a macro. As long as this is 
documented, there is no real problem here. This example boils down to 
the question what a reasonable default is and how hard it is to get the 
other variant.

3. The third example in that paper shows how a newly introduced lexical 
function binding may interfere with a function that is used in a macro 
expansion. However, the names of global functions definitions are an 
important part of an application's ontology. You know that you redefine 
an important element of your application when meaningful names are used. 
Different ontologies can be separated by proper namespace mechanisms.

Here it helps that a Lisp-2 seperates variables and functions by 
default. Variables are usually not important parts of an application's 
ontology. If they are, the convention in Common Lisp is to use proper 
naming schemes, like asterisks for special variables. Effectively, this 
creates a new namespace. (ISLISP is a little bit more explicit than 
Common Lisp in this regard.)

4. The fourth example can be solved with a proper GENSYM for "use" in 
the "contorted" macro. Again, there are good examples why one would want 
to introduce a new binding in a macro.

> As Jens pointed out, all the major
> Schemes implement defmacro & gensym, and it works fine.  And, of course,
> quasiquotation is standardized by R5RS.

OK, I believe you.

>>My feeling is that Common Lispniks would have an easier time to consider
>>using Scheme when appropriate if Scheme implementations would more
>>clearly document whether they support these features (except, of course,
>>for the Lisp-2 thing). It's important that you can create uninterned symbols.
> 
> I wouldn't expect many Common Lispniks to use Scheme when appropriate.
> Unless they like already like Scheme, why should they?  And if they already
> like Scheme, they're unlikely to be put off by such vague concerns as the
> above.  For the record, I'm not aware of any limitations on the defmacro
> capabilities in the many Schemes which implement it.

Some while ago, I wanted to experiment with continuations in Scheme. 
Apart from the fact that not all Schemes seem to implement continuations 
fully and/or correctly (see 
http://sisc.sourceforge.net/r5rspitresults.html ), the fact that the 
respective documentations make me feel uneasy about whether I have to 
relearn programming techniques for totally unrelated areas is a clear 
downside IMHO.

[...]
> The result is that it's actually easier to reason about syntax-rules
> macros - which makes them easier to write, and easier to read.  As a result,
> and also because of the enforced hygiene, they're less error-prone.

I don't mind using DEFMACRO for simple things. I don't find them hard to 
write or read, and I don't know why they would be more error-prone. 
Sounds similar to some of the claims made by advocates of static type 
systems. Maybe this boils down to just a matter of taste.

>>I am not saying that this makes syntax-case worse than quasiquotation.
>>Maybe I am just missing something.
> 
> I don't think it's possible to compare without actually learning to use both
> systems.  If you're interested in becoming a bit more familiar, I'd
> recommend starting with syntax-rules, which is very simple and easy to get
> up to speed with.  It also has the benefit that you could use it from within
> CL via Dorai Sitaram's package.

OK.

>>Hm, I recall reading that syntax-case allows for recording line numbers
>>of the original expression. Are there more advantages?
> 
> Using domain-specific structures instead of lists to represent syntax means
> that you can associate any information you want to with source code.  As
> Jens pointed out, PLT uses this to good effect.  I think there's an argument
> that this is an obvious way forward for Lisp-like languages - things like
> refactoring tools and any other kinds of automated code processing can
> benefit from it.

OK

>>Having more options is certainly better. (Such a statement from a fan of
>>a supposedly minimal language?!? ;)
> 
> I think options are required here because none of these systems are perfect
> in all respects.  But the core language is still minimal - these macro
> systems are all surface features, for syntax transformation.  The portable
> syntax-case system can be implemented in almost any standard Scheme, and
> both syntax-rules and defmacro are easy to implement in terms of
> syntax-case.  You can view syntax-rules and defmacro as applications of
> syntax-case.  The reverse is not the case.

What stands in the way of implementing syntax-case on top of DEFMACRO? 
(This is not a rhetorical question.)

>>Anyway, thanks for your insightful posting. I appreciate your efforts to
>>bridge the gaps between the Scheme and Common Lisp communities,
>>especially because this isn't always easy.
> 
> Thanks.  I'm more interested in the underlying ideas, than in language
> advocacy.  Systems for transforming s-expressions are relevant to all Lisp
> dialects.

Agreed.


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <403911a8$0$235$edfadb0f@dread12.news.tele.dk>
Pascal Costanza wrote:

> Some while ago, I wanted to experiment with continuations in Scheme. 
> Apart from the fact that not all Schemes seem to implement continuations 
> fully and/or correctly (see 
> http://sisc.sourceforge.net/r5rspitresults.html ), the fact that the 
> respective documentations make me feel uneasy about whether I have to 
> relearn programming techniques for totally unrelated areas is a clear 
> downside IMHO.

Most of these errors are not about call/cc but letrec
(e.g. 1.1 and 1.2 [I was too lazy to check the others]), where some
implementors have chosen to deviate sligthly from the standard (some
refer to it as letrec*). This deviation require the use of call/cc to
observe, and that's why the test examples are filled with call/cc.

I wouldn't be surprised if the behaviour becomes sanctioned in R6RS.

>>> Hm, I recall reading that syntax-case allows for recording line numbers
>>> of the original expression. Are there more advantages?

Perhaps you have already read it, but Dybvig's  "Writing Hygenic Macros
in Scheme with Syntax-Case" available at

<ftp://ftp.cs.indiana.edu/pub/scheme-repository/doc/pubs/iucstr356.ps.gz>

is one of the best expositions of syntax-case from an user view point.

> What stands in the way of implementing syntax-case on top of DEFMACRO? 
> (This is not a rhetorical question.)

I can't see any. Hm. Perhaps one could simple throw the Scheme source
of syntax-case through PseudoScheme?

-- 
Jens Axel S�gaard
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1e1d1$471$1@newsreader2.netcologne.de>
Jens Axel S�gaard wrote:

> Perhaps you have already read it, but Dybvig's  "Writing Hygenic Macros
> in Scheme with Syntax-Case" available at
> 
> <ftp://ftp.cs.indiana.edu/pub/scheme-repository/doc/pubs/iucstr356.ps.gz>
> 
> is one of the best expositions of syntax-case from an user view point.

Thanks for the link. At a first glance it looks quite good. (No, I 
haven't read it yet.)


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Anton van Straaten
Subject: Re: Scheme macros
Date: 
Message-ID: <mza_b.14958$W74.182@newsread1.news.atl.earthlink.net>
Pascal Costanza wrote:
> Anton van Straaten wrote:
>
> > Pascal Costanza wrote:
> >
> >>To me, the conceptual simplicity of CL-style macros is striking: It's
> >>just a transformation of s-expression. That's it.
> >
> > That's what all of these macro systems are.
>
> R5RS doesn't say so. At least, I don't see where the term "macro
> transformer" is defined. It seems to me that the standard tries hard to
> hide that fact. (But I might simply not have found the relevant sections.)

If you look up "macro transformer" in the index, it points you to a page
which contains the following definition (Sec. 4.3, Macros):

"The set of rules that specifies how a use of a macro is transcribed into a
more primitive expression is called the 'transformer' of the macro."

I don't think it's hiding anything.  Do you think otherwise?

> >>Once understood, it's clear that you can do anything
> >>with this conceptual model.
> >
> > The same is true of syntax-case.
>
> Of course, I will take your word for that. But I still don't understand
> what syntax-case does. I have browsed through the various links that are
> usually referred (mainly papers and a book by Dybvig), but I find it
> very hard to follow the contents. It would be good if there would exist
> some kind of high-level overview about syntax-case for people who
> already know DEFMACRO.

I agree the docs don't make it easy to get into at first.  I learned
syntax-case (up to a point - I'm not an expert by any means) after first
learning and using syntax-rules for some time, and having previously been
familiar with defmacro.  I think syntax-rules makes a good starting point,
because it teaches the hygienic pattern matching approach in a simpler
context.  That same approach is used by syntax-case, but augmented with a
much more powerful procedural syntax manipulation capability.

But I'll ignore my own advice and take a stab at explaining syntax-case,
starting from a defmacro perspective.  Perhaps the gentlest introduction to
syntax-case is Dybvig's paper, "Writing Hygenic Macros in Scheme with
Syntax-Case":
ftp://ftp.cs.indiana.edu/pub/scheme-repository/doc/pubs/iucstr356.ps.gz ,
and I'll use some of its examples below.

Start with defmacro, and imagine that instead of quoting syntax using
quasiquote, you use a special form, 'syntax', which instead of returning a
list as quasiquote does, returns a syntax object, i.e. an instance of an
abstract data type which represents a piece of syntax.  This type has an API
which supports various code walking & manipulation capabilities.  It can
also be converted to a list (or whatever value the original syntax
represented) via 'syntax-object->datum'.

An important thing to note here is that a syntax object "understands" the
syntax it represents - it's not just an undifferentiated list.  It knows
which values are identifiers, it knows things about where those identifiers
are bound, and as we've touched on, it can track things like line numbers
(which may be implementation-dependent).  If you're developing code
manipulation tools - editors, debuggers etc. - these syntax objects give you
a capability which defmacro doesn't even attempt to address.  Syntax objects
are a richer way to represent program syntax than lists, and their uses go
beyond just macros.

Within a syntax expression of the form (syntax ...), any references to macro
variables are replaced with their values, i.e. there's no need to unquote
references to macro variables.  So to steal an example from Dybvig, here's
an 'and2' macro which works like 'and', but for only two operands:

(define-syntax and2
  (lambda (x)
    (syntax-case x ()
      ((_ x y)
       (syntax (if x y #f))))))

The (lambda (x) ...) binds a syntax object to x, representing the syntax of
the expression which invoked the macro.  In theory, you can do whatever you
want with that syntax object.  Most commonly, you'll use syntax-case to do a
pattern match on it, which is what the above example does with the
expression (syntax-case x () ...).  The () is for literals, like 'else' in
cond.

Within the above syntax-case expression, there's a single pattern and a
corresponding template:

      ((_ x y)
       (syntax (if x y #f))))))

The underscore in (_ x y) represents the name of the macro - you could also
write (and2 x y), it doesn't matter.  This pattern will match any
invocations of AND2 with two parameters.  After the pattern, is the
expression which will be executed when that pattern is matched.  In this
case, it's simply a syntax expression which returns the expression (if x y
#f).  The return value from syntax-case must be a syntax object, which
represents the actual syntax of the final program.

In the above, since x and y are macro variables (strictly speaking, "pattern
variables"), their values will be substituted when the syntax object is
created, so that (and2 4 5) becomes (if 4 5 #f).

I think I'll stop there for now, since I have other things to do!  I've
touched on some of the more important points about syntax-case.  There's a
lot more to it than the above - particularly, breaking hygiene, and
executing procedural code rather than simply applying a template.  But all
of it fits into the above framework, and involves manipulating syntax
objects, in a way similar to what you would do in defmacro, but through the
syntax object API rather than manipulating lists.  There are plenty more
examples in the Dybvig paper.

> > Syntax-rules is not hard to learn.  If anything, it suffers from being
> > almost too simple; as well as from lacking good, short introductory
> > material.  You specify a pattern, and specify the syntax that should
replace
> > that pattern.  That's all there is to it.
>
> Examples like those given in
> http://okmij.org/ftp/Scheme/r5rs-macros-pitfalls.txt seem to indicate
> that syntax-rules just trade one set of possible pitfalls with a
> different set, but along that way the conceptual simplicity is lost.

I don't accept that "conceptual simplicity is lost" with syntax-rules.  It's
a different approach, which in some ways is conceptually simpler than
defmacro, since it doesn't require the user to manually keep track of the
different levels at which the macro operates.  The pitfalls you mention may
indeed be flaws in syntax-rules - I'm not familiar enough with them to
comment - but I find that syntax-rules works very well for many kinds of
macros, better than defmacro in fact.

Of course, the latter claim is hardly ever going to be accepted by someone
only familiar with defmacro.  For the record, I learned and used defmacro
before ever using syntax-rules or syntax-case, and I still use defmacro from
time to time, so I think I have a good basis for comparison.

> Here are the examples from that reference implemented with DEFMACRO:
>
> (defun foo-f (x)
>    (flet ((id (x) x))
>      (id (1+ x))))
>
> (defmacro foo-m (x)
>    `(macrolet ((id (x) x))
>       (id (1+ ,x))))
>
> (defmacro bar-m2 (var &body body)
>    `(macrolet ((helper (&body body)
>                  `(lambda (,',var) ,@body)))
>       (helper ,@body)))
>
>
> I really don't see the problem. Seriously not.

I'm not sure what you mean about not seeing the problem.  One of the
problems mentioned in the article is that syntax-rules pattern variables
don't shadow.  I don't know if there's a justification for that, or it's
simply a bug in the design of syntax-rules.  But you usually get an error if
you make this mistake, and it's easy to fix, and easy to avoid.  It doesn't
mean that syntax-rules is not useful, and it's still better than defmacro,
which you can't dispute until you've learned syntax-rules.  ;)

> > Syntax-case is more complex, and I do think that's a drawback when
compared
> > to defmacro.  It increases the temptation to conclude the following:
> >
> >>And it immediately makes me wonder whether it is really worth it.
> >>After all, I know how to make things work with DEFMACRO.
> >
> > I might wonder something similar if I were a Python programmer looking
at
> > Lisp: Lisp seems hard to learn, and I would know how to make things work
> > with Python.
>
> Lisp and Scheme bring you metacircularity. As soon as Pythonistas write
> program generators, it's clear that their laguage is missing something
> important. Of course, they can write a Lisp interpreter in Python, but
> that's besides the point.
>
> Do you really think that syntax-case is an equally important step forward?

In some respects, yes, but that's not what I really meant.  It's easy to
look at something from the outside and find reasons not to try it, and that
was my main point.  But the points you've been picking on don't seem very
substantial to me - it seems as though you're looking for reasons to ignore
these systems, rather than looking for reasons you might want to learn them.
To conclude from the points you've raised that these systems can be ignored,
seems to me to throw a bunch of babies out with the bathwater.  (They're
much cuter babies than that warty defmacro baby, too! ;)

Of course, a CL programmer who wants to write standard CL code, obviously
has little incentive to be interested in other macro systems.  But if your
interests cross multiple languages, then there's value in the Scheme macro
systems, at the very least in the sense that learning more than one approach
to the same problem expands the horizons of your understanding of the
problem.

> >>BTW, what you really need to make something like DEFMACRO work is, on
> >>top of that, of course quasiquotation, GENSYM/MAKE-SYMBOL or
> >>string->uninterned-symbol and most probably a Lisp-2.
> >
> > I don't see that Lisp-2 is an issue.
>
> See http://citeseer.nj.nec.com/bawden88syntactic.html

I'm familiar with why people claim it's an issue, but in practice I think
it's not significantly worse than the issue of hygiene in defmacros in
general.  As I've said and defended here once before, Lisp-1 can express any
Lisp-2 program, simply by changing any conflicting names so as not to
conflict - a conceptually trivial transformation, with consequences which
are primarily subjective.  It would have an impact on porting Lisp-2 macros
to Lisp-1, but it doesn't limit what you can easily express in Lisp-1.

Put another way, having the ability to accidentally compensate for hygiene
violations in some cases - where multiple namespaces happen to prevent the
problem - isn't a solution to the general problem of not having hygiene.
Since you haven't solved the general problem, you still have to address
questions of hygiene, in various low-level ways.  A single namespace doesn't
makes this problem worse in any significant way.

> Here it helps that a Lisp-2 seperates variables and functions by
> default. Variables are usually not important parts of an application's
> ontology. If they are, the convention in Common Lisp is to use proper
> naming schemes, like asterisks for special variables. Effectively, this
> creates a new namespace.

Mmm, asterisks.  This, to me, is why the whole Lisp-1/2 debate is moot.  The
solution is simply Lisp-N, where you can define namespaces, modules, etc.
and control how they're used.  See PLT Scheme etc.

> 4. The fourth example can be solved with a proper GENSYM for "use" in
> the "contorted" macro.

The phrase "proper GENSYM" is an oxymoron.  GENSYM operates at a strangely
low level of abstraction.  Why don't you use GENSYM when declaring normal
lexical variables in a procedure?  Rhetorical question, of course - the
point is, GENSYM is a kludge.  It's not a particularly onerous one, but it's
part of what makes defmacro worth improving on.

> Some while ago, I wanted to experiment with continuations in Scheme.
> Apart from the fact that not all Schemes seem to implement continuations
> fully and/or correctly (see
> http://sisc.sourceforge.net/r5rspitresults.html ), the fact that the
> respective documentations make me feel uneasy about whether I have to
> relearn programming techniques for totally unrelated areas is a clear
> downside IMHO.

We're straying far afield here.  ;)  But I'll give my opinion about
continuations, too.  Re the quality of implementations, once again you're
looking at edge cases.  Forget about those, they're not important, except
in, well, edge cases.  All of the major Schemes either support continuations
well, or tell you when they don't - e.g., some of the Scheme to C compilers
deliberately provide restricted continuations.

As far as relearning programming techniques goes, first and foremost,
continuations are a general conceptual model for control flow.  If you only
write single-threaded code in a language with a traditional linear
stack-based control flow, you won't have much use for continuations -
they're far more powerful and general than is needed to deal with that case.
But for systems with more complex control flow, continuations can provide a
very useful model - web servers are just one example, but really any system
which involves multiple threads, distributed processing, etc. can benefit
from modeling via continuations.

Scheme is one of very few languages - along with SML/NJ, Stackless Python,
and the RhinoWithContinuations version of Javascript - which implements
first-class continuations.  If you're developing tools in the spaces
mentioned above, this is a useful capability.  Stackless Python uses
continuations to support microthreading; Scheme has a number of web server
solutions which use continuations to "invert control" so that the structure
of a web application's code can be decoupled from its web page structure;
and RhinoWithContinuations does something similar for web applications in
the Cocoon web framework.  For applications which need them, continuations
are very useful, and have little competition.  Their competition is mainly
OS-level threads, which really solve a different problem, and conceptually
are a stream of continuations anyway.

For ordinary programming, though, continuations are more or less
irrelevant - they should be dealt with under the hood, whether by tools like
those web server frameworks, or by language constructs like exception
handlers and microthreads.  The only reason to learn about use of
first-class continuations as a programming construct is either for the sake
of learning, to deepen your understanding of programming; or if you are
interested in developing language or system tools that use them.  If you are
interested in any of this, then yes, you're going to have to do some
learning and relearning - there's no way around that.  But for most ordinary
applications, you can safely ignore continuations.

> [...]
> > The result is that it's actually easier to reason about syntax-rules
> > macros - which makes them easier to write, and easier to read.  As a
result,
> > and also because of the enforced hygiene, they're less error-prone.
>
> I don't mind using DEFMACRO for simple things. I don't find them hard to
> write or read, and I don't know why they would be more error-prone.
> Sounds similar to some of the claims made by advocates of static type
> systems. Maybe this boils down to just a matter of taste.

Maybe - let's talk once you've tried syntax-rules.  But you gave a clue to
your reading & writing process for DEFMACRO when you said that when reading
a syntax-rules macro, you were immediately worrying about which level the
various tokens were at.  You've learned to look for, and expect something
that, with syntax-rules, you can simply forget about.  You don't do these
things when writing ordinary functions - why do you put up with it when
writing macros?  What would you think of Lisp if you had to use gensym to
initialize every variable you use?  You've simply become very used to a
low-level technique, so that you don't believe there's any need for a higher
level technique.

> What stands in the way of implementing syntax-case on top of DEFMACRO?
> (This is not a rhetorical question.)

I don't think it would make much sense.  The implementation of syntax
objects has little do with what defmacro does.  The pattern matching forms
of syntax-case might be defined via DEFMACRO at the surface level, but their
definitions deal with syntax objects, so there'd be little for defmacro to
do once the syntax objects had been constructed.  It wouldn't help much when
constructing the syntax objects, either, since the 'syntax' form doesn't use
defmacro syntax, and I can't see any point in converting it internally.

The reason that it's easy to implement DEFMACRO in syntax-case is that a
syntax object is a superset of the list representations of syntax used by
DEFMACRO.  You can translate syntax-as-lists to syntax objects and back
again, without losing anything - it's part of the standard syntax object
API, so nothing additional is needed to do that, which is partly why a
syntax-case implementation of DEFMACRO is short.  Going the other way is
more problematic, since syntax-as-lists has less information than a syntax
object.

Anton
From: Brian Mastenbrook
Subject: Re: Scheme macros
Date: 
Message-ID: <220220042241350582%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <···················@newsread1.news.atl.earthlink.net>,
Anton van Straaten <·····@appsolutions.com> wrote:

> I don't accept that "conceptual simplicity is lost" with syntax-rules.  It's
> a different approach, which in some ways is conceptually simpler than
> defmacro, since it doesn't require the user to manually keep track of the
> different levels at which the macro operates.  The pitfalls you mention may
> indeed be flaws in syntax-rules - I'm not familiar enough with them to
> comment - but I find that syntax-rules works very well for many kinds of
> macros, better than defmacro in fact.
> 
> Of course, the latter claim is hardly ever going to be accepted by someone
> only familiar with defmacro.  For the record, I learned and used defmacro
> before ever using syntax-rules or syntax-case, and I still use defmacro from
> time to time, so I think I have a good basis for comparison.

For the record, I'm familiar with both defmacro and syntax-rules,
though I am considerably more familiar with the first. With helpful
tools for list destructuring and mass generation of gensyms, defmacro
macros can be pretty easy to write, whereas I often have to squint at a
syntax-rules macro to figure out what ...'s correspond to what. Perhaps
this is simply a consequence of my level of experience. Perhaps it's
just personal preference.

> > 4. The fourth example can be solved with a proper GENSYM for "use" in
> > the "contorted" macro.
> 
> The phrase "proper GENSYM" is an oxymoron.  GENSYM operates at a strangely
> low level of abstraction.  Why don't you use GENSYM when declaring normal
> lexical variables in a procedure?  Rhetorical question, of course - the
> point is, GENSYM is a kludge.  It's not a particularly onerous one, but it's
> part of what makes defmacro worth improving on.

I don't understand what makes GENSYM such a kludge, nor why it is such
a low level of abstraction. GENSYM is not doing anything different than
CONS does - it returns something which is unique to eq? with the
specified contents. In CL, a GENSYM-alike can minimally be written as
(make-symbol "").

> We're straying far afield here.  ;)  But I'll give my opinion about
> continuations, too.  Re the quality of implementations, once again you're
> looking at edge cases.  Forget about those, they're not important, except
> in, well, edge cases.  All of the major Schemes either support continuations
> well, or tell you when they don't - e.g., some of the Scheme to C compilers
> deliberately provide restricted continuations.

Fortunately, this is what Chicken is for.

> Scheme is one of very few languages - along with SML/NJ, Stackless Python,
> and the RhinoWithContinuations version of Javascript - which implements
> first-class continuations.  If you're developing tools in the spaces
> mentioned above, this is a useful capability.  Stackless Python uses
> continuations to support microthreading; Scheme has a number of web server
> solutions which use continuations to "invert control" so that the structure
> of a web application's code can be decoupled from its web page structure;
> and RhinoWithContinuations does something similar for web applications in
> the Cocoon web framework.  For applications which need them, continuations
> are very useful, and have little competition.  Their competition is mainly
> OS-level threads, which really solve a different problem, and conceptually
> are a stream of continuations anyway.

And here is where the next Lisp/Scheme debate is going to start up.
When Schemers speak of continuations, they really mean implicit
continuations - the idea that the program should be granted access to
the continuations that are flying around under the hood in the
implementation. However, the name "continuation" is considerably more
general, or else anybody using CPS needs to find a new term.

In fact I would argue that the main competition to call/cc
continuations is going to be CPS, and that CPS has a really big
advantage: after you write your program, you can identify what types of
continuations you use, and then change their representation to a list
of a name and arguments which describes the continuation of that type.
For instance, a standard CPS example:

(defun fib-cps (n k)
  (if (< n 3) (funcall k 1)
      (fib-cps (- n 2)
               (lambda (v1)
                 (fib-cps (- n 1)
                          (lambda (v2)
                            (funcall k (+ v1 v2))))))))

Becomes:

(defun fib-cps (n k)
  (if (< n 3) (do-continuation k 1)
      (fib-cps (- n 2)
               (make-val1-continuation (- n 1)
                                       k))))

(defun make-val1-continuation (n k)
  `(val1-continuation ,n ,k))

(defun make-add1-continuation (n k)
  `(add1-continuation ,n ,k))

(defun do-continuation (k arg)
  (ecase (car k)
    (return-k arg)
    (val1-continuation (fib-cps (cadr k)
                                (make-add1-continuation arg (caddr k))))
    (add1-continuation (do-continuation (caddr k)
                         (+ arg (cadr k))))))

Of course this is all rather complicated to write by hand, but there is
one huge advantage: the continuations are now serializable, even across
different lisps! Provided with sufficient tools to generate code like
this, I don't see why anyone would prefer call/cc continuations (which
are inherently fragile) for this use case.

CPS also brings with it a clearer view of dynamically scoped variables,
which are useful in writing network applications. With call/cc
continuations, calling a saved continuation (eg for a web process) will
trigger wind points on the way out, meaning you can't store eg the
current connection in a simple dynamic variable.

One interesting thing to note is that continuations can be described in
terms of the standard UNIX call fork() with shared memory. If you do
use shared memory, fork() will simply copy the stack alone - so, to
exploit this, when you call/cc, fork() a new process, and have it
immediately suspend itself. When it is unsuspended, it should fork()
itself into a running process and then re-suspend itself. When you call
the associated continuation, unsuspend that process, and kill yourself.
Make sure to store return values on the heap, and you're all set! Thus
continuations are really a subset of standard UNIX multiprocessing
semantics :-)

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Anton van Straaten
Subject: Re: Scheme macros
Date: 
Message-ID: <9Qh_b.15739$W74.6434@newsread1.news.atl.earthlink.net>
Brian Mastenbrook wrote:
> In article <···················@newsread1.news.atl.earthlink.net>,
> Anton van Straaten <·····@appsolutions.com> wrote:
>
> > I find that syntax-rules works very well for many kinds of
> > macros, better than defmacro in fact.
> >
> > Of course, the latter claim is hardly ever going to be accepted by
someone
> > only familiar with defmacro.  For the record, I learned and used
defmacro
> > before ever using syntax-rules or syntax-case, and I still use defmacro
from
> > time to time, so I think I have a good basis for comparison.
>
> For the record, I'm familiar with both defmacro and syntax-rules,
> though I am considerably more familiar with the first. With helpful
> tools for list destructuring and mass generation of gensyms, defmacro
> macros can be pretty easy to write, whereas I often have to squint at a
> syntax-rules macro to figure out what ...'s correspond to what. Perhaps
> this is simply a consequence of my level of experience. Perhaps it's
> just personal preference.

I'm sure there is a large subjective element.  My real point relates to the
initial reaction that defmacro users often have, to not knowing what's going
on when they don't see syntax and variables being quoted or unquoted.
However, this is actually a benefit of syntax-rules, that they're simply
unfamiliar with.  I notice you didn't mention that issue as being something
that you have to squint for, which isn't surprising if you're somewhat
familiar with syntax-rules.

Re the ...'s, they since they appear after the expression which is being
repeated, I tend to think of them like a postfix token - along the lines of
the quote and quasiquote tokens, but appearing after the expression to which
it applies, instead of before.  The equivalent operation in defmacro is
usually procedural code, which I don't think is any clearer to absorb at a
glance.

> I don't understand what makes GENSYM such a kludge, nor why it is such
> a low level of abstraction. GENSYM is not doing anything different than
> CONS does - it returns something which is unique to eq? with the
> specified contents. In CL, a GENSYM-alike can minimally be written as
> (make-symbol "").

Yes, GENSYM is just a constructor.  But you don't normally have to
"construct" your variable names.  The only reason you do with defmacro, is
because defmacro doesn't deal with hygiene.  It's low-level because you're
implementing the sort of feature that is usually handled by the language.

You're right, that with some wrappers to generate gensym, you can make this
easier.  But as I said, you don't have to do this with normal lexical or
other variables, and I doubt you'd be defending it if you did have to gensym
all your variables.

What makes macros different?  That's semi-rhetorical - it's an interesting
question to answer.  I think the answer ends up being that defmacro exists
at an intersection point between ease of implementation, sufficient ease of
use, and the macro equivalent of Turing-completeness.  There are other
interesting intersection points, though.

> > All of the major Schemes either support continuations
> > well, or tell you when they don't - e.g., some of the Scheme to C
compilers
> > deliberately provide restricted continuations.
>
> Fortunately, this is what Chicken is for.

Yes, Chicken warms the cockles of my cool-hack-loving heart.  For those who
aren't familiar, it's a Scheme which generates C code that uses a version of
Henry Baker's Cheney-on-the-MTA mechanism for supporting tail recursion and
continuations in C:
http://home.pipeline.com/~hbaker1/CheneyMTA.html )

> > For applications which need them, continuations
> > are very useful, and have little competition.  Their competition is
mainly
> > OS-level threads, which really solve a different problem, and
conceptually
> > are a stream of continuations anyway.
>
> And here is where the next Lisp/Scheme debate is going to start up.
> When Schemers speak of continuations, they really mean implicit
> continuations - the idea that the program should be granted access to
> the continuations that are flying around under the hood in the
> implementation. However, the name "continuation" is considerably more
> general, or else anybody using CPS needs to find a new term.

No argument about the term.  The full term for the Scheme thingies is
"first-class continuation", and you can produce them in other ways than
call/cc.

> In fact I would argue that the main competition to call/cc
> continuations is going to be CPS, and that CPS has a really big
> advantage: after you write your program, you can identify what types of
> continuations you use, and then change their representation to a list
> of a name and arguments which describes the continuation of that type.

CPS is very useful, but one of the applications for first-class
continuations is implementing language-level tools - exception systems,
unusual control flows like goal seeking, web server inversion of control,
etc.  In all of these cases, you don't want to require the end programmer to
write their code in CPS.  You raised that point:

> Of course this is all rather complicated to write by hand, but there is
> one huge advantage: the continuations are now serializable, even across
> different lisps! Provided with sufficient tools to generate code like
> this, I don't see why anyone would prefer call/cc continuations (which
> are inherently fragile) for this use case.

If you write tools to generate CPS code behind the scenes, so that the user
ends up with serializable continuations, then you've implemented a language
which offers first-class continuations.  Whether it provides them through
call/cc or not isn't particularly important - call/cc just happens to be one
of the most simple and general way of giving access to continuations.  If
such tools actually existed as a layer over Lisp or Scheme, I'm sure it
would be useful.

Those tools might still find it useful to offer something like call/cc, if
it's possible given the typing issues, so that users can do their own
control manipulations rather than relying on whatever the tools support.  I
suspect it'll be tricky to eliminate the relevance of call/cc - it's a bit
like saying that given function definition syntax & variable binding syntax
etc., we could eliminate the need for lambda.  It'd still be there, just
hidden, which is the way it's supposed to be anyway.

> One interesting thing to note is that continuations can be described in
> terms of the standard UNIX call fork() with shared memory. If you do
> use shared memory, fork() will simply copy the stack alone - so, to
> exploit this, when you call/cc, fork() a new process, and have it
> immediately suspend itself. When it is unsuspended, it should fork()
> itself into a running process and then re-suspend itself. When you call
> the associated continuation, unsuspend that process, and kill yourself.
> Make sure to store return values on the heap, and you're all set! Thus
> continuations are really a subset of standard UNIX multiprocessing
> semantics :-)

I presume you'd have to implement mutable variables via heap allocated ref
cells, or something, otherwise you'd end cloning variables that should be
shared (if I've understood the model correctly).  So, since this model is
restricted, it's actually a subset of the continuation model, not the other
way around.  :)

Anton
From: Brian Mastenbrook
Subject: Re: Scheme macros
Date: 
Message-ID: <230220040538300856%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <····················@newsread1.news.atl.earthlink.net>,
Anton van Straaten <·····@appsolutions.com> wrote:

> > I don't understand what makes GENSYM such a kludge, nor why it is such
> > a low level of abstraction. GENSYM is not doing anything different than
> > CONS does - it returns something which is unique to eq? with the
> > specified contents. In CL, a GENSYM-alike can minimally be written as
> > (make-symbol "").
> 
> Yes, GENSYM is just a constructor.  But you don't normally have to
> "construct" your variable names.  The only reason you do with defmacro, is
> because defmacro doesn't deal with hygiene.  It's low-level because you're
> implementing the sort of feature that is usually handled by the language.

Wait, stop. Who said anything about GENSYM being a variable name? Not
me, for sure. GENSYM constructs a symbol. The fact that the evaluation
semantics for symbols is to treat them as variables isn't necessarily
relevant to the existance of GENSYM - it has the same use if you are
writing an interpreter or compiler for a language with hygienic macros.

I guess what you are really arguing is that Lisp a level of abstraction
to represent its macros which is normally hidden in a Scheme
implementation. This much is true, but I suspect that lispers do not
respond to "LOW-LEVEL" the way you do: on the contrary, I'd prefer to
work in a language that exposes its low-level technologies to me via
appropriate reflection.

> CPS is very useful, but one of the applications for first-class
> continuations is implementing language-level tools - exception systems,
> unusual control flows like goal seeking, web server inversion of control,
> etc.  In all of these cases, you don't want to require the end programmer to
> write their code in CPS.  You raised that point:

[elided]

> If you write tools to generate CPS code behind the scenes, so that the user
> ends up with serializable continuations, then you've implemented a language
> which offers first-class continuations.  Whether it provides them through
> call/cc or not isn't particularly important - call/cc just happens to be one
> of the most simple and general way of giving access to continuations.  If
> such tools actually existed as a layer over Lisp or Scheme, I'm sure it
> would be useful.

I was not arguing about the semantics or need for a tool like call/cc
in general. The issue I was trying to raise was whether call/cc was
necessary in the host language, especially for the specific problem of
inversion of control in a web server. call/cc is a very powerful but
very brutal tool in a sense, and a CPSer can produce code with
advantages (such as serializable continuations) that the host
continuations can't offer. It's just like shifting from use of LAMBDA
to an explicit data structure, to separate out the data being closed
over from the code in question. Both of these things are very useful
when you are actually trying to write a web application which needs
live upgrading, load balancing, serializable state, et al. Similarly
I'm sure a goal-directed reasoner might want to have serializable
state, for when you're trying to reproduce your experiments :-)

> Those tools might still find it useful to offer something like call/cc, if
> it's possible given the typing issues, so that users can do their own
> control manipulations rather than relying on whatever the tools support.  I
> suspect it'll be tricky to eliminate the relevance of call/cc - it's a bit
> like saying that given function definition syntax & variable binding syntax
> etc., we could eliminate the need for lambda.  It'd still be there, just
> hidden, which is the way it's supposed to be anyway.

Of course, but even if call/cc is offered in full, it will still not be
a language level tool to the host language. I believe this to be an
important distinction.

> > One interesting thing to note is that continuations can be described in
> > terms of the standard UNIX call fork() with shared memory. If you do
> > use shared memory, fork() will simply copy the stack alone - so, to
> > exploit this, when you call/cc, fork() a new process, and have it
> > immediately suspend itself. When it is unsuspended, it should fork()
> > itself into a running process and then re-suspend itself. When you call
> > the associated continuation, unsuspend that process, and kill yourself.
> > Make sure to store return values on the heap, and you're all set! Thus
> > continuations are really a subset of standard UNIX multiprocessing
> > semantics :-)
> 
> I presume you'd have to implement mutable variables via heap allocated ref
> cells, or something, otherwise you'd end cloning variables that should be
> shared (if I've understood the model correctly).  So, since this model is
> restricted, it's actually a subset of the continuation model, not the other
> way around.  :)

Actually, since it provides the continuation model when everything is
heap-allocated, and something slightly different when some things are
stack-allocated, doesn't that technically make continuations a subset
of this? :-)

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Anton van Straaten
Subject: Re: Scheme macros
Date: 
Message-ID: <UXs_b.17076$W74.1352@newsread1.news.atl.earthlink.net>
Brian Mastenbrook wrote:
> In article <····················@newsread1.news.atl.earthlink.net>,
> Anton van Straaten <·····@appsolutions.com> wrote:
>
> > > I don't understand what makes GENSYM such a kludge, nor why it
> > > is such a low level of abstraction. GENSYM is not doing anything
> > > different than CONS does - it returns something which is unique to
> > > eq? with the specified contents. In CL, a GENSYM-alike can
> > > minimally be written as (make-symbol "").
> >
> > Yes, GENSYM is just a constructor.  But you don't normally have to
> > "construct" your variable names.  The only reason you do with defmacro,
> > is because defmacro doesn't deal with hygiene.  It's low-level because
> > you're implementing the sort of feature that is usually handled by the
> > language.
>
> Wait, stop. Who said anything about GENSYM being a variable name?
> Not me, for sure.

Me neither.  The second sentence in my paragraph above is right-associative,
and beta-substitutes for the "do" in the sentence to its right: applying an
inlining transform, the third sentence would read: "The only reason you have
to "construct" your variable names with defmacro..."

> GENSYM constructs a symbol. The fact that the evaluation
> semantics for symbols is to treat them as variables isn't necessarily
> relevant to the existance of GENSYM - it has the same use if you are
> writing an interpreter or compiler for a language with hygienic macros.

Right, I wasn't objecting to GENSYM's existence, but to its use to create
variables safe for use in defmacro, and the fact that the consequences of
that can't be hidden, even if the GENSYM itself is.

> I guess what you are really arguing is that Lisp a level of abstraction
> to represent its macros which is normally hidden in a Scheme
> implementation. This much is true, but I suspect that lispers do not
> respond to "LOW-LEVEL" the way you do: on the contrary, I'd prefer to
> work in a language that exposes its low-level technologies to me via
> appropriate reflection.

I have no problem with that - but it doesn't make sense to have to deal with
that in every macro that's ever written.  You might use a feature like
GENSYM to implement some language/system-level construct, but what I'm
saying is that your average, ordinary macro shouldn't have to deal with that
level of abstraction - it has nothing to do with the problem domain.
Wrapping the GENSYM in higher-level functions helps, but you're still
dealing with the consequences of the issue - the need for GENSYM is just the
symptom.

One of the strengths of Lisp family languages is that code can be written at
a level closer to the problem domain.  When programming in many other
languages, even some of the more modern languages with garbage collection
etc., programmers are more often forced to deal with issues that amount to
limitations or quirks of the language.  Lisp, because of macros,
unrestricted procedural abstraction, and the things that have been built on
top of that (like CLOS), suffers much less from this.

However, use of DEFMACRO is an exception - the programmer is required to
deal with instantiating certain variable names before using them, much like
a C programmer has to allocate memory before storing anything.  Hiding the
GENSYM use doesn't help any more than hiding the allocation in C++ (via
'new') - if you don't use the proper construct, or do the correct escaping
of references to a name, it's an error.  My analogy is actually startlingly
appropriate: escaping a name using unquote is analogous to dereferencing a
pointer in C, using '*'.  A design at the appropriate level of abstraction
would know when a dereference or an unquote needs to occur, and not require
the programmer to worry about it with every use.

The main difference between the way variables work in defmacro, and the way
memory allocation & access works in C, is that the consequences of getting
defmacro wrong are nowhere near as severe.  But that doesn't excuse the
abstraction level violation that's taking place in every macro.

Note that I'm not saying DEFMACRO should be tossed in the trash.  I don't
think any of the alternatives exceed DEFMACRO's utility & simplicity in
every dimension.  But they do improve on it in some important ways, and in
the absence of a perfect macro system which achieves a perfect score in
every dimension, I think it's useful to have more than one system.

> I was not arguing about the semantics or need for a tool like call/cc
> in general. The issue I was trying to raise was whether call/cc was
> necessary in the host language, especially for the specific problem of
> inversion of control in a web server. call/cc is a very powerful but
> very brutal tool in a sense, and a CPSer can produce code with
> advantages (such as serializable continuations) that the host
> continuations can't offer. It's just like shifting from use of LAMBDA
> to an explicit data structure, to separate out the data being closed
> over from the code in question. Both of these things are very useful
> when you are actually trying to write a web application which needs
> live upgrading, load balancing, serializable state, et al.

If you deliver a language or tools that helps me do all of these things, I'd
love to use it.  As others have pointed out, there are some issues to
address.  In the meantime, I think call/cc is a useful tool to have.  If it
later turns out to have been a step on the road to better things, there's
nothing wrong with that.

> > > Thus continuations are really a subset of standard UNIX
> > > multiprocessing semantics :-)
> >
> > I presume you'd have to implement mutable variables via heap allocated
ref
> > cells, or something, otherwise you'd end cloning variables that should
be
> > shared (if I've understood the model correctly).  So, since this model
is
> > restricted, it's actually a subset of the continuation model, not the
other
> > way around.  :)
>
> Actually, since it provides the continuation model when everything is
> heap-allocated, and something slightly different when some things are
> stack-allocated, doesn't that technically make continuations a subset
> of this? :-)

No.  :oP

Anton
From: Rahul Jain
Subject: Re: Scheme macros
Date: 
Message-ID: <87fzd1dy2n.fsf@nyct.net>
"Anton van Straaten" <·····@appsolutions.com> writes:

> I have no problem with that - but it doesn't make sense to have to deal with
> that in every macro that's ever written.  You might use a feature like
> GENSYM to implement some language/system-level construct, but what I'm
> saying is that your average, ordinary macro shouldn't have to deal with that
> level of abstraction - it has nothing to do with the problem domain.
> Wrapping the GENSYM in higher-level functions helps, but you're still
> dealing with the consequences of the issue - the need for GENSYM is just the
> symptom.

First you say you have no problem with using GENSYM under the covers. 
Then you say that having abstractions that use GENSYM under the covers
is a problem. I don't get it. FWIW, there is a quasi-standard REBINDING
(a.k.a. WITH-GENSYMS) macro that provides just such an abstraction. I
should probably use it, but I'm too lazy to. :)

> However, use of DEFMACRO is an exception - the programmer is required to
> deal with instantiating certain variable names before using them, much like
> a C programmer has to allocate memory before storing anything.

The same gripe could be given about MOP. One has to instantiate the
slot-definition before using it. It's a question of layered protocols. 
Use the layer that fits the problem. This reminds me: you still haven't
shown an implementation of LOOP in syntax-rules (or maybe I haven't
gotten that far in the thread).

-- 
Rahul Jain
·····@nyct.net
Professional Software Developer, Amateur Quantum Mechanicist
From: Anton van Straaten
Subject: Re: Scheme macros
Date: 
Message-ID: <edA_b.17980$W74.12787@newsread1.news.atl.earthlink.net>
Rahul Jain wrote:
> "Anton van Straaten" <·····@appsolutions.com> writes:
>
> > I have no problem with that - but it doesn't make sense to have to deal
with
> > that in every macro that's ever written.  You might use a feature like
> > GENSYM to implement some language/system-level construct, but what I'm
> > saying is that your average, ordinary macro shouldn't have to deal with
that
> > level of abstraction - it has nothing to do with the problem domain.
> > Wrapping the GENSYM in higher-level functions helps, but you're still
> > dealing with the consequences of the issue - the need for GENSYM is just
the
> > symptom.
>
> First you say you have no problem with using GENSYM under the covers.
> Then you say that having abstractions that use GENSYM under the covers
> is a problem. I don't get it.

Sorry for not being clear.  I'm saying that in the case of DEFMACRO, the
abstraction necessarily leaks, which reduces its quality as an abstraction.
As I said, the need for GENSYM is just a symptom, of the hygiene issue.

> FWIW, there is a quasi-standard REBINDING (a.k.a. WITH-GENSYMS)
> macro that provides just such an abstraction.

Even if you use that, you still have to unquote the references to the
variables it declares.  If you forget to unquote them, it's a bug - not only
that, but it's one that isn't necessarily detected by the compiler, and can
result in erroneous variable capture.  Now, I'm not saying that's the end of
the world, but you have to admit that in other contexts - such as when
dealing with ordinary variables - we don't accept that sort of thing.  Why
the exception for DEFMACRO?

> > However, use of DEFMACRO is an exception - the programmer is required to
> > deal with instantiating certain variable names before using them, much
like
> > a C programmer has to allocate memory before storing anything.
>
> The same gripe could be given about MOP. One has to instantiate the
> slot-definition before using it. It's a question of layered protocols.
> Use the layer that fits the problem.

Yes, that's what I'm saying.  With DEFMACRO, there is no higher layer, but
there should be.  That's the exact issue that syntax-rules and syntax-case
address.

> This reminds me: you still haven't shown an implementation of LOOP
> in syntax-rules (or maybe I haven't gotten that far in the thread).

You'd have to pay me to implement LOOP, in any language.  It could be
implemented straightforwardly enough with syntax-case.  It may very well be
difficult to implement in syntax-rules, but that's not particularly
relevant - I've repeatedly said that I don't consider syntax-rules a
complete replacement for DEFMACRO.  However, syntax-rules does show how a
higher layer than DEFMACRO can work well, so there are lessons to be learned
from it.  At the very least, it offers an alternative way of thinking about
macros, to help combat the Sapir-Whorf hypothesis as it applies to CL
macros.

Anton
From: Rahul Jain
Subject: Re: Scheme macros
Date: 
Message-ID: <87vflxcbla.fsf@nyct.net>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Sorry for not being clear.  I'm saying that in the case of DEFMACRO, the
> abstraction necessarily leaks, which reduces its quality as an abstraction.

No. DEFMACRO is an abstraction over syntax. It is not an abstraction
over semantics. Compare this to CAR/CDR/CDAR vs. 
FIRST/SECOND/THIRD/REST. Note the equivalent of CDAR as a list-operator. 
Macros are like cons cells. Syntax-rules are (I guess) like lists. Would
you claim that the existence of CDAR reduces the quality of cons cells
as an abstraction?

> As I said, the need for GENSYM is just a symptom, of the hygiene issue.

Like lack of side-effects, hygiene is NOT a goal. It's just a property
of _certain_ macros.

>> FWIW, there is a quasi-standard REBINDING (a.k.a. WITH-GENSYMS)
>> macro that provides just such an abstraction.
>
> Even if you use that, you still have to unquote the references to the
> variables it declares.  If you forget to unquote them, it's a bug - not only
> that, but it's one that isn't necessarily detected by the compiler, and can
> result in erroneous variable capture.  Now, I'm not saying that's the end of
> the world, but you have to admit that in other contexts - such as when
> dealing with ordinary variables - we don't accept that sort of thing.  Why
> the exception for DEFMACRO?

(let ((x 1)
      (y 2))
  (let ((x 2)) ;; OOPS. this was supposed to be z!
     ...))

Lisp compilers accept this sort of thing, and so do scheme compilers, afaik.

> Yes, that's what I'm saying.  With DEFMACRO, there is no higher layer, but
> there should be.  That's the exact issue that syntax-rules and syntax-case
> address.

OK, but that says nothing about DEFMACRO itself being considered
harmful. In other posts, the way that syntax-rules adds information to
the expansion such as what is used as a binding and what isn't (in order
to help identify what should be captured/shadowed, and what should be
gensymed, I assume). I'm not sure if you could then use a
DEFMACRO-defined macro in the definition of a syntax-rules-defined
macro. If that's the case, that _serverely_ limits the usefulness of
syntax-rules.

-- 
Rahul Jain
·····@nyct.net
Professional Software Developer, Amateur Quantum Mechanicist
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1f5qu$o7v$1@newsreader2.netcologne.de>
Anton van Straaten wrote:

>>FWIW, there is a quasi-standard REBINDING (a.k.a. WITH-GENSYMS)
>>macro that provides just such an abstraction.
> 
> Even if you use that, you still have to unquote the references to the
> variables it declares.  If you forget to unquote them, it's a bug - not only
> that, but it's one that isn't necessarily detected by the compiler, and can
> result in erroneous variable capture.  Now, I'm not saying that's the end of
> the world, but you have to admit that in other contexts - such as when
> dealing with ordinary variables - we don't accept that sort of thing.  Why
> the exception for DEFMACRO?

Just reword that section, and IMHO one can see that this is just a 
question of reasonable defaults:

"Even if you use syntax-case, you still have to take special action to 
break hygiene.  If you forget to do that, it's a bug - not only that, 
but it's one that isn't necessarily detected by the compiler, and can 
result in erroneous absence of variable capture."

I understand that syntax-rules and syntax-case is a different style for 
expressing macros, but I don't accept the hygiene part. The purpose of 
an abstraction is to suppress irrelevant elements. Name capture is an 
important and useful concept that you have to know about anyway, so it's 
not irrelevant.

>>This reminds me: you still haven't shown an implementation of LOOP
>>in syntax-rules (or maybe I haven't gotten that far in the thread).

I don't think anyone has claimed that syntax-rules is appropriate for 
expressing LOOP. If DEFMACRO's only purpose was to be able to express 
LOOP then we wouldn't need it, because LOOP is part of the standard.


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Björn Lindberg
Subject: Re: Scheme macros
Date: 
Message-ID: <hcsr7wioxn0.fsf@tjatte.nada.kth.se>
"Anton van Straaten" <·····@appsolutions.com> writes:

> > FWIW, there is a quasi-standard REBINDING (a.k.a. WITH-GENSYMS)
> > macro that provides just such an abstraction.
> 
> Even if you use that, you still have to unquote the references to the
> variables it declares.  If you forget to unquote them, it's a bug - not only
> that, but it's one that isn't necessarily detected by the compiler, and can
> result in erroneous variable capture.  Now, I'm not saying that's the end of
> the world, but you have to admit that in other contexts - such as when
> dealing with ordinary variables - we don't accept that sort of thing.  Why
> the exception for DEFMACRO?

You don't *have* to unquote. If you want the macro invocation (foo 7)
to expand into (bar a 7), you can either write it as `(bar a ,x) or
(list 'bar 'a x). The unquoting in the first case is just a
convenience. But if you want to be able to refer to both variables
local to the macro as well as variables at the place of macro
expansion, I don't see how you can avoid having some kind of
quoting/unquoting to distinguish them. How is this distinction made in
the scheme macro systems?


Bj�rn
From: Rahul Jain
Subject: Re: Scheme macros
Date: 
Message-ID: <87k72ddycl.fsf@nyct.net>
Brian Mastenbrook <····················@cs.indiana.edu> writes:

> Wait, stop. Who said anything about GENSYM being a variable name? Not
> me, for sure. GENSYM constructs a symbol. The fact that the evaluation
> semantics for symbols is to treat them as variables isn't necessarily
> relevant to the existance of GENSYM - it has the same use if you are
> writing an interpreter or compiler for a language with hygienic macros.

I think this is a key issue. Earlier, it was claimed that a Lisp-1 can
emulate a Lisp-2 by just renaming variables. This doesn't do anything
for the case where the plist of a symbol is used by the macro-function
for that symbol. That macro-function can then be attached to any symbols
and the plist can be used to customize its behavior. Given a MOP, I'd
rather implement this using funcallable-instances, but I don't see how
having a Lisp-1 gives you those for free. :)

-- 
Rahul Jain
·····@nyct.net
Professional Software Developer, Amateur Quantum Mechanicist
From: Ray Dillinger
Subject: Re: Scheme macros
Date: 
Message-ID: <403D291D.C39A3653@sonic.net>
Rahul Jain wrote:
> 
> I think this is a key issue. Earlier, it was claimed that a Lisp-1 can
> emulate a Lisp-2 by just renaming variables. This doesn't do anything
> for the case where the plist of a symbol is used by the macro-function
> for that symbol. That macro-function can then be attached to any symbols
> and the plist can be used to customize its behavior. Given a MOP, I'd
> rather implement this using funcallable-instances, but I don't see how
> having a Lisp-1 gives you those for free. :)

It's worth a note that scheme - the only currently popular Lisp-1 -- doesn't 
require plists and most scheme implementations don't have them. 

While in common lisp it's common to have a whole bunch of values stored 
in a symbol under different names in the symbol's property list, this 
is not possible in most schemes. In scheme, a symbol is merely a value, 
and its only distinguishing feature is the spelling of its name. 

Arguably, use of a plist means we're not really talking about a lisp-2 
anymore; at that point we're talking about a lisp-n, with n unbounded. 
And while it is strictly true that straightforward lexical transformations
can turn any lisp-N code into lisp-1 code where N is known, the argument 
breaks down on plists because N is unbounded. 

				Bear
From: Rahul Jain
Subject: Re: Scheme macros
Date: 
Message-ID: <87ad35j1ta.fsf@nyct.net>
Ray Dillinger <····@sonic.net> writes:

> Arguably, use of a plist means we're not really talking about a lisp-2 
> anymore; at that point we're talking about a lisp-n, with n unbounded. 
> And while it is strictly true that straightforward lexical transformations
> can turn any lisp-N code into lisp-1 code where N is known, the argument 
> breaks down on plists because N is unbounded. 

I disagree. The plist is merely some property of the symbol. The only
real difference between a lisp-1 and a lisp-2 is in how the compiler
_treats_ the symbols in different locations in the form it's given.

What's the problem is that by renaming a symbol when it's used in a
different context, we lose the association between the bindings in those
two contexts. Take, for example, the english word `record'. The name is
the same, but in different syntactic contexts, it has slightly different
(but related) meanings. Since symbols in lisp are objects with identity,
we can use this association to good effect.

In lisp, a symbol is really treated as an object, for use by
applications. In scheme, it's just a name for a lexical binding,
specifically for the use of the compiler. Any other uses will end up
causing conflicts.

-- 
Rahul Jain
·····@nyct.net
Professional Software Developer, Amateur Quantum Mechanicist
From: Ray Dillinger
Subject: Re: Scheme macros
Date: 
Message-ID: <403F86AE.40E23C29@sonic.net>
Rahul Jain wrote:
> 
> Ray Dillinger <····@sonic.net> writes:
> 
> > Arguably, use of a plist means we're not really talking about a lisp-2
> > anymore; at that point we're talking about a lisp-n, with n unbounded.
> > And while it is strictly true that straightforward lexical transformations
> > can turn any lisp-N code into lisp-1 code where N is known, the argument
> > breaks down on plists because N is unbounded.
> 
> I disagree. The plist is merely some property of the symbol. The only
> real difference between a lisp-1 and a lisp-2 is in how the compiler
> _treats_ the symbols in different locations in the form it's given.
> 
> What's the problem is that by renaming a symbol when it's used in a
> different context, we lose the association between the bindings in those
> two contexts.

There is no association between the bindings in a lisp-1 to lose.  
Different symbols are merely different values.  you can't rename 
a symbol in a pure lisp-1, any more than you can rename 23 or 
#\a or the string value "rename."  So talking about what happens 
when a symbol is renamed in a different context is like talking 
about what happens when the moon turns into cheese.

If you can rename it, meaningfully, then its name is a value that 
can be changed without changing its identity, and there is at least
one other value stored in the identified structure.  That means we 
are simply not talking about a lisp-1 any more.

> Take, for example, the english word `record'. The name is
> the same, but in different syntactic contexts, it has slightly different
> (but related) meanings. Since symbols in lisp are objects with identity,
> we can use this association to good effect.

I know what you're talking about; but the argument simply does not 
apply to lisp in general. It only applies to lisp-2's and greater. 
 
> In lisp, a symbol is really treated as an object, for use by
> applications. In scheme, it's just a name for a lexical binding,
> specifically for the use of the compiler. Any other uses will end up
> causing conflicts.

*blip.*  Sorry, I had a mental trainwreck when I read this.  Up to now, 
I've been assuming when you used "lisp" that you were specifically 
including all dialects.  

For clarity, please, If you're comparing one lisp dialect to another, 
or comparing families of dialects such as lisp-1's and lisp-n's, 
use the names of the dialects you're talking about. 

				Bear
From: Rahul Jain
Subject: Re: Scheme macros
Date: 
Message-ID: <878yilw5mh.fsf@nyct.net>
Ray Dillinger <····@sonic.net> writes:

> Rahul Jain wrote:
>> What's the problem is that by renaming a symbol when it's used in a
>> different context, we lose the association between the bindings in those
>> two contexts.
>
> There is no association between the bindings in a lisp-1 to lose.

The point is that we were trying to convert a lisp-2 program into a
lisp-1 program. The fact that there is an association in a lisp-2 and
there is none in a lisp-1 is the problem.

>> Take, for example, the english word `record'. The name is
>> the same, but in different syntactic contexts, it has slightly different
>> (but related) meanings. Since symbols in lisp are objects with identity,
>> we can use this association to good effect.
>
> I know what you're talking about; but the argument simply does not 
> apply to lisp in general. It only applies to lisp-2's and greater. 

Yes, that's my point. You aren't allowed to do this in a lisp-1. 
Therefore, symbol munging doesn't fix the problem.

>> In lisp, a symbol is really treated as an object, for use by
>> applications. In scheme, it's just a name for a lexical binding,
>> specifically for the use of the compiler. Any other uses will end up
>> causing conflicts.
>
> *blip.*  Sorry, I had a mental trainwreck when I read this.  Up to now, 
> I've been assuming when you used "lisp" that you were specifically 
> including all dialects.  

Personally, I don't consider Scheme to be a Lisp dialect any more than I
consider C to be an Algol dialect.

-- 
Rahul Jain
·····@nyct.net
Professional Software Developer, Amateur Quantum Mechanicist
From: Ray Dillinger
Subject: Re: Scheme macros
Date: 
Message-ID: <4042DC75.B555E8D4@sonic.net>
Rahul Jain wrote:
> 
> Ray Dillinger <····@sonic.net> writes:
> 
> > Rahul Jain wrote:
> >> What's the problem is that by renaming a symbol when it's used in a
> >> different context, we lose the association between the bindings in those
> >> two contexts.
> >
> > There is no association between the bindings in a lisp-1 to lose.
> 
> The point is that we were trying to convert a lisp-2 program into a
> lisp-1 program. The fact that there is an association in a lisp-2 and
> there is none in a lisp-1 is the problem.

It's still a simple syntactic trasformation.  You wind up using 
N variables in a lisp-1 to represent what you have in a single 
symbol in a lisp-N. 

The syntactic transformation is that all the derived symbols get 
passed around together; effectively there are now N times as many 
variables in every scope and every call.

When N is unknown this is no longer possible to do without actually
analyzing the program, ie, it's no longer merely syntactic. 

However, a tiny scrap of analysis suffices.  All you do is read the
program looking for different properties, including syntactic sugar for 
them like accessing a 'function' property via position in a combination. 

The number of different properties you find is N, and then you convert
from lisp-N to lisp-1 syntactically. 


> > I know what you're talking about; but the argument simply does not
> > apply to lisp in general. It only applies to lisp-2's and greater.
> 
> Yes, that's my point. You aren't allowed to do this in a lisp-1.
> Therefore, symbol munging doesn't fix the problem.

Of course it does.  You just wind up with more symbols is all, 
and instead of being bound by their name, they are "bound" by being
explicitly passed around together. 

> > *blip.*  Sorry, I had a mental trainwreck when I read this.  Up to now,
> > I've been assuming when you used "lisp" that you were specifically
> > including all dialects.
> 
> Personally, I don't consider Scheme to be a Lisp dialect any more than I
> consider C to be an Algol dialect.

Hmmm.  Fully parenthesized prefix notation, higher-order functions, 
anonymous functions, code-as-data, dynamic typing, car, cons, cdr, 
lambda, quote, eval,....  sure looks like a lisp to me.  I think 
you're wrong about that.

Interesting take on things, though.  Are there any lisp-1's which 
you do consider to be a Lisp dialect? If so, what's the distinguishing 
feature? 

				Bear
From: Rahul Jain
Subject: Re: Scheme macros
Date: 
Message-ID: <873c8stu9j.fsf@nyct.net>
Ray Dillinger <····@sonic.net> writes:

> However, a tiny scrap of analysis suffices.  All you do is read the
> program looking for different properties, including syntactic sugar for 
> them like accessing a 'function' property via position in a combination. 
>
> The number of different properties you find is N, and then you convert
> from lisp-N to lisp-1 syntactically. 

This sounds like an interesting theory. I don't know if it's doable in
practice. How would you figure out all semantic connotations of a symbol
throught all the code that is being called in the presence of dynamic
redefinition and forward references? Wouldn't it be easier to just treat
symbols as objects when they're used as objects? You might as well pass
around standard-instances by exploding them into their component slots.

>> Personally, I don't consider Scheme to be a Lisp dialect any more than I
>> consider C to be an Algol dialect.
>
> Hmmm.  Fully parenthesized prefix notation, higher-order functions, 
> anonymous functions, code-as-data, dynamic typing, car, cons, cdr, 
> lambda, quote, eval,....  sure looks like a lisp to me.  I think 
> you're wrong about that.

Note that I consider dylan to be more of a lisp than scheme. Also, CAR
and CDR don't even do the same things in lisp and scheme. I can add them
to C rather trivially, too, once I choose lisp semantics or scheme
semantics.

Scheme code (in fully-parenthesized prefix notation, which, as a type of
criterion, would put perl and C into the same language family) is a
bunch of text that gets parsed into some structured intermediate form
and then has semantic analysis applied to it. All the syntax is defined
textually, not structurally. The "code data" that exists in scheme is
about as close to lisp's s-expressions as s-expressions are to python's
code data, as we've learned in the discussion about scheme macros.

Perl has lambda (anon. functions, higher-order functions) and eval. 

Creating literals (quote) is nothing special to any language. Even C has
it.

Dynamic typing is hardly unique or even universal in the family of
languages that includes lisp and scheme (take ML, for example).

> Interesting take on things, though.  Are there any lisp-1's which 
> you do consider to be a Lisp dialect? If so, what's the distinguishing 
> feature? 

As with all terms, it's not a single feature: there is a center point
and there is a range which varies based on the context of a given
communication. I consider there to be a language family that contains
lisp, scheme, dylan, ocaml, and perl, as they all share some fundamental
features.

-- 
Rahul Jain
·····@nyct.net
Professional Software Developer, Amateur Quantum Mechanicist
From: Ray Dillinger
Subject: Re: Scheme macros
Date: 
Message-ID: <40454DC1.2930697F@sonic.net>
Rahul Jain wrote:
> 
> > Hmmm.  Fully parenthesized prefix notation, higher-order functions,
> > anonymous functions, code-as-data, dynamic typing, car, cons, cdr,
> > lambda, quote, eval,....  sure looks like a lisp to me.  I think
> > you're wrong about that.
> 
> Note that I consider dylan to be more of a lisp than scheme. Also, CAR
> and CDR don't even do the same things in lisp and scheme. I can add them
> to C rather trivially, too, once I choose lisp semantics or scheme
> semantics.
 
That's weird, to me. Dylan code isn't expressed by a sublanguage
of the language it uses for data, so it wasn't even in the running
to be a lisp. 

After a bit of reflection, I get what you're angling at with CAR and CDR
having different semantics in Scheme and Common Lisp; Scheme has them 
returning values, all the time, whereas CL has them returning values in 
some contexts and mutable locations in others.  So scheme is biased 
much more strongly in favor of functional code. Or, more precisely speaking, 
against side effecting code. 

However, I don't think either way is any more or less "lispy;" to the 
extent that it is, Scheme's CAR and CDR have the semantics described by 
McCarthy way back when and CL has an extension invented since, so this
argues for CL being less a lisp than scheme(!). 

Also, the difference in CAR/CDR has nothing to do with the fundamental 
differences between lisp-1's and lisp-n's, which we've been discussing
up to now. 


> Scheme code is a
> bunch of text that gets parsed into some structured intermediate form
> and then has semantic analysis applied to it. 

Scheme's eval takes structured list data, the same as Common Lisp's. 
I can create and evaluate scheme code on the fly by manipulating list 
data, so this is just clearly wrong.  There is *NO* reading of R5RS
that allows eval to be defined on mere text, period.  You must stop 
making this absurd claim; it can only damage your credibility.

The only difference is that the standard doesn't have to talk about 
it as much in terms of abstract forms because scheme doesn't have 
reader macros to cause the distinction between forms and text to be 
as clear.

> > Interesting take on things, though.  Are there any lisp-1's which
> > you do consider to be a Lisp dialect? If so, what's the distinguishing
> > feature?
> 
> As with all terms, it's not a single feature: there is a center point
> and there is a range which varies based on the context of a given
> communication. I consider there to be a language family that contains
> lisp, scheme, dylan, ocaml, and perl, as they all share some fundamental
> features.

So, and this is just a guess, do you consider Common Lisp to be the precise
center of what Lisp is?  Is Lisp defined in terms of distance-from-CL 
for you, or do you see CL as occupying its own particular spot of the 
universe of Lisp? As a lexically-scoped Lisp-2, it is representative of 
less than 10% of Lisp's history. 

				Bear
From: Rahul Jain
Subject: Re: Scheme macros
Date: 
Message-ID: <8765dmlihn.fsf@nyct.net>
Ray Dillinger <····@sonic.net> writes:

> That's weird, to me. Dylan code isn't expressed by a sublanguage
> of the language it uses for data, so it wasn't even in the running
> to be a lisp. 

Well, neither is scheme code, as I said we learned in the discussion
about scheme macros. Scheme code is a totally different kind of data
structure that includes lots of semantic information, unlike lisp code.

> After a bit of reflection, I get what you're angling at with CAR and CDR
> having different semantics in Scheme and Common Lisp; Scheme has them 
> returning values, all the time, whereas CL has them returning values in 
> some contexts and mutable locations in others.

No, that's done by the modifier macros themselves. They run the
setf-expander you defined and get the 5 values needed to manipulate that
operator's conceptual "place".

> So scheme is biased much more strongly in favor of functional code. 
> Or, more precisely speaking, against side effecting code.

No, scheme is just biased at making you _suffer_ if you want to make
side-effecting code efficient. (Not that any side-effect-happy language
is any better.)

> Also, the difference in CAR/CDR has nothing to do with the fundamental 
> differences between lisp-1's and lisp-n's, which we've been discussing
> up to now. 

Yes, I was using them as examples.

>> Scheme code is a
>> bunch of text that gets parsed into some structured intermediate form
>> and then has semantic analysis applied to it. 
>
> Scheme's eval takes structured list data, the same as Common Lisp's.

But the syntax isn't defined in terms of structured list data.

> I can create and evaluate scheme code on the fly by manipulating list 
> data, so this is just clearly wrong.  There is *NO* reading of R5RS
> that allows eval to be defined on mere text, period.  You must stop 
> making this absurd claim; it can only damage your credibility.

EVAL operates on something other than the definition of the syntax in
the standard.

> The only difference is that the standard doesn't have to talk about 
> it as much in terms of abstract forms because scheme doesn't have 
> reader macros to cause the distinction between forms and text to be 
> as clear.

It also doesn't have syntactic macros, just semantic ones. (Officially)

> So, and this is just a guess, do you consider Common Lisp to be the precise
> center of what Lisp is?  Is Lisp defined in terms of distance-from-CL 
> for you, or do you see CL as occupying its own particular spot of the 
> universe of Lisp? As a lexically-scoped Lisp-2, it is representative of 
> less than 10% of Lisp's history. 

Well, it's the only current general purpose language that calls itself
"Lisp". Including scheme in any family with CL makes for a very broad
family, as I said.

Note also that communities change over time. Is "English" defined in
terms of distance from Modern English or Old English?

-- 
Rahul Jain
·····@nyct.net
Professional Software Developer, Amateur Quantum Mechanicist
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: Scheme macros
Date: 
Message-ID: <pan.2004.03.04.10.08.22.321275@knm.org.pl>
On Tue, 02 Mar 2004 22:25:24 -0500, Rahul Jain wrote:

>> Scheme's eval takes structured list data, the same as Common Lisp's.
> 
> But the syntax isn't defined in terms of structured list data.

But it's the question of language descripton, no? It could be, while being
equivalent. Or is there something which prevents defining Scheme semantics
in terms of the list data? Is it legal to write (foo . (bar)) instead of
(foo bar) as procedure application? Guile allows that.

Ironically, Common Lisp semantics must be defined in terms of characters,
not list data. Because of readtable changes, and changes of variables
which influence the reader (e.g. the current package), you can't read the
whole file and then define the semantics in terms of the read data. You
must execute statements one by one while reading them. So either you
define the semantics of a single statement at a time, and say that all
forms in a file are just read and executed (is it true without exceptions?)
or the semantics must be a function of character data in the file.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Anton van Straaten
Subject: Re: Scheme macros
Date: 
Message-ID: <AfK2c.691$%06.357@newsread2.news.pas.earthlink.net>
Rahul Jain wrote:
> Ray Dillinger <····@sonic.net> writes:
>
>> That's weird, to me. Dylan code isn't expressed by a sublanguage
>> of the language it uses for data, so it wasn't even in the running
>> to be a lisp.
>
> Well, neither is scheme code, as I said we learned in the discussion
> about scheme macros. Scheme code is a totally different kind of data
> structure that includes lots of semantic information, unlike lisp code.

Scheme's representation of code provides a superset of what CL offers.  I
guess that makes Scheme a superset of CL.  What's the point of this game,
again?

Anton
From: Rahul Jain
Subject: Re: Scheme macros
Date: 
Message-ID: <87ishgl96g.fsf@nyct.net>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Scheme's representation of code provides a superset of what CL offers.  I
> guess that makes Scheme a superset of CL.  What's the point of this game,
> again?

Um... a higher level construct isn't a "superset" of a lower-level
construct. It hides information that is determined to be irrelevant at
the higher level. Not all languages operate semantically in the same way
as scheme. CL's code representation is just a symbolic tree structure. 
You get to assign your own semantics to it. In short, Scheme's
representation of code does NOT provide a superset of what CL offers. It
operates at a DIFFERENT level, and some operators need information from
the lower-level.

-- 
Rahul Jain
·····@nyct.net
Professional Software Developer, Amateur Quantum Mechanicist
From: Anton van Straaten
Subject: Re: Scheme macros
Date: 
Message-ID: <wfR2c.8851$%06.6801@newsread2.news.pas.earthlink.net>
Rahul Jain wrote:
> "Anton van Straaten" <·····@appsolutions.com> writes:
>
> > Scheme's representation of code provides a superset of what CL offers.
I
> > guess that makes Scheme a superset of CL.  What's the point of this
game,
> > again?
>
> Um... a higher level construct isn't a "superset" of a lower-level
> construct. It hides information that is determined to be irrelevant at
> the higher level.

I misspoke.  I was referring to the syntax-case syntax objects, which are
indeed a superset of lists.  So, it's R5RS Scheme plus syntax-case which is
the superset of CL, in terms of syntax representation and manipulation.
Syntax-case doesn't hide any information relative to defmacro - in fact, it
supports inclusion of information which defmacro doesn't support.

My point is that the argument that Scheme is somehow less Lisp-like because
it doesn't represent code specifically as a list, arises from a narrow
conception of code representation in Lisps.  A rebuttal to that narrow
conception is to point out that Scheme supports a code representation which
can do everything that the standard CL representation can do, and more, i.e.
it's a superset.

As Will Clinger has pointed out, some of the modern macro systems were
designed for CL, and at least one of them - syntax-rules - is available for
CL.  An unavoidable characteristic of Lisps is that they can naturally
support multiple macro systems, even though their standards tend to specify
only one.

Anton
From: Brian Mastenbrook
Subject: Re: Scheme macros
Date: 
Message-ID: <080320040205187750%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <···················@newsread2.news.pas.earthlink.net>,
Anton van Straaten <·····@appsolutions.com> wrote:

> My point is that the argument that Scheme is somehow less Lisp-like because
> it doesn't represent code specifically as a list, arises from a narrow
> conception of code representation in Lisps.

I don't think macro hygiene has any bearing on the issue lispiness -
after all, you have to move from S-Expressions to something richer at
some point in the compilation phase. The problem right now is that
Scheme is not specified to involve lists at any point in the read /
compile cycle, and thus the following forms are not equivalent (in fact
the second is illegal):

(lambda (x) x)
(lambda (x . ()) x)

The fact that the semantics of Lisp are divorced from its textual
representation has some happy side-effects, most notably the InterLisp
structure editor (which I would like to reproduce in either Cocoa or
CLIM). So far, I have been unable to determine the rationale for the
way it is specified in R5RS or for the MzScheme extension to psyntax
which allows the macroexpansion-time differentiation of those two forms
(!). Thankfully I have not yet encountered schemes other than MzScheme
which do not use the reader to read code.

(Obligatory disclaimer: I'm a passionate S-Expression advocate.)

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <y8qbjubq.fsf@ccs.neu.edu>
Brian Mastenbrook <····················@cs.indiana.edu> writes:

> In article <···················@newsread2.news.pas.earthlink.net>,
> Anton van Straaten <·····@appsolutions.com> wrote:
>
>> My point is that the argument that Scheme is somehow less Lisp-like because
>> it doesn't represent code specifically as a list, arises from a narrow
>> conception of code representation in Lisps.
>
> I don't think macro hygiene has any bearing on the issue lispiness -
> after all, you have to move from S-Expressions to something richer at
> some point in the compilation phase. The problem right now is that
> Scheme is not specified to involve lists at any point in the read /
> compile cycle, and thus the following forms are not equivalent (in fact
> the second is illegal):
>
> (lambda (x) x)
> (lambda (x . ()) x)

I don't know why you think this.  Only the most contorted reading of
the spec could lead to this conclusion, and as a practical matter I
think it would be impossible to implement a consistent, correct,
Scheme that behaved that way.
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <wu5vjuaf.fsf@ccs.neu.edu>
Brian Mastenbrook <····················@cs.indiana.edu> writes:

> In article <···················@newsread2.news.pas.earthlink.net>,
> Anton van Straaten <·····@appsolutions.com> wrote:
>
>> My point is that the argument that Scheme is somehow less Lisp-like because
>> it doesn't represent code specifically as a list, arises from a narrow
>> conception of code representation in Lisps.
>
> I don't think macro hygiene has any bearing on the issue lispiness -
> after all, you have to move from S-Expressions to something richer at
> some point in the compilation phase. The problem right now is that
> Scheme is not specified to involve lists at any point in the read /
> compile cycle, and thus the following forms are not equivalent (in fact
> the second is illegal):
>
> (lambda (x) x)
> (lambda (x . ()) x)

I don't know why you think this.  Only the most contorted reading of
the spec could lead to this conclusion, and as a practical matter I
think it would be impossible to implement a consistent, correct,
Scheme that behaved that way.
From: Brian Mastenbrook
Subject: Re: Scheme macros
Date: 
Message-ID: <080320040954348675%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <············@ccs.neu.edu>, Joe Marshall <···@ccs.neu.edu>
wrote:

> > (lambda (x) x)
> > (lambda (x . ()) x)
> 
> I don't know why you think this.  Only the most contorted reading of
> the spec could lead to this conclusion, and as a practical matter I
> think it would be impossible to implement a consistent, correct,
> Scheme that behaved that way.

<lambda expression> --> (lambda <formals> <body>)
<formals> --> (<variable>*) | <variable> | (<variable>+ . <variable>)

() is not a variable. I don't see how this reading is "most contorted",
given that R5RS seems to take great pains to separate program text and
data (while still allowing that programs can be read as data). As to
whether you actually could implement a scheme like this - given that
MzScheme can already distinguish between the two in its syntax-case, I
don't see why it would be inconsistent for it to disallow (x . ()) as a
formals list.

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <d67njraz.fsf@ccs.neu.edu>
Brian Mastenbrook <····················@cs.indiana.edu> writes:

> In article <············@ccs.neu.edu>, Joe Marshall <···@ccs.neu.edu>
> wrote:
>
>> > (lambda (x) x)
>> > (lambda (x . ()) x)
>> 
>> I don't know why you think this.  Only the most contorted reading of
>> the spec could lead to this conclusion, and as a practical matter I
>> think it would be impossible to implement a consistent, correct,
>> Scheme that behaved that way.
>
> <lambda expression> --> (lambda <formals> <body>)
> <formals> --> (<variable>*) | <variable> | (<variable>+ . <variable>)
>
> () is not a variable. 

Yes, but you are looking at section 7.1.3, and the right hand side of
the BNF productions are (the printed representation of) datum, not
character strings.  That is,  (<variable>*) is a list of variables,
not a parenthesis, a series of characters parsable as identifiers
separated by whitespace, followed by a close parenthesis.

(lambda (x . ()) x)  by section 3.3 is simply a different printed
representation for (lambda (x) x)

> I don't see how this reading is "most contorted", given that R5RS
> seems to take great pains to separate program text and data (while
> still allowing that programs can be read as data).

I don't know why you think this:

  - Section 1.2  ``Scheme, like most dialects of Lisp, employs a fully
    parenthesized prefix notation for programs and (other) data; the
    grammar of  Scheme generates a sublanguage of the language used
    for data. An important consequence of this simple, uniform
    representation is the susceptibility of Scheme programs and data
    to uniform treatment by other Scheme programs. For example, the
    eval procedure evaluates a Scheme program expressed as data.''

  - Section 3.3 ``Scheme's syntax has the property that any sequence
    of characters that is an expression is also the external
    representation of some object.''

  - Section 6.3.2 ``Within literal expressions and representations of objects
    read by the read procedure, the forms '<datum>, `<datum>,
    ,@<datum>, and ,@<datum> denote two-element lists whose first
    elements are the symbols quote, quasiquote, unquote, and
    unquote-splicing, respectively.  The second element in each case
    is <datum>. This convention is supported so that arbitrary Scheme
    programs may be represented as lists.  That is, according to
    Scheme's grammar, every <expression> is also a <datum> (see
    section 7.1.2). Among other things, this permits the use of the
    read procedure to parse Scheme programs.''

  - Section 6.5  ``Expression must be a valid Scheme expression
    represented as data.''

  - Section 7.1.2 ``Note that any string that parses as an
    <expression> will also parse as a <datum>.''

It is clear that R5RS considers expressions to be a subgrammar of data.
From: Brian Mastenbrook
Subject: Re: Scheme macros
Date: 
Message-ID: <080320041149302468%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <············@ccs.neu.edu>, Joe Marshall <···@ccs.neu.edu>
wrote:

> Yes, but you are looking at section 7.1.3, and the right hand side of
> the BNF productions are (the printed representation of) datum, not
> character strings.  That is,  (<variable>*) is a list of variables,
> not a parenthesis, a series of characters parsable as identifiers
> separated by whitespace, followed by a close parenthesis.

Section 7.1.2 - "Note that any string that parses as an <expression>
will also parse as a <datum>." Note that this does not state or imply
the converse.

> (lambda (x . ()) x)  by section 3.3 is simply a different printed
> representation for (lambda (x) x)

Section 3.3 describes external representations, aka 7.1.2. It is
required that program text parse as <datum>, but not the other way
around.

>   - Section 1.2  ``Scheme, like most dialects of Lisp, employs a fully
>     parenthesized prefix notation for programs and (other) data; the
>     grammar of  Scheme generates a sublanguage of the language used
>     for data. An important consequence of this simple, uniform
>     representation is the susceptibility of Scheme programs and data
>     to uniform treatment by other Scheme programs. For example, the
>     eval procedure evaluates a Scheme program expressed as data.''

To eval, (lambda (x . ()) x) is the same as (lambda (x) x) because the
latter is the standard external representation. I don't see how this
implies that (lambda (x . ()) x) is allowable as <expression> in
program text.

>   - Section 3.3 ``Scheme's syntax has the property that any sequence
>     of characters that is an expression is also the external
>     representation of some object.''

Right, which implies that <expression> is <datum>, not the converse.

>   - Section 6.3.2 ``Within literal expressions and representations of objects
>     read by the read procedure, the forms '<datum>, `<datum>,
>     ,@<datum>, and ,@<datum> denote two-element lists whose first
>     elements are the symbols quote, quasiquote, unquote, and
>     unquote-splicing, respectively.  The second element in each case
>     is <datum>. This convention is supported so that arbitrary Scheme
>     programs may be represented as lists.  That is, according to
>     Scheme's grammar, every <expression> is also a <datum> (see
>     section 7.1.2). Among other things, this permits the use of the
>     read procedure to parse Scheme programs.''

Once again.

>   - Section 6.5  ``Expression must be a valid Scheme expression
>     represented as data.''

And again.

>   - Section 7.1.2 ``Note that any string that parses as an
>     <expression> will also parse as a <datum>.'

And again.

Section 1.2: "The `read' procedure parses its input as data (section
see section 7.1.2 External representations), not as program."

> It is clear that R5RS considers expressions to be a subgrammar of data.

Which is precisely my point - a subgrammar, and a strict subgrammar.
Thus Schemes are permitted to use other than the reader to read program
text, as inded MzScheme does.

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <u10zi1lu.fsf@ccs.neu.edu>
Brian Mastenbrook <····················@cs.indiana.edu> writes:

> In article <············@ccs.neu.edu>, Joe Marshall <···@ccs.neu.edu>
> wrote:
>
>> Yes, but you are looking at section 7.1.3, and the right hand side of
>> the BNF productions are (the printed representation of) datum, not
>> character strings.  That is,  (<variable>*) is a list of variables,
>> not a parenthesis, a series of characters parsable as identifiers
>> separated by whitespace, followed by a close parenthesis.
>
> Section 7.1.2 - "Note that any string that parses as an <expression>
> will also parse as a <datum>." Note that this does not state or imply
> the converse.

That is irrelevant to my point.  (lambda (x . ()) x) is a member of
the right-hand side of the BNF in section 7.1.3, and is therefore an
expression. 

>>   - Section 1.2  ``Scheme, like most dialects of Lisp, employs a fully
>>     parenthesized prefix notation for programs and (other) data; the
>>     grammar of  Scheme generates a sublanguage of the language used
>>     for data. An important consequence of this simple, uniform
>>     representation is the susceptibility of Scheme programs and data
>>     to uniform treatment by other Scheme programs. For example, the
>>     eval procedure evaluates a Scheme program expressed as data.''
>
> To eval, (lambda (x . ()) x) is the same as (lambda (x) x) because the
> latter is the standard external representation. I don't see how this
> implies that (lambda (x . ()) x) is allowable as <expression> in
> program text.

You are asserting that EVAL can interpret a superset of the language?!
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <k72e9iz6.fsf@comcast.net>
Brian Mastenbrook <····················@cs.indiana.edu> writes:

> And here is where the next Lisp/Scheme debate is going to start up.
> When Schemers speak of continuations, they really mean implicit
> continuations - the idea that the program should be granted access to
> the continuations that are flying around under the hood in the
> implementation. However, the name "continuation" is considerably more
> general, or else anybody using CPS needs to find a new term.
>
> In fact I would argue that the main competition to call/cc
> continuations is going to be CPS, and that CPS has a really big
> advantage: after you write your program, you can identify what types of
> continuations you use, and then change their representation to a list
> of a name and arguments which describes the continuation of that type.

The problem with this approach is that this requires a global
transformation and you may not have access to all the code.

-- 
~jrm
From: Brian Mastenbrook
Subject: Re: Scheme macros
Date: 
Message-ID: <230220040843467186%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <············@comcast.net>, Joe Marshall
<·············@comcast.net> wrote:

> The problem with this approach is that this requires a global
> transformation and you may not have access to all the code.

Where is it? Locked away somewhere?

You only need to transform your own code; I would hope that any other
function you would be calling from a web page generator would be
simple.

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <vflxlvwk.fsf@ccs.neu.edu>
Brian Mastenbrook <····················@cs.indiana.edu> writes:

> In article <············@comcast.net>, Joe Marshall
> <·············@comcast.net> wrote:
>
>> The problem with this approach is that this requires a global
>> transformation and you may not have access to all the code.
>
> Where is it? Locked away somewhere?

Essentially.  Are you going to CPS convert the entire system?  Even
the primitives?

> You only need to transform your own code; I would hope that any other
> function you would be calling from a web page generator would be
> simple.

Like mapcar?  CPS conversion is a model for
call-with-current-continuation, but it is not a practical
substitution.
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: Scheme macros
Date: 
Message-ID: <pan.2004.02.23.14.22.34.812876@knm.org.pl>
On Mon, 23 Feb 2004 08:43:46 -0500, Brian Mastenbrook wrote:

>> The problem with this approach is that this requires a global
>> transformation and you may not have access to all the code.
> 
> Where is it? Locked away somewhere?
> 
> You only need to transform your own code; I would hope that any other
> function you would be calling from a web page generator would be
> simple.

Calling higher order functions like mapcar and passing them functions
which capture or invoke continuations requires a CPS transformer to
reimplement these functions (if their source is not visible).

Some builtin functions are hard to transform. For example funcall: you
should determine whether its argument is a transformed function or not,
which is in general not known until runtime.

CPS transformation by code walking is necessarily incomplete because
it's not possible to transform higher order black boxes.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1e71o$gbk$1@newsreader2.netcologne.de>
Anton van Straaten wrote:

> If you look up "macro transformer" in the index, it points you to a page
> which contains the following definition (Sec. 4.3, Macros):
> 
> "The set of rules that specifies how a use of a macro is transcribed into a
> more primitive expression is called the 'transformer' of the macro."
> 
> I don't think it's hiding anything.  Do you think otherwise?

I think Dybvig's explanations in "Writing Hygenic Macros in Scheme with 
Syntax-Case" are much clearer:

"Macro transformers are procedures of one argument. The argument to a 
macro transformer is a syntax object, which contains contextual 
information about an expression in addition to its structure. [...]"

Generally, I need mental models how language constructs are mapped to 
the lower levels in order to understand and trust them. This is true for 
almost every language feature that I use. Schemers seem to be more 
mathematically inclined and prefer other perspectives to understanding 
language features, and maybe I am just not in the right target audience.

> But I'll ignore my own advice and take a stab at explaining syntax-case,
> starting from a defmacro perspective.

Thanks for your explanations. They help a lot.

> Start with defmacro, and imagine that instead of quoting syntax using
> quasiquote, you use a special form, 'syntax', which instead of returning a
> list as quasiquote does, returns a syntax object, i.e. an instance of an
> abstract data type which represents a piece of syntax.  This type has an API
> which supports various code walking & manipulation capabilities.  It can
> also be converted to a list (or whatever value the original syntax
> represented) via 'syntax-object->datum'.

[...]
> Syntax objects
> are a richer way to represent program syntax than lists, and their uses go
> beyond just macros.

That's the first glimpse I have caught that this might be something 
worthwhile to learn. Where is that API documented/specified?

> I think I'll stop there for now, since I have other things to do!

Thanks a lot for taking your time to provide these explanations. They 
help a lot.

>>Examples like those given in
>>http://okmij.org/ftp/Scheme/r5rs-macros-pitfalls.txt seem to indicate
>>that syntax-rules just trade one set of possible pitfalls with a
>>different set, but along that way the conceptual simplicity is lost.
> 
> I don't accept that "conceptual simplicity is lost" with syntax-rules.  It's
> a different approach, which in some ways is conceptually simpler than
> defmacro, since it doesn't require the user to manually keep track of the
> different levels at which the macro operates.  The pitfalls you mention may
> indeed be flaws in syntax-rules - I'm not familiar enough with them to
> comment - but I find that syntax-rules works very well for many kinds of
> macros, better than defmacro in fact.

OK, my preliminary conclusion is that it is just a different programming 
style for expressing macros.

> I'm not sure what you mean about not seeing the problem.  One of the
> problems mentioned in the article is that syntax-rules pattern variables
> don't shadow.  I don't know if there's a justification for that, or it's
> simply a bug in the design of syntax-rules.  But you usually get an error if
> you make this mistake, and it's easy to fix, and easy to avoid.  It doesn't
> mean that syntax-rules is not useful, and it's still better than defmacro,
> which you can't dispute until you've learned syntax-rules.  ;)

Until now I have found defmacro very easy to learn, and most of the 
stuff I have read so far about Scheme's macro system(s) is very 
inaccessible for me. This makes defmacro de facto more useful to me. Of 
course, this might change in the future.

>>Lisp and Scheme bring you metacircularity. As soon as Pythonistas write
>>program generators, it's clear that their laguage is missing something
>>important. Of course, they can write a Lisp interpreter in Python, but
>>that's besides the point.
>>
>>Do you really think that syntax-case is an equally important step forward?
> 
> In some respects, yes, but that's not what I really meant.  It's easy to
> look at something from the outside and find reasons not to try it, and that
> was my main point.  But the points you've been picking on don't seem very
> substantial to me - it seems as though you're looking for reasons to ignore
> these systems, rather than looking for reasons you might want to learn them.

That's not so far from the truth. See below.

> Of course, a CL programmer who wants to write standard CL code, obviously
> has little incentive to be interested in other macro systems.  But if your
> interests cross multiple languages, then there's value in the Scheme macro
> systems, at the very least in the sense that learning more than one approach
> to the same problem expands the horizons of your understanding of the
> problem.

I accept that.

>>>>BTW, what you really need to make something like DEFMACRO work is, on
>>>>top of that, of course quasiquotation, GENSYM/MAKE-SYMBOL or
>>>>string->uninterned-symbol and most probably a Lisp-2.
>>>
>>>I don't see that Lisp-2 is an issue.
>>
>>See http://citeseer.nj.nec.com/bawden88syntactic.html
> 
> I'm familiar with why people claim it's an issue, but in practice I think
> it's not significantly worse than the issue of hygiene in defmacros in
> general.  As I've said and defended here once before, Lisp-1 can express any
> Lisp-2 program, simply by changing any conflicting names so as not to
> conflict - a conceptually trivial transformation, with consequences which
> are primarily subjective.  It would have an impact on porting Lisp-2 macros
> to Lisp-1, but it doesn't limit what you can easily express in Lisp-1.

If you are talking about an automatic transformation of names here, then 
I wouldn't agree that this is a relevant argument. Programmers choose 
names because they are descriptive of the nature of the conceptual 
entities these names stand for. An automatic translation looses this 
aspect, at least to a certain degree.

If I say (let ((list ...)) ...) in Common Lisp, I have chosen the name 
"list" to say something about the variable that it denotes. If that name 
gets automatically translated to some arbitrary other name it either 
looses a certain amount of that descriptive quality ("lst"), or it is 
amended with some irrelevant information ("list-var").

It's true that this doesn't essentially limit what you can express in 
Lisp-1, but it's also true that there _is_ a fundamental difference 
between functions and values, and even though the separation of these 
two spaces was an accident in the history of Lisp, it still matches an 
important qualitative distinction.

> Put another way, having the ability to accidentally compensate for hygiene
> violations in some cases - where multiple namespaces happen to prevent the
> problem - isn't a solution to the general problem of not having hygiene.
> Since you haven't solved the general problem, you still have to address
> questions of hygiene, in various low-level ways.  A single namespace doesn't
> makes this problem worse in any significant way.

I disagree. Names for functions and values cannot accidentally clash in 
a Lisp-2, and this is an important category of potential clashes in a 
Lisp-1, even outside the domain of macro programming.

>>Here it helps that a Lisp-2 seperates variables and functions by
>>default. Variables are usually not important parts of an application's
>>ontology. If they are, the convention in Common Lisp is to use proper
>>naming schemes, like asterisks for special variables. Effectively, this
>>creates a new namespace.
> 
> Mmm, asterisks.  This, to me, is why the whole Lisp-1/2 debate is moot.  The
> solution is simply Lisp-N, where you can define namespaces, modules, etc.
> and control how they're used.  See PLT Scheme etc.

Unfortunately, there doesn't seem to be conventions to separate names 
for functions and values that Schemers adhere to. This is an important 
aspect of Common Lisp's naming convention for special variables.

The point is that the number of cases in which hygiene is a real issue 
is considerably reduced by the combination of Lisp-2-ness and naming 
conventions in Common Lisp, so that the remaining cases aren't pressing 
anymore. For macros, you can just use a set of idioms that you can 
easily memorize and you're done with it.

>>4. The fourth example can be solved with a proper GENSYM for "use" in
>>the "contorted" macro.
> 
> The phrase "proper GENSYM" is an oxymoron.  GENSYM operates at a strangely
> low level of abstraction.  Why don't you use GENSYM when declaring normal
> lexical variables in a procedure?  Rhetorical question, of course 
[...]

No, not really. Some time ago, I have started some thought experiments 
how one could design a Lisp dialect that works like that, and I think 
this has the potential to add some interesting features.

>>Some while ago, I wanted to experiment with continuations in Scheme.
>>Apart from the fact that not all Schemes seem to implement continuations
>>fully and/or correctly (see
>>http://sisc.sourceforge.net/r5rspitresults.html ), the fact that the
>>respective documentations make me feel uneasy about whether I have to
>>relearn programming techniques for totally unrelated areas is a clear
>>downside IMHO.
> 
> We're straying far afield here.  ;)  But I'll give my opinion about
> continuations, too.

Maybe my posting was too ambiguous here, but I really didn't want to 
talk about continuations here. I understand them (because I have a 
mental model how they are mapped to parameter passing and procedure 
invocation mechanisms!), and I have a fair understanding of what they 
can be used for.

However, the point is that in order to experiment with continuations, I 
have to put up with all the other design decisions of Scheme. For most 
of Scheme's features that's ok, but wrt macros this can be annoying.

I have made a few attempts to get an understanding of Scheme macros in 
the past, and I have always found them too hard to understand especially 
in comparison to the only minor improvements they seemed to make. The 
optimal thing would have been a Common Lisp implementation with call/cc 
built in, but such a beast doesn't seem to exist.

So yes: If you want to use Scheme for something that's not really 
related to macros, and you don't _want_ to learn syntax-rules or 
syntax-case for some reason, and you are not sure whether gensym 
actually works, because most Schemers tell you that this evil anyway, 
then this can be annoying. I think that this is actually a disservice to 
Scheme and Lisp in general.

I have no problems to agree with you that defmacro, syntax-rules and 
syntax-case are just different ways to implement macros, maybe with 
their own respective strengths in the latter two cases that I cannot 
judge at the moment. However, it is a fact that defmacro is generally 
described in the Scheme community as fundamentally problematic, and this 
is clearly wrong. (See for example the first paragraph of "Syntactic 
Abstractions in Scheme" by Hieb, Dybvig, Bruggeman.)

>>I don't mind using DEFMACRO for simple things. I don't find them hard to
>>write or read, and I don't know why they would be more error-prone.
>>Sounds similar to some of the claims made by advocates of static type
>>systems. Maybe this boils down to just a matter of taste.
> 
> Maybe - let's talk once you've tried syntax-rules.  But you gave a clue to
> your reading & writing process for DEFMACRO when you said that when reading
> a syntax-rules macro, you were immediately worrying about which level the
> various tokens were at.  You've learned to look for, and expect something
> that, with syntax-rules, you can simply forget about.  You don't do these
> things when writing ordinary functions

Yes, I do.

> - why do you put up with it when
> writing macros?  What would you think of Lisp if you had to use gensym to
> initialize every variable you use?  You've simply become very used to a
> low-level technique, so that you don't believe there's any need for a higher
> level technique.

Right. At least, this was right until I have heard from you that syntax 
objects can be used for things that go beyond macros.


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Jacek Generowicz
Subject: Re: Scheme macros
Date: 
Message-ID: <tyfsmh0opo7.fsf@pcepsft001.cern.ch>
Pascal Costanza <········@web.de> writes:

> Anton van Straaten wrote:
> 
> > I think I'll stop there for now, since I have other things to do!
> 
> Thanks a lot for taking your time to provide these explanations. They
> help a lot.

Seconded.
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1gjd9$j5p$1@newsreader2.netcologne.de>
Pascal Costanza wrote:

> I have no problems to agree with you that defmacro, syntax-rules and 
> syntax-case are just different ways to implement macros, maybe with 
> their own respective strengths in the latter two cases that I cannot 
> judge at the moment.

Here is the result of my first experiment with syntax-rules...

(define-syntax special-let
   (syntax-rules ()
     ((_ () form ...)
      (let () form ...))
     ((_ ((var binding) more ...) form ...)
      (let ((var binding))
        (special-let (more ...) form ...)))
     ((_ (var more ...) form ...)
      (let ((var ()))
        (special-let (more ...) form ...)))))

...and for comparison purposes, here is how I would implement it in 
Common Lisp:

(defmacro special-let (bindings &body body)
   (reduce (lambda (binding body)
	    (cond ((symbolp binding)
		   `(let ((,binding nil))
		      ,body))
		  (t `(let (,binding)
			,body))))
	  bindings
	  :from-end t :initial-value `(progn ,@body)))


I am not sure what to think of this. Comments?


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <403bf95e$0$195$edfadb0f@dread12.news.tele.dk>
Pascal Costanza wrote:
> 
> Pascal Costanza wrote:
> 
>> I have no problems to agree with you that defmacro, syntax-rules and 
>> syntax-case are just different ways to implement macros, maybe with 
>> their own respective strengths in the latter two cases that I cannot 
>> judge at the moment.
> 
> 
> Here is the result of my first experiment with syntax-rules...
> 
> (define-syntax special-let
>   (syntax-rules ()
>     ((_ () form ...)
>      (let () form ...))
>     ((_ ((var binding) more ...) form ...)
>      (let ((var binding))
>        (special-let (more ...) form ...)))
>     ((_ (var more ...) form ...)
>      (let ((var ()))
>        (special-let (more ...) form ...)))))
> 
> ...and for comparison purposes, here is how I would implement it in 
> Common Lisp:
> 
> (defmacro special-let (bindings &body body)
>   (reduce (lambda (binding body)
>         (cond ((symbolp binding)
>            `(let ((,binding nil))
>               ,body))
>           (t `(let (,binding)
>             ,body))))
>       bindings
>       :from-end t :initial-value `(progn ,@body)))
> 
> 
> I am not sure what to think of this. Comments?

I am not sure. What happens in the CL macro
if the body contains a macro that expands to
one of the variables being bound in the special
let?

Example:


   (define a 0)

   (define-syntax foo
     (syntax-rules ()
       ((_)
        (+ a 100))))

   (special-let ((a 42))	
     (foo))


Evaluates to 100.

But

   (defparameter a 0)

   (defmacro foo ()
     '(+ a 100))

   (special-let ((a 42))
     (foo))

evaluates to 142.


I tried (defvar a 0) and (defconstant a 0)
instead of (defparamter a 0). The first
also evaluates to 142, the second gives an
error.


In order to give the defmacro the same behaviour
as the syntax-rules one, you need to rename the user
given names in bindings using gensym.  The tricky part
is then the renaming in the body.


However, the possibility also exist that I messed up
in the translation of the example from Scheme to CL.
In that case forget the paragraph above.
(But please tell, me what the correct translation
would be)


-- 
Jens Axel S�gaard
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1i6o8$jt$1@newsreader2.netcologne.de>
Jens Axel S�gaard wrote:

> Pascal Costanza wrote:
> 
>>
>> Pascal Costanza wrote:
>>
>>> I have no problems to agree with you that defmacro, syntax-rules and 
>>> syntax-case are just different ways to implement macros, maybe with 
>>> their own respective strengths in the latter two cases that I cannot 
>>> judge at the moment.
>>
>>
>>
>> Here is the result of my first experiment with syntax-rules...
>>
>> (define-syntax special-let
>>   (syntax-rules ()
>>     ((_ () form ...)
>>      (let () form ...))
>>     ((_ ((var binding) more ...) form ...)
>>      (let ((var binding))
>>        (special-let (more ...) form ...)))
>>     ((_ (var more ...) form ...)
>>      (let ((var ()))
>>        (special-let (more ...) form ...)))))
>>
>> ...and for comparison purposes, here is how I would implement it in 
>> Common Lisp:
>>
>> (defmacro special-let (bindings &body body)
>>   (reduce (lambda (binding body)
>>         (cond ((symbolp binding)
>>            `(let ((,binding nil))
>>               ,body))
>>           (t `(let (,binding)
>>             ,body))))
>>       bindings
>>       :from-end t :initial-value `(progn ,@body)))
>>
>>
>> I am not sure what to think of this. Comments?
> 
> 
> I am not sure. What happens in the CL macro
> if the body contains a macro that expands to
> one of the variables being bound in the special
> let?
> 
> Example:
> 
> 
>   (define a 0)
> 
>   (define-syntax foo
>     (syntax-rules ()
>       ((_)
>        (+ a 100))))
> 
>   (special-let ((a 42))   
>     (foo))
> 
> 
> Evaluates to 100.
> 
> But
> 
>   (defparameter a 0)
> 
>   (defmacro foo ()
>     '(+ a 100))

If Common Lispniks write such macros, they explicitly want to capture 
the variable A defined in the surrounding code. For example, this could 
be part of the specification of the macro FOO: "captures variable A and 
adds 100"

If you don't want that, you have to write the following:

(defmacro foo ()
   (let ((a (gensym)))
     `(+ ,a 100)))

...or better:

(defmacro foo ()
   (with-unique-names (a)
     `(+ ,a 100)))

WITH-UNIQUE-NAMES is not defined in ANSI Common Lisp, but is easy to 
implement and provided with some CL implementations. Sometimes, it is 
called WITH-GENSYMS.

It's possible to capture names with syntax-case via some explicit 
manipulations of syntax objects. But I don't understand the details yet. 
Contrary to what _some_ literature about hygienic macros suggests, name 
capture is a very useful concept.


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <eksji7ak.fsf@ccs.neu.edu>
Pascal Costanza <········@web.de> writes:

> Contrary to what _some_ literature about hygienic macros
> suggests, name capture is a very useful concept.

Name capture *is* quite useful.  Inadvertant name capture is much less
desirable.
From: Mario S. Mommer
Subject: Re: Scheme macros
Date: 
Message-ID: <fzbrnnck6b.fsf@germany.igpm.rwth-aachen.de>
Joe Marshall <···@ccs.neu.edu> writes:
> Pascal Costanza <········@web.de> writes:
> > Contrary to what _some_ literature about hygienic macros
> > suggests, name capture is a very useful concept.
> 
> Name capture *is* quite useful.  Inadvertant name capture is much less
> desirable.

I agree, but i have /never/ found a bug caused by this. Has it ever
happened to you? In CL, that is.
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <znb67wq2.fsf@comcast.net>
Mario S. Mommer <········@yahoo.com> writes:

> Joe Marshall <···@ccs.neu.edu> writes:
>> Pascal Costanza <········@web.de> writes:
>> > Contrary to what _some_ literature about hygienic macros
>> > suggests, name capture is a very useful concept.
>> 
>> Name capture *is* quite useful.  Inadvertant name capture is much less
>> desirable.
>
> I agree, but i have /never/ found a bug caused by this. Has it ever
> happened to you? In CL, that is.

Yes, but the macros were quite hairy.  I don't have an off-hand
example, though.

-- 
~jrm
From: Christopher C. Stacy
Subject: Re: Scheme macros
Date: 
Message-ID: <ubrnngfsx.fsf@news.dtpq.com>
>>>>> On Wed, 25 Feb 2004 09:05:07 -0500, Joe Marshall ("Joe") writes:

 Joe> Pascal Costanza <········@web.de> writes:
 >> Contrary to what _some_ literature about hygienic macros
 >> suggests, name capture is a very useful concept.

 Joe> Name capture *is* quite useful.

Usually in MACROLET.
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1i7p2$3bm$1@newsreader2.netcologne.de>
Pascal Costanza wrote:

> If you don't want that, you have to write the following:
> 
> (defmacro foo ()
>   (let ((a (gensym)))
>     `(+ ,a 100)))
> 
> ...or better:
> 
> (defmacro foo ()
>   (with-unique-names (a)
>     `(+ ,a 100)))

...nonsense, these macros would produce errors because they would 
attempt to add 100 to an unbound variable.

The question is what you want to achieve with such a macro. If you want 
to capture a name, then capture it. If you want to make sure that you 
don't capture an arbitrary A, give it a more meaningful name or put it 
in package created for this purpose.

(in-package "A-CAPTURER")

(defvar a 0) ;; this should rather be *a*, but for the sake of the 
example...

(defmacro foo ()
   '(+ a 100))


(in-package "OTHER-PACKAGE")

(special-let ((a 42))
   (foo))


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: james anderson
Subject: gensym-macrolet ? [ Re: Scheme macros
Date: 
Message-ID: <403CC5E5.F5097EAC@setf.de>
Pascal Costanza wrote:
> 
> ...
> 
> WITH-UNIQUE-NAMES is not defined in ANSI Common Lisp, but is easy to
> implement and provided with some CL implementations. Sometimes, it is
> called WITH-GENSYMS.
> 

there have been several references to the WITH-UNIQUE-NAMES / WITH-GENSYMS
operators lately. all with the intended application, to establish unique names
for use in a constructed macro expansion. is there any reason not to formulate
the operation as a compile-time rather than run-time operation? as in

(defmacro foo (x)
   (gensym-macrolet (a)
     `(+ a ,x)))

where gensym-macrolet itself is a macro?

...
From: Kaz Kylheku
Subject: Re: gensym-macrolet ? [ Re: Scheme macros
Date: 
Message-ID: <cf333042.0402251707.128fef9b@posting.google.com>
james anderson <··············@setf.de> wrote in message news:<·················@setf.de>...
> there have been several references to the WITH-UNIQUE-NAMES / WITH-GENSYMS
> operators lately. all with the intended application, to establish unique names
> for use in a constructed macro expansion. is there any reason not to formulate
> the operation as a compile-time rather than run-time operation? as in

WITH-GENSYMS and similar things work at compile time, or whenever
macro-expansion takes place.

> (defmacro foo (x)
>    (gensym-macrolet (a)
>      `(+ a ,x)))
>
> where gensym-macrolet itself is a macro?

The problem is that GENSYM-MACROLET has to implement a code-walker
which scans the unevaluated backquote template and replaces every
occurence of A by a gensym.

That's a lot of work just for the sake of avoiding a blister on your
middle finger when you type the extra commas:

  (with-gensyms (a)
    `(+ ,a ,x))

You already have a backquote expander working for you, capable of
looking for markers in a template and substituting values, so the
obvious thing is to use it.

Common Lisp has packages which you can use to achieve hygiene, so none
of these tricks are necessary. If you put your macro in a package, all
of the symbols in the macro body will be interned in your package (or
else be symbols that your package has imported: you have control over
that).

  (in-package :my-macro)

  (defmacro foo (x)
    `(+ a ,x))

The symbols here are MY-MACRO::FOO, MY-MACRO::X and MY-MACRO::A. No
WITH-GENSYMS needed, no code walker, no Scheme features, nothing.
From: james anderson
Subject: Re: gensym-macrolet ? [ Re: Scheme macros
Date: 
Message-ID: <403D9481.7FC07D96@setf.de>
Kaz Kylheku wrote:
> 
> james anderson <··············@setf.de> wrote in message news:<·················@setf.de>...
> > there have been several references to the WITH-UNIQUE-NAMES / WITH-GENSYMS
> > operators lately. all with the intended application, to establish unique names
> > for use in a constructed macro expansion. is there any reason not to formulate
> > the operation as a compile-time rather than run-time operation? as in
> 
> WITH-GENSYMS and similar things work at compile time, or whenever
> macro-expansion takes place.

which is the macro definition's run-time. the macro itself also has a compile-time.

> 
> > (defmacro foo (x)
> >    (gensym-macrolet (a)
> >      `(+ a ,x)))
> >
> > where gensym-macrolet itself is a macro?
> 
> The problem is that GENSYM-MACROLET has to implement a code-walker
> which scans the unevaluated backquote template and replaces every
> occurence of A by a gensym.

as the intended behaviour would not be entirely analogous to symbol-macrolet -
neither in the operator/value namespace disctinction nor in the observance of
lexical constraints, one conceivable implementation is as a naive sublis. 

> 
> That's a lot of work just for the sake of avoiding a blister on your
> middle finger when you type the extra commas:
> 
> ...
> 
>   (in-package :my-macro)
> 
>   (defmacro foo (x)
>     `(+ a ,x))
> 
> The symbols here are MY-MACRO::FOO, MY-MACRO::X and MY-MACRO::A. No
> WITH-GENSYMS needed, no code walker, no Scheme features, nothing.

all well and true. i'm just wondering.

...
From: Tim Bradshaw
Subject: Re: gensym-macrolet ? [ Re: Scheme macros
Date: 
Message-ID: <ey3brnlhikh.fsf@cley.com>
* james anderson wrote:

> which is the macro definition's run-time. the macro itself also has a compile-time.

So does the compiler.  Why is this interesting?

--tim
From: james anderson
Subject: Re: gensym-macrolet ? [ Re: Scheme macros
Date: 
Message-ID: <403E31F5.EBAC64C@setf.de>
Tim Bradshaw wrote:
> 
> * james anderson wrote:
> 
> > which is the macro definition's run-time. the macro itself also has a compile-time.
> 
> So does the compiler.  Why is this interesting?

because i was enquiring about substitution mechanisms which act at the macro
definition's compile time and the observation, to which the above responded, was

Kaz Kylheku wrote:
> 
> james anderson <··············@setf.de> wrote in message news:<·················@setf.de>...
> > ...
> 
> WITH-GENSYMS and similar things work at compile time, or whenever
> macro-expansion takes place.

which describes an operator which substitutes at the macro's run-time rather
than at its compile-time. [at least, given the definitions of which i am
aware.] 

...
From: Jens Axel Søgaard
Subject: Re: gensym-macrolet ? [ Re: Scheme macros
Date: 
Message-ID: <40461b5e$0$216$edfadb0f@dread12.news.tele.dk>
Kaz Kylheku wrote:

> Common Lisp has packages which you can use to achieve hygiene, so none
> of these tricks are necessary. If you put your macro in a package, all
> of the symbols in the macro body will be interned in your package (or
> else be symbols that your package has imported: you have control over
> that).
> 
>   (in-package :my-macro)
> 
>   (defmacro foo (x)
>     `(+ a ,x))
> 
> The symbols here are MY-MACRO::FOO, MY-MACRO::X and MY-MACRO::A. No
> WITH-GENSYMS needed, no code walker, no Scheme features, nothing.

Thanks. That answered my question on how to refer to a variable
in the lexical scope at the place of the definition of the macro.

This also explains why Common Lispers are satisfied with defmacro
and Schemers are not. There are no standard module system in Scheme
and thus the same solution doesn't apply to Scheme.

-- 
Jens Axel S�gaard
From: Pascal Costanza
Subject: Re: gensym-macrolet ? [ Re: Scheme macros
Date: 
Message-ID: <c1jige$srr$2@newsreader2.netcologne.de>
james anderson wrote:

> 
> Pascal Costanza wrote:
> 
>>...
>>
>>WITH-UNIQUE-NAMES is not defined in ANSI Common Lisp, but is easy to
>>implement and provided with some CL implementations. Sometimes, it is
>>called WITH-GENSYMS.
>>
> 
> 
> there have been several references to the WITH-UNIQUE-NAMES / WITH-GENSYMS
> operators lately. all with the intended application, to establish unique names
> for use in a constructed macro expansion. is there any reason not to formulate
> the operation as a compile-time rather than run-time operation? as in
> 
> (defmacro foo (x)
>    (gensym-macrolet (a)
>      `(+ a ,x)))
> 
> where gensym-macrolet itself is a macro?

This is the first time I hear about gensym-macrolet. What is this?


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <403cd102$0$218$edfadb0f@dread12.news.tele.dk>
Pascal Costanza wrote:
> Jens Axel S�gaard wrote:
>> Pascal Costanza wrote:
>>> Pascal Costanza wrote:

>>> Here is the result of my first experiment with syntax-rules...
>>>
>>> (define-syntax special-let
>>>   (syntax-rules ()
>>>     ((_ () form ...)
>>>      (let () form ...))
>>>     ((_ ((var binding) more ...) form ...)
>>>      (let ((var binding))
>>>        (special-let (more ...) form ...)))
>>>     ((_ (var more ...) form ...)
>>>      (let ((var ()))
>>>        (special-let (more ...) form ...)))))
>>>
>>> ...and for comparison purposes, here is how I would implement it in 
>>> Common Lisp:
>>>
>>> (defmacro special-let (bindings &body body)
>>>   (reduce (lambda (binding body)
>>>         (cond ((symbolp binding)
>>>            `(let ((,binding nil))
>>>               ,body))
>>>           (t `(let (,binding)
>>>             ,body))))
>>>       bindings
>>>       :from-end t :initial-value `(progn ,@body)))
>>>
>>>
>>> I am not sure what to think of this. Comments?

>> I am not sure. What happens in the CL macro
>> if the body contains a macro that expands to
>> one of the variables being bound in the special
>> let?
>>
>> Example:
>>
>>
>>   (define a 0)
>>
>>   (define-syntax foo
>>     (syntax-rules ()
>>       ((_)
>>        (+ a 100))))
>>
>>   (special-let ((a 42))       (foo))
>>
>>
>> Evaluates to 100.
>>
>> But
>>
>>   (defparameter a 0)
>>
>>   (defmacro foo ()
>>     '(+ a 100))
> 
> 
> If Common Lispniks write such macros, they explicitly want to capture 
> the variable A defined in the surrounding code. For example, this could 
> be part of the specification of the macro FOO: "captures variable A and 
> adds 100"

Ok. I'll explain the Scheme example in a little more depth, and
refrain from any attempt to write it in CL.

The variable A mention in the expansion of FOO has lexical scope
seen from the definition of FOO.

Thus

   (define a 0)

   (define-syntax foo
     (syntax-rules ()
       ((_)
        (+ a 100))))

   (display (special-let ((a 42)) (foo)))
   (newline)

   (set! a 10)

   (display (special-let ((a 42)) (foo)))
   (newline)

displays

   100
   110

> If you don't want that, you have to write the following:
> 
> (defmacro foo ()
>   (let ((a (gensym)))
>     `(+ ,a 100)))
> 
> ...or better:
> 
> (defmacro foo ()
>   (with-unique-names (a)
>     `(+ ,a 100)))

But then a new variable is introduced. I want variables refered
to in macros (not bound by code introduced by the macro) to refer to
the variables in the lexical scope of the definition of the macro
as opposed to the places of use of the macro.

As far as I can tell your definitions above doesn't give the same
behaviour as the elaborated Scheme example above.

The question is: Is it possible to write a defmacro without making an
alpha conversion of the code in body of special-let?

-- 
Jens Axel S�gaard
From: Björn Lindberg
Subject: Re: Scheme macros
Date: 
Message-ID: <hcsn076owzf.fsf@tjatte.nada.kth.se>
Jens Axel S�gaard <······@jasoegaard.dk> writes:

> Ok. I'll explain the Scheme example in a little more depth, and
> refrain from any attempt to write it in CL.
> 
> The variable A mention in the expansion of FOO has lexical scope
> seen from the definition of FOO.
> 
> Thus
> 
>    (define a 0)
> 
>    (define-syntax foo
>      (syntax-rules ()
>        ((_)
>         (+ a 100))))
> 
>    (display (special-let ((a 42)) (foo)))
>    (newline)
> 
>    (set! a 10)
> 
>    (display (special-let ((a 42)) (foo)))
>    (newline)
> 
> displays
> 
>    100
>    110
> 
> > If you don't want that, you have to write the following:
> > (defmacro foo ()
> >   (let ((a (gensym)))
> >     `(+ ,a 100)))
> > ...or better:
> > (defmacro foo ()
> >   (with-unique-names (a)
> >     `(+ ,a 100)))
> 
> But then a new variable is introduced. I want variables refered
> to in macros (not bound by code introduced by the macro) to refer to
> the variables in the lexical scope of the definition of the macro
> as opposed to the places of use of the macro.

I don't think you can, but that does not have anything to do with
defmacro, but rather the lack in CL of global lexicals. For instance,
if I write the following:

(let ((a 0)) ; a is a lexical
  (defun set-a (x)
    (setf a x))
  (defmacro foo ()
    `(+ ,a 100)))

I get:

* (special-let ((a 42)) (foo))
==> 100
* (set-a 10)
==> 10
* (special-let ((a 42)) (foo))
==> 110


Bj�rn
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1krv1$1016$1@f1node01.rhrz.uni-bonn.de>
Bj�rn Lindberg wrote:
> Jens Axel S�gaard <······@jasoegaard.dk> writes:
[...]

> I don't think you can, but that does not have anything to do with
> defmacro, but rather the lack in CL of global lexicals. For instance,
> if I write the following:
> 
> (let ((a 0)) ; a is a lexical
>   (defun set-a (x)
>     (setf a x))
>   (defmacro foo ()
>     `(+ ,a 100)))
> 
> I get:
> 
> * (special-let ((a 42)) (foo))
> ==> 100
> * (set-a 10)
> ==> 10
> * (special-let ((a 42)) (foo))
> ==> 110

Right, this is simpler than I thought.


Pascal

-- 
Pascal Costanza               University of Bonn
···············@web.de        Institute of Computer Science III
http://www.pascalcostanza.de  R�merstr. 164, D-53117 Bonn (Germany)
From: Jeremy Yallop
Subject: Re: Scheme macros
Date: 
Message-ID: <slrnc3pk23.fqt.jeremy@kaje.cl.cam.ac.uk>
Jens Axel S�gaard wrote:
> Pascal Costanza wrote:
>> Here is the result of my first experiment with syntax-rules...
>> 
>> (define-syntax special-let
>>   (syntax-rules ()
>>     ((_ () form ...)
>>      (let () form ...))
>>     ((_ ((var binding) more ...) form ...)
>>      (let ((var binding))
>>        (special-let (more ...) form ...)))
>>     ((_ (var more ...) form ...)
>>      (let ((var ()))
>>        (special-let (more ...) form ...)))))
>> 
>> ...and for comparison purposes, here is how I would implement it in 
>> Common Lisp:
>> 
>> (defmacro special-let (bindings &body body)
>>   (reduce (lambda (binding body)
>>         (cond ((symbolp binding)
>>            `(let ((,binding nil))
>>               ,body))
>>           (t `(let (,binding)
>>             ,body))))
>>       bindings
>>       :from-end t :initial-value `(progn ,@body)))
>> 
>> 
>> I am not sure what to think of this. Comments?
> 
> I am not sure. What happens in the CL macro
> if the body contains a macro that expands to
> one of the variables being bound in the special
> let?

Exactly the same as happens in the Scheme macro.

>    (define a 0)
> 
>    (define-syntax foo
>      (syntax-rules ()
>        ((_)
>         (+ a 100))))
> 
>    (special-let ((a 42))	
>      (foo))

  (define-macro (foo) 
    '(+ a 3))
  
  (special-let ((a 42))
     (foo))

   => 45

>    (defmacro foo ()
>      '(+ a 100))

Your "foo" is the "unhygenic" macro that causes the name capture here;
Pascal's defmacro and define-syntax versions of special-let behave
equivalently in this respect.  The "foo" macro is really a red herring
here.

> In order to give the defmacro the same behaviour
> as the syntax-rules one, you need to rename the user
> given names in bindings using gensym.  

I don't think that would work very well.

Jeremy.
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <403cd71d$0$212$edfadb0f@dread12.news.tele.dk>
Jeremy Yallop wrote:
> Jens Axel S�gaard wrote:
>>Pascal Costanza wrote:
>>
>>>Here is the result of my first experiment with syntax-rules...
>>>
>>>(define-syntax special-let
>>>  (syntax-rules ()
>>>    ((_ () form ...)
>>>     (let () form ...))
>>>    ((_ ((var binding) more ...) form ...)
>>>     (let ((var binding))
>>>       (special-let (more ...) form ...)))
>>>    ((_ (var more ...) form ...)
>>>     (let ((var ()))
>>>       (special-let (more ...) form ...)))))
>>>
>>>...and for comparison purposes, here is how I would implement it in 
>>>Common Lisp:
>>>
>>>(defmacro special-let (bindings &body body)
>>>  (reduce (lambda (binding body)
>>>        (cond ((symbolp binding)
>>>           `(let ((,binding nil))
>>>              ,body))
>>>          (t `(let (,binding)
>>>            ,body))))
>>>      bindings
>>>      :from-end t :initial-value `(progn ,@body)))
>>>
>>>
>>>I am not sure what to think of this. Comments?
>>
>>I am not sure. What happens in the CL macro
>>if the body contains a macro that expands to
>>one of the variables being bound in the special
>>let?
> 
> 
> Exactly the same as happens in the Scheme macro.
> 
> 
>>   (define a 0)
>>
>>   (define-syntax foo
>>     (syntax-rules ()
>>       ((_)
>>        (+ a 100))))
>>
>>   (special-let ((a 42))	
>>     (foo))
> 
> 
>   (define-macro (foo) 
>     '(+ a 3))

But now you are defining a new macro. The point of syntax-rules
macros is that they make it easy to ensure that you don't
inadvertently capture a variable. In other words, a variable name
in a macro definition should refer to the variable in the lexical
scope of the macro definition (not the macro use).


The question was not how to write FOO such that the variable
is captured. The qustion is:

   How do I write the macro FOO in CL

     (define-syntax foo
        (syntax-rules ()
          ((_) (+ a 100))))

   such that Pascal's definition of special-let doesn't
   capture the variable?

If that's not possible, then the conclusion is that
the two macros Pascal wanted us to compare does not
have the same semantics (which by the way, I assume
was the intention).

The next question is then: What needs to be done to
the define-macro definition of special-let to get
the same semantics? One way is this:

>>In order to give the defmacro the same behaviour
>>as the syntax-rules one, you need to rename the user
>>given names in bindings using gensym.  

[Here I was talking about the definition of special-let]

> I don't think that would work very well.

I do. The algorithm behind syntax-rules performs the
renaming for you. In the defmacro case, you have to
do it yourself.

-- 
Jens Axel S�gaard
From: Jeremy Yallop
Subject: Re: Scheme macros
Date: 
Message-ID: <slrnc3pn1t.g06.jeremy@kaje.cl.cam.ac.uk>
Jens Axel S�gaard wrote:
> Jeremy Yallop wrote:
>> Jens Axel S�gaard wrote:
>>>Pascal Costanza wrote:
>>>
>>>>Here is the result of my first experiment with syntax-rules...
>>>>
>>>>(define-syntax special-let
>>>>  (syntax-rules ()
>>>>    ((_ () form ...)
>>>>     (let () form ...))
>>>>    ((_ ((var binding) more ...) form ...)
>>>>     (let ((var binding))
>>>>       (special-let (more ...) form ...)))
>>>>    ((_ (var more ...) form ...)
>>>>     (let ((var ()))
>>>>       (special-let (more ...) form ...)))))
>>>>
>>>>...and for comparison purposes, here is how I would implement it in 
>>>>Common Lisp:
>>>>
>>>>(defmacro special-let (bindings &body body)
>>>>  (reduce (lambda (binding body)
>>>>        (cond ((symbolp binding)
>>>>           `(let ((,binding nil))
>>>>              ,body))
>>>>          (t `(let (,binding)
>>>>            ,body))))
>>>>      bindings
>>>>      :from-end t :initial-value `(progn ,@body)))
>>>>
>>>>
>>>>I am not sure what to think of this. Comments?
>>>
>>>I am not sure. What happens in the CL macro
>>>if the body contains a macro that expands to
>>>one of the variables being bound in the special
>>>let?
>> 
>> 
>> Exactly the same as happens in the Scheme macro.
>> 
>> 
>>>   (define a 0)
>>>
>>>   (define-syntax foo
>>>     (syntax-rules ()
>>>       ((_)
>>>        (+ a 100))))
>>>
>>>   (special-let ((a 42))	
>>>     (foo))
>> 
>> 
>>   (define-macro (foo) 
>>     '(+ a 3))
> 
> But now you are defining a new macro. The point of syntax-rules
> macros is that they make it easy to ensure that you don't
> inadvertently capture a variable. 

Yes, but Pascal's special-let is entirely irrelevant to this: it
doesn't cause any inadvertent name capture.  Your CL "foo" macro does
cause "inadvertent" name capture, so if you don't want that to happen,
don't write the macro like that.  Your examples behave exactly the
same in (builtin) let as in special-let, which shows that special-let
is not at fault here.

> In other words, a variable name in a macro definition should refer
> to the variable in the lexical scope of the macro definition (not
> the macro use).
> 
> The question was not how to write FOO such that the variable
> is captured.

The claim was that the defmacro version of special-let captures the
variable.  I have shown that this is not the case: your "foo" macro
captures the variable, and does so in both the defmacro and the
syntax-rules versions of special-let.

> The qustion is:
> 
>    How do I write the macro FOO in CL
> 
>      (define-syntax foo
>         (syntax-rules ()
>           ((_) (+ a 100))))
> 
>    such that Pascal's definition of special-let doesn't
>    capture the variable?

Which variable?  The problem is that there is no top-level lexical
environment to refer to in CL.  CL "let" bindings are dynamic for
variables created with defparameter, so the top-level value isn't
active within the body of the "let".

> If that's not possible, then the conclusion is that
> the two macros Pascal wanted us to compare does not
> have the same semantics (which by the way, I assume
> was the intention).

Well, perhaps.  The different semantics, though, have nothing to do
with name capture.

> The next question is then: What needs to be done to
> the define-macro definition of special-let to get
> the same semantics? One way is this:
> 
>>>In order to give the defmacro the same behaviour
>>>as the syntax-rules one, you need to rename the user
>>>given names in bindings using gensym.  
> 
> [Here I was talking about the definition of special-let]
> 
>> I don't think that would work very well.
> 
> I do. The algorithm behind syntax-rules performs the
> renaming for you. In the defmacro case, you have to
> do it yourself.

I think this entirely misses the point.  There is no name capture in
special-let; bindings are established for variable names that the user
supplies and these bindings are active within the body of the macro
invocation.  No renaming is necessary (or possible).

Jeremy.
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <403cf721$0$255$edfadb0f@dread12.news.tele.dk>
Jeremy Yallop wrote:
> Jens Axel S�gaard wrote:
> 
>>Jeremy Yallop wrote:
>>
>>>Jens Axel S�gaard wrote:
>>>
>>>>Pascal Costanza wrote:
>>>>
>>>>
>>>>>Here is the result of my first experiment with syntax-rules...
>>>>>
>>>>>(define-syntax special-let
>>>>> (syntax-rules ()
>>>>>   ((_ () form ...)
>>>>>    (let () form ...))
>>>>>   ((_ ((var binding) more ...) form ...)
>>>>>    (let ((var binding))
>>>>>      (special-let (more ...) form ...)))
>>>>>   ((_ (var more ...) form ...)
>>>>>    (let ((var ()))
>>>>>      (special-let (more ...) form ...)))))
>>>>>
>>>>>...and for comparison purposes, here is how I would implement it in 
>>>>>Common Lisp:
>>>>>
>>>>>(defmacro special-let (bindings &body body)
>>>>> (reduce (lambda (binding body)
>>>>>       (cond ((symbolp binding)
>>>>>          `(let ((,binding nil))
>>>>>             ,body))
>>>>>         (t `(let (,binding)
>>>>>           ,body))))
>>>>>     bindings
>>>>>     :from-end t :initial-value `(progn ,@body)))
>>>>>
>>>>>
>>>>>I am not sure what to think of this. Comments?
>>>>
>>>>I am not sure. What happens in the CL macro
>>>>if the body contains a macro that expands to
>>>>one of the variables being bound in the special
>>>>let?
>>>
>>>
>>>Exactly the same as happens in the Scheme macro.
>>>
>>>
>>>
>>>>  (define a 0)
>>>>
>>>>  (define-syntax foo
>>>>    (syntax-rules ()
>>>>      ((_)
>>>>       (+ a 100))))
>>>>
>>>>  (special-let ((a 42))	
>>>>    (foo))
>>>
>>>
>>>  (define-macro (foo) 
>>>    '(+ a 3))
>>
>>But now you are defining a new macro. The point of syntax-rules
>>macros is that they make it easy to ensure that you don't
>>inadvertently capture a variable. 
> 
> 
> Yes, but Pascal's special-let is entirely irrelevant to this: it
> doesn't cause any inadvertent name capture.  Your CL "foo" macro does
> cause "inadvertent" name capture, so if you don't want that to happen,
> don't write the macro like that.  Your examples behave exactly the
> same in (builtin) let as in special-let, which shows that special-let
> is not at fault here.
> 
> 
>>In other words, a variable name in a macro definition should refer
>>to the variable in the lexical scope of the macro definition (not
>>the macro use).
>>
>>The question was not how to write FOO such that the variable
>>is captured.
> 
> 
> The claim was that the defmacro version of special-let captures the
> variable.  I have shown that this is not the case: your "foo" macro
> captures the variable, and does so in both the defmacro and the
> syntax-rules versions of special-let.

Forget my CL version of the FOO macro. That was just my (bad)
attempt to translate it.

>>The qustion is:
>>
>>   How do I write the macro FOO in CL
>>
>>     (define-syntax foo
>>        (syntax-rules ()
>>          ((_) (+ a 100))))
>>
>>   such that Pascal's definition of special-let doesn't
>>   capture the variable?
> 
> 
> Which variable?  

The variable A.

 > The problem is that there is no top-level lexical
> environment to refer to in CL.  

 > CL "let" bindings are dynamic for
 > variables created with defparameter, so the top-level value isn't
 > active within the body of the "let".

Ah! Thanks for pointing that out. No harm done. Whether A is in the
top level or not is fortunately not important to the example.
Here it is with local variable.

(let ((a 0))
   (let-syntax ((foo (syntax-rules ()
                       ((_)
                        (+ a 100)))))

     (display (special-let ((a 42)) (foo)))
     (newline)

     (set! a 10)

     (display (special-let ((a 42)) (foo)))
     (newline)))

Prints:

   100
   110


>>If that's not possible, then the conclusion is that
>>the two macros Pascal wanted us to compare does not
>>have the same semantics (which by the way, I assume
>>was the intention).

> Well, perhaps.  The different semantics, though, have nothing 
 > to do with name capture.

What are you thinking about?

>>I do. The algorithm behind syntax-rules performs the
>>renaming for you. In the defmacro case, you have to
>>do it yourself.
> 
> 
> I think this entirely misses the point.  There is no name capture in
> special-let; bindings are established for variable names that the user
> supplies and these bindings are active within the body of the macro
> invocation.  No renaming is necessary (or possible).

If one never uses macros whose expansion refer to variables in the
lexical scopy of the definition of the macro, then there are no difference.

How do I in a defmacro refer to a variable in the lexical scope of
the definition of the variable?

-- 
Jens Axel S�gaard
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1jib3$srr$1@newsreader2.netcologne.de>
Jens Axel S�gaard wrote:

> How do I in a defmacro refer to a variable in the lexical scope of
> the definition of the variable?

Here is an attempt to do that.

(let ((a 42))
   (defmacro foo ()
     `(symbol-macrolet ((a (funcall ',(lambda () a))))
        (+ a 100))))

...but it seems hard to me to generalize this. On the other hand, it 
also seems unusual to me. I guess that one would either use special 
variables or factor things out in their own packages for such a purpose.

However, I would be interested in other opinions.


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Kaz Kylheku
Subject: Re: Scheme macros
Date: 
Message-ID: <cf333042.0402271034.584ff993@posting.google.com>
Jens Axel S�gaard <······@jasoegaard.dk> wrote in message news:<403cf721$0$255
> How do I in a defmacro refer to a variable in the lexical scope of
> the definition of the variable?

I don't see how you can fulfill this requirement without taking away
the idea that a macro produces the source code to a form which is
substituted somewhere, or changing what lexical scope means or what an
expression is.

In Common Lisp, a form can only be in two distinct lexical scopes
simultaneously if those scopes are nested.

What you are asking for is to be able to write a form (the macro body)
which is simultaneously in the lexical scope of the macro definition,
and in the lexical scope of the substitution site. Some of the free
variable references in the body resolve to one scope, other ones to
the other.

One way to do that is to change the inputs to the macro to be higher
order functions. Even simple variable references become lambdas. This
is how it works in lazy functional languages. A simple expression like
X is really (LAMBDA () X), so that when you pass X down through a
chain of function calls, its evaluation can be delayed, yet take place
in the right lexical environment.

We can turn all of the argument forms supplied by the macro user into
lambdas, and we can write the macro body as a lambda. Ensure that the
closures are created in the appropriate environments, and it's done.

The question is, would you want the macro system do do all this
inefficient, complicated junk automatically, under the hood, just to
solve some problem that is better addressed using dynamically scoped
module variables in a package?

Example: suppose we want to implement a REPEAT macro that repeatedly
evaluates the supplied form. The repetition count comes implicitly
from a lexical variable set up around the macro:

 (let ((repeat-count 10))
   (defmacro repeat (form)
     `(funcall ,(lambda (form-closure)
                  (dotimes (i repeat-count)
                    (funcall form-closure)))
               (lambda () ,form))))

If expressions were lazy, and if functions had destructuring lambda
lists, we could do this as an inline function:

  (declaim (inline repeat))

  (let ((repeat-count 10))
    (defun repeat (lazy-value)
      (dotimes (i repeat-count)
        lazy-value)))  ;; assumption: force happens here, no caching.

Or we could write a macro-writing macro that would do all the LAMBDA's
and FUNCALL's for us: it would take all of the argument pieces, make
closures out of them, turn the body into a closure, and use a MACROLET
to rewrite the references into funcalls.

But doing so takes away power from the macro system and complicates it
output.

I don't want the inputs to a macro to be some opaque objects; I want
them to be lists that I can subject to arbitrary analysis and
synthesis. If I leave free variables in the resulting form, I want
them to connect to lexical or dynamic variables in the substitution
environment only. I don't want the object output by a macro to carry
bindings back to the factory from where it came, unless I put them in.
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <40461c70$0$273$edfadb0f@dread12.news.tele.dk>
Kaz Kylheku wrote:
> Jens Axel S�gaard <······@jasoegaard.dk> wrote in message news:<403cf721$0$255
> 
>>How do I in a defmacro refer to a variable in the lexical scope of
>>the definition of the variable?

> I don't see how you can fulfill this requirement without taking away
> the idea that a macro produces the source code to a form which is
> substituted somewhere, or changing what lexical scope means or what an
> expression is.

The package solution you gave in another post, has the effect I wanted.

> In Common Lisp, a form can only be in two distinct lexical scopes
> simultaneously if those scopes are nested.
> 
> What you are asking for is to be able to write a form (the macro body)
> which is simultaneously in the lexical scope of the macro definition,
> and in the lexical scope of the substitution site. Some of the free
> variable references in the body resolve to one scope, other ones to
> the other.
> 
> One way to do that is to change the inputs to the macro to be higher
> order functions. Even simple variable references become lambdas. This
> is how it works in lazy functional languages. A simple expression like
> X is really (LAMBDA () X), so that when you pass X down through a
> chain of function calls, its evaluation can be delayed, yet take place
> in the right lexical environment.
> 
> We can turn all of the argument forms supplied by the macro user into
> lambdas, and we can write the macro body as a lambda. Ensure that the
> closures are created in the appropriate environments, and it's done.
> 
> The question is, would you want the macro system do do all this
> inefficient, complicated junk automatically, under the hood, just to
> solve some problem that is better addressed using dynamically scoped
> module variables in a package?

I didn't knew the package solution at the time.

Thanks for a thorough explanation.

-- 
Jens Axel S�gaard
From: Anton van Straaten
Subject: Re: Scheme macros
Date: 
Message-ID: <oWL2c.908$%06.182@newsread2.news.pas.earthlink.net>
On 2004-02-27, Kaz Kylheku wrote:
> I don't want the inputs to a macro to be some opaque objects; I want
> them to be lists that I can subject to arbitrary analysis and synthesis.

But lists in Lisp are opaque objects.  They respond to an interface
consisting of operations such as car and cdr.  As long as whatever opaque
object you replace them with supports an interface that is at least
equivalent to the one that lists support, you haven't lost anything.

Anton
From: Nils Gösche
Subject: Re: Scheme macros
Date: 
Message-ID: <87d680dv74.fsf@darkstar.cartan.de>
Jens Axel S�gaard <······@jasoegaard.dk> writes:

[...]

I am not sure what you guys are talking about in this thread, but I
have a feeling you might want to play around with

(defmacro foo ()
  `(+ ,a 100))

Regards,
-- 
Nils G�sche
"Don't ask for whom the <CTRL-G> tolls."

PGP key ID #xEEFBA4AF
From: Nils Gösche
Subject: Re: Scheme macros
Date: 
Message-ID: <878yiodusr.fsf@darkstar.cartan.de>
I wrote:

> (defmacro foo ()
>   `(+ ,a 100))

Oops, somebody else posted the same thing already.  Sorry for the
redundance.

Regards,
-- 
Nils G�sche
"Don't ask for whom the <CTRL-G> tolls."

PGP key ID #xEEFBA4AF
From: Marco Baringer
Subject: Re: Scheme macros
Date: 
Message-ID: <m2wu6b6pbz.fsf@bese.it>
Jens Axel S�gaard <······@jasoegaard.dk> writes:

> I do. The algorithm behind syntax-rules performs the
> renaming for you. In the defmacro case, you have to
> do it yourself.

doesn't syntax rules have to do more than just renaming? Since, iirc,
scheme macros also capture their lexcial context, with something like
this:

(let ((a 5))
  (define-syntax foo
     (syntax-rules ()
       ((_) (+ a 100))))

There is no way, in portable common lisp, to write a define-syntax
with the same semantics as the above foo macro.

-- 
-Marco
Ring the bells that still can ring.
Forget your perfect offering.
There is a crack in everything.
That's how the light gets in.
     -Leonard Cohen
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1tjve$nsd$1@newsreader2.netcologne.de>
Jens Axel S�gaard wrote:

> Pascal Costanza wrote:
> 
>> Here is the result of my first experiment with syntax-rules...
>>
>> (define-syntax special-let
>>   (syntax-rules ()
>>     ((_ () form ...)
>>      (let () form ...))
>>     ((_ ((var binding) more ...) form ...)
>>      (let ((var binding))
>>        (special-let (more ...) form ...)))
>>     ((_ (var more ...) form ...)
>>      (let ((var ()))
>>        (special-let (more ...) form ...)))))
>>
>> ...and for comparison purposes, here is how I would implement it in 
>> Common Lisp:
>>
>> (defmacro special-let (bindings &body body)
>>   (reduce (lambda (binding body)
>>         (cond ((symbolp binding)
>>            `(let ((,binding nil))
>>               ,body))
>>           (t `(let (,binding)
>>             ,body))))
>>       bindings
>>       :from-end t :initial-value `(progn ,@body)))
>>
>>
>> I am not sure what to think of this. Comments?
> 
> I am not sure. What happens in the CL macro
> if the body contains a macro that expands to
> one of the variables being bound in the special
> let?

It's a pity that we have got sidetracked again by the hygiene issue. I 
think the thread has shown that there are several options to take care 
of this by now. The most natural way in Common Lisp seems to be to use 
special variables and/or to use packages for namespacing issues.

In the meantime, I have experimented a little with ways to simulate 
syntax-rules/syntax-case in Common Lisp. I think today I have got the 
important insight: The Common Lisp way is to separate the concerns more 
clearly.

There are three issue, as far as I can see:

1) hygiene: The difference between CL and Scheme is that they just have 
different defaults here. The respective communities prefer their 
respective defaults, for various reasons, but there is no clear winner here.

2) referential transparency: We have discussed this in this thread.

3) pattern matching: This allows for a different programming style.

In CL, we have WITH-UNIQUE-NAMES and REBINDING for hygiene/referential 
transparency issues.

Now here comes the pattern matching stuff!

I have fiddled with DESTRUCTURING-BIND, error handling and dynamic 
scoping, and I think this works. See code below.


(use-package :lispworks)

(defvar *destructuring*)

(defmacro destructuring-case (expression &body forms)
   (with-unique-names (block)
     (rebinding (expression)
       `(block ,block
          (let ((*destructuring* t))
            (tagbody
              ,@(loop for form in forms
                      for tag = (gensym)
                      collect `(handler-bind
                                 ((error (lambda (err)
                                           (declare (ignore err))
                                           (when *destructuring*
                                             (go ,tag)))))
                                 (destructuring-bind
                                   ,(car form) ,expression
                                   (declare (ignorable _ __ ___))
                                   (let ((*destructuring* nil))
                                     (return-from ,block ,@(cdr form)))))
                      collect tag)
              (error "No match in destructuring-case.")))))))


;; test

(defmacro special-let (&whole form &rest args)
   (declare (ignore args))
   (destructuring-case form
     ((_ ((var binding) &rest more) &rest forms)
      `(let ((,var ,binding))
         (special-let (,@more) ,@forms)))
     ((_ (var &rest more) &rest forms)
      `(let ((,var nil))
         (special-let (,@more) ,@forms)))
     ((_ () &rest forms)
      `(let () ,@forms))))

For illustration purposes, here is the expansion of the 
destructuring-case form.

(LET* ((#:EXPRESSION829 FORM))
   (BLOCK #:BLOCK828
     (LET ((*DESTRUCTURING* T))
       (TAGBODY (HANDLER-BIND ((ERROR
                                (LAMBDA (ERR)
                                  (DECLARE (IGNORE ERR))
                                  (WHEN *DESTRUCTURING* (GO #:G830)))))
                  (DESTRUCTURING-BIND
                      (_ ((VAR BINDING) &REST MORE) &REST FORMS)
                      #:EXPRESSION829
                    (DECLARE (IGNORABLE _ __ ___))
                    (LET ((*DESTRUCTURING* NIL))
                      (RETURN-FROM #:BLOCK828
                        `(LET ((,VAR ,BINDING))
                           (SPECIAL-LET (,@MORE) ,@FORMS))))))
        #:G830  (HANDLER-BIND ((ERROR
                                (LAMBDA (ERR)
                                  (DECLARE (IGNORE ERR))
                                  (WHEN *DESTRUCTURING* (GO #:G831)))))
                  (DESTRUCTURING-BIND
                      (_ (VAR &REST MORE) &REST FORMS)
                      #:EXPRESSION829
                    (DECLARE (IGNORABLE _ __ ___))
                    (LET ((*DESTRUCTURING* NIL))
                      (RETURN-FROM #:BLOCK828
                        `(LET ((,VAR NIL))
                           (SPECIAL-LET (,@MORE) ,@FORMS))))))
        #:G831  (HANDLER-BIND ((ERROR
                                (LAMBDA (ERR)
                                  (DECLARE (IGNORE ERR))
                                  (WHEN *DESTRUCTURING* (GO #:G832)))))
                  (DESTRUCTURING-BIND
                      (_ NIL &REST FORMS)
                      #:EXPRESSION829
                    (DECLARE (IGNORABLE _ __ ___))
                    (LET ((*DESTRUCTURING* NIL))
                      (RETURN-FROM #:BLOCK828 `(LET () ,@FORMS)))))
        #:G832  (ERROR "No match in destructuring-case.")))))

Comments?


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Brian Mastenbrook
Subject: Re: Scheme macros
Date: 
Message-ID: <290220041633560321%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <············@newsreader2.netcologne.de>, Pascal Costanza
<········@web.de> wrote:

> I have fiddled with DESTRUCTURING-BIND, error handling and dynamic 
> scoping, and I think this works. See code below.
>
> Comments?

There's no good idea that can't be reinvented! I've been using an
essentially identical macro for quite a while now; it's part of my
common-idioms package on cliki. (Also happens to be the first google
hit for destructuring-case.)

Brian

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c1u1tb$i9b$1@newsreader2.netcologne.de>
Brian Mastenbrook wrote:

> In article <············@newsreader2.netcologne.de>, Pascal Costanza
> <········@web.de> wrote:
> 
>>I have fiddled with DESTRUCTURING-BIND, error handling and dynamic 
>>scoping, and I think this works. See code below.
>>
>>Comments?
> 
> There's no good idea that can't be reinvented! I've been using an
> essentially identical macro for quite a while now; it's part of my
> common-idioms package on cliki. (Also happens to be the first google
> hit for destructuring-case.)

I wasn't aware of that. Your version even supports literals and what 
Schemers call a "fender". Cool!


Pascal

-- 
Tyler: "How's that working out for you?"
Jack: "Great."
Tyler: "Keep it up, then."
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <404621b2$0$289$edfadb0f@dread12.news.tele.dk>
Pascal Costanza wrote:
> Jens Axel S�gaard wrote:
>> Pascal Costanza wrote:
>>
>>> Here is the result of my first experiment with syntax-rules...
>>>
>>> (define-syntax special-let
>>>   (syntax-rules ()
>>>     ((_ () form ...)
>>>      (let () form ...))
>>>     ((_ ((var binding) more ...) form ...)
>>>      (let ((var binding))
>>>        (special-let (more ...) form ...)))
>>>     ((_ (var more ...) form ...)
>>>      (let ((var ()))
>>>        (special-let (more ...) form ...)))))
>>>
>>> ...and for comparison purposes, here is how I would implement it in 
>>> Common Lisp:

>>> I am not sure what to think of this. Comments?

>> I am not sure. What happens in the CL macro
>> if the body contains a macro that expands to
>> one of the variables being bound in the special
>> let?

> It's a pity that we have got sidetracked again by the hygiene issue. I 
> think the thread has shown that there are several options to take care 
> of this by now. The most natural way in Common Lisp seems to be to use 
> special variables and/or to use packages for namespacing issues.

Yes. I think I focused on the hygiene problem, because that's the
hard problem (if you don't have packages).

> In the meantime, I have experimented a little with ways to simulate 
> syntax-rules/syntax-case in Common Lisp. I think today I have got the 
> important insight: The Common Lisp way is to separate the concerns more 
> clearly.
> 
> There are three issue, as far as I can see:
> 
> 1) hygiene: The difference between CL and Scheme is that they just have 
> different defaults here. The respective communities prefer their 
> respective defaults, for various reasons, but there is no clear winner 
> here.
> 
> 2) referential transparency: We have discussed this in this thread.
> 
> 3) pattern matching: This allows for a different programming style.

As you and Masterbrook demonstrate it is relatively easy to
copy the pattern matching style of writing macros when you
have defmacro.


> (defmacro special-let (&whole form &rest args)
>   (declare (ignore args))
>   (destructuring-case form
>     ((_ ((var binding) &rest more) &rest forms)
>      `(let ((,var ,binding))
>         (special-let (,@more) ,@forms)))
>     ((_ (var &rest more) &rest forms)
>      `(let ((,var nil))
>         (special-let (,@more) ,@forms)))
>     ((_ () &rest forms)
>      `(let () ,@forms))))

I find this code much more readable than you original CL macro,
since it concentrates at the code transformation at hand.

For fun I have written a little desugarer for R5RS Scheme
(without macros) using (the PLT version of) syntax-case.

It can be found here:

     <http://www.scheme.dk/desugar.scm>

The forms quasisyntax (also written #' ) and unsyntax (also written #,)
correspond to normal quasiquote and unquote, but construct syntax objects
instead of lists.

The forms quasisyntax/loc and syntax/lox (abreviated qs/l ans d/s in the code)
let the macro writer associate location on the syntax to be constructed.

An example (the desugarer is called d):

   (syntax-case* stx (quote if begin set! lambda define let
                      let* letrec cond else => case and or do)
                 identifier=?
    ...
    [(begin e)                      (d (s/l stx e))]
    ...)

The form (begin e) (= stx) desugars to e. The form (s/l stx e) is used
to associate the syntax location of (begin e) (i.e. the file, module, and linie number,
and column span) to the desugared syntax e. If an error occurs under the desugaring
of e then it's possible to give an error message to user in terms of the original
code.

The desugaring of let is as follows:

     [(let ([vs es] ...) e)          (let ([des (smap d #'(es ...))])
                                       (qs/l stx (let #,(smap2 list #'(vs ...) des) #,(d #'e))))]
     [(let ([vs es] ...) e e* ...)   (d (s/l stx (let ([vs es] ...) (begin e e* ...))))]
     [(let t ([vs es] ...) e e* ...) (d (s/l stx (letrec ((t (lambda (vs ...) e e* ...))))))]

where smap and smap2 are like MAP but works on syntax representing lists.

-- 
Jens Axel S�gaard
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c25kvh$64l$1@newsreader2.netcologne.de>
Jens Axel S�gaard wrote:

> Pascal Costanza wrote:
> 

>> It's a pity that we have got sidetracked again by the hygiene issue. I 
>> think the thread has shown that there are several options to take care 
>> of this by now. The most natural way in Common Lisp seems to be to use 
>> special variables and/or to use packages for namespacing issues.
> 
> Yes. I think I focused on the hygiene problem, because that's the
> hard problem (if you don't have packages).

OK, I see.

>> (defmacro special-let (&whole form &rest args)
>>   (declare (ignore args))
>>   (destructuring-case form
>>     ((_ ((var binding) &rest more) &rest forms)
>>      `(let ((,var ,binding))
>>         (special-let (,@more) ,@forms)))
>>     ((_ (var &rest more) &rest forms)
>>      `(let ((,var nil))
>>         (special-let (,@more) ,@forms)))
>>     ((_ () &rest forms)
>>      `(let () ,@forms))))
> 
> 
> I find this code much more readable than you original CL macro,
> since it concentrates at the code transformation at hand.

Funny that I find my original code easier to understand, because it more 
clearly states the flow of control IMHO. With destructuring-case / 
syntax-rules/case one uses recursion instead of iteration, and in this 
particular example I find iteration more natural. But this is certainly 
just a matter of taste.

Anyway, the destructuring-case is extremely close to the syntax-rules 
variant. It uses &rest and ,@ instead of ... - it would be interesting 
to see if one could make effective use of other lambda keywords.

> For fun I have written a little desugarer for R5RS Scheme
> (without macros) using (the PLT version of) syntax-case.
> 
> It can be found here:
> 
>     <http://www.scheme.dk/desugar.scm>

Hm, interesting. But what would you use that for? (This not a rhetorical 
question.)

My first guess would be that Common Lispers would rather trust their 
vendors to do these things right for the ANSI stuff. It would be 
interesting to have such facilities for user-supplied macros, but I 
guess that's what environment objects are supposed to be used for in 
combination with *MACROEXPAND-HOOK*. (Environment objects are not 
supported in ANSI, but see 8.5 in CLtL2.)

But as I said, this is my first guess, and I haven't thought about the 
details. It's unlikely that I am missing something.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <40465ccb$0$266$edfadb0f@dread12.news.tele.dk>
Pascal Costanza wrote:
> Jens Axel S�gaard wrote:
>> Pascal Costanza wrote:

>> I find this code much more readable than you original CL macro,
>> since it concentrates at the code transformation at hand.

> Funny that I find my original code easier to understand, because it more 
> clearly states the flow of control IMHO. With destructuring-case / 
> syntax-rules/case one uses recursion instead of iteration, and in this 
> particular example I find iteration more natural. But this is certainly 
> just a matter of taste.

If I need to convince myself that the transformation is correct,
then in

 >>> (defmacro special-let (&whole form &rest args)
 >>>   (declare (ignore args))
 >>>   (destructuring-case form
 >>>     ((_ ((var binding) &rest more) &rest forms)
 >>>      `(let ((,var ,binding))
 >>>         (special-let (,@more) ,@forms)))
 >>>     ((_ (var &rest more) &rest forms)
 >>>      `(let ((,var nil))
 >>>         (special-let (,@more) ,@forms)))
 >>>     ((_ () &rest forms)
 >>>      `(let () ,@forms))))

I need to focus on three small transformations (which can be
checked seperately) rather than having one large transformation.

E.g. the first clause states that

(let ((<var1> <exp1>)
       (<var2> <exp2>)
       ...)
   <body-exp> ...)

can be rewritten as

(let ((<var1> <exp1>))
    (let ((<var2> <exp2>)
          ...)
      <body-exp> ...))  .

That this transformation is correct is easy to see.

> Anyway, the destructuring-case is extremely close to the syntax-rules 
> variant. It uses &rest and ,@ instead of ... - it would be interesting 
> to see if one could make effective use of other lambda keywords.

Indeed.

>> For fun I have written a little desugarer for R5RS Scheme
>> (without macros) using (the PLT version of) syntax-case.
>>
>> It can be found here:
>>
>>     <http://www.scheme.dk/desugar.scm>
> 
> 
> Hm, interesting. But what would you use that for? (This not a rhetorical 
> question.)

First, as a (hopefully) interesting example of the use of syntax-case.

Second, in a Scheme compiler as a first approximation for the desugaring
phase (since it doesn't handle macros).

> My first guess would be that Common Lispers would rather trust their 
> vendors to do these things right for the ANSI stuff. It would be 
> interesting to have such facilities for user-supplied macros, but I 
> guess that's what environment objects are supposed to be used for in 
> combination with *MACROEXPAND-HOOK*. (Environment objects are not 
> supported in ANSI, but see 8.5 in CLtL2.)
> 
> But as I said, this is my first guess, and I haven't thought about the 
> details. It's unlikely that I am missing something.

It was just a toy example. I am not going to replace my system's
normal macro expander :-)

-- 
Jens Axel S�gaard
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c271fc$nj8$1@newsreader2.netcologne.de>
Jens Axel S�gaard wrote:

> If I need to convince myself that the transformation is correct,
> then in
> 
>  >>> (defmacro special-let (&whole form &rest args)
>  >>>   (declare (ignore args))
>  >>>   (destructuring-case form
>  >>>     ((_ ((var binding) &rest more) &rest forms)
>  >>>      `(let ((,var ,binding))
>  >>>         (special-let (,@more) ,@forms)))
>  >>>     ((_ (var &rest more) &rest forms)
>  >>>      `(let ((,var nil))
>  >>>         (special-let (,@more) ,@forms)))
>  >>>     ((_ () &rest forms)
>  >>>      `(let () ,@forms))))
> 
> I need to focus on three small transformations (which can be
> checked seperately) rather than having one large transformation.

Yes, I understand that.

>>> For fun I have written a little desugarer for R5RS Scheme
>>> (without macros) using (the PLT version of) syntax-case.
>>>
>>> It can be found here:
>>>
>>>     <http://www.scheme.dk/desugar.scm>
>>
>> Hm, interesting. But what would you use that for? (This not a 
>> rhetorical question.)
> 
> First, as a (hopefully) interesting example of the use of syntax-case.
> 
> Second, in a Scheme compiler as a first approximation for the desugaring
> phase (since it doesn't handle macros).
> 
>> My first guess would be that Common Lispers would rather trust their 
>> vendors to do these things right for the ANSI stuff. It would be 
>> interesting to have such facilities for user-supplied macros, but I 
>> guess that's what environment objects are supposed to be used for in 
>> combination with *MACROEXPAND-HOOK*. (Environment objects are not 
>> supported in ANSI, but see 8.5 in CLtL2.)
>>
>> But as I said, this is my first guess, and I haven't thought about the 
>> details. It's unlikely that I am missing something.
> 
> It was just a toy example. I am not going to replace my system's
> normal macro expander :-)

:)

However, I think by now that my first guess was wrong. When we take the 
very simple case of a symbol in a Common Lisp source code, it is always 
eq-identical to itself, no matter where it is mentioned. So there is no 
way to unambiguously map it to additional information, say, in a 
hashtable. Instead, one really needs a more sophisticated data structure 
to represent both the symbol and, say, source code locations. So this is 
what syntax objects are actually there for.

Thanks for you patience - I have learned something.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403032307.34680671@posting.google.com>
Pascal Costanza <········@web.de> wrote:
> It's a pity that we have got sidetracked again by the hygiene issue. I 
> think the thread has shown that there are several options to take care 
> of this by now. The most natural way in Common Lisp seems to be to use 
> special variables and/or to use packages for namespacing issues.

I don't see how special variables and/or namespaces could give you
an equivalent of referential transparency for local macros.

> There are three issue, as far as I can see:
> 
> 1) hygiene: The difference between CL and Scheme is that they just have 
> different defaults here. The respective communities prefer their 
> respective defaults, for various reasons, but there is no clear winner here.
> 
> 2) referential transparency: We have discussed this in this thread.
> 
> 3) pattern matching: This allows for a different programming style.

Those are indeed the three differences between macros in CL and
Scheme.

The importance of these differences lies not so much in whether
you can get your work done as in what kinds of languages you can
design without losing the benefits of reliable macros.  Scheme-style
macros have been implemented for rather different languages like
Modula-2.  If you look at all the features of Common Lisp that you
are using to get the same effect, it's a little hard to believe that
the CL approach would generalize as well to other languages.

Will
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c276m4$3lc$1@newsreader2.netcologne.de>
William D Clinger wrote:

> Pascal Costanza <········@web.de> wrote:
> 
>>It's a pity that we have got sidetracked again by the hygiene issue. I 
>>think the thread has shown that there are several options to take care 
>>of this by now. The most natural way in Common Lisp seems to be to use 
>>special variables and/or to use packages for namespacing issues.
> 
> I don't see how special variables and/or namespaces could give you
> an equivalent of referential transparency for local macros.

This should do the job:

(defmacro macrolet* (vars defs &body body)
   (let ((syms (mapcar (lambda (var)
                         (cons var (copy-symbol var)))
                       vars)))
     `(let ,(mapcar (lambda (var)
                      `(,(cdr (assoc var syms))
                        (lambda () ,var)))
                    vars)
        (declare (special ,@(mapcar #'cdr syms)))
        (macrolet
            ,(mapcar
              (lambda (def)
                `(,(car def)
                  ,(cadr def)
                  `(let ()
                     (declare (special ,@',(mapcar #'cdr syms)))
                     (symbol-macrolet
                         ,',(mapcar
                             (lambda (var)
                               `(,var (funcall ,(cdr (assoc var syms)))))
                             vars)
                       ,@',(cddr def)))))
              defs)
          ,@body))))

(let ((x 5))
   (macrolet* (x)
       ((test () (list x)))
     (let ((x 55))
       (test))))
=> (5)

The expansion is this:

(LET ((X 5))
   (LET ((#:X (LAMBDA () X)))
     (DECLARE (SPECIAL #:X))
     (MACROLET
	((TEST ()
            `(LET ()
               (DECLARE (SPECIAL #:X))
               (SYMBOL-MACROLET ((X (FUNCALL #:X)))
                 (LIST X)))))
       (LET ((X 55))
         (LET ()
           (DECLARE (SPECIAL #:X))
           (SYMBOL-MACROLET ((X (FUNCALL #:X)))
             (LIST (FUNCALL #:X))))))))

(This macro doesn't yet include checks whether variable names clash with 
parameter names for local macros.)

>>There are three issue, as far as I can see:
>>
>>1) hygiene: The difference between CL and Scheme is that they just have 
>>different defaults here. The respective communities prefer their 
>>respective defaults, for various reasons, but there is no clear winner here.
>>
>>2) referential transparency: We have discussed this in this thread.
>>
>>3) pattern matching: This allows for a different programming style.
> 
> Those are indeed the three differences between macros in CL and
> Scheme.
> 
> The importance of these differences lies not so much in whether
> you can get your work done as in what kinds of languages you can
> design without losing the benefits of reliable macros.  Scheme-style
> macros have been implemented for rather different languages like
> Modula-2.  If you look at all the features of Common Lisp that you
> are using to get the same effect, it's a little hard to believe that
> the CL approach would generalize as well to other languages.

This is a very interesting perspective I haven't yet thought of at all. 
You're right, and this resonates with the notion that Scheme is a 
testbed for language features that go beyond Lisp. (I don't recall 
exactly where I have read this...)

Thanks a lot to Anton, Jens and you for a very fruitful discussion!


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Kaz Kylheku
Subject: Re: Scheme macros
Date: 
Message-ID: <cf333042.0403041344.720469d5@posting.google.com>
··········@verizon.net (William D Clinger) wrote in message news:<····························@posting.google.com>...
> Pascal Costanza <········@web.de> wrote:
> > There are three issue, as far as I can see:
> > 
> > 1) hygiene: The difference between CL and Scheme is that they just have 
> > different defaults here. The respective communities prefer their 
> > respective defaults, for various reasons, but there is no clear winner here.
> > 
> > 2) referential transparency: We have discussed this in this thread.
> > 
> > 3) pattern matching: This allows for a different programming style.
> 
> Those are indeed the three differences between macros in CL and
> Scheme.
> 
> The importance of these differences lies not so much in whether
> you can get your work done as in what kinds of languages you can
> design without losing the benefits of reliable macros.  Scheme-style
> macros have been implemented for rather different languages like
> Modula-2.  If you look at all the features of Common Lisp that you
> are using to get the same effect, it's a little hard to believe that
> the CL approach would generalize as well to other languages.

So what? Maybe it can't be easily generalized to other languages
because it hasn't been braindamaged to fit.

Lisp macros require that the source code be a programmer-visible data
structure, full stop. So to have Lisp-like macros in Modula-2, you'd
have to have a way of operating on Modula-2 source as an abstract
syntax tree.

The Scheme macro system does not rely on the program having a nested
list structure. It does not pass unevaluated source code to a macro
transformer. It implements some hodge-podge semantics in which the
inputs to the macro are not pure source, but pieces of a program
containing bindings. The pattern matcher behaves differently based on
whether symbols occuring in the macro call have bindings or not in the
calling original environment. The template itself isn't pure code
either because its free variables magically attach to bindings in the
defining environment.

I disagree with the insinuation that Modula-2 is ``rather different''
from Scheme. It's rather different from Lisp, that's certain. Add
lexical closures, macros and continuations to Modula-2 and you have
Scheme with Pascal-like syntax.
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403042245.6d694b37@posting.google.com>
···@ashi.footprints.net (Kaz Kylheku) wrote:
> Lisp macros require that the source code be a programmer-visible data
> structure, full stop. So to have Lisp-like macros in Modula-2, you'd
> have to have a way of operating on Modula-2 source as an abstract
> syntax tree.

True.

> The Scheme macro system does not rely on the program having a nested
> list structure.

Untrue.  The Scheme macro system *does* rely on this.  That's the
main problem you have to solve when implementing Scheme-like macros
for Modula-2.  The obvious approach is to figure out some way for the
macro system to operate on Modula-2 source as abstract syntax trees.

> It does not pass unevaluated source code to a macro
> transformer.

Untrue.

> It implements some hodge-podge semantics in which the
> inputs to the macro are not pure source, but pieces of a program
> containing bindings.

This doesn't sound true to me, but perhaps you understand these
things better than I.

> The pattern matcher behaves differently based on
> whether symbols occuring in the macro call have bindings or not in the
> calling original environment.

Untrue.  You may be confusing pattern matching with transcription,
and don't understand transcription, but that's just a guess.

> The template itself isn't pure code
> either because its free variables magically attach to bindings in the
> defining environment.

True.

> I disagree with the insinuation that Modula-2 is ``rather different''
> from Scheme.

That is probably true.

By the way, can you explain to me why Pascal Costanza's MACROLET*
doesn't work with this minor variation of the example he gave?
Can you explain how to fix his code so things like this will work
reliably?

(let ((x 5))
  (macrolet* (x)
    ((test () (list x))   
     (set37 () (setf x 37)))
    (let ((x 55))
      (set37)
      (test))))

Will
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c29dns$bge$1@newsreader2.netcologne.de>
William D Clinger wrote:

> By the way, can you explain to me why Pascal Costanza's MACROLET*
> doesn't work with this minor variation of the example he gave?
> Can you explain how to fix his code so things like this will work
> reliably?
> 
> (let ((x 5))
>   (macrolet* (x)
>     ((test () (list x))   
>      (set37 () (setf x 37)))
>     (let ((x 55))
>       (set37)
>       (test))))

LispWorks tells me that there is a setf expander missing for the funcall 
of the generated special symbol. So it should probably be possible to 
add a setf expander for that, but I haven't checked.

Independent of whether this works or not, this thing starts to turn into 
a Turing-equivalence argument, and that's pretty boring. To me, it's 
clear by now what Anton has said in the beginning: 
syntax-rules/syntax-case have their respective advantages wrt 
expressivity. The fact that these advantages don't seem to be important 
in Common Lisp is largely irrelevant - it surely doesn't falsify that 
assessment.

To put it like that: Assume you are forced to program in Java instead of 
Common Lisp - this happens in the "real world" - then it's probably 
better to have a Scheme-style macro system than to have no Lisp feature 
at all. ;)


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <ad2vodk7.fsf@comcast.net>
Pascal Costanza <········@web.de> writes:

> William D Clinger wrote:
>
>> By the way, can you explain to me why Pascal Costanza's MACROLET*
>> doesn't work with this minor variation of the example he gave?
>> Can you explain how to fix his code so things like this will work
>> reliably?
>> (let ((x 5))
>>   (macrolet* (x)
>>     ((test () (list x))        (set37 () (setf x 37)))
>>     (let ((x 55))
>>       (set37)
>>       (test))))
>
> LispWorks tells me that there is a setf expander missing for the
> funcall of the generated special symbol. So it should probably be
> possible to add a setf expander for that, but I haven't checked.
>
> Independent of whether this works or not, this thing starts to turn
> into a Turing-equivalence argument, and that's pretty boring. 

Will's example is a bit more subtle than that...

-- 
~jrm
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c2ah88$fsv$1@newsreader2.netcologne.de>
Joe Marshall wrote:

> Pascal Costanza <········@web.de> writes:
> 
>>William D Clinger wrote:
>>
>>
>>>By the way, can you explain to me why Pascal Costanza's MACROLET*
>>>doesn't work with this minor variation of the example he gave?
>>>Can you explain how to fix his code so things like this will work
>>>reliably?
>>>(let ((x 5))
>>>  (macrolet* (x)
>>>    ((test () (list x))        (set37 () (setf x 37)))
>>>    (let ((x 55))
>>>      (set37)
>>>      (test))))
>>
>>LispWorks tells me that there is a setf expander missing for the
>>funcall of the generated special symbol. So it should probably be
>>possible to add a setf expander for that, but I haven't checked.
>>
>>Independent of whether this works or not, this thing starts to turn
>>into a Turing-equivalence argument, and that's pretty boring. 
> 
> Will's example is a bit more subtle than that...

First note that I have meant my remark as an acknowledgement of 
syntax-rules/syntax-case. The code I need to write to make things work 
with DEFMACRO starts to get very hairy, just to prove that it can be 
made to work. But that's not the really interesting point. What's 
interesting is how convenient it is to express something, and 
syntax-rules/syntax-case indeed seem to do a much better job here.

Nevertheless, because of your remark I have tried to make it work, and I 
have to admit that I don't get the subtlety. I am probably missing 
something. What's the point? Is it about the reliability? The code below 
seems to work...

Here's the code.

(defmacro macrolet* (vars defs &body body)
   (let ((syms (mapcar (lambda (var)
                         (cons var (copy-symbol var)))
                       vars)))
     `(let ,(mapcar (lambda (var)
                      `(,(cdr (assoc var syms))
                        (cons (lambda () ,var)
                              (lambda (value)
                                (setf ,var value)))))
                    vars)
        (declare (special ,@(mapcar #'cdr syms)))
        (macrolet
            ,(mapcar
              (lambda (def)
                `(,(car def)
                  ,(cadr def)
                  `(let ()
                     (declare (special ,@',(mapcar #'cdr syms)))
                     (flet ,',(loop for var in vars
                                    for sym = (cdr (assoc var syms))
                                    collect `(,sym ()
                                               (funcall (car ,sym)))
                                    collect `((setf ,sym) (value)
                                               (funcall
                                                (cdr ,sym) value)))
                       (symbol-macrolet
                           ,',(mapcar
                               (lambda (var)
                                 `(,var (,(cdr (assoc var syms)))))
                               vars)
                         ,@',(cddr def))))))
              defs)
          ,@body))))

(let ((x 5))
   (macrolet* (x)
       ((test () (list x))
        (set37 () (setf x 37)))
     (let ((x 55))
       (set37)
       (test))))

==> (37)

Here's the expansion.

(LET ((X 5))
   (LET ((#:X (CONS (LAMBDA () X)
                    (LAMBDA (VALUE)
                      (LET* ((#:G919 VALUE))
                        (SETQ X #:G919))))))
     (DECLARE (SPECIAL #:X))
     (MACROLET ((TEST ()
                  `(LET ()
                     (DECLARE (SPECIAL #:X))
                     (FLET ((#:X ()
                              (FUNCALL (CAR #:X)))
                            ((SETF #:X) (VALUE)
                              (FUNCALL (CDR #:X) VALUE)))
                       (SYMBOL-MACROLET ((X (#:X)))
                         (LIST X)))))
                (SET37 ()
                  `(LET ()
                     (DECLARE (SPECIAL #:X))
                     (FLET ((#:X ()
                              (FUNCALL (CAR #:X)))
                            ((SETF #:X) (VALUE)
                              (FUNCALL (CDR #:X) VALUE)))
                       (SYMBOL-MACROLET ((X (#:X)))
                         (SETF X 37))))))
       (LET ((X 55))
         (LET ()
           (DECLARE (SPECIAL #:X))
           (FLET ((#:X ()
                    (FUNCALL (CAR #:X)))
                  ((SETF #:X) (VALUE)
                    (FUNCALL (CDR #:X) VALUE)))
             (SYMBOL-MACROLET ((X (#:X)))
               (LET* ((#:G928 37))
                 (SETF::GENSYM\ \"X\" #:G928)))))
         (LET ()
           (DECLARE (SPECIAL #:X))
           (FLET ((#:X ()
                    (FUNCALL (CAR #:X)))
                  ((SETF #:X) (VALUE)
                    (FUNCALL (CDR #:X) VALUE)))
             (SYMBOL-MACROLET ((X (#:X)))
               (LIST (#:X)))))))))


Pascal


-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <y8qfyrjz.fsf@ccs.neu.edu>
Pascal Costanza <········@web.de> writes:

> Nevertheless, because of your remark I have tried to make it work, and
> I have to admit that I don't get the subtlety. I am probably missing
> something. What's the point? Is it about the reliability? The code
> below seems to work...

Before I make a complete ass of myself, let me make sure I understand
what macrolet* is supposed to do.  Am I correct in that the VARS list
indicates which variables are to be scoped at the macro-definition
site and all other variables are scoped at the macro-use site?
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403052029.2a3f7f6b@posting.google.com>
Joe Marshall wrote:
> > Nevertheless, because of your remark I have tried to make it work, and
> > I have to admit that I don't get the subtlety. I am probably missing
> > something. What's the point? Is it about the reliability? The code
> > below seems to work...
> 
> Before I make a complete ass of myself, let me make sure I understand
> what macrolet* is supposed to do....

I too am having a hard time understanding the intended syntax/semantics
of macrolet*.  Reasoning by analogy with macrolet isn't working for me,
because

(let ((x 5))
  (macrolet ((test (y) `(list x ,y)))
    (let ((x 55))
      (test x))))

evaluates to (55 55) but

(let ((x 5))
  (macrolet* ((test (y) `(list x ,y)))
    (let ((x 55))
      (test x))))

results in an error.  I doubt whether this error is very interesting,
but it is preventing me from testing more interesting examples.

Will
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403052048.de2097b@posting.google.com>
In my previous post, the second example was wrong.  It should
have been

(let ((x 5))
  (macrolet* (x) ((test (y) `(list x ,y)))
    (let ((x 55))
      (test x))))

That example gives an error, not the expected (LIST X 55).
The reason I expected that answer is that

(let ((x 5))
  (macrolet* (x) ((test () `(list x ,x)))
    (let ((x 55))
      (test))))

evaluates to (LIST X 5).  Okay, so the body of a macro defined
by macrolet* isn't evaluated like the body of a macro defined
by macrolet.  Fair enough, but then

(let ((x 5))
  (macrolet* (x) ((test (y) (list x y)))
    (let ((x 55))
      (test 7))))

also gives an error, not the expected (5 7).

What I'm trying to do is to run some examples that would show
the macrolet* macro delivers the "equivalent of referential
transparency for local macros" that Pascal claimed for it.
Right now it seems to blow up on all interesting examples,
but that's probably because I don't understand how it should
be used.

Will
From: Brian Mastenbrook
Subject: Re: Scheme macros
Date: 
Message-ID: <060320040339343622%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <···························@posting.google.com>, William D
Clinger <··········@verizon.net> wrote:

> What I'm trying to do is to run some examples that would show
> the macrolet* macro delivers the "equivalent of referential
> transparency for local macros" that Pascal claimed for it.
> Right now it seems to blow up on all interesting examples,
> but that's probably because I don't understand how it should
> be used.

Pascal's macro is not sufficient to provide referential transparency;
however, I have managed to construct a diabolical macro that does. This
macro steals heavily from an earlier unportable macro (part of
common-idioms now) which I wrote to prove to Dan Friedman that you can
play games with the environment using the CL macro system, in ways like
this:

(let ((x 1) (y 2))
  (let-env env
           (append
            (let ((x 3) (y 4))
              (setf x 5)
              (env (x)
                   (setf x 6)
                   (setf y 7))
              (list x y))
            (list x y)))) => (5 7 6 2)

The CL macro which implements this is much larger than the
corresponding syntax-case, but it works without a code walker (see
footnote). The only downside is that it relies on the order of
macroexpansion, which is not portable. However, we can do away with
that if we know at the point of environment capture which variables we
will need; then the macro can be completely portable. This is the first
piece of the puzzle.

The second piece is in understanding what exactly constitutes a
reference for the concept of referential transparency. In syntax-rules,
this is not much of an issue to define: a symbol will either be thrown
away or appear in the output; if it does appear it will either be
quoted or or in a position to be evaluated. This makes it easy to
implement syntax-rules hygiene via something like explicit renaming.
However, in CL, the situation is not so simple. For as much as the
system knows, the macro in question could be EVALing its input at
macroexpansion time. As such, to achieve referential transparency, we
need some way of letting the macroexpander know that particular
variables will appear unquoted in the eventual output. To accomplish
that I have modified your example as follows:

(let ((x 5))
  (macrolet* (x)
             ((test (y) (code (y) `(list x ,y))))
             (let ((x 55))
               (test (1+ x))))) => (5 56)

The (code (y) ...) form establishes a scope where Y is bound to
something that has had its references taken care of properly. If
ommitted, the example will return (5 6).

You can find it at
http://www.cs.indiana.edu/~bmastenb/misc/macroletstar.lisp . I take no
responsibility for any psychotic breakdowns caused by reading this
code. Be sure to replace the import for with-gensyms with whatever
version you prefer.

Footnote: Actually, it uses a very sophisticated code-walker, as does
this macro. It just happens to be the one that's built into your
implementation. Hurray for inter-macro communication via property lists
on uninterned symbols.

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Brian Mastenbrook
Subject: Re: Scheme macros
Date: 
Message-ID: <060320040401142024%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <·······································@cs.indiana.edu>,
Brian Mastenbrook <····················@cs.indiana.edu> wrote:

> You can find it at
> http://www.cs.indiana.edu/~bmastenb/misc/macroletstar.lisp . 

Er, if you happened to fetch it in the past few minutes, you got a
version with a completely misleading (and completely false!) docstring
on let-env*. It's been removed now.

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403060847.6762ada4@posting.google.com>
Brian Mastenbrook <····················@cs.indiana.edu> wrote:
> Pascal's macro is not sufficient to provide referential transparency;
> however, I have managed to construct a diabolical macro that does....
[snip]
> You can find it at
> http://www.cs.indiana.edu/~bmastenb/misc/macroletstar.lisp . I take no
> responsibility for any psychotic breakdowns caused by reading this
> code. Be sure to replace the import for with-gensyms with whatever
> version you prefer.

Thank you, Brian.  I will study your diabolical macro.

Will
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c2fb2a$lhr$2@newsreader2.netcologne.de>
William D Clinger wrote:

> Brian Mastenbrook <····················@cs.indiana.edu> wrote:
> 
>>Pascal's macro is not sufficient to provide referential transparency;
>>however, I have managed to construct a diabolical macro that does....
> 
> [snip]
> 
>>You can find it at
>>http://www.cs.indiana.edu/~bmastenb/misc/macroletstar.lisp . I take no
>>responsibility for any psychotic breakdowns caused by reading this
>>code. Be sure to replace the import for with-gensyms with whatever
>>version you prefer.
> 
> Thank you, Brian.  I will study your diabolical macro.

Seconded.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Brian Mastenbrook
Subject: Re: Scheme macros
Date: 
Message-ID: <070320041003377124%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <············@newsreader2.netcologne.de>, Pascal Costanza
<········@web.de> wrote:

> > Thank you, Brian.  I will study your diabolical macro.
> 
> Seconded.

Thanks to Paul Dietz for pointing out that this macro didn't need to
use property lists at all. Oops! This macro was originally derived from
one which did (as I explained) and so was a bit hairy at first.

I've updated the version on the server (
http://www.cs.indiana.edu/~bmastenb/misc/macroletstar.lisp ), and it
should now be 20% less diabolical.

Brian

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c2fa7v$kam$1@newsreader2.netcologne.de>
William D Clinger wrote:

> What I'm trying to do is to run some examples that would show
> the macrolet* macro delivers the "equivalent of referential
> transparency for local macros" that Pascal claimed for it.
> Right now it seems to blow up on all interesting examples,
> but that's probably because I don't understand how it should
> be used.

Right, it doesn't work. Thanks for the counter example.

Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403052106.3a9e688d@posting.google.com>
I apologize for the consecutive messages, but it occurred to me
that not everyone on this newsgroup has even seen any examples
of referentially transparent macros.  Here is one of the simplest
possible examples:

  (let ((x 5))
    (let-syntax ((test (syntax-rules ()
                        ((_ y) (list x y)))))
      (let ((x 55))
        (test x))))

In Scheme, this evaluates to (5 55).  It has been claimed that
this kind of macro behavior can be reliably duplicated in Common
Lisp using special variables and packages.  I'm trying to
understand how.

Will
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c2favv$lhr$1@newsreader2.netcologne.de>
William D Clinger wrote:

> I apologize for the consecutive messages, but it occurred to me
> that not everyone on this newsgroup has even seen any examples
> of referentially transparent macros.  Here is one of the simplest
> possible examples:
> 
>   (let ((x 5))
>     (let-syntax ((test (syntax-rules ()
>                         ((_ y) (list x y)))))
>       (let ((x 55))
>         (test x))))
> 
> In Scheme, this evaluates to (5 55).  It has been claimed that
> this kind of macro behavior can be reliably duplicated in Common
> Lisp using special variables and packages.  I'm trying to
> understand how.

...and I don't understand why it shouldn't work. The idea is that at any 
stage in a program, you can pick out local bindings and save them in 
global bindings for later reuse. If you need to be able to modify the 
local bindings or see changes from other places you essentially need 
double indirection, for example by closing over the bindings you are 
interested in. Furthermore, you can secure bindings captured like that 
by associating the information with uninterned symbols, and only 
selectively providing these symbols. AFAICS, these are the essential 
building blocks to play around with environments in nearly any way you'd 
like to.

Of course, the details will be hairy, so one eventually needs macros 
that capture the important idioms. (This is what I meant with my remark 
that this starts to turn into a Turing equivalence argument.)

So why are you doubting that it would work?


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403071130.43018d43@posting.google.com>
Pascal Costanza <········@web.de> wrote:
> So why are you doubting that it would work?

At this point, the reason I doubt whether your macro works is
that both you and Brian Mastenbrook have said it doesn't work.

It was clear from the beginning that you understood the issues
better than most people who offer unsubstantiated claims of the
form "Oh, that's easy to do in Common Lisp".  That's why I took
your macro seriously enough to study it.

Your macro definition was surprisingly small, and expansions of
examples didn't seem to have enough getters and setters for it
to work.  Instead of trying to construct a proof that your macro
works correctly, I constructed tests, intending to look at the
expanded code.  Most of those tests blew up.

I haven't had time to study Brian's code yet, but it looks to
have about the right order of complexity.  I don't doubt that
it works, and I'm looking forward to studying it.

Both Kohlbecker's algorithm and the Clinger-Rees "Macros That
Work" algorithm were developed with Common Lisp in mind.  X3J13
rejected this macro technology, but there has never been any
doubt in my mind that this technology could be made to work in
Common Lisp.

Will
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c2ga4r$lbj$1@newsreader2.netcologne.de>
William D Clinger wrote:

> Pascal Costanza <········@web.de> wrote:
> 
>>So why are you doubting that it would work?
> 
> At this point, the reason I doubt whether your macro works is
> that both you and Brian Mastenbrook have said it doesn't work.

;)

> It was clear from the beginning that you understood the issues
> better than most people who offer unsubstantiated claims of the
> form "Oh, that's easy to do in Common Lisp".  That's why I took
> your macro seriously enough to study it.

OK, thank you. It was more intended as an argument in an ongoing 
discussion. More often than not, I can just express my thoughts better 
as code than as text.

I haven't understood the issues before that involve local lexical 
bindings. In fact, I still think I don't completely understand them. But 
thanks to this discussion, I have a much better picture by now what 
kinds of problems hygienic macro systems are trying to solve.

Thank you.

> Both Kohlbecker's algorithm and the Clinger-Rees "Macros That
> Work" algorithm were developed with Common Lisp in mind.  X3J13
> rejected this macro technology, but there has never been any
> doubt in my mind that this technology could be made to work in
> Common Lisp.

OK, thanks for this clarification.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c2at44$7b3$1@newsreader2.netcologne.de>
Joe Marshall wrote:

> Pascal Costanza <········@web.de> writes:
> 
>>Nevertheless, because of your remark I have tried to make it work, and
>>I have to admit that I don't get the subtlety. I am probably missing
>>something. What's the point? Is it about the reliability? The code
>>below seems to work...
> 
> Before I make a complete ass of myself, let me make sure I understand
> what macrolet* is supposed to do.  Am I correct in that the VARS list
> indicates which variables are to be scoped at the macro-definition
> site and all other variables are scoped at the macro-use site?

Right.

Here is the "template" code from which I have derived the macro:

(let ((x 5))
   (let ((%unique-symbol%
          (cons (lambda () x)
                (lambda (value) (setf x value)))))
     (declare (special %unique-symbol%))
     (macrolet ((test ()
                  '(let ()
                     (declare (special %unique-symbol%))
                     (flet ((%unique-symbol% ()
                              (funcall (car %unique-symbol%)))
                            ((setf %unique-symbol%) (value)
                              (funcall (cdr %unique-symbol%)
                                       value)))
                       (symbol-macrolet ((x (%unique-symbol%)))
                         (list x)))))
                (setf37 ()
                  '(let ()
                     (declare (special %unique-symbol%))
                     (flet ((%unique-symbol% ()
                              (funcall (car %unique-symbol%)))
                            ((setf %unique-symbol%) (value)
                              (funcall (cdr %unique-symbol%)
                                       value)))
                       (symbol-macrolet ((x (%unique-symbol%)))
                         (setf x 37))))))
       (let ((x 55))
         (setf37)
         (test)))))

Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Kaz Kylheku
Subject: Re: Scheme macros
Date: 
Message-ID: <cf333042.0403051719.7ec82fa2@posting.google.com>
··········@verizon.net (William D Clinger) wrote in message news:<····························@posting.google.com>...
> > The pattern matcher behaves differently based on
> > whether symbols occuring in the macro call have bindings or not in the
> > calling original environment.
> 
> Untrue.  You may be confusing pattern matching with transcription,
> and don't understand transcription, but that's just a guess.

See the R5RS example in 4.3.2 where the cond macro is invoked using
the anaphoric => syntax that invokes a function:

  (cond (#t => func))

If there is a binding for => surrounding the macro call, this will
match a different rule. Rather than matching (cond (test => result))
it matches (cond (test result1 result2 ...)).

The pattern matcher has to grok semantic information to make a
syntactic decision. That smells. It's like in the C language, where
semantic information is consulted to determine how to parse:

   A(B);

is it a declaration or a function call? If A is a typedef name, parse
it this way, if it's a function name, parse it that way. Ugh!
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403061625.5021d20@posting.google.com>
···@ashi.footprints.net (Kaz Kylheku) wrote:
> > > The pattern matcher behaves differently based on
> > > whether symbols occuring in the macro call have bindings or not in the
> > > calling original environment.
> > 
> > Untrue.  You may be confusing pattern matching with transcription,
> > and don't understand transcription, but that's just a guess.
> 
> See the R5RS example in 4.3.2 where the cond macro is invoked using
> the anaphoric => syntax that invokes a function:
> 
>   (cond (#t => func))
> 
> If there is a binding for => surrounding the macro call, this will
> match a different rule. Rather than matching (cond (test => result))
> it matches (cond (test result1 result2 ...)).

I guessed wrong.

In Scheme's macro system, the pattern matcher doesn't care whether
symbols that occur in a use of a macro are bound in the context of
that use.  Thus both

    (let ((x 5))
      (let-syntax ((m (syntax-rules (x) ((_ x z) (+ z z))
                                        ((_ y z) (* y z)))))
        (list (m x 7) (m 9 11))))

and

    (let-syntax ((m (syntax-rules (x) ((_ x z) (+ z z))
                                      ((_ y z) (* y z)))))
      (list (m x 7) (m 9 11)))

evaluate to (14 99), even though x is bound in the first example and
not bound in the second.  On the other hand, the pattern matcher
*does* pay attention to what symbols denote.  Thus

    (let-syntax ((for-each (syntax-rules (lambda)
                            ((_ (lambda (x) bodyform1 ...) values)
                             (do ((vals values (cdr vals)))
                                 ((null? vals))
                               (let ((x (car vals))) bodyform1 ...)))
                            ((_ proc values1 ...)
                             (for-each proc values1 ...)))))
      (for-each (lambda (n) (display n)) '(1 2 3 4 5))
      (let ((lambda (lambda (ignored f) f)))
        (for-each (lambda (list) display) '(a b c d e))))

prints 12345abcde, not 12345 as Kaz would prefer.

> The pattern matcher has to grok semantic information to make a
> syntactic decision. That smells. It's like in the C language, where
> semantic information is consulted to determine how to parse:
> 
>    A(B);
> 
> is it a declaration or a function call? If A is a typedef name, parse
> it this way, if it's a function name, parse it that way. Ugh!

R5RS Scheme has no reserved words.  None.  Therefore *all*
parsing depends on context, even in a pure interpreter.

I think there's an argument to be made in favor of reserved words,
especially for things like ELSE and =>, and I tried to make that
argument myself.  That argument did not prevail.

I'm kind of glad I lost that argument.  Common Lisp has about
700 reserved words, and Scheme has none.  Scheme's extreme
position was made possible by its macro technology.  Common
Lisp's opposite near-extreme was made necessary by its macro
technology.  This serves as a nice demonstration of how macro
technology influences language design.

The absence of reserved words is not a fundamental aspect of
Scheme's macro technology.  It would be easy to allow for
reserved words, or to provide a means by which a macro can
treat things like ELSE and => as reserved words.  As things
stand now, Scheme programmers have to go outside the R5RS
macro system and use SYNTAX-CASE if they want to do that.

Will
From: Peter Seibel
Subject: Re: Scheme macros
Date: 
Message-ID: <m3hdx1ze6w.fsf@javamonkey.com>
··········@verizon.net (William D Clinger) writes:

> I'm kind of glad I lost that argument.  Common Lisp has about
> 700 reserved words, and Scheme has none.

I'm not sure it's quite fair to say Common Lisp has 700 "reserved
words". At least, if there's anyone reading along from a C/C++/Java
background who's not familiar with the Common Lisp symbol package,
they shouldn't think that what you're calling "reserved words" in
Common Lisp are the same thing.

I ran into a very specific instance of the difference today: I was
writing some code to take apart Java classfiles. I ended up trying to
write a class named RETURN to represent the JVM op of the same name.
Of course I got an error because RETURN is a symbol in the CL package.
When I had written a similar piece of software in Java I had run into
the same problem because "return" is reserved word in Java.

However the difference is that in Common Lisp the only restriction is
on the specific name COMMON-LISP:RETURN. So I was able to cleanly
solve my problem by making a package JVM which contained the names of
all the classes I wanted to create to represent JVM ops. There's no
way to do the same thing in Java because the reserved words are really
reserved.

-Peter

-- 
Peter Seibel                                      ·····@javamonkey.com

         Lisp is the red pill. -- John Fraser, comp.lang.lisp
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403071226.7b312e7e@posting.google.com>
Peter Seibel <·····@javamonkey.com> wrote:
> I'm not sure it's quite fair to say Common Lisp has 700 "reserved
> words". At least, if there's anyone reading along from a C/C++/Java
> background who's not familiar with the Common Lisp symbol package,
> they shouldn't think that what you're calling "reserved words" in
> Common Lisp are the same thing.

I'm sure it was quite unfair.

It would have fairer to say that Common Lisp has 979 symbols in the
COMMON-LISP package that are restricted in the following ways:
None may be defined as a global function, macro, compiler macro,type
specifier, structure, or declaration.  None may be removed from the
COMMON-LISP package, nor may their home package be altered.  None may
be traced, declared or proclaimed special, defined as a method
combination type, used as the class-name argument to SETF of
FIND-CLASS, bound as a catch tag, or bound as a restart name.  None
may have a SETF expander defined for them, nor may their SETF
function names be defined, undefined, or bound.  751 of these symbols
are restricted in the following ways: None of the 751 may be bound as
a local function or macro, nor may a local SETF function be bound for
them.  This list of restrictions is incomplete.

My use of the phrase "reserved words" was a misleading abbreviation
of the truer statement above.  Next time I'll try to remember to say
"restricted words" instead.

You gave a very nice example of the difference:

> I ran into a very specific instance of the difference today: I was
> writing some code to take apart Java classfiles. I ended up trying to
> write a class named RETURN to represent the JVM op of the same name.
> Of course I got an error because RETURN is a symbol in the CL package.
> When I had written a similar piece of software in Java I had run into
> the same problem because "return" is reserved word in Java.
> 
> However the difference is that in Common Lisp the only restriction is
> on the specific name COMMON-LISP:RETURN. So I was able to cleanly
> solve my problem by making a package JVM which contained the names of
> all the classes I wanted to create to represent JVM ops. There's no
> way to do the same thing in Java because the reserved words are really
> reserved.
> 
> -Peter

Will
From: Rahul Jain
Subject: Re: Scheme macros
Date: 
Message-ID: <87ad2skxnb.fsf@nyct.net>
··········@verizon.net (William D Clinger) writes:

> It would have fairer to say that Common Lisp has 979 symbols in the
> COMMON-LISP package that are restricted in the following ways:

NO symbol in the COMMON-LISP package should have any of its CL-defined
semantic properties be modified by any code outside of the
implementation itself. This is a general rule that extends to all
libraries. If what you're modifying doesn't have _some_ essential
property in your own package and it's not a variable exported for
customization, then you're just begging to cause name clashes and
portability problems. There aren't reserved "words". There is just a
reserved _package_. You are free to play around in your own package.

-- 
Rahul Jain
·····@nyct.net
Professional Software Developer, Amateur Quantum Mechanicist
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403072256.481f9d9a@posting.google.com>
Rahul Jain wrote:
> NO symbol in the COMMON-LISP package should have any of its CL-defined
> semantic properties be modified by any code outside of the
> implementation itself. This is a general rule that extends to all
> libraries. If what you're modifying doesn't have _some_ essential
> property in your own package and it's not a variable exported for
> customization, then you're just begging to cause name clashes and
> portability problems. There aren't reserved "words". There is just a
> reserved _package_. You are free to play around in your own package.

If you are thanking me for my role in getting X3J13 to acknowledge
these principles, then I mirthfully accept your thanks.

Will
From: Kaz Kylheku
Subject: Re: Scheme macros
Date: 
Message-ID: <cf333042.0403091106.3c8374ae@posting.google.com>
··········@verizon.net (William D Clinger) wrote in message news:<····························@posting.google.com>...
> Peter Seibel <·····@javamonkey.com> wrote:
> > I'm not sure it's quite fair to say Common Lisp has 700 "reserved
> > words". At least, if there's anyone reading along from a C/C++/Java
> > background who's not familiar with the Common Lisp symbol package,
> > they shouldn't think that what you're calling "reserved words" in
> > Common Lisp are the same thing.
> 
> I'm sure it was quite unfair.
> 
> It would have fairer to say that Common Lisp has 979 symbols in the
> COMMON-LISP package that are restricted in the following ways:

These ways don't make them reserved keyword. A reserved keyword is
understdoot to be some terminal symbol that has a fixed role in the
grammar.

One feature that is like a reserved keyword in Lisp is NIL in the
COMMON-LISP package. In all kinds of syntaxes, a rule that calls for
NIL will only match NIL.

Another feature that is like reserved keywords is certain lexical
patterns for tokens. The token 123+456 is not a standard number, and
it's not a symbol; it's reserved. If you want a symbol of that name,
you must use escapes. The token ... is also a kind of reserved word.
These are true lexical reservations, beneath the package system. You
can't read ... in any package. I would say that this is the closest
thing to the idea of reserved keywords in Lisp, in the sense of
reserved lexical patterns that shunt what would otherwise be ordinary
identifiers into special lexical categories.

> None may be defined as a global function, macro, compiler macro,type

But can be locally defined!

> specifier, structure, or declaration.

If you allow functions to be redefined, then calls to them cannot be
inlined. A compiler cannot then assume that the function + (in the
"COMMON-LISP" package) refers to the standard function. Even if at the
time of compilation that is true, it may change in the future when
something is loaded into the running image.

Other languages have similar restrictions. In the C language, you
can't write your own function called "memcpy" with external linkage.
(You can write one with internal linkage, provided that you don't have
#include <string.h> not followed by an #undef memcpy). But you can use
memcpy as a local variable, name of a struct member, goto label, etc.
No C programmer would take you seriously if you claimed that memcpy is
a reserved keyword, like auto, struct, switch, int, char, default,
etc.

>  None may be removed from the
> COMMON-LISP package, nor may their home package be altered. 

Also, for completeness, you might mention that T, NIL and symbols in
the "KEYWORD" package are restricted in that you can't bind to them,
and under evaluation, they stand for themselves.
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403092107.47c145e5@posting.google.com>
···@ashi.footprints.net (Kaz Kylheku) wrote:
> > It would have fairer to say that Common Lisp has 979 symbols in the
> > COMMON-LISP package that are restricted in the following ways:
> 
> These ways don't make them reserved keyword. A reserved keyword is
> understdoot to be some terminal symbol that has a fixed role in the
> grammar.

I never called them reserved keywords.  I called them reserved
words.  In the message you quoted above, I admitted it would
have been fairer to have called them restricted words, and I
will try to remember to do that in the future.

> > None may be defined as a global function, macro, compiler macro,type
> 
> But can be locally defined!

Most cannot.  Only 228 of the 979 symbols in the COMMON-LISP
package may be locally defined as a lexical function, macro,
or compiler macro.  See HyperSpec 11.1.2.1.2, actions 2 and 3,
and the exceptions in 11.1.2.1.2.1.

> Other languages have similar restrictions.

Most other languages have similar restrictions.  Scheme does
not.  That was my point.

Will
From: Nils Gösche
Subject: Re: Scheme macros
Date: 
Message-ID: <87r7w1s2s4.fsf@darkstar.cartan.de>
··········@verizon.net (William D Clinger) writes:

> ···@ashi.footprints.net (Kaz Kylheku) wrote:

> > But can be locally defined!
> 
> Most cannot.  Only 228 of the 979 symbols in the COMMON-LISP package
> may be locally defined as a lexical function, macro, or compiler
> macro.  See HyperSpec 11.1.2.1.2, actions 2 and 3, and the
> exceptions in 11.1.2.1.2.1.
> 
> > Other languages have similar restrictions.
> 
> Most other languages have similar restrictions.  Scheme does not.
> That was my point.

And yet, it's the Schemers who call a list 'lst' instead of 'list'.
Isn't that funny?

Regards,
-- 
Nils G�sche
"Don't ask for whom the <CTRL-G> tolls."

PGP key ID #xEEFBA4AF
From: Ray Dillinger
Subject: Re: Scheme macros
Date: 
Message-ID: <404EC547.A1494340@sonic.net>
Nils Gösche wrote:

> ··········@verizon.net (William D Clinger) writes:

> > ···@ashi.footprints.net (Kaz Kylheku) wrote:

> > > Other languages have similar restrictions.

> > Most other languages have similar restrictions.  Scheme does not.
> > That was my point.
 
> And yet, it's the Schemers who call a list 'lst' instead of 'list'.
> Isn't that funny?

Actually, I usually call mine Lizst.  Comes from a background 
of using Franz Lisp way back in the wayback. :-)  

				Bear
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403100836.111c1e3e@posting.google.com>
···@cartan.de (Nils G�sche) wrote:
> > > Other languages have similar restrictions.
> > 
> > Most other languages have similar restrictions.  Scheme does not.
> > That was my point.
> 
> And yet, it's the Schemers who call a list 'lst' instead of 'list'.
> Isn't that funny?

What's funny is how few people understand:
    (1)  what is allowed;
    (2)  what is a good idea;
    (3)  the difference between (1) and (2).

Will
From: Kaz Kylheku
Subject: Re: Scheme macros
Date: 
Message-ID: <cf333042.0403111628.58c6b6d8@posting.google.com>
··········@verizon.net (William D Clinger) wrote in message news:<····························@posting.google.com>...
> ···@cartan.de (Nils G�sche) wrote:
> > > > Other languages have similar restrictions.
> > > 
> > > Most other languages have similar restrictions.  Scheme does not.
> > > That was my point.
> > 
> > And yet, it's the Schemers who call a list 'lst' instead of 'list'.
> > Isn't that funny?
> 
> What's funny is how few people understand:
>     (1)  what is allowed;
>     (2)  what is a good idea;
>     (3)  the difference between (1) and (2).

Do you understand the difference?

For instance, you seem to believe that it's a good idea not to have
silly restrictions, like not being allowed to redefine a standard
function.

This leads me to suspect that you believe that it's a good idea to do
that.
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403120825.23a1e128@posting.google.com>
···@ashi.footprints.net (Kaz Kylheku) wrote:
> > What's funny is how few people understand:
> >     (1)  what is allowed;
> >     (2)  what is a good idea;
> >     (3)  the difference between (1) and (2).
> 
> Do you understand the difference?

Yes.

> For instance, you seem to believe that it's a good idea not to have
> silly restrictions, like not being allowed to redefine a standard
> function.

Yes.

> This leads me to suspect that you believe that it's a good idea to do
> that.

If "do that" means "redefine a standard function", then what you
suspect is untrue.  What kind of logic led you to suspect that
false conclusion?

Will
From: ozan s yigit
Subject: Re: Scheme macros
Date: 
Message-ID: <vi4znanxmx4.fsf@blue.cs.yorku.ca>
···@cartan.de (Nils G�sche) writes:

> And yet, it's the Schemers who call a list 'lst' instead of 'list'.
> Isn't that funny?

is there not an old lisp adage that went roughly like this?

   always write your code as if the next maintainer will be a
   homicidal maniac who knows where you live.

oz
From: Kaz Kylheku
Subject: Re: Scheme macros
Date: 
Message-ID: <cf333042.0403111627.462557f8@posting.google.com>
··········@verizon.net (William D Clinger) wrote in message news:<····························@posting.google.com>...
> ···@ashi.footprints.net (Kaz Kylheku) wrote:
> > > It would have fairer to say that Common Lisp has 979 symbols in the
> > > COMMON-LISP package that are restricted in the following ways:
> > 
> > These ways don't make them reserved keyword. A reserved keyword is
> > understdoot to be some terminal symbol that has a fixed role in the
> > grammar.
> 
> I never called them reserved keywords.  I called them reserved
> words.  In the message you quoted above, I admitted it would
> have been fairer to have called them restricted words, and I
> will try to remember to do that in the future.
> 
> > > None may be defined as a global function, macro, compiler macro,type
> > 
> > But can be locally defined!
> 
> Most cannot.  Only 228 of the 979 symbols in the COMMON-LISP
> package may be locally defined as a lexical function, macro,
> or compiler macro.  See HyperSpec 11.1.2.1.2, actions 2 and 3,
> and the exceptions in 11.1.2.1.2.1.
> 
> > Other languages have similar restrictions.
> 
> Most other languages have similar restrictions.  Scheme does
> not.  That was my point.

If your entire program is one piece of text that is processed from
beginning to end, starting with the initial environment provided by
the language, then the compiler can always keep track of what was
redefined and how.  Then when the last definition has been processed
it can optimize everything based on the accumulated knowledge, when
every definition is known and settled.

But when a program is processed in pieces that have to be somehow
linked together, programers have to live with some restrictions so
that the implementation can still do an efficient job of processing
their programs without requiring access to all of them at once. The
implementation efefctively says ``I can't see all your program at
once, so I will make certain assumptions that the pieces I do not see
do not do certain things which, if I had to defend against them, would
make me produce terribly inefficient translations of the parts of your
program that have been presented to me. If you violate my assumptions,
the results may be incorrect.''

What happens if I load a module into a running, compiled Scheme
program, and that module redefines some standard function that is
being called throughout that program? Will newly evaluated calls go to
the new definition or old? What about inlined calls?
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403121144.70bb96b@posting.google.com>
···@ashi.footprints.net (Kaz Kylheku) wrote:
> What happens if I load a module into a running, compiled Scheme
> program, and that module redefines some standard function that is
> being called throughout that program? Will newly evaluated calls go to
> the new definition or old? What about inlined calls?

R5RS Scheme does not specify a module system.  In most
implementations of Scheme that provide a module system, the
redefinition of a standard function F is local to the module.
All calls to F within that module go to the local F, not the
standard F.  If calls to F are inlined within the module, it
is the redefinition of F that is inlined, not the standard
definition.

If you were talking about R5RS Scheme without any module
extensions, and meant something like "file" when you wrote
"module", then there are two possible behaviors permitted by
R5RS section 5.2.1.  In some implementations of Scheme, the
redefinition of F is treated as an assignment, so that all
subsequent calls to F from anywhere in the user program (but
not from anywhere in a library procedure!) will go to the new
definition of F.

If calls to F had been inlined, then those inlined calls must
be undone, much as when a JIT compiler has to undo inlining
when the class loader loads a class definition that renders
a previous inlining illegal.  Because this is a hassle for
the implementor, most implementations of Scheme do not inline
calls to standard procedures unless the programmer declares
that none of the standard procedures will be redefined.  How
this declaration is made is implementation-dependent.

In other implementations, the redefinition of F binds a new
variable F.  The semantics of this binding is not specified
by the R5RS, but the semantics used in Standard ML is often
taken as the model.  In most such implementations, code that
was loaded prior to the redefinition continues to use the old
standard definition of F, while code loaded subsequent to the
redefinition uses the redefinition.

One of the attractions of the redefinition-as-binding semantics
is that no inlining needs to be undone.  If the implementation
supports separate compilation, however, then inlining becomes
an issue anyway, and is usually dealt with in exactly the same
ways as in implementations that regard redefinitions as
assignments.

Will
From: Kaz Kylheku
Subject: Re: Scheme macros
Date: 
Message-ID: <cf333042.0403121934.5f76b0c6@posting.google.com>
··········@verizon.net (William D Clinger) wrote in message news:<···························@posting.google.com>...
> ···@ashi.footprints.net (Kaz Kylheku) wrote:
> > What happens if I load a module into a running, compiled Scheme
> > program, and that module redefines some standard function that is
> > being called throughout that program? Will newly evaluated calls go to
> > the new definition or old? What about inlined calls?
> 
> R5RS Scheme does not specify a module system.  In most
> implementations of Scheme that provide a module system, the
> redefinition of a standard function F is local to the module.
> All calls to F within that module go to the local F, not the
> standard F.  If calls to F are inlined within the module, it
> is the redefinition of F that is inlined, not the standard
> definition.
> 
> If you were talking about R5RS Scheme without any module
> extensions, and meant something like "file" when you wrote
> "module", then there are two possible behaviors permitted by
> R5RS section 5.2.1. 

Okay, so in CL, if you redefine, the behavior is undefined. No
diagnostic is required. This is considered to be a restriction against
redefining.

In Scheme if you redefine, you get implementation-defined behavior. So
this is not considered a restriction against redefining.

Got it, thanks!
From: Pascal Bourguignon
Subject: Re: Scheme macros
Date: 
Message-ID: <87k71pwc2k.fsf@thalassa.informatimago.com>
···@ashi.footprints.net (Kaz Kylheku) writes:
> Okay, so in CL, if you redefine, the behavior is undefined. No
> diagnostic is required. This is considered to be a restriction against
> redefining.
> 
> In Scheme if you redefine, you get implementation-defined behavior. So
> this is not considered a restriction against redefining.
> 
> Got it, thanks!

On the  other hand,  in Common-Lisp, since  you have package,  you can
always define symbols with same name in a MY-COMMON-LISP package using
and  re-exporting all  of COMMON-LISP  but those  symbols you  want to
"redefine" that you shadow and  define yourself.  And this behavior is
perfectly defined.  Of course, any function, macro or special operator
in COMMON-LISP would still either use other function, macro or special
operator in COMMON-LISP or any other hiden primitive.

-- 
__Pascal_Bourguignon__                     http://www.informatimago.com/
There is no worse tyranny than to force a man to pay for what he doesn't
want merely because you think it would be good for him.--Robert Heinlein
http://www.theadvocates.org/
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403131145.73e5db93@posting.google.com>
···@ashi.footprints.net (Kaz Kylheku) wrote:
> Okay, so in CL, if you redefine, the behavior is undefined. No
> diagnostic is required. This is considered to be a restriction against
> redefining.
> 
> In Scheme if you redefine, you get implementation-defined behavior. So
> this is not considered a restriction against redefining.
> 
> Got it, thanks!

I doubt it.  In Scheme, if you redefine deliberately, and you have
read the R5RS, then you know you need to put redefinitions before
any uses of the redefined variables.  This gives you a reliable
semantics for redefinition.

In Common Lisp, redefinition is not allowed, and the behavior of
redefinition is implementation-dependent no matter what you do.
The solution is not to redefine anything in the COMMON-LISP package,
but to define different symbols with the same short name that shadow
their namesakes in the COMMON-LISP package.  As with redefinitions
in Scheme, the definitions of the shadowing symbols have to precede
all uses of them.

In other words, the practical difference between global redefinition
in Scheme and Common Lisp isn't great.  The real differences have to
do with local bindings: Scheme allows any identifier to be bound
locally, while the Common Lisp programmer has to be careful not to
bind any of 751 restricted words.  This difference between Scheme
and Common Lisp is the result of their differing macro technologies,
as I claimed.

Will
From: David Fisher
Subject: Re: Scheme macros
Date: 
Message-ID: <14030ca9.0403131638.3a759c8e@posting.google.com>
··········@verizon.net (William D Clinger) wrote in message news:<····························@posting.google.com>...
> ···@ashi.footprints.net (Kaz Kylheku) wrote:
> > Okay, so in CL, if you redefine, the behavior is undefined. No
> > diagnostic is required. This is considered to be a restriction against
> > redefining.
> > 
> > In Scheme if you redefine, you get implementation-defined behavior. So
> > this is not considered a restriction against redefining.
> > 
> > Got it, thanks!
> 
> I doubt it.  In Scheme, if you redefine deliberately, and you have
> read the R5RS, then you know you need to put redefinitions before
> any uses of the redefined variables.  This gives you a reliable
> semantics for redefinition.
> 

"reliable" must be, of course, better than "unreliable". Additionally,
as schemers told me, define-macro can be very easily defined in terms
of syntax-case, but syntax-case can not be defined in terms of
define-macro very easily. So why does Common Lisp only have defmacro ?
"reliability" wasn't so important?
From: Jan Rychter
Subject: Re: Scheme macros
Date: 
Message-ID: <m2r7w3agrh.fsf@tnuctip.rychter.com>
>>>>> "Peter" == Peter Seibel <·····@javamonkey.com> writes:
 Peter> ··········@verizon.net (William D Clinger) writes:
 >> I'm kind of glad I lost that argument.  Common Lisp has about 700
 >> reserved words, and Scheme has none.

 Peter> I'm not sure it's quite fair to say Common Lisp has 700
 Peter> "reserved words". At least, if there's anyone reading along from
 Peter> a C/C++/Java background who's not familiar with the Common Lisp
 Peter> symbol package, they shouldn't think that what you're calling
 Peter> "reserved words" in Common Lisp are the same thing.

 Peter> I ran into a very specific instance of the difference today: I
 Peter> was writing some code to take apart Java classfiles. I ended up
 Peter> trying to write a class named RETURN to represent the JVM op of
 Peter> the same name.  Of course I got an error because RETURN is a
 Peter> symbol in the CL package.  When I had written a similar piece of
 Peter> software in Java I had run into the same problem because
 Peter> "return" is reserved word in Java.

 Peter> However the difference is that in Common Lisp the only
 Peter> restriction is on the specific name COMMON-LISP:RETURN. So I was
 Peter> able to cleanly solve my problem by making a package JVM which
 Peter> contained the names of all the classes I wanted to create to
 Peter> represent JVM ops. There's no way to do the same thing in Java
 Peter> because the reserved words are really reserved.

A couple of days ago while reading a publication I translated some
expressions (string transformations) into code. I defined a function K
that operated on strings s and t, and gazed for a moment at interesting
error messages that appeared before being able to make a context
shift...

--J.
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c2gaoo$mb0$1@newsreader2.netcologne.de>
William D Clinger wrote:

> I'm kind of glad I lost that argument.  Common Lisp has about
> 700 reserved words, and Scheme has none.  Scheme's extreme
> position was made possible by its macro technology.  Common
> Lisp's opposite near-extreme was made necessary by its macro
> technology.  This serves as a nice demonstration of how macro
> technology influences language design.

and Joe Marshall wrote before:

> Common Lisp has some restrictions to ensure that macros can work
> correctly.  For instance, you are not permitted to shadow identifiers
> in the CL package.  Syntax-rules does not have this restriction.

I am trying to understand what you mean by these statements. I have 
skimmed the ANSI CL spec and CLtL2, and I have found two statements that 
seem to explain what the issue is. In the ANSI spec it's the 
"LISP-SYMBOL-REDEFINITION Writeup" (see 
http://www.lispworks.com/reference/HyperSpec/Issues/iss214_w.htm ), and 
in CLtL2 it's explained in 11.6. about built-in packages. The example 
given in both texts is this.

(flet ((open (filename &key direction)
          (format t "~%OPEN was called.")
          (open filename :direction direction)))
   (with-open-file (x "frob" :direction ':output)
     (format t "~%Was OPEN called?")))

ANSI CL defines that this code has undefined behavior. Is this the kind 
of restriction you are referring to?

If that's the case I wouldn't agree that such restrictions have anything 
to do with macros, at least not directly, but I first would like to know 
whether I am on the right track.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403071911.7faea3b8@posting.google.com>
Pascal Costanza <········@web.de> wrote:

> > Common Lisp has some restrictions to ensure that macros can work
> > correctly.  For instance, you are not permitted to shadow identifiers
> > in the CL package.  Syntax-rules does not have this restriction.
> 
> I am trying to understand what you mean by these statements. I have 
> skimmed the ANSI CL spec and CLtL2, and I have found two statements that 
> seem to explain what the issue is. In the ANSI spec it's the 
> "LISP-SYMBOL-REDEFINITION Writeup" (see 
> http://www.lispworks.com/reference/HyperSpec/Issues/iss214_w.htm ), and 
> in CLtL2 it's explained in 11.6. about built-in packages. The example 
> given in both texts is this.
[snip]
> ANSI CL defines that this code has undefined behavior. Is this the kind 
> of restriction you are referring to?

Yes.

> If that's the case I wouldn't agree that such restrictions have anything 
> to do with macros, at least not directly, but I first would like to know 
> whether I am on the right track.

These restrictions have everything to do with macros.  Although X3J13
rejected hygienic and referentially transparent macro technology, the
debate raised their consciousness about the reliability of macros to
the point that they listened when Gabriel and I urged that programmers
not be allowed to redefine or create local bindings for names that
macros were likely to use.  My involvement with X3J13 ended when I
left Tektronix early in 1988, so I did not have any part in the actual
writing of issue LISP-SYMBOL-REDEFINITION:MAR89-X3J13, but I was very
much involved in pushing for those restrictions.

If you read the "Rationale:" and "Current practice:" parts of that
issue, you will see that macros were central.  I agree that these
sections could have been more clearly written.

Will
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c2icrq$jgu$1@newsreader2.netcologne.de>
William D Clinger wrote:
> Pascal Costanza <········@web.de> wrote:
> 
>>>Common Lisp has some restrictions to ensure that macros can work
>>>correctly.  For instance, you are not permitted to shadow identifiers
>>>in the CL package.  Syntax-rules does not have this restriction.
>>
>>I am trying to understand what you mean by these statements. I have 
>>skimmed the ANSI CL spec and CLtL2, and I have found two statements that 
>>seem to explain what the issue is. In the ANSI spec it's the 
>>"LISP-SYMBOL-REDEFINITION Writeup" (see 
>>http://www.lispworks.com/reference/HyperSpec/Issues/iss214_w.htm ), and 
>>in CLtL2 it's explained in 11.6. about built-in packages.
[...]

>>If that's the case I wouldn't agree that such restrictions have anything 
>>to do with macros, at least not directly, but I first would like to know 
>>whether I am on the right track.
> 
> These restrictions have everything to do with macros.
[...]

> If you read the "Rationale:" and "Current practice:" parts of that
> issue, you will see that macros were central.  I agree that these
> sections could have been more clearly written.

The issue that is described there is indeed an important one (so thanks 
for bringing it up at that time), and macros have obviously played an 
important role to uncover it, but the issue is not related to macros.

Let's take the example given there.

(flet ((open (filename &key direction)
          (format t "~%OPEN was called.")
          (open filename :direction direction)))
   (with-open-file (x "frob" :direction ':output)
     (format t "~%Was OPEN called?")))

The spec for WITH-OPEN-FILE says this: "with-open-file uses open to 
create a file stream [...]" It would have been perfectly reasonable to 
specify that WITH-OPEN-FILE _must_ use whatever possibly local function 
binding for OPEN (or COMMON-LISP:OPEN, to be more specific) is current 
at the place where WITH-OPEN-FILE is used (expanded). In that case, it 
would be clear what the semantics of the example given above would be.

The rationale indicates that WITH-OPEN-FILE is possibly implemented like 
this:

(defmacro with-open-file (...)
   (if (some-opaque-condition ...)
       `(some-implementation-dependent-optimization-without-open ...)
       `(unwind-protect
           (... (open ...) ...)
          ...)))

So because of some possible implementation-dependent optimizations, OPEN 
might actually not be used in WITH-OPEN-FILE, and in order not to 
unnecessarily prevent such optimizations, the above restrictions have 
been specificied.

Now, here is the same example without any macros.

(defun with-open-file-call (filename block &rest args)
   "Like WITH-OPEN-FILE, but block is a function that takes an open
    file as a parameter."
   (let ((open-file (apply #'open filename args)))
     (funcall block open-file)
     (close open-file)))

An implementation might choose to add some optimizations, like above.

(defun with-open-file-call (filename block &rest args)
   (if (some-other-opaque-condition ...)
     (some-other-implementation-dependent-optimization-without-open ...)
     (let ((open-file (apply #'open filename args)))
       (unwind-protect
         (funcall block open-file)
         (close open-file)))))

So instead of...

(with-open-file (s filespec options ...)
    (... s ...))

...one would say...

(with-open-file-call filespec
   (lambda (s)
     (... s ...))
   options ...)

Now, in order not to prevent implementation-dependent optimizations, one 
still would have to specify that the result of changing the 
SYMBOL-FUNCTION of OPEN is undefined in the case of WITH-OPEN-FILE-CALL. 
That's because it's not clear what the semantics of the following code 
would be.

(let ((prev-open (symbol-function 'open)))
   (setf (symbol-function 'open)
         (lambda (filename &rest args)
           (format t "~%OPEN was called.")
           (apply #'prev-open filename args)))
   (with-open-file-call "frob"
     (lambda (x)
       (format t "~%Was OPEN called?"))
     :direction ':output)
   (setf (symbol-function 'open)
         prev-open))

So no macros involved at all, but still exactly the same issue. The real 
issue is that the _protocols_ of several  macros _and_ functions would 
be underspecified without the restrictions given in Section 11.1.2.1.2 
of the ANSI spec. 
(http://www.lispworks.com/reference/HyperSpec/Body/11_abab.htm )

(The issue is more pressing for macros probably because they are usually 
used for abstracting away protocols over functions, and functions 
usually play the role of lower-level abstractions, but that's just 
guesswork and doesn't change the argument.)


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <ptbni1if.fsf@ccs.neu.edu>
Pascal Costanza <········@web.de> writes:

> The issue that is described there is indeed an important one (so
> thanks for bringing it up at that time), and macros have obviously
> played an important role to uncover it, but the issue is not related
> to macros.
>
> Let's take the example given there.
>
> (flet ((open (filename &key direction)
>           (format t "~%OPEN was called.")
>           (open filename :direction direction)))
>    (with-open-file (x "frob" :direction ':output)
>      (format t "~%Was OPEN called?")))
>

How about this example:

(flet ((unwind-protect (&body forms) `(multiple-value-prog1 ,@forms)))
   (with-open-file (x "frob" :direction ':output)
      (format t "~%Was OPEN called?")))

The spec says nothing about unwind-protect.
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c2iire$18e$1@newsreader2.netcologne.de>
Joe Marshall wrote:

> Pascal Costanza <········@web.de> writes:
> 
>>The issue that is described there is indeed an important one (so
>>thanks for bringing it up at that time), and macros have obviously
>>played an important role to uncover it, but the issue is not related
>>to macros.
>>
>>Let's take the example given there.
>>
>>(flet ((open (filename &key direction)
>>          (format t "~%OPEN was called.")
>>          (open filename :direction direction)))
>>   (with-open-file (x "frob" :direction ':output)
>>     (format t "~%Was OPEN called?")))
> 
> How about this example:
> 
> (flet ((unwind-protect (&body forms) `(multiple-value-prog1 ,@forms)))
>    (with-open-file (x "frob" :direction ':output)
>       (format t "~%Was OPEN called?")))
> 
> The spec says nothing about unwind-protect.

Yes it does. ANSI CL says that this is undefined.

Why is this relevant? It doesn't affect my argument, does it?


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <wu5vumi9.fsf@ccs.neu.edu>
Pascal Costanza <········@web.de> writes:

>
> Why is this relevant? It doesn't affect my argument, does it?

'cause I spazzed.

---

Shadowing a built-in function with a lexical one would not be a
problem except for macros.
 
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c2k25h$ko1$1@newsreader2.netcologne.de>
Joe Marshall wrote:

> Shadowing a built-in function with a lexical one would not be a
> problem except for macros.

Shadowing _any_ function with a lexical one can be a problem for macros 
if they happen to rely on it. This doesn't have anything to do with the 
built-in status of those functions.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <znaqjrjl.fsf@comcast.net>
Pascal Costanza <········@web.de> writes:

> Joe Marshall wrote:
>
>> Shadowing a built-in function with a lexical one would not be a
>> problem except for macros.
>
> Shadowing _any_ function with a lexical one can be a problem for
> macros if they happen to rely on it. This doesn't have anything to do
> with the built-in status of those functions.

Well, that's true, but it is very unlikely that *your* macro package
relies on *my* functions.  (Especially if I haven't written my
functions yet.)  It is far more likely that your macro package expands
to uses of built-in common lisp functions (and functions within your
macro package).


-- 
~jrm
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403092059.6e1d8132@posting.google.com>
Joe Marshall quoting first me and then Pascal Costanza:
> >
> >> Shadowing a built-in function with a lexical one would not be a
> >> problem except for macros.
> >
> > Shadowing _any_ function with a lexical one can be a problem for
> > macros if they happen to rely on it. This doesn't have anything to do
> > with the built-in status of those functions.
> 
> Well, that's true, but it is very unlikely that *your* macro package
> relies on *my* functions.

Just to clarify:  What Pascal said is true of Common Lisp macros.
What Pascal said is *not* true of Scheme macros.

Will
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c2v0fe$af7$1@newsreader2.netcologne.de>
William D Clinger wrote:

> Joe Marshall quoting first me and then Pascal Costanza:
> 
>>>>Shadowing a built-in function with a lexical one would not be a
>>>>problem except for macros.
>>>
>>>Shadowing _any_ function with a lexical one can be a problem for
>>>macros if they happen to rely on it. This doesn't have anything to do
>>>with the built-in status of those functions.
>>
>>Well, that's true, but it is very unlikely that *your* macro package
>>relies on *my* functions.
> 
> Just to clarify:  What Pascal said is true of Common Lisp macros.
> What Pascal said is *not* true of Scheme macros.

What I said is true of defmacro, no matter what language it is 
implemented in.

syntax-rules/syntax-case do not solve the referential transparency 
issue, because it still exists for defmacro in Scheme. You cannot 
prevent other people from using defmacro, so you still have the problems 
with built-in procedures in Scheme as you have with symbols in the 
COMMON-LISP package. The need to use third-party libraries is a fact of 
everyday software engineering that you cannot ignore.

You can only solve the referential transparency issue (as any other one) 
in your own code. The attempt to solve it in the macro system means to 
focus on the wrong problem.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403131209.56617b9c@posting.google.com>
Pascal Costanza <········@web.de> wrote:
> syntax-rules/syntax-case do not solve the referential transparency 
> issue, because it still exists for defmacro in Scheme.

There is no such thing as DEFMACRO in R5RS Scheme.

> You cannot prevent other people from using defmacro, so you still
> have the problems with built-in procedures in Scheme as you have
> with symbols in the COMMON-LISP package.

I cannot prevent people from using Scheme to implement programming
languages that are best left unimplemented.  This is a special case
of the fact that I cannot stop people from writing incorrect code
in Scheme.

> The need to use third-party libraries is a fact of 
> everyday software engineering that you cannot ignore.

That is one of the primary motivations for reliable macro technology.
We do not want a third-party macro to break when it is used in some
other context.

> You can only solve the referential transparency issue (as any other one) 
> in your own code.

This is false, as was demonstrated by the R5RS macro system.

> The attempt to solve it in the macro system means to 
> focus on the wrong problem.

Referential transparency just means that uses of names occurring in
the source code refer to the lexical bindings of those names within
whose scope those uses occur.  R5RS Scheme has this property.

You could argue that referential transparency is not a desirable
property, or that it is the wrong problem on which to focus, but
to deny that R5RS Scheme has that property is absurd.

Will
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c30jue$9hs$1@newsreader2.netcologne.de>
William D Clinger wrote:
> Pascal Costanza <········@web.de> wrote:
> 
>>syntax-rules/syntax-case do not solve the referential transparency 
>>issue, because it still exists for defmacro in Scheme.
> 
> There is no such thing as DEFMACRO in R5RS Scheme.

This is completely unrelated to my statement. I didn't talk about R5RS. 
I am interested in the things that are actually used. Since I don't 
regularly use Scheme, I can only guess, but for example, defmacro is 
part of SLIB, which seems to be a popular Scheme library. DrScheme, 
seemingly one of the most popular Scheme implementations, comes with 
MzLib, which in turn provides define-macro and defmacro.

It seems to me that defmacro is regularly used in the Scheme world, 
otherwise it wouldn't be so widely supported.

> Referential transparency just means that uses of names occurring in
> the source code refer to the lexical bindings of those names within
> whose scope those uses occur.  R5RS Scheme has this property.
 >
> You could argue that referential transparency is not a desirable
> property, or that it is the wrong problem on which to focus, but
> to deny that R5RS Scheme has that property is absurd.

Just to clarify: Right, R5RS Scheme has that property, and I didn't deny 
that. I am not interested in a Scheme vs. Common Lisp discussion, but in 
syntax-rules vs. defmacro. Both macro systems exist for both languages.

So, what do you think of my suggestion to add syntax for creating unique 
symbols, as I have presented in another post? I am very interested in 
your opinion.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Tom Lord
Subject: Re: Scheme macros
Date: 
Message-ID: <405487E2.474A3B04@regexps.com>
Pascal Costanza wrote:
> 
> William D Clinger wrote:
> > Pascal Costanza <········@web.de> wrote:

> I am interested in the things that are actually used. Since I don't
> regularly use Scheme, I can only guess, but for example, defmacro is
> part of SLIB, which seems to be a popular Scheme library. DrScheme,
> seemingly one of the most popular Scheme implementations, comes with
> MzLib, which in turn provides define-macro and defmacro.

> It seems to me that defmacro is regularly used in the Scheme world,
> otherwise it wouldn't be so widely supported.

Grep for and examine _uses_ in various libraries rather than definitions
of DEFMACRO.   My quick study of an old version of slib suggests that
all interesting uses of DEFMACRO there fall into one of the categories:

1) used to provide DEFINE-SYNTAX and friends

   SCM, the implementation most closely associated with Slib,
   provides DEFMACRO natively because it turns out to be very
   easy and natural to implement given the way that EVAL works in SCM.

   The other macro systems provided by slib and defined there in
   terms of DEFMACRO were, I believe, put there initially to
   help introduce these new systems to Schemer's.   These days, they
   and similar implementations are helpful for porting code to 
   SCM and Guile.

2) Referred to by the simple slib package system.

   These uses don't so much use DEFMACRO to define macros -- they notice
   it's presence or absense and use that information in package
dependency
   management.

3) Handled in debugging aids.

   Mostly not define new macros but just to handle DEFMACRO macros in
   debugging tools.

4) A very small number of actual macros.

   I didn't spot any of these that I think are typical of Scheme code.
   For example, there is a DEFMACRO definition of FLUID-LET that
   makes some sense for how-to-do-it in SCM but I think if you stop
   100 Schemer's on the street and ask them to implement FLUID-LET,
   unless Aubrey is among them, you'll get 100 definitions none of which
   use DEFMACRO.

Slib, handy though it may be, may also not be terribly representative 
of Scheme code in general.  Poking around in PLT Scheme, SCSH, Bigloo,
reference implementations of SRFIs, etc. may give a more realistic
sample.

Slib is (relatively speaking) venerable (though actively updated).  It
predates standardization of macros in R5RS.  I believe it predates R4RS.
I recall using SCM at a point in history when DEFMACRO was the only 
practical choice available within SCM.

I also found in slib and in my personal little library of Scheme
code far fewer macro definitions of any kind than I would have 
guessed.  It might be interesting to compare the ratio of code-lines
to macro definitions in Scheme v. CL and sample around to see how
the difference might be explained.  (My style may be atypical.)

> [how about a facility for generating unique symbol names]

Well, that'd be useful and sane on first principles -- no need to
justify in terms of DEFMACRO.  It might deserve to go in R6RS, too, 
because it is such a basic omission _and_ is otherwise impossible
to implement (in a practical form) in a portable way.



-t
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403142120.6a971d62@posting.google.com>
Pascal Costanza <········@web.de> wrote:
> > There is no such thing as DEFMACRO in R5RS Scheme.
> 
> This is completely unrelated to my statement. I didn't talk about R5RS. 
> I am interested in the things that are actually used. Since I don't 
> regularly use Scheme, I can only guess, but for example, defmacro is 
> part of SLIB, which seems to be a popular Scheme library. DrScheme, 
> seemingly one of the most popular Scheme implementations, comes with 
> MzLib, which in turn provides define-macro and defmacro.

In Scheme, DEFMACRO and DEFINE-MACRO are relics from the years
between RRRS and R4RS, when Scheme had no macro system at all.
The macro system of R4RS was in an appendix, not part of the
document proper, so some systems continued to support DEFMACRO
or similar instead of referentially transparent macros until
the R5RS came out in 1998.

The use of DEFMACRO etc was especially common in interpreted
systems like SCM and DrScheme, where the slow speed of interpreted
Scheme made it necessary to write as much of the system as possible
in some other language, usually C.  Implementing DEFMACRO in C
was a lot easier than implementing SYNTAX-RULES, so DEFMACRO hung
on longer in the interpreted systems than in the compiled ones.
Machines are so fast now that even an interpreted macro expander
runs fast enough, so I think most implementations that still provide
DEFMACRO are doing so for backward compatibility.

> Just to clarify: Right, R5RS Scheme has that property, and I didn't deny 
> that. I am not interested in a Scheme vs. Common Lisp discussion, but in 
> syntax-rules vs. defmacro. Both macro systems exist for both languages.

In the Scheme world, programmers use SYNTAX-RULES whenever possible,
and supplement it with SYNTAX-CASE when SYNTAX-RULES isn't powerful
enough.  When both SYNTAX-RULES and SYNTAX-CASE are available, there
is no reason at all for a programmer to use DEFMACRO.  SYNTAX-CASE
interacts well with SYNTAX-RULES; DEFMACRO doesn't.

> So, what do you think of my suggestion to add syntax for creating unique 
> symbols, as I have presented in another post? I am very interested in 
> your opinion.

Sorry, I must not have seen that post.  SYNTAX-CASE already has that
capability, and it is available as a standard trick even to users of
SYNTAX-RULES.

Will
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403150528.6ceca44d@posting.google.com>
> > So, what do you think of my suggestion to add syntax for creating unique 
> > symbols, as I have presented in another post? I am very interested in 
> > your opinion.
> 
> Sorry, I must not have seen that post.  SYNTAX-CASE already has that
> capability, and it is available as a standard trick even to users of
> SYNTAX-RULES.
> 
> Will

Sorry again.

Tom Lord was right when he said there is no way to create unique
symbols in R5RS Scheme.  I was wrong when I said SYNTAX-RULES and
SYNTAX-CASE had that capability.

What SYNTAX-RULES and SYNTAX-CASE can do is to create new unique
local identifiers.  Identifiers and symbols are different things,
of course.

I apologize for my error.

Will
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c34c5a$12rc$1@f1node01.rhrz.uni-bonn.de>
William D Clinger wrote:

>>>So, what do you think of my suggestion to add syntax for creating unique 
>>>symbols, as I have presented in another post? I am very interested in 
>>>your opinion.
[...]

> What SYNTAX-RULES and SYNTAX-CASE can do is to create new unique
> local identifiers.  Identifiers and symbols are different things,
> of course.

That's not what I meant. I think I have found a way to ensure 
referential transparency even for defmacro, without changing defmacro at 
all. However, I will need some time to experiment with this.

Thanks again for a very fruitful and insightful discussion.


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop - June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop - June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Jeff Dalton
Subject: Re: Scheme macros
Date: 
Message-ID: <fx465c97p0g.fsf@tarn.inf.ed.ac.uk>
··········@verizon.net (William D Clinger) writes:

> Pascal Costanza <········@web.de> wrote:
> > > There is no such thing as DEFMACRO in R5RS Scheme.
> > 
> > This is completely unrelated to my statement. I didn't talk about R5RS. 
> > I am interested in the things that are actually used. Since I don't 
> > regularly use Scheme, I can only guess, but for example, defmacro is 
> > part of SLIB, which seems to be a popular Scheme library. DrScheme, 
> > seemingly one of the most popular Scheme implementations, comes with 
> > MzLib, which in turn provides define-macro and defmacro.
> 
> In Scheme, DEFMACRO and DEFINE-MACRO are relics from the years
> between RRRS and R4RS, when Scheme had no macro system at all.
> The macro system of R4RS was in an appendix, not part of the
> document proper, so some systems continued to support DEFMACRO
> or similar instead of referentially transparent macros until
> the R5RS came out in 1998.
> 
> The use of DEFMACRO etc was especially common in interpreted
> systems like SCM and DrScheme, where the slow speed of interpreted
> Scheme made it necessary to write as much of the system as possible
> in some other language, usually C.  Implementing DEFMACRO in C
> was a lot easier than implementing SYNTAX-RULES, so DEFMACRO hung
> on longer in the interpreted systems than in the compiled ones.
> Machines are so fast now that even an interpreted macro expander
> runs fast enough, so I think most implementations that still provide
> DEFMACRO are doing so for backward compatibility.
> 
> > Just to clarify: Right, R5RS Scheme has that property, and I didn't deny 
> > that. I am not interested in a Scheme vs. Common Lisp discussion, but in 
> > syntax-rules vs. defmacro. Both macro systems exist for both languages.
> 
> In the Scheme world, programmers use SYNTAX-RULES whenever possible,
> and supplement it with SYNTAX-CASE when SYNTAX-RULES isn't powerful
> enough.  When both SYNTAX-RULES and SYNTAX-CASE are available, there
> is no reason at all for a programmer to use DEFMACRO.  SYNTAX-CASE
> interacts well with SYNTAX-RULES; DEFMACRO doesn't.
> ...

Yet later, Anton van Straaten said:

  Defmacro works very well in Scheme, and plenty of people use it.

and

  defmacro is still widely used in Scheme

Which surprised me.

-- jd
From: Sunnan
Subject: Re: Scheme macros
Date: 
Message-ID: <878yh5qudj.fsf@handgranat.org>
Jeff Dalton <····@tarn.inf.ed.ac.uk> writes:

> ··········@verizon.net (William D Clinger) writes:

>> In the Scheme world, programmers use SYNTAX-RULES whenever possible,
>> and supplement it with SYNTAX-CASE when SYNTAX-RULES isn't powerful
>> enough.  When both SYNTAX-RULES and SYNTAX-CASE are available, there
>> is no reason at all for a programmer to use DEFMACRO.  SYNTAX-CASE
>> interacts well with SYNTAX-RULES; DEFMACRO doesn't.
>> ...
>
> Yet later, Anton van Straaten said:
>
>   Defmacro works very well in Scheme, and plenty of people use it.
>
> and
>
>   defmacro is still widely used in Scheme
>
> Which surprised me.

They're both true.

syntax-case and syntax-rules both provide a transformer for
define-syntax. defmacro or define-macro (while sometimes implemented
using define-syntax + syntax-case) doesn't.

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Re: Scheme macros
Date: 
Message-ID: <fx4isg8bloo.fsf@tarn.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> Jeff Dalton <····@tarn.inf.ed.ac.uk> writes:
> 
> > ··········@verizon.net (William D Clinger) writes:
> 
> >> In the Scheme world, programmers use SYNTAX-RULES whenever possible,
> >> and supplement it with SYNTAX-CASE when SYNTAX-RULES isn't powerful
> >> enough.  When both SYNTAX-RULES and SYNTAX-CASE are available, there
> >> is no reason at all for a programmer to use DEFMACRO.  SYNTAX-CASE
> >> interacts well with SYNTAX-RULES; DEFMACRO doesn't.
> >> ...
> >
> > Yet later, Anton van Straaten said:
> >
> >   Defmacro works very well in Scheme, and plenty of people use it.
> >
> > and
> >
> >   defmacro is still widely used in Scheme
> >
> > Which surprised me.
> 
> They're both true.

Sorry, but I don't understand your reply.  Both what are true?
The two things I quoted from Anton?  Or what Anton said and
what Will Clinger said?

> syntax-case and syntax-rules both provide a transformer for
> define-syntax. defmacro or define-macro (while sometimes implemented
> using define-syntax + syntax-case) doesn't.

Why are you telling me this?  What does it have to do with
"They're both true"?

Maybe I wasn't clear enough in my message, because I can't
see how your reply relates to what I was trying to say.

You have to look at the message I was answering in context.
I'll give some of that here, and perhaps that will help.

Will Clinger was responding to Pascal Costanza's suggestion that
defmacro is regularly used in the Scheme world:

  [Clinger:]
  > There is no such thing as DEFMACRO in R5RS Scheme.

  [Costanza:]
  This is completely unrelated to my statement. I didn't talk about R5RS. 
  I am interested in the things that are actually used. Since I don't 
  regularly use Scheme, I can only guess, but for example, defmacro is 
  part of SLIB, which seems to be a popular Scheme library. DrScheme, 
  seemingly one of the most popular Scheme implementations, comes with 
  MzLib, which in turn provides define-macro and defmacro.

  It seems to me that defmacro is regularly used in the Scheme world, 
  otherwise it wouldn't be so widely supported.

Now, Clinger's reply to the above was not agreement.  He didn't say
"yes, it is regularly used", not even "yes, it is regularly used,
*but* ..."

He said:

  In Scheme, DEFMACRO and DEFINE-MACRO are relics from the years
  between RRRS and R4RS, when Scheme had no macro system at all.
  The macro system of R4RS was in an appendix, not part of the
  document proper, so some systems continued to support DEFMACRO
  or similar instead of referentially transparent macros until
  the R5RS came out in 1998.

  ... Machines are so fast now that even an interpreted macro expander
  runs fast enough, so I think most implementations that still provide
  DEFMACRO are doing so for backward compatibility.

That doesn't make it sound like defmacro is regularly used or
widely used.  Nor does the rest of his message.  Perhaps it
*was* widely used, but now it's a "relic".  Instead:

  In the Scheme world, programmers use SYNTAX-RULES whenever possible,
  and supplement it with SYNTAX-CASE when SYNTAX-RULES isn't powerful
  enough.

Yet Anton said defmacro "is widely used in Scheme" and "plenty of
people use it", meaning still, now.

-- jd
From: Sunnan
Subject: Re: Scheme macros
Date: 
Message-ID: <873c7cuknq.fsf@handgranat.org>
Jeff Dalton <····@tarn.inf.ed.ac.uk> writes:
> Sorry, but I don't understand your reply.  Both what are true?
> The two things I quoted from Anton?  Or what Anton said and
> what Will Clinger said?

I meant what you quoted from Will Clinger vs. what you quoted from
Anton.

>> syntax-case and syntax-rules both provide a transformer for
>> define-syntax. defmacro or define-macro (while sometimes implemented
>> using define-syntax + syntax-case) doesn't.
>
> Why are you telling me this?  What does it have to do with
> "They're both true"?

I thought that showing why "SYNTAX-CASE interacts well with
SYNTAX-RULES; DEFMACRO doesn't" would show that it doesn't mean that
defmacro has to be unavailable.

> Maybe I wasn't clear enough in my message, because I can't
> see how your reply relates to what I was trying to say.
<snip clarification>

Ah, thanks. Here's a clarification from me:

Plenty of people use defmacro (it's used heavily in Dorai's book, for
example) but the "canonical" way to do it (and I would guess more
widely spread, these days) is define-syntax (e.g. syntax-rules
supplemented with (or replaced by) syntax-case). Personally, I'm happy
that all three are available and, as I've said, would gladly see
syntax-case accepted into R6RS (and then implementations can provide
define-macro in terms of it. gensym would also have to be provided,
though.)

Are defmacro/define-macro relics? Sure.
Do they still work? Yes.
Are they widely used? Yes.

Are they provided for backward compatibility? Well, yeah, a lot of
code is written with them or ported from CL. I've rewritten some of my
define-macros to use syntax-case and have a few that's on the todo
list.

Is syntax-case better or "more recommended"? According to many
(including me, but I don't have any authority), yes.

-- 
One love,
Sunnan
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403081702.423db7d9@posting.google.com>
Pascal Costanza <········@web.de> wrote:
> So no macros involved at all, but still exactly the same issue.

Global redefinitions of the various things associated with symbols
in the COMMON-LISP package would get you into trouble even without
macros, but macros are the only reason (apart from programmer
confusion!) to forbid local lexical bindings of the names in that
package.

Will
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c2k3da$nbj$1@newsreader2.netcologne.de>
William D Clinger wrote:

> Pascal Costanza <········@web.de> wrote:
> 
>>So no macros involved at all, but still exactly the same issue.
> 
> Global redefinitions of the various things associated with symbols
> in the COMMON-LISP package would get you into trouble even without
> macros, but macros are the only reason (apart from programmer
> confusion!) to forbid local lexical bindings of the names in that
> package.

Right. So?

My concern is your claim that Common Lisp has "reserved"/"restricted" 
words while Scheme doesn't.

I think the real issue is this: One shouldn't arbitrarily change
definitions that other code relies on in a way that might affect that
code. In the case of macros, this can happen via shadowing. In the case
of functions, this can happen by altering their definitions via side
effects.

R5RS specifies that library procedures might or might not be implemented 
in terms of other built-in procedures. For practical purposes, this 
means that you have essentially the same restrictions for built-in 
procedures in Scheme as you have for symbols in the COMMON-LISP package. 
For example, you don't know how (define car 'beetle) affects other 
built-in procedures, so you shouldn't do this in portable code. As soon 
as you integrate third-party code that may use things like fluid-let or 
defmacro, you might get hurt by the very same problems that Common Lisp 
explicitly specifies away.

No matter how hard you try, all code ultimately depends on conventions 
how different entities interact with each other. It's a good idea to 
specify the restrictions and possibilities that result from such 
interactions. I don't see how the positions Scheme and Common Lisp take 
are respectively "extreme" in a meaningful way in this regard. Since you 
can use both syntax-rules and defmacro in both Scheme and Common Lisp, 
these two languages are effectively on par here.

Haskell is extreme.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: William D Clinger
Subject: Re: Scheme macros
Date: 
Message-ID: <fb74251e.0403091211.2232f55b@posting.google.com>
Pascal Costanza <········@web.de> wrote:
> R5RS specifies that library procedures might or might not be implemented 
> in terms of other built-in procedures. For practical purposes, this 
> means that you have essentially the same restrictions for built-in 
> procedures in Scheme as you have for symbols in the COMMON-LISP package. 
> For example, you don't know how (define car 'beetle) affects other 
> built-in procedures, so you shouldn't do this in portable code.

Untrue.  R5RS section 6 says

    A program may use a top-level definition to bind any variable.
    It may subsequently alter any such binding by an assignment (see
    section 4.1.6 Assignments). These operations do not modify the
    behavior of Scheme's built-in procedures. Altering any top-level
    binding that has not been introduced by a definition has an
    unspecified effect on the behavior of the built-in procedures. 

Thus (set! car 'beetle) might cause a problem, if your implementation
allows it at all, but (define car 'beetle) is guaranteed not to cause
a problem.  This has been true in IEEE Scheme since 1991; thus the
"Language changes" section says

    The report is now a superset of the IEEE standard for Scheme
    [IEEEScheme]: implementations that conform to the report will
    also conform to the standard. This required the following changes: 
[snip]
        * Programs are allowed to redefine built-in procedures.
          Doing so will not change the behavior of other built-in
          procedures. 

Will
From: Jens Axel Søgaard
Subject: Re: Scheme macros
Date: 
Message-ID: <404e0d96$0$282$edfadb0f@dread12.news.tele.dk>
Pascal Costanza wrote:
> R5RS specifies that library procedures might or might not be implemented 
> in terms of other built-in procedures. For practical purposes, this 
> means that you have essentially the same restrictions for built-in 
> procedures in Scheme as you have for symbols in the COMMON-LISP package. 
> For example, you don't know how (define car 'beetle) affects other 
> built-in procedures, so you shouldn't do this in portable code. 

I think you missed the second paragraph of Chapter 6

   Standard procedures

   This chapter describes Scheme's built-in procedures. The initial (or
   ``top level'') Scheme environment starts out with a number of
   variables bound to locations containing useful values, most of which
   are primitive procedures that manipulate data. For example, the
   variable abs is bound to (a location initially containing) a procedure
    of one argument that computes the absolute value of a number, and the
    variable + is bound to a procedure that computes sums. Built-in
   procedures that can easily be written in terms of other built-in
   procedures are identified as ``library procedures''.

   A program may use a top-level definition to bind any variable. It may
   subsequently alter any such binding by an assignment (see 4.1.6).
   *These operations do not modify the behavior of Scheme's built-in
   procedures.* Altering any top-level binding that has not been
   introduced by a definition has an unspecified effect on the behavior
   of the built-in procedures.

-- 
Jens Axel S�gaard
From: Jeff Dalton
Subject: Re: Scheme macros
Date: 
Message-ID: <fx41xmx7nio.fsf@tarn.inf.ed.ac.uk>
··········@verizon.net (William D Clinger) writes:

[Re LISP-SYMBOL-REDEFINITION etc]

> These restrictions have everything to do with macros.  Although X3J13
> rejected hygienic and referentially transparent macro technology, the
> debate raised their consciousness about the reliability of macros to
> the point that they listened when Gabriel and I urged that programmers
> not be allowed to redefine or create local bindings for names that
> macros were likely to use.  My involvement with X3J13 ended when I
> left Tektronix early in 1988, so I did not have any part in the actual
> writing of issue LISP-SYMBOL-REDEFINITION:MAR89-X3J13, but I was very
> much involved in pushing for those restrictions.

After seeing the discussion in this thread, I wonder what the
motive was.

It almost sounds like it was so that Common Lisp could be
ridiculed, later on, for having the restrictions.  :)

> If you read the "Rationale:" and "Current practice:" parts of that
> issue, you will see that macros were central.  I agree that these
> sections could have been more clearly written.

It seems to be one issue among others.

Here's current practice:

  Many Lisp environments have some mechanism for warning about
  redefinition of Lisp symbols and preventing accidental redefinition
  while allowing it where necessary (e.g., to patch the Lisp system
  itself, fix a bug, add an optimization.)

  Fewer check for lexical redefinition, since such redefinition is not
  as dangerous. Certainly, there are some symbols that are never used
  in macro expansions of the standard Common Lisp macros. However,
  implementations do differ on the behavior of macro expansions.

And here's rationale:

  This proposal is the only simple resolution of the problem
  description that we can imagine that is consistent with current
  implementation techniques.

  Allowing arbitrary redefinition of symbols in the system would place
  severe restrictions on implementations not to actually use those
  symbols in macro expansions of other symbols, in function calls,
  etc. While some looser restrictions might do for any particular
  Common Lisp implementation, there seems to be no good way to
  distinguish between those symbols that are redefinable and those
  that are not.

  In general, programs can redefine functions safely by creating new
  symbols in their own package, possibly shadowing the name.

There's a long list of restrictions.  For local binding to be left
out, there would have had to be no problem with it.  There is a
problem, because of macros, but macros don't seem to be a central
issue.  They're just the reason why that's in the list along with all
the other things.  Lexical redefinition is even said to be "not as
dangerous".

-- jd
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: Scheme macros
Date: 
Message-ID: <pan.2004.03.06.19.28.03.194828@knm.org.pl>
On Fri, 05 Mar 2004 17:19:34 -0800, Kaz Kylheku wrote:

>   (cond (#t => func))
> 
> If there is a binding for => surrounding the macro call, this will
> match a different rule. Rather than matching (cond (test => result))
> it matches (cond (test result1 result2 ...)).
> 
> The pattern matcher has to grok semantic information to make a
> syntactic decision. That smells. It's like in the C language, where
> semantic information is consulted to determine how to parse:

I don't see how it's worse than a Lisp macroexpander which must know the
context in order to either leave (foo bar) alone or expand the foo macro
if it's defined.

Actually Lisp has more cross-level dependencies. For Lisp you must
*execute* code in order to *parse* further statements, because of
the possibility of changing readtable, packages etc. In Scheme you
can at least build an abstract syntax tree before executing anything.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Kaz Kylheku
Subject: Re: Scheme macros
Date: 
Message-ID: <cf333042.0403081132.2cbccd30@posting.google.com>
Marcin 'Qrczak' Kowalczyk <······@knm.org.pl> wrote in message news:<······························@knm.org.pl>...
> On Fri, 05 Mar 2004 17:19:34 -0800, Kaz Kylheku wrote:
> 
> >   (cond (#t => func))
> > 
> > If there is a binding for => surrounding the macro call, this will
> > match a different rule. Rather than matching (cond (test => result))
> > it matches (cond (test result1 result2 ...)).
> > 
> > The pattern matcher has to grok semantic information to make a
> > syntactic decision. That smells. It's like in the C language, where
> > semantic information is consulted to determine how to parse:
> 
> I don't see how it's worse than a Lisp macroexpander which must know the
> context in order to either leave (foo bar) alone or expand the foo macro
> if it's defined.

Obviously, to have programmable syntax, there has to be some piece of
context to hook into. A purely context-free grammar is a hard-coded
grammar.

The definition of FOO in (FOO BAR) is the minimum amount of context
information. You look at the first position of the list and know how
to parse the rest.

If there are going to have superfluous, unnecessary context
dependencies beyond the definition of the main connective on the left,
don't call it hygiene.

> Actually Lisp has more cross-level dependencies. For Lisp you must
> *execute* code in order to *parse* further statements, because of
> the possibility of changing readtable, packages etc. In Scheme you
> can at least build an abstract syntax tree before executing anything.

Really? Syntax trees get built without anything executing? I suspect
what you mean is that no user-serviceable code gets executed, just
like in the case of C, Modula, and others.
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <ptbnjtsc.fsf@ccs.neu.edu>
···@ashi.footprints.net (Kaz Kylheku) writes:

> See the R5RS example in 4.3.2 where the cond macro is invoked using
> the anaphoric => syntax that invokes a function:
>
>   (cond (#t => func))
>
> If there is a binding for => surrounding the macro call, this will
> match a different rule. Rather than matching (cond (test => result))
> it matches (cond (test result1 result2 ...)).
>
> The pattern matcher has to grok semantic information to make a
> syntactic decision. That smells. It's like in the C language, where
> semantic information is consulted to determine how to parse:
>
>    A(B);
>
> is it a declaration or a function call? If A is a typedef name, parse
> it this way, if it's a function name, parse it that way. Ugh!

It isn't as bad as C.  In Scheme, the context is always lexical
(modulo top-level definitions, of course), so you have a hope of
success.  In C++, I've seen header files that have typedefs inside of
class definitions, so 

foo ()
{
  A(B);
}

would be a function call, but 

bar::quux()
{
  A(B);
}

would be a declaration.
From: Gareth McCaughan
Subject: Re: Scheme macros
Date: 
Message-ID: <87d67s6qwj.fsf@g.mccaughan.ntlworld.com>
Kaz Kylheku wrote:

> I disagree with the insinuation that Modula-2 is ``rather different''
> from Scheme. It's rather different from Lisp, that's certain. Add
> lexical closures, macros and continuations to Modula-2 and you have
> Scheme with Pascal-like syntax.

The mind boggles. Those are pretty major features, and so is the
difference in syntax between Scheme-like and Pascal-like. And you
forgot to mention dynamic typing, and the entirely different
standard library, and symbols, and automatic memory management,
and literal notation for linked lists, and functions with variable
numbers of arguments, and a completely different set of control
structures. But yes, apart from those trifling matters and a few
more I might have forgotten, I'm sure Scheme and Modula-2 are
exactly the same.

-- 
Gareth McCaughan
.sig under construc
From: Pascal Costanza
Subject: Re: Scheme macros
Date: 
Message-ID: <c2tdod$egc$1@newsreader2.netcologne.de>
William D Clinger wrote:

> Pascal Costanza <········@web.de> wrote:
> 
>>It's a pity that we have got sidetracked again by the hygiene issue. I 
>>think the thread has shown that there are several options to take care 
>>of this by now. The most natural way in Common Lisp seems to be to use 
>>special variables and/or to use packages for namespacing issues.
> 
> I don't see how special variables and/or namespaces could give you
> an equivalent of referential transparency for local macros.

Here is what I vaguely had in mind when I said that special variables 
can help here. It just took me several turns to clear up my thoughts.

The following is what I think a very simple solution to referential 
transparency issues with defmacro. It relies on the reader interning 
symbols in the right packages, and using dynamic scoping at read time.

The most important change is that the right binding for a name is not 
determined at the use sites, but at the definition site. This greatly 
simplifies things. However, I would like to see whether other examples 
for referentially transparent macro definitions can or cannot be made to 
work with this approach. Are there any collections of test cases / examples?

OK, here we go:

(defpackage #:unique-symbols
   (:use #:common-lisp)
   (:import-from #:lispworks #:with-unique-names))

(in-package #:unique-symbols)

;;; Enforcement of symbol uniqueness:
;;; Square brackets [] bind *package* to a fresh package
;;; that imports everything accessible in the previous *package*.
;;; The bang ! enforces a unique symbol that is present
;;; in *package* afterwards.


;; Packages must be named so that symbols can be interned.
;; The following stuff is used for creating new package names.

(defconstant +internal-package-prefix+
   "PACKAGE-FOR-REFERENTIALLY-UNIQUE-SYMBOLS-")

(defvar *internal-package-counter* 0)

(defun new-package-name ()
   (with-output-to-string (s)
     (princ +internal-package-prefix+ s)
     (princ (incf *internal-package-counter*) s)))

(defun ensure-new-package ()
   "create a new package and import all symbols accessible in *package*"
   (let ((pkg (make-package (new-package-name) :use nil)))
     (import (loop for symbol being each symbol of *package*
                   collect symbol)
             pkg)
     pkg))

;; Manipulation of the reader.

(defmacro with-new-package (&body body)
   "bind *package* to a new package as created by ensure-new-package"
   (with-unique-names (new-package)
     `(let ((,new-package (ensure-new-package)))
        (unwind-protect
          (let ((*package* ,new-package))
            ,@body)
          (delete-package ,new-package)))))

(defun open-bracket-macro-char (stream macro-char)
   "continue to read with a new package until the closing bracket"
   (declare (ignore macro-char))
   (with-new-package
     (read-delimited-list #\] stream t)))

(set-macro-character #\[ #'open-bracket-macro-char)
(set-macro-character #\] (get-macro-character #\)))

(defun bang-macro-char (stream macro-char)
   "force the following symbol to be freshly interned in *package*"
   (declare (ignore macro-char))
   (let ((symbol (read stream t nil t)))
     (check-type symbol symbol)
     (unintern symbol)
     (intern (symbol-name symbol))))

(set-macro-character #\! #'bang-macro-char)

;; test cases

(assert (equal
          (let ((x 5))
            (macrolet
                ((test () '(list x)))
              [let ((!x 55))
                (test)]))
          '(5)))

(assert (equal
          (let ((x 5))
            (macrolet
                ((test () '(list x))
                 (set37 () '(setf x 37)))
              [let ((!x 55))
                (set37)
                (test)]))
          '(37)))

(assert (equal
          (let ((x 5))
            (macrolet
                ((test (y) `(list x ,y)))
              [let ((!x 55))
                (test x)]))
          '(5 55)))

(assert (equal
          (let ((x 5))
            (macrolet
                ((test (y) `(list x ,y)))
              [let ((!x 55))
                (test 7)]))
          '(5 7)))


I am starting to believe that it would be a good idea to drop at least 
some of the restrictions placed on symbols in the COMMON-LISP package. 
The following code...

(flet ((open (filename &key direction)
          (format t "~%OPEN was called.")
          (open filename :direction direction)))
   (with-open-file (x "frob" :direction ':output)
     (format t "~%Was OPEN called?")))

...has a useful meaning IMHO - WITH-OPEN-FILE should just be documented 
accordingly so that it is clear that it uses whatever function 
definition for the _symbol_ OPEN is current. If you don't want your 
local functions inadvertantly affect the use of macros, rewrite your 
code like this.

[flet ((!open (...)
          ...))
   (with-open-file (x "frob" :direction ':output)
     (format t "~%Was OPEN called?"))]

Here, it is clear that the fresh symbol OPEN is totally unrelated to 
whatever other programmers have used in their macro definitions.

I hope that I am not missing something.

Comments?


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: dw
Subject: Re: Scheme macros
Date: 
Message-ID: <41d9ef49.0402242207.360a4366@posting.google.com>
"Anton van Straaten" <·····@appsolutions.com> wrote in message news:<···················@newsread1.news.atl.earthlink.net>...
> 
> (define-syntax and2
>   (lambda (x)
>     (syntax-case x ()
>       ((_ x y)
>        (syntax (if x y #f))))))
> 
> The (lambda (x) ...) binds a syntax object to x, representing the syntax of
> the expression which invoked the macro.  In theory, you can do whatever you
> want with that syntax object.  Most commonly, you'll use syntax-case to do a
> pattern match on it, which is what the above example does with the
> expression (syntax-case x () ...).  The () is for literals, like 'else' in
> cond.
> 
> Within the above syntax-case expression, there's a single pattern and a
> corresponding template:
> 
>       ((_ x y)
>        (syntax (if x y #f))))))
> 
> The underscore in (_ x y) represents the name of the macro - you could also
> write (and2 x y), it doesn't matter.  This pattern will match any
> invocations of AND2 with two parameters.  After the pattern, is the
> expression which will be executed when that pattern is matched.  In this
> case, it's simply a syntax expression which returns the expression (if x y
> #f).  The return value from syntax-case must be a syntax object, which
> represents the actual syntax of the final program.

From what I understood, it looks like Scheme macros are a bit like C++ 
templates, that is instead of directly manipulating lists, they use
pattern matching. Both are Turing-complete systems.

If you are familiar with templates, can you compare syntax-case and syntax-
rules to them?

If you were designing a new computer language "for the masses" what kind of
macros would it have?
From: Grzegorz =?UTF-8?B?Q2hydXBhxYJh?=
Subject: Re: Scheme macros
Date: 
Message-ID: <c15ck4$9h$1@news.ya.com>
Anton van Straaten wrote:


> It's not only hygiene.  Like Barry, I would assume that Olin was talking
> about the syntax-case macro system:

Given that Olin's paper is about scsh/Scheme48, and given that AFAIK
Scheme48 does not provide syntax-case, this seems unlikely. Anyway, it's
not terribly difficult for a macro system to be more "sophisticated" (as
opposed to more general) than Common Lisps defamcro.

-- 
Grzegorz Chrupała | http://pithekos.net | ·········@jabber.org
gosh -bE "begin (print \
(map (cut number->string <> 36) '(18 806564 1714020422))) (exit)"
From: Jeff Dalton
Subject: Re: Scheme macros
Date: 
Message-ID: <fx4ad1l7s3d.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Happily, Lisp being the ultra-flexible
> language that it is, anyone who wants to play with syntax-rules in CL can
> take a look at:
> http://www.ccs.neu.edu/home/dorai/mbe/mbe-lsp.html

According to that page:

  This Common Lisp implementation does _not_ provide hygiene.

-- jd
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <r7wpn76y.fsf@ccs.neu.edu>
Jacek Generowicz <················@cern.ch> writes:

> My question is ... how should I interpret the claim that Scheme's
> macro system is more "sophisticated" than CL's ?  Does it merely reflect
> a preference for Scheme's "hygene", or is there something else behind
> the statement ?

You should interpret the claim based on the biases of the person
making the claim.

One way of looking at is that the macro system performs a
source->source transformation.  To do this you need a mechanism for
destructuring the original code that preserves the meaning of the
subforms that are not being transformed, a mechanism for transforming
the fragments, a mechanism for introducing new forms with a meaning
independent of the source context, and a mechanism for combining
these.

The Common Lisp macro system provides a hook and a small amount of
destructuring.  The mechanism for transforming the fragments and
combining the results are provided by the built-in list manipulation
facilities. 

The rest of the work is a burden that the macro writer
must take on himself.  The macro writer must avoid introducting
colliding identifiers so that the original forms, when re-injected,
retain their meaning.  He must ensure that introduced gensyms are
correctly referred to in the expanded code so that the new forms are
independent of context.  Admittedly, this is not a large burden to
bear for the power of macros.

The Scheme macro system provides a much richer destructuring facility,
and automates the entire process of renaming to preserve meaning.  It
provides a richer set of base tools *for that particular problem*, so
in this way it is more sophisticated than the CL macro system.

However, the Scheme macro system does *not* provide much in the way of
transforming the program fragments or combining the fragments and new
code into a result.  This is done via pattern matching and templates
that do not give you easy access to the underlying list structure.  It
can be amazingly difficult to generate the patterns and templates
needed to do what would be simple list manipulation.

The end result is that the Scheme macro system can be a pain in the
ass to understand and use.  From what I've seen, I'm not the only one
in the Scheme community that finds the hygienic macro system
difficult. It seems that the burden of maintaining hygiene manually is
smaller than the burden introduced by maintaining it automatically.
From: Kaz Kylheku
Subject: Re: Scheme macros
Date: 
Message-ID: <cf333042.0402210100.693246a@posting.google.com>
Jacek Generowicz <················@cern.ch> wrote in message news:<···············@pcepsft001.cern.ch>...
> [Don't want to start a flamefest, so I'm not crossposting this to
> c.l.s]
> 
> In _Lambda: the ultimate "little language"_, Olin Shivers suggests
> (second paragraph of section 9) that " ... Scheme has the most
> sophisticated macro system of any language in this family ...", where
> 'this family' is explicitly stated to include Common Lisp.

This just illustrates the difference between what Schemers tend to
find important or interesting and what Lispers tend find important or
interesting.

Scheme comes with a sophisticated macro system. Lisp comes with
sophisticated macros.

Given a Lisp system, I can show ready-made macros like LOOP to Lisp
newbies. In a five minute demo, I can expose how an expressive little
language that brings a clear benefit to everyday programming tasks is
implemented by source-to-source transformations on lists.

A slick way to write macros is important if you want to showcase your
macro-writing system to people who already understand macro-writing
systems and their surrounding issues.  It's not that imporant if you
want to get a software engineering task done, because the cost of
writing a macro is amortized over all of its uses. Given that
amortization, maintaining a nested backquote stuffed with MAPCAR calls
and whatnot isn't such a big deal. It's not important what the
expander of a macro looks like, and quite often it's not even
important what the expansion looks like as long as it is correct.

The Lisp way of writing macros is powerful and general, because you
are writing code, rather than expressing your intent in a tightly
constrained pattern-matching language. If you want some ad-hoc
irregular syntax, you can just write code to handle it. If you
deliberately want non-hygiene, that is easy. If you want to break open
a macro argument that is a string literal and parse a language
embedded in that, no problem, just write the code. If you want to
treat symbols as name-equivalent so :FOR, #:FOR and MYPACKAGE::FOR
denote the same thing, just write the code. And so forth.

Can someone show me the syntax-rules implementation of something that
closely resembles the CL LOOP macro, with all the clauses? Surely,
this should be a piece of cake using the ``most sophisticated macro
system''. What does syntax-rules bring to the table when you want to
write something like this?

How about the FORMATTER macro? What would be the syntax-rules for
that?
From: Joe Marshall
Subject: Re: Scheme macros
Date: 
Message-ID: <ad3bhv4h.fsf@comcast.net>
···@ashi.footprints.net (Kaz Kylheku) writes:

> Can someone show me the syntax-rules implementation of something that
> closely resembles the CL LOOP macro, with all the clauses? Surely,
> this should be a piece of cake using the ``most sophisticated macro
> system''. 

See http://okmij.org/ftp/Scheme/macros.html 

Oleg has written a compiler that transforms Scheme code to the
appropriate pattern-matching rules.

> What does syntax-rules bring to the table when you want to write
> something like this?

Common Lisp has some restrictions to ensure that macros can work
correctly.  For instance, you are not permitted to shadow identifiers
in the CL package.  Syntax-rules does not have this restriction.


-- 
~jrm