From: robbie carlton
Subject: scheme seems neater
Date: 
Message-ID: <32b5ef05.0404060506.7068c3d9@posting.google.com>
Hi. Probs stupid question from someone relatively new to Lisp. In
scheme a symbol has only one visible binding at anytime, whereas in CL
a symbol can have a variable value a function value, a property list,
a documentation string, and probs some other junk I forgot. Question
is, why? Doesn't the CL way just promote messy unreadable code. Also,
the Scheme way means function definitions are much more pretty, and
consistent, ie assigning a symbol to a function literal. It just seems
nicer. I understand Paul Grahams arc is going to include some of that
schemeyness, but probably won't be around for a decade. Am I just
wrong, or is Scheme just more elegant than CL?

From: mikel
Subject: Re: scheme seems neater
Date: 
Message-ID: <JCycc.33444$ci3.17245@newssvr29.news.prodigy.com>
robbie carlton wrote:
> Hi. Probs stupid question from someone relatively new to Lisp. In
> scheme a symbol has only one visible binding at anytime, whereas in CL
> a symbol can have a variable value a function value, a property list,
> a documentation string, and probs some other junk I forgot. Question
> is, why? Doesn't the CL way just promote messy unreadable code. Also,
> the Scheme way means function definitions are much more pretty, and
> consistent, ie assigning a symbol to a function literal. It just seems
> nicer. I understand Paul Grahams arc is going to include some of that
> schemeyness, but probably won't be around for a decade. Am I just
> wrong, or is Scheme just more elegant than CL?

I can see why you think that way, but in practice about the only real 
difference it makes is that coming up with a good macro system for 
Scheme has been harder (because of the much greater likelihood of 
accidental variable capture in a single namespace). There are other 
differences between the languages on account of the namespace issue, and 
in some respects Scheme's syntax is arguably tidier because of it, but 
these wind up being mostly peripheral issues.
From: Paul Wallich
Subject: Re: scheme seems neater
Date: 
Message-ID: <c4uegj$5k8$1@reader1.panix.com>
mikel wrote:

> robbie carlton wrote:
> 
>> Hi. Probs stupid question from someone relatively new to Lisp. In
>> scheme a symbol has only one visible binding at anytime, whereas in CL
>> a symbol can have a variable value a function value, a property list,
>> a documentation string, and probs some other junk I forgot. Question
>> is, why? Doesn't the CL way just promote messy unreadable code. Also,
>> the Scheme way means function definitions are much more pretty, and
>> consistent, ie assigning a symbol to a function literal. It just seems
>> nicer. I understand Paul Grahams arc is going to include some of that
>> schemeyness, but probably won't be around for a decade. Am I just
>> wrong, or is Scheme just more elegant than CL?
> 
> 
> I can see why you think that way, but in practice about the only real 
> difference it makes is that coming up with a good macro system for 
> Scheme has been harder (because of the much greater likelihood of 
> accidental variable capture in a single namespace). There are other 
> differences between the languages on account of the namespace issue, and 
> in some respects Scheme's syntax is arguably tidier because of it, but 
> these wind up being mostly peripheral issues.

And, as people will tell you endlessly, a lot depends on your 
perceptions of elegance.  Stereotypically in Scheme, for example, you 
pay for the increased elegance of the base language with a decrease in 
the elegance of expressing your ideas in code, e.g. having to invent 
misspellings, nonce names and circumlocations to avoid punning.

Some people view "I perceived with my eyes a toothed wood-cutting device 
and sawed a board with it" as more elegant, and others prefer "I saw the 
saw and sawed a board with it." The language that only allows the first 
version is undoubtedly more elegant in some ways, the one that 
encourages the second in others.

paul
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87ekr16si1.fsf@handgranat.org>
Paul Wallich <··@panix.com> writes:
> And, as people will tell you endlessly, a lot depends on your
> perceptions of elegance.  Stereotypically in Scheme, for example, you
> pay for the increased elegance of the base language with a decrease in
> the elegance of expressing your ideas in code, e.g. having to invent
> misspellings, nonce names and circumlocations to avoid punning.

Some would say that its well worth that price.

> Some people view "I perceived with my eyes a toothed wood-cutting
> device and sawed a board with it" as more elegant, and others prefer
> "I saw the saw and sawed a board with it." The language that only
> allows the first version is undoubtedly more elegant in some ways, the
> one that encourages the second in others.

But with CL it's cumbersome to say "I sawed the saw".

E.g. in scheme you can write (list list) which evaluates to
(#<procedure>). In CL you have to write (list #'list) to get that same
effect.

And puns can be obfuscating until you're used to them. If you want to
write the function
(defun hack-the-list (list)
   (list (hackhackhack list)))

in CL, you can. Sure.

In scheme you have to change the variable name a little.

(define (hack-the-list list-)
   (list (hackhackhack list-)))

How is that much less elegant than CL having to add #' pretty often?
(uses of mapcar comes to mind).

(Common idioms are lis and lst.)

So CL-ers have to add #' and funcalls in some situations, and schemers
have to change their names in some others. The same inelegancy.

However, I can't think of any drawback with a lisp-1 that a lisp-2
doesn't also have.

Variable naming is one of the things about good-looking code that's
easiest and hardest at the same time.

Here's another example that might be food for thought:

(defun hack-together-two-lists (list list)
   ...)

That's not valid CL, either.

Now, many people would possibly argue that "Oh, but Good Style would
dictate that you name the two variables differently so that you can
see what's going on in the function". A surprisingly lange subset of
those very same persons would still argue for CL-style punning and
twin namespaces.

In conclusion, I like CL a lot. I just don't think that having to say
"I saw the #'saw and sawed a board with it" is something to write home
about.

-- 
One love,
Sunnan
From: Frode Vatvedt Fjeld
Subject: Re: scheme seems neater
Date: 
Message-ID: <2hr7v1dsjn.fsf@vserver.cs.uit.no>
Sunnan <······@handgranat.org> writes:

> [..] In conclusion, I like CL a lot. I just don't think that having
> to say "I saw the #'saw and sawed a board with it" is something to
> write home about.

I think you're missing the point here. You don't say "I saw the #'saw
and sawed .." in CL, you say "I saw the saw and sawed ..". The point
is that contextually, humans understand that the first saw is a verb
and the second a noun, even if they are spelled identically. What this
illustrates about CL is that in e.g. (saw saw), the first saw
references the function name-space, while the second saw references
the variable name-space.

Your sentence "I saw the #'saw and sawed a board with it", is
presumably intended to correspond to a CL form like (saw #'saw).
However, the translation of (saw #'saw) to english would be something
like "I saw the verb `saw' and sawed .." where the qualification of
the second saw is required in order for it not to be mistaken for the
metal thing with teeth. Precisely the same situation as in CL, and why
we have to say (function saw), shortcut to #'saw, to express "the
value of saw in the function name-space" when not at the operator
postion in a form.

-- 
Frode Vatvedt Fjeld
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <8765cd6q52.fsf@handgranat.org>
Frode Vatvedt Fjeld <······@cs.uit.no> writes:
> I think you're missing the point here. You don't say "I saw the #'saw
> and sawed .." in CL, you say "I saw the saw and sawed ..". The point
> is that contextually, humans understand that the first saw is a verb
> and the second a noun, even if they are spelled identically. What this
> illustrates about CL is that in e.g. (saw saw), the first saw
> references the function name-space, while the second saw references
> the variable name-space.

Yeah, I understand, but I warped the analogy a bit because I couldn't
think of anything cleaner. I have now, though, how about this
sentence:

"Tomorrow I am going to saw with the saw".

In CL, that would have to be 
"..going to #'saw with the saw".

As for Bj�rns comments, sure, I never denied that you can't do
CL-style puns with scheme - that was what my entire post was about!

-- 
One love,
Sunnan
From: =?utf-8?b?QmrDtnJuIExpbmRiZXJn?=
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcszn9p84ed.fsf@fnatte.nada.kth.se>
Sunnan <······@handgranat.org> writes:

> Frode Vatvedt Fjeld <······@cs.uit.no> writes:
> > I think you're missing the point here. You don't say "I saw the #'saw
> > and sawed .." in CL, you say "I saw the saw and sawed ..". The point
> > is that contextually, humans understand that the first saw is a verb
> > and the second a noun, even if they are spelled identically. What this
> > illustrates about CL is that in e.g. (saw saw), the first saw
> > references the function name-space, while the second saw references
> > the variable name-space.
> 
> Yeah, I understand, but I warped the analogy a bit because I couldn't
> think of anything cleaner. I have now, though, how about this
> sentence:
> 
> "Tomorrow I am going to saw with the saw".
> 
> In CL, that would have to be 
> "..going to #'saw with the saw".
> 
> As for Björns comments, sure, I never denied that you can't do
> CL-style puns with scheme - that was what my entire post was about!

My comment was meant the other way around; I was referring to the name
mangling necessary in Scheme, eg "list" becomes "lst", "saw" becomes
"sw" and "saugh" respectively, and so forth.


Björn
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87ptal5all.fsf@handgranat.org>
·······@nada.kth.se (Bj�rn Lindberg) writes:
> My comment was meant the other way around; I was referring to the name
> mangling necessary in Scheme, eg "list" becomes "lst", "saw" becomes
> "sw" and "saugh" respectively, and so forth.

So was I, or at least I was trying to.

Name mangling, a.k.a. choosing variable names with care, is always
necessary when you have two variables that would otherwise have the
same name, regardless of the number of namespaces involved. I guess
that's my point summed up as shortly as I can.

That's why I brought up the (defun mangle-lists (list list list) ...)
invalid example.

Maybe people who adore CL-style punning could look into having
different namespaces for other types, as well - a string and an int
having the same name in the same variable, for example. Of course(?),
that would be annoying - and that's the same argument that lisp-1
proponents present for uniting the namespace of functions with other
types.

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4zn9pgfwb.fsf@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> ·······@nada.kth.se (Bj�rn Lindberg) writes:
> > My comment was meant the other way around; I was referring to the name
> > mangling necessary in Scheme, eg "list" becomes "lst", "saw" becomes
> > "sw" and "saugh" respectively, and so forth.
> 
> So was I, or at least I was trying to.
> 
> Name mangling, a.k.a. choosing variable names with care,

If you think prejudicial language should be avoided,
then the use of "punning" for what happens in CL should
also be dropped.

Not all uses of multiple meanings for a word are punning.

> is always
> necessary when you have two variables that would otherwise have the
> same name, regardless of the number of namespaces involved. I guess
> that's my point summed up as shortly as I can.

CL allows context to disambiguate, so that choosing different
names is less often needed.  That isn't changed by both languages
having some cases where names would clash.

> Maybe people who adore CL-style punning

I don't think anyone adores it.  The CL attitude is more
pragmatic, that's all.

> could look into having
> different namespaces for other types, as well - a string and an int
> having the same name in the same variable, for example.

There are a number of factors that have to be balanced.
Things don't have to be pushed to extremes.  

-- jd
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <878yh956vf.fsf@handgranat.org>
Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
>> Name mangling, a.k.a. choosing variable names with care,
>
> If you think prejudicial language should be avoided,

I didn't mind the term "name mangling" (and I've used it myself in
other responses in this thread), I just wanted to clarify what "name
mangling" sometimes is.

> then the use of "punning" for what happens in CL should
> also be dropped.

I'm sorry.

I had mostly heard the term in a positive light.

> Not all uses of multiple meanings for a word are punning.

I meant specifically (lambda (string) (string string)) and so on.

>> Maybe people who adore CL-style punning
>
> I don't think anyone adores it.

I glanced at the thread "Lisp puns considered good style?" (original
post message id: <······················@golden.net>) and some people
seemed to adore it. Maybe I got the wrong impression.

>> could look into having
>> different namespaces for other types, as well - a string and an int
>> having the same name in the same variable, for example.
>
> There are a number of factors that have to be balanced.
> Things don't have to be pushed to extremes.  

Maybe not.
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4r7v1gdzn.fsf@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> Jeff Dalton <····@todday.inf.ed.ac.uk> writes:

> > Not all uses of multiple meanings for a word are punning.
> 
> I meant specifically (lambda (string) (string string)) and so on.

Do you mean that cases like that, which look quite strange to me,
are puns, or that any defintion in which "list", say, was used
as both a variable and a function name would be "punning"?

> >> Maybe people who adore CL-style punning
> >
> > I don't think anyone adores it.
> 
> I glanced at the thread "Lisp puns considered good style?" (original
> post message id: <······················@golden.net>) and some people
> seemed to adore it. Maybe I got the wrong impression.

I haven't read that one.

-- jd
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87isgd3pdl.fsf@handgranat.org>
Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> Do you mean that cases like that, which look quite strange to me,
> are puns, or that any defintion in which "list", say, was used
> as both a variable and a function name would be "punning"?

Here is the example from that thread:

(defun pr-string (s string)
   (setq string (string string))

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4r7v0g90f.fsf@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> > Do you mean that cases like that, which look quite strange to me,
> > are puns, or that any defintion in which "list", say, was used
> > as both a variable and a function name would be "punning"?
> 
> Here is the example from that thread:
> 
> (defun pr-string (s string)
>    (setq string (string string))

Which doesn't answer the question you quote above. :(

-- jd
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87n05o25gl.fsf@handgranat.org>
Jeff Dalton <····@todday.inf.ed.ac.uk> writes:

> Sunnan <······@handgranat.org> writes:
>
>> Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
>> > Do you mean that cases like that, which look quite strange to me,
>> > are puns, or that any defintion in which "list", say, was used
>> > as both a variable and a function name would be "punning"?
>> 
>> Here is the example from that thread:
>> 
>> (defun pr-string (s string)
>>    (setq string (string string))
>
> Which doesn't answer the question you quote above. :(

Sorry. I mean that cases like that are puns, not necessarily any
definition where (say) list was used as both a variable and a function
name.

-- 
One love,
Sunnan
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c4uo32$sb6$1@f1node01.rhrz.uni-bonn.de>
Sunnan wrote:

> Maybe people who adore CL-style punning could look into having
> different namespaces for other types, as well - a string and an int
> having the same name in the same variable, for example. Of course(?),
> that would be annoying - and that's the same argument that lisp-1
> proponents present for uniting the namespace of functions with other
> types.

No, it's not. The one case would be annoying, the other case is not.

BTW, what's the point of this discussion? If you prefer one style, you 
know where to find it. If you prefer the other, you also know where to 
find it. What's the problem?


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87lll958kb.fsf@handgranat.org>
Pascal Costanza <········@web.de> writes:
>> Maybe people who adore CL-style punning could look into having
>> different namespaces for other types, as well - a string and an int
>> having the same name in the same variable, for example. Of course(?),
>> that would be annoying - and that's the same argument that lisp-1
>> proponents present for uniting the namespace of functions with other
>> types.
>
> No, it's not. The one case would be annoying, the other case is not.

Care to clarify?

> BTW, what's the point of this discussion?

The point of this discussion, for me, is to find out whether there are
any good technical (as opposed to historical/political) reasons why
one would, today, create a new lisp-2, and if so, trying to understand
those reasons.

I am very much trying to keep it from being a Scheme vs CL
"contest". I'm not trying to present either language as better than
the other, but I am trying to refute some of the arguments for a
lisp-2 that I don't think make sense.

> If you prefer one style, you know where to find it. If you prefer
> the other, you also know where to find it.

I'm experimenting with both styles.

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4n05pgcyx.fsf@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> The point of this discussion, for me, is to find out whether there are
> any good technical (as opposed to historical/political) reasons why
> one would, today, create a new lisp-2, and if so, trying to understand
> those reasons.

I don't think there are any technical reasons that definitively
establish that Lisp-1 or Lisp-2 is better.

They're both reasonable choices in "language space", it seems
to me.

Much of the discussion of this issue in recent years (decades?)
seems to assume that Lisp-2 is obviously worse, so that only
historical or political reasons could explain why any language
has two namespaces.

-- jd
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87ekr13p8h.fsf@handgranat.org>
Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> Much of the discussion of this issue in recent years (decades?)
> seems to assume that Lisp-2 is obviously worse, so that only
> historical or political reasons could explain why any language
> has two namespaces.

Right, but I like to question my own assumptions. To me Lisp-2 does
seem obviously and intuitively worse - an assumption I'm well aware
could be wrong - and thus this thread is interesting to me.

(BTW, don't knock historical or political reasons - they explain many
things in computing.)

-- 
One love,
Sunnan
From: =?utf-8?b?QmrDtnJuIExpbmRiZXJn?=
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcsoeq5805g.fsf@fnatte.nada.kth.se>
Sunnan <······@handgranat.org> writes:

> Pascal Costanza <········@web.de> writes:
> >> Maybe people who adore CL-style punning could look into having
> >> different namespaces for other types, as well - a string and an int
> >> having the same name in the same variable, for example. Of course(?),
> >> that would be annoying - and that's the same argument that lisp-1
> >> proponents present for uniting the namespace of functions with other
> >> types.
> >
> > No, it's not. The one case would be annoying, the other case is not.
> 
> Care to clarify?
> 
> > BTW, what's the point of this discussion?
> 
> The point of this discussion, for me, is to find out whether there are
> any good technical (as opposed to historical/political) reasons why
> one would, today, create a new lisp-2, and if so, trying to understand
> those reasons.

There are. Or rather, there are advantages to both ways. I think
ultimately it is an issue of personal preference which style you
prefer. The issue has been extensively discussed in the past on this
newsgroup, try Google groups. There is also a paper by Kent Pitman
comparing Lisp-1 and lisp-2.

> I am very much trying to keep it from being a Scheme vs CL
> "contest". I'm not trying to present either language as better than
> the other, but I am trying to refute some of the arguments for a
> lisp-2 that I don't think make sense.

As have countless others before you...

:-)


Björn
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87zn9p3qqv.fsf@handgranat.org>
·······@nada.kth.se (Bj�rn Lindberg) writes:
> There are. Or rather, there are advantages to both ways. I think
> ultimately it is an issue of personal preference which style you
> prefer. The issue has been extensively discussed in the past on this
> newsgroup, try Google groups. There is also a paper by Kent Pitman
> comparing Lisp-1 and lisp-2.

I can only find http://www.dreamsongs.com/Separation.html (also at
http://www.nhplace.com/kent/Papers/Technical-Issues.html) which I've
read thoroughly, several times.

>> I am very much trying to keep it from being a Scheme vs CL
>> "contest". I'm not trying to present either language as better than
>> the other, but I am trying to refute some of the arguments for a
>> lisp-2 that I don't think make sense.
>
> As have countless others before you...

Yeah, I guess it gets pretty tiring.

Well, I didn't start the thread, but I had to jump in. My main
reaction to people adamantly defending lisp-2 is surprise and
disunderstanding.

I understand the historical/political argument, I do, but I guess I'm
blinded by my own emotion of how neat I think it is with lisp-1:s, how
blindingly flexible and beautiful the code can be.

If someone says that they choose CL *in spite of* it being a lisp-2, I
can definitely understand them. I like CL.

If someone says that they choose a lisp-2 *because* it's a lisp-2, I
react with "really? tell me more!"

However, if someone has an argument against lisp-1 that I consider
bogus (say, "you can't do macros" or "lisp-2 solves all name-mangling
issues"), I might argue against that.

My dream lisp would have the best from both scheme and CL.

Anyway, sorry.

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4zn9og9fx.fsf@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> I understand the historical/political argument, I do, but I guess I'm
> blinded by my own emotion of how neat I think it is with lisp-1:s, how
> blindingly flexible and beautiful the code can be.

There's a *political* argument?  What is it?  In all the years
I've been seeing arguments about this, I don't think I've ever
seen anyone give a political argument for two namespaces.

> If someone says that they choose CL *in spite of* it being a lisp-2, I
> can definitely understand them. I like CL.
> 
> If someone says that they choose a lisp-2 *because* it's a lisp-2, I
> react with "really? tell me more!"

I don't choose CL only because it has two namespaces, but it's
not an "in spite of" either.  Both approaches are reasonable;
I slightly prefer the two namespaces most of the time, but
not always.

-- jd
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87r7v025kb.fsf@handgranat.org>
Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> There's a *political* argument?  What is it?

There was a reluctance to make radical changes because of political
reasons. Or that's how I understand it from _The Evolution of Lisp_.

> I don't choose CL only because it has two namespaces, but it's
> not an "in spite of" either.

Right, sounds reasonable. It's the "because" that has (at least in the
past) left me wondering.

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4n05o1q5s.fsf@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> > There's a *political* argument?  What is it?
> 
> There was a reluctance to make radical changes because of political
> reasons. Or that's how I understand it from _The Evolution of Lisp_.

Political or practical?  Implementations would have to change;
application code would have to change; all the bugs the change
introduced would have to be found and fixed.

What do you have in mind in _Evolution_?

-- jd
From: Björn Lindberg
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcs8yh8c37y.fsf@knatte.nada.kth.se>
Sunnan <······@handgranat.org> writes:

> ·······@nada.kth.se (Bj�rn Lindberg) writes:
> > There are. Or rather, there are advantages to both ways. I think
> > ultimately it is an issue of personal preference which style you
> > prefer. The issue has been extensively discussed in the past on this
> > newsgroup, try Google groups. There is also a paper by Kent Pitman
> > comparing Lisp-1 and lisp-2.
> 
> I can only find http://www.dreamsongs.com/Separation.html (also at
> http://www.nhplace.com/kent/Papers/Technical-Issues.html) which I've
> read thoroughly, several times.

That is the one I was thinking of.

> >> I am very much trying to keep it from being a Scheme vs CL
> >> "contest". I'm not trying to present either language as better than
> >> the other, but I am trying to refute some of the arguments for a
> >> lisp-2 that I don't think make sense.
> >
> > As have countless others before you...
> 
> Yeah, I guess it gets pretty tiring.
> 
> Well, I didn't start the thread, but I had to jump in. My main
> reaction to people adamantly defending lisp-2 is surprise and
> disunderstanding.
> 
> I understand the historical/political argument, I do, but I guess I'm
> blinded by my own emotion of how neat I think it is with lisp-1:s, how
> blindingly flexible and beautiful the code can be.
> 
> If someone says that they choose CL *in spite of* it being a lisp-2, I
> can definitely understand them. I like CL.
> 
> If someone says that they choose a lisp-2 *because* it's a lisp-2, I
> react with "really? tell me more!"

That is strange. Some of the advantages of Lisp-2 have already been
explained to you. Even if you yourself consider the advantages of
lisp-1 to bear more weight, and prefer lisp-1 yourself, surely it is
not impossible to see why others make different choices?

> However, if someone has an argument against lisp-1 that I consider
> bogus (say, "you can't do macros" or "lisp-2 solves all name-mangling
> issues"), I might argue against that.

No one said it solves /all/ name mangling issues. However, it solves
them to a significant degree (compared to lisp-1). To a person who
does not like having to mangle names, lisp-2 is likely quite
attractive.

> My dream lisp would have the best from both scheme and CL.

Many of the best things about Scheme or CL are the results of
different trade-offs, thus irreconcileable. You can't have both lisp-1
and lisp-2 at the same time for instance.


Bj�rn
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87hdvw3lzx.fsf@handgranat.org>
·······@nada.kth.se (Bj�rn Lindberg) writes:
> Sunnan <······@handgranat.org> writes:
>> If someone says that they choose a lisp-2 *because* it's a lisp-2, I
>> react with "really? tell me more!"
>
> That is strange. Some of the advantages of Lisp-2 have already been
> explained to you.

Well, let's see:

1. The macro issue. As far as I understand it, this argument is bogus.

2. The ability to use the same variable name twice within a scope,
   once for a function and once for a non-function, because the reader
   (or compiler as it may be) can disambiguate. This is sometimes a
   loss (causes nonobvious code) and sometimes a win. Aesthetics and
   taste play some part.

Did I miss something?

Advantages of lisp-1:

1. Less redundancy (no need for flet etc).
2. Code brevity.

Not counting, since tastes vary:

3. Various aesthetic issues, including the definition of eval.

> Even if you yourself consider the advantages of lisp-1 to bear more
> weight, and prefer lisp-1 yourself, surely it is not impossible to
> see why others make different choices?

If not impossible, at least sufficiently hard to see it that I figure
that I might be missing something.

>> However, if someone has an argument against lisp-1 that I consider
>> bogus (say, "you can't do macros" or "lisp-2 solves all name-mangling
>> issues"), I might argue against that.
>
> No one said it solves /all/ name mangling issues. However, it solves
> them to a significant degree (compared to lisp-1). To a person who
> does not like having to mangle names, lisp-2 is likely quite
> attractive.

It solves "string" and "list". I've never seen saw/sw in practice.

>> My dream lisp would have the best from both scheme and CL.
>
> Many of the best things about Scheme or CL are the results of
> different trade-offs, thus irreconcileable. You can't have both lisp-1
> and lisp-2 at the same time for instance.

No, but you can have a lisp-1 with read-macros, packages, hashes,
defmacro, and (this drives me nuts, and is one of the reasons I hang
out at cll, that I may one day remember) that one CL feature that
scheme lacks that I woke up in the middle of the night longing for and
then forgot about and I can't remember what feature that was.

-- 
One love,
Sunnan
From: Björn Lindberg
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcsd66kstmi.fsf@fnatte.nada.kth.se>
Sunnan <······@handgranat.org> writes:

> ·······@nada.kth.se (Bj�rn Lindberg) writes:
> > Sunnan <······@handgranat.org> writes:
> >> If someone says that they choose a lisp-2 *because* it's a lisp-2, I
> >> react with "really? tell me more!"
> >
> > That is strange. Some of the advantages of Lisp-2 have already been
> > explained to you.
> 
> Well, let's see:
> 
> 1. The macro issue. As far as I understand it, this argument is bogus.

I don't know enough about defmacro use in Scheme to be able to make
comparisons, but using defmacro in CL would definitely be more
inconvenient and error prone if CL were a lisp-1.

> 2. The ability to use the same variable name twice within a scope,
>    once for a function and once for a non-function, because the reader
>    (or compiler as it may be) can disambiguate. This is sometimes a
>    loss (causes nonobvious code) and sometimes a win. Aesthetics and
>    taste play some part.

I cannot see how it could ever be a /disadvantage/ to be able to not
have to mange variable names?!? Increased freedom to choose ones
variable names somehow makes people worse programmers?

> Did I miss something?

No, that is pretty much it, I think.

> Advantages of lisp-1:
> 
> 1. Less redundancy (no need for flet etc).

I don't understand this one. What would you use instead of flet?

> 2. Code brevity.

I don't understand this one either. In what significant way is code
brevity enchanced by having just one namespace? Are you thinking of
FUNCALL and #'? Do you really consider that that significant?

> Not counting, since tastes vary:
> 
> 3. Various aesthetic issues, including the definition of eval.

Agreed. Aesthetics are subjective, and different people find both
lisp-1 and lisp-2 more aesthetically pleasing.

> > Even if you yourself consider the advantages of lisp-1 to bear more
> > weight, and prefer lisp-1 yourself, surely it is not impossible to
> > see why others make different choices?
> 
> If not impossible, at least sufficiently hard to see it that I figure
> that I might be missing something.
> 
> >> However, if someone has an argument against lisp-1 that I consider
> >> bogus (say, "you can't do macros" or "lisp-2 solves all name-mangling
> >> issues"), I might argue against that.
> >
> > No one said it solves /all/ name mangling issues. However, it solves
> > them to a significant degree (compared to lisp-1). To a person who
> > does not like having to mangle names, lisp-2 is likely quite
> > attractive.
> 
> It solves "string" and "list". I've never seen saw/sw in practice.
> 
> >> My dream lisp would have the best from both scheme and CL.
> >
> > Many of the best things about Scheme or CL are the results of
> > different trade-offs, thus irreconcileable. You can't have both lisp-1
> > and lisp-2 at the same time for instance.
> 
> No, but you can have a lisp-1 with read-macros, packages, hashes,
> defmacro, and (this drives me nuts, and is one of the reasons I hang
> out at cll,

So you basically want a Lisp which is as much as possible like CL
apart from being lisp-1? I think you should give CL a try. :-)

> that I may one day remember) that one CL feature that
> scheme lacks that I woke up in the middle of the night longing for and
> then forgot about and I can't remember what feature that was.

Special variables?


Bj�rn
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87zn9ozst6.fsf@handgranat.org>
·······@nada.kth.se (Bj�rn Lindberg) writes:
> I don't know enough about defmacro use in Scheme to be able to make
> comparisons, but using defmacro in CL would definitely be more
> inconvenient and error prone if CL were a lisp-1.

All right.

>> 2. The ability to use the same variable name twice within a scope,
>>    once for a function and once for a non-function, because the reader
>>    (or compiler as it may be) can disambiguate. This is sometimes a
>>    loss (causes nonobvious code) and sometimes a win. Aesthetics and
>>    taste play some part.
>
> I cannot see how it could ever be a /disadvantage/ to be able to not
> have to mange variable names?!?

*Making* that choice is what sometimes results in a loss, not being
 able to. I should've been clearer.

> Increased freedom to choose ones variable names somehow makes people
> worse programmers?

No, while many people actually think so (and argue against both CL and
scheme because of the "too much freedom"-issue), I don't.

>> Advantages of lisp-1:
>> 
>> 1. Less redundancy (no need for flet etc).
>
> I don't understand this one. What would you use instead of flet?

Regular let.

CL has a lot of things done twice, in one way for functions and one
way for variables. (Scheme uses define both for variables and
functions, for example, and uses the same let for variables and
functions).

Valid scheme:

(let ((hej display)
      (bjoern "Hej hej"))
 (hej bjoern))     

While I seldom see the equivalent CL:

(let ((hej #'princ)
      (bjoern "Hej hej"))
 (funcall hej bjoern))     

that's not the point (I never said lisp-1 makes things possible that
can't be done with #' or funcall), the point is that lisp-2 has a lot
of (from a lisp-1 point of view) unnecessary things like flet.

>> 2. Code brevity.
>
> I don't understand this one either. In what significant way is code
> brevity enchanced by having just one namespace? Are you thinking of
> FUNCALL and #'? Do you really consider that that significant?

Yeah, I do. But see also the above point about redundancy.

>> No, but you can have a lisp-1 with read-macros, packages, hashes,
>> defmacro, and (this drives me nuts, and is one of the reasons I hang
>> out at cll,
>
> So you basically want a Lisp which is as much as possible like CL
> apart from being lisp-1? I think you should give CL a try. :-)

Well, those features are already available in many non-standard
schemes, but I do use CL some (mostly to remember that elulive
feature).

>> that I may one day remember) that one CL feature that
>> scheme lacks that I woke up in the middle of the night longing for and
>> then forgot about and I can't remember what feature that was.
>
> Special variables?

No, it could've been read macros, but I'm not sure.

-- 
One love,
Sunnan
From: Sunnan
Subject: I just want to add (was Re: scheme seems neater)
Date: 
Message-ID: <87vfkczrmk.fsf_-_@handgranat.org>
To all involved:

Okay, I got tricked into a old, tiresome and unnecessary argument (I
wonder if the OP was a troll), but

1. It never turned to flames.
2. I got some question marks straightened out.

Thanks for your patience today.

-- 
One love,
Sunnan
From: Björn Lindberg
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcs8yh8spm9.fsf@fnatte.nada.kth.se>
Sunnan <······@handgranat.org> writes:

> ·······@nada.kth.se (Bj�rn Lindberg) writes:
> > I don't know enough about defmacro use in Scheme to be able to make
> > comparisons, but using defmacro in CL would definitely be more
> > inconvenient and error prone if CL were a lisp-1.
> 
> All right.
> 
> >> 2. The ability to use the same variable name twice within a scope,
> >>    once for a function and once for a non-function, because the reader
> >>    (or compiler as it may be) can disambiguate. This is sometimes a
> >>    loss (causes nonobvious code) and sometimes a win. Aesthetics and
> >>    taste play some part.
> >
> > I cannot see how it could ever be a /disadvantage/ to be able to not
> > have to mange variable names?!?
> 
> *Making* that choice is what sometimes results in a loss, not being
>  able to. I should've been clearer.
> 
> > Increased freedom to choose ones variable names somehow makes people
> > worse programmers?
> 
> No, while many people actually think so (and argue against both CL and
> scheme because of the "too much freedom"-issue), I don't.

Yes, that is exactly what I meant. I thought your argument sounded
like one of those "too much freedom can't be good"-kinds sometimes
advanced from bondage-&-discipline language camps.

> >> Advantages of lisp-1:
> >> 
> >> 1. Less redundancy (no need for flet etc).
> >
> > I don't understand this one. What would you use instead of flet?
> 
> Regular let.

And the disadvantage of using flet would be...? (As you no doubt know,
you can use plain old let in CL too: (let ((foo #'(lambda (...) ...)))
... (funcall foo ...)).)

> CL has a lot of things done twice, in one way for functions and one
> way for variables. (Scheme uses define both for variables and
> functions, for example, and uses the same let for variables and
> functions).

That is merely a syntactic difference. It wouldn't be hard to write a
macro DEFINE which would expand into DEFUN or DEFVAR as appropriate
(forgetting for a minute that CL does not have top level lexicals).


Bj�rn
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <8765ccyqkf.fsf@handgranat.org>
·······@nada.kth.se (Bj�rn Lindberg) writes:
> Yes, that is exactly what I meant. I thought your argument sounded
> like one of those "too much freedom can't be good"-kinds sometimes
> advanced from bondage-&-discipline language camps.

However (and this is a non-argument because it's based on personal
aesthetics), I prefer freedom based on simplicity.

> And the disadvantage of using flet would be...?

It's a disadvantage for implementors and for learners of the
language. It's brain clutter.

> (As you no doubt know, you can use plain old let in CL too: (let
> ((foo #'(lambda (...) ...)))  ... (funcall foo ...)).)

Yeah, didn't I include such an example somewhere?

>> CL has a lot of things done twice, in one way for functions and one
>> way for variables. (Scheme uses define both for variables and
>> functions, for example, and uses the same let for variables and
>> functions).
>
> That is merely a syntactic difference. It wouldn't be hard to write a
> macro DEFINE which would expand into DEFUN or DEFVAR as appropriate
> (forgetting for a minute that CL does not have top level lexicals).

No, it's more than a syntactic difference - you still need to have
both DEFUN and DEFVAR implemented underneath.

-- 
One love,
Sunnan
From: Frode Vatvedt Fjeld
Subject: Re: scheme seems neater
Date: 
Message-ID: <2hr7v0c8x4.fsf@vserver.cs.uit.no>
·······@nada.kth.se (Bj�rn Lindberg) writes:

>> And the disadvantage of using flet would be...?

Sunnan <······@handgranat.org> writes:

> It's a disadvantage for implementors and for learners of the
> language. It's brain clutter.

In my opinion, a much stronger case can be made for the viewpoint that
lexical variables and lexical functions are conceptually quite
different things, and that shoe-horning both concepts into the same
language construct is what really constitutes brain clutter. I'm very
happy that when I see a "let" I immediately know it's a variable
binding, without having to examine the binding's value or usage. I
find that my brain has absolutely no problems dealing with the burden
of knowing both "let" and "flet".

I'd even be inclined to consider whether Common Lisp is "cluttering"
let by using it for both lexical and dynamic variable bindings.

-- 
Frode Vatvedt Fjeld
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: scheme seems neater
Date: 
Message-ID: <pan.2004.04.07.14.50.56.418948@knm.org.pl>
On Wed, 07 Apr 2004 13:41:11 +0200, Frode Vatvedt Fjeld wrote:

> In my opinion, a much stronger case can be made for the viewpoint that
> lexical variables and lexical functions are conceptually quite
> different things,

The point is, they are not that different. Lisp treats them differently,
but a function is just a special case of an object. A lexical variable
can hold anything; a lexical function can hold only functions. You can
do everything with the more general concept only, lexical variables.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: =?utf-8?b?QmrDtnJuIExpbmRiZXJn?=
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcsu0zv7rwv.fsf@fnatte.nada.kth.se>
Marcin 'Qrczak' Kowalczyk <······@knm.org.pl> writes:

> On Wed, 07 Apr 2004 13:41:11 +0200, Frode Vatvedt Fjeld wrote:
> 
> > In my opinion, a much stronger case can be made for the viewpoint that
> > lexical variables and lexical functions are conceptually quite
> > different things,
> 
> The point is, they are not that different. Lisp treats them differently,
> but a function is just a special case of an object. A lexical variable
> can hold anything; a lexical function can hold only functions. You can
> do everything with the more general concept only, lexical variables.

But there is a conceptual difference as well. Functions *operate* on
objects, apart from being objects themselves. Typically, all other
kinds of objects are *operated upon*. So in the sense of how they are
used, functions are conceptually different from all other objects.

(This is also why Sunnan's reductio ad absurdam of having separate
namespaces for all different object types is not really relevant.)


Björn
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c51d1e$b7d$1@newsreader2.netcologne.de>
Marcin 'Qrczak' Kowalczyk wrote:

> On Wed, 07 Apr 2004 13:41:11 +0200, Frode Vatvedt Fjeld wrote:
> 
> 
>>In my opinion, a much stronger case can be made for the viewpoint that
>>lexical variables and lexical functions are conceptually quite
>>different things,
> 
> The point is, they are not that different. Lisp treats them differently,
> but a function is just a special case of an object. A lexical variable
> can hold anything; a lexical function can hold only functions. You can
> do everything with the more general concept only, lexical variables.

You're mixing up things here. You can compare values and function 
objects, or variable bindings and function bindings. Variable bindings 
are indeed more general than function bindings, but function objects and 
values are treated differently even in Scheme. That's trivial.

However, Scheme also treats the car position in an expression 
differently from the cdr positions. The car determines the meaning of an 
expression, the cdrs don't. That's the essential part. In Lisp, the car 
position is looked up in a different namespace, and _of course_ it only 
makes sense to put function objects there. How would you call a value?


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: scheme seems neater
Date: 
Message-ID: <pan.2004.04.07.17.38.08.436002@knm.org.pl>
On Wed, 07 Apr 2004 19:16:29 +0200, Pascal Costanza wrote:

>> The point is, they are not that different. Lisp treats them differently,
>> but a function is just a special case of an object. A lexical variable
>> can hold anything; a lexical function can hold only functions. You can
>> do everything with the more general concept only, lexical variables.
> 
> You're mixing up things here. You can compare values and function 
> objects, or variable bindings and function bindings.

I compare both pairs.

> Variable bindings are indeed more general than function bindings, but
> function objects and values are treated differently even in Scheme.

In Scheme function objects are a special case of values, just as function
bindings are a special case of value bindings.

In Lisp function objects are also a special case of values, but function
bindings are not a subset of value bindings - they are made disjoint.

> However, Scheme also treats the car position in an expression
> differently from the cdr positions.

Except for macros, they are evaluated using the same rules. The expression
to give as a function is a special case of an expression to give as an
argument. It can use the same set of operations for evaluation, with
the only restriction being that it must evaluate to a function object.
Every value which can be applied can also be passed as a parameter.

Macros are an exception, they make sense only when directly applied. They
are not first class objects. But functions *are* first class objects, so
it's natural to treat them as a subset of values, rather than completely
different things.

> In Lisp, the car position is looked up in a different namespace, and
> _of course_ it only makes sense to put function objects there. How would
> you call a value?

It only makes sense to put numbers as arguments of +, but Lisp doesn't
enforce it by putting variables holding numbers in a separate namespace.
So having only a subset of values legal is not enough to justify separate
namespaces.

Separate namespaces for two kinds of things don't cause trouble (other
than perhaps human confusion) in cases the usage of the kinds is disjoint.
For example C types and C values are never interchangeable. But it's
not the case for functions and other values: it does make sense to have
functions as arguments to other functions, and to call functions which
came as parameters or are computed by function calls. Separate namespaces
make it harder.

I understand how separate namespaces make non-hygienic macros a smaller
problem, and that some people like punning, but I can't agree that those
namespaces should be separated because of a fundamental difference between
functions and other objects. At least in a functional language, where
functions *are* objects.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Matthew Danish
Subject: Re: scheme seems neater
Date: 
Message-ID: <20040407185325.GA25328@mapcar.org>
On Wed, Apr 07, 2004 at 07:38:11PM +0200, Marcin 'Qrczak' Kowalczyk wrote:
> In Scheme function objects are a special case of values, just as function
> bindings are a special case of value bindings.
> 
> In Lisp function objects are also a special case of values, but function
> bindings are not a subset of value bindings - they are made disjoint.

Keep in mind that ordinary variables may also be bound to functions.

> It only makes sense to put numbers as arguments of +, but Lisp doesn't
> enforce it by putting variables holding numbers in a separate namespace.
> So having only a subset of values legal is not enough to justify separate
> namespaces.

In a statically typed language, it is conceivable to imagine a separate
namespace for every type:

(let ((a 1)
      (a "a"))
  (setf a (+ a 1))  ;; A the integer
  (setf a (string a))) ;; A the string

Of course, Lisp is not like this, but there is a distinction which can
be statically determined: syntactic position.  Neither in Scheme nor in
CL is ((not-a-fn-value) ...) a valid program; but in CL this can be
determined statically, and in Scheme it can only be found at runtime.
In addition, in a conforming CL program, by looking up the function name
in the function namespace, it is known that if the name is bound there
then it must be bound to a function object.  There is no such guarantee
in Scheme.

> Separate namespaces for two kinds of things don't cause trouble (other
> than perhaps human confusion) in cases the usage of the kinds is disjoint.
> For example C types and C values are never interchangeable. But it's
> not the case for functions and other values: it does make sense to have
> functions as arguments to other functions, and to call functions which
> came as parameters or are computed by function calls. Separate namespaces
> make it harder.

However this is partially incorrect: while it does make sense to have
higher-order functions (and Lisp-2 does not make this difficult), the
important distinction actually lies in the syntax of forms, not in the
type of objects.  Without knowing anything about the runtime values
involved, it is quite clear that the first element of a function form
must be a function.  CL takes this fact and makes it a part of the
language, along with macro and special forms which also can (and must)
be determined statically.  So my question is: why would you want the
names of functions and the names of variables to be in the same
namespace, when the primary use of function names is to appear where a
variable name would rarely want to appear?

Consider the case of class names: why should a class named FOO prevent
the use of variables named FOO?  The only time I would be referring the
class by the name FOO is in contexts which expect a class name.  If, on
the rare occasion I need to get the class object which is bound to the
class name FOO, there is FIND-CLASS.

> I understand how separate namespaces make non-hygienic macros a smaller
> problem, and that some people like punning, but I can't agree that those
> namespaces should be separated because of a fundamental difference between
> functions and other objects. At least in a functional language, where
> functions *are* objects.

Remember, it is function /names/ which are separated.  Function objects
don't have anything to do with namespaces.  The distinction is a
syntactic one.  When we write (NAME ...) we want NAME to be bound to a
function.  The separate namespace guarantees it won't be bound to any
other type of object in a conforming program.

-- 
; Matthew Danish <·······@andrew.cmu.edu>
; OpenPGP public key: C24B6010 on keyring.debian.org
; Signed or encrypted mail welcome.
; "There is no dark side of the moon really; matter of fact, it's all dark."
From: Frode Vatvedt Fjeld
Subject: Re: scheme seems neater
Date: 
Message-ID: <2hekqzcm87.fsf@vserver.cs.uit.no>
Marcin 'Qrczak' Kowalczyk <······@knm.org.pl> writes:

> In Lisp function objects are also a special case of values, but
> function bindings are not a subset of value bindings - they are made
> disjoint.

I find this statement utterly confused.

A lexical function binding, to me, means approximately: "In this
lexical environment I'll have this (function) operator available that
operates so an so". By no stretch of the imagination can this concept
be called "a subset of value bindings". Scheme overloads let with two
completely separate concepts, leaving the burden of disentangling the
resulting code's meaning to the human reader.

> Macros are an exception, they make sense only when directly
> applied. They are not first class objects. But functions *are* first
> class objects, so it's natural to treat them as a subset of values,
> rather than completely different things.

> [...]

> I understand how separate namespaces make non-hygienic macros a
> smaller problem, and that some people like punning, but I can't
> agree that those namespaces should be separated because of a
> fundamental difference between functions and other objects. At least
> in a functional language, where functions *are* objects.

The fact that functions are objects does not imply that there is not a
fundamental difference between functions (in general, not just in the
role "functions as objects") and other objects.

To me, the fundamental difference is obvious. Programming is to
instruct computers to do something, and the most important unit of
"doing something", certainly in Lisp, is the function. A large
percentage of all the forms one writes are of the sort "apply the
function named <foo> to these objects". No other class of objects
comes remotely close in terms of important fundamental difference from
the rest. In theory, I could make do without arrays, conses, or even
numbers, but not without functions. The fact that functions are also
objects is interesting, and perhaps even fascinating, but clearly does
not diminish their unique position.

-- 
Frode Vatvedt Fjeld
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: scheme seems neater
Date: 
Message-ID: <pan.2004.04.08.09.39.43.527910@knm.org.pl>
On Thu, 08 Apr 2004 03:06:00 +0200, Frode Vatvedt Fjeld wrote:

> A lexical function binding, to me, means approximately: "In this
> lexical environment I'll have this (function) operator available that
> operates so an so". By no stretch of the imagination can this concept
> be called "a subset of value bindings".

Of course it can. Instead of:
   (flet ((f ...)) ...)
   #'f
   (f ...)
you can also write:
   (let ((f #'(lambda ...))) ...)
   f
   (funcall f ...)

In many contexts they are interchangeable. In some they aren't because of
Lisp peculiarities, e.g. you must use the function version or play with
setq if you want recursion, you must use the value version if the choice
of the function to bind is made dynamically, and you must use a combination
of both if you want both.

Not only in Scheme, but also in SML, OCaml, Haskell, even in Python
function bindings are just value bindings which happen to bind functions.
Functions are a special case of objects, not something entirely different.

> Scheme overloads let with two completely separate concepts,

No, Lisp provides two notations for overlapping concepts.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Rob Warnock
Subject: Re: scheme seems neater
Date: 
Message-ID: <uMmdncc3R-_hjujdRVn-sQ@speakeasy.net>
Frode Vatvedt Fjeld  <······@cs.uit.no> wrote:
+---------------
| I find that my brain has absolutely no problems dealing with the
| burden of knowing both "let" and "flet".
+---------------

Me neither... except for occasional grumblings at the additional
indenting when I need both around the same block of code.  ;-}

+---------------
| I'd even be inclined to consider whether Common Lisp is "cluttering"
| let by using it for both lexical and dynamic variable bindings.
+---------------

Indeed. If we had to explicitly say FLUID-LET like some Schemes do
(or to keep it short, DYN-LET or even just DLET), it would remove
a certain amount of confusion (that the *FOO* convention doesn't
completely solve).

But... We'd have to add a new &DYNAMIC keyword to lambda lists
to force FLUID-LET bindings on some parameters, otherwise certain
well-established coding styles would break badly.


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: =?utf-8?b?QmrDtnJuIExpbmRiZXJn?=
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcs7jws7y4w.fsf@fnatte.nada.kth.se>
Sunnan <······@handgranat.org> writes:

> ·······@nada.kth.se (Björn Lindberg) writes:
> > Yes, that is exactly what I meant. I thought your argument sounded
> > like one of those "too much freedom can't be good"-kinds sometimes
> > advanced from bondage-&-discipline language camps.
> 
> However (and this is a non-argument because it's based on personal
> aesthetics), I prefer freedom based on simplicity.

Lisp-1 and lisp-2 are both simple, but in different ways.

> > And the disadvantage of using flet would be...?
> 
> It's a disadvantage for implementors and for learners of the
> language. 

The number of users of a language far exceeds its learners. A language
design meant for real use should be optimized with this in
consideration.

> It's brain clutter.

I disagree. It describes intent, and can help make the code clearer
(to a human reader). But if this is your view, I am not surprised you
have a hard time coming to terms with CL. FLET and LABELS, FIRST &
REST vs CAR & CDR, DEFVAR, DEFPARAMETER, DEFCONSTANT, etc. The number
of similar-but-slightly-different operators abound...

> > (As you no doubt know, you can use plain old let in CL too: (let
> > ((foo #'(lambda (...) ...)))  ... (funcall foo ...)).)
> 
> Yeah, didn't I include such an example somewhere?
> 
> >> CL has a lot of things done twice, in one way for functions and one
> >> way for variables. (Scheme uses define both for variables and
> >> functions, for example, and uses the same let for variables and
> >> functions).
> >
> > That is merely a syntactic difference. It wouldn't be hard to write a
> > macro DEFINE which would expand into DEFUN or DEFVAR as appropriate
> > (forgetting for a minute that CL does not have top level lexicals).
> 
> No, it's more than a syntactic difference - you still need to have
> both DEFUN and DEFVAR implemented underneath.

Ok, but a DEFINE could be implemented in a CL directly, without using
DEFUN and DEFVAR too. The only difference is which cell of the symbol
something is put.


Björn
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <871xn0ymev.fsf@handgranat.org>
·······@nada.kth.se (Bj�rn Lindberg) writes:
> The number of users of a language far exceeds its learners. A language
> design meant for real use should be optimized with this in
> consideration.

Of course.

> I disagree. It describes intent, and can help make the code clearer
> (to a human reader). But if this is your view, I am not surprised you
> have a hard time coming to terms with CL. FLET and LABELS, FIRST &
> REST vs CAR & CDR, DEFVAR, DEFPARAMETER, DEFCONSTANT, etc. The number
> of similar-but-slightly-different operators abound...

Yeah, I think you understand me now. I'm thinking that Once And Only
Once can lead to code that's simultaneously short and clear.

> Ok, but a DEFINE could be implemented in a CL directly, without using
> DEFUN and DEFVAR too. The only difference is which cell of the symbol
> something is put.

All right.

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <fx4k70r4dhc.fsf_-_@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> > And the disadvantage of using flet would be...?
> 
> It's a disadvantage for implementors and for learners of the
> language. It's brain clutter.

But natural languages are quite cluttered.  It's far from clear that
all "clutter" creates problems for learning.

Natural languages also have a lot of redundancy, which tends
to make it easier to get the intended meaning.

Common Lisp is more like a natural language than Scheme is.

They very things that seem bad about Common Lisp to someone
wanting a maximally rational design are, arguably, good.
Or at least harmless, low cost.  Otherwise, they'd have
been eliminated somewhere along the way.  Probably.

Languages that evolve, that are shaped by use and history,
become adapted to human needs.  They can be better adapted
than the language you get from a rational design.

From the rationalist point of view, Common Lisp's namespace
decisions may seem arbitrary.  Why have those particular ones?
Why not a namespace per type?  What is the argument that shows
Lisp-2 is best?

In fact, a namespace per type is not a completely mad idea.

In mathematics, a number of different alphabets and fonts
are used, so that it's usually clear what kind of thing
each symbol refers to.

Sometimes, the conventions change.  For example, in model
theory, strange-looking German script letters were used
for models.  But after (quite) a while, model theorists
started using ... (wait for it) ... M.  And other ordinary
capital letters.

(This is, for most people, easier to read.)

So, in effect, mathematicians use lots of namespaces.  That
they use A for something doesn't stop them from using
alpha, or lower-case a, or bold A, for another.

But programming languages haven't gone that way, for a
variety of reasons.  Probably, that's a good decision.
That doesn't mean that anyone has an argument that shows
it is absolutely the right decision.

Note that programmers often create namespaces by convention.
In Java, class names begin with a capital, method names to
not.  So you can have a class and method of the same name
(modulo case).

A naming convention, perhaps enforced by the language, is an
alternative to having separate function and value namespaces.
In Common Lisp, we have the *-name convention for specials,
which helps keep them from interfering with ordinary variables.

So there are many possibilities, and Common Lisp is at a
point in this space.  I think it's likely that the combination
of design and history that led to current Common Lisp has
placed it at a not-too-bad point, regardless of how arbitrary
that position seems from maximally rational point of view.

-- jd
From: Cameron MacKinnon
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <cemdnf4DQfywW-ndRVn-uw@golden.net>
Jeff Dalton wrote:
> But natural languages are quite cluttered.  It's far from clear that
> all "clutter" creates problems for learning.
> 
> Natural languages also have a lot of redundancy, which tends
> to make it easier to get the intended meaning.

Speaking face to face, expressions and body language help communication. 
On the telephone, voice pitch and modulation give clues. Handwriting 
offers a variety of modes of expression. Typesetting allows only font, 
size, bold, italic and underscoring. Computer code typically allows none 
of the above. And as the expressive possibilities decrease in each of 
the above, the amount of formality considered appropriate to the medium 
(indeed, required) increases.

> Common Lisp is more like a natural language than Scheme is.

oh i #'agree completely


> They very things that seem bad about Common Lisp to someone
> wanting a maximally rational design are, arguably, good.
> Or at least harmless, low cost.  Otherwise, they'd have
> been eliminated somewhere along the way.  Probably.

I can't agree with the eliminated along the way bit. Things almost never 
get eliminated from computer languages, because the "our legacy code 
would break with the new tools" whiners outvote the rationalization crowd.

> In fact, a namespace per type is not a completely mad idea.

See Perl.

> Sometimes, the [math] conventions change.  For example, in model
> theory, strange-looking German script letters were used
> for models.  But after (quite) a while, model theorists
> started using ... (wait for it) ... M.  And other ordinary
> capital letters.

I don't know about model theory, but some mathematicians just change 
their notation because they're sick of fighting with their 
computer-typesetting vendors, no?

In the Lisp world, millions of keystrokes have been wasted because 
Church was too inconsiderate to pick mu, nu, xi or pi. And other 
computer languages, APL aside, are stuck choosing their symbols from the 
completely irrational quasi-standard set inherited from typewriters.

So in the computer world, a lot of the freedom for language evolution 
has been artificially circumscribed by legacy peripheral devices and 
legacy code. I suppose the newest human languages (text messaging) 
suffer from similar considerations.

> Note that programmers often create namespaces by convention.
> In Java, class names begin with a capital, method names to
> not.  So you can have a class and method of the same name
> (modulo case).

Now there's an idea! Steal from German. Use initial caps for functions. 
Emacs could be made smart enough to autocapitalize the car of unquoted 
lists. SLIME could maybe help Emacs autocapitalize functional arguments. 
Good for typing, good for legibility, and no more #'. Of course, all 
those people still using uppercase-only terminals would be out of luck.  :-(



-- 
Cameron MacKinnon
Toronto, Canada
From: Jeff Dalton
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <fx465ca72es.fsf@todday.inf.ed.ac.uk>
Cameron MacKinnon <··········@clearspot.net> writes:

> Jeff Dalton wrote:

> > They very things that seem bad about Common Lisp to someone
> > wanting a maximally rational design are, arguably, good.
> > Or at least harmless, low cost.  Otherwise, they'd have
> > been eliminated somewhere along the way.  Probably.

> I can't agree with the eliminated along the way bit. Things almost
> never get eliminated from computer languages, because the "our legacy
> code would break with the new tools" whiners outvote the
> rationalization crowd.

Nonetheless, much rationalization happened during the design
of Common Lisp, and other things were eliminated earlier on.

Otherwise, we'd still have COMMON from Lisp 1.5 as well
as SPECIAL, and much else as well.

Before Common Lisp, people typically designed their own
dialect when they did an implementation, putting in the things
they liked and leaving out ones they especially didn't.

-- jd
From: Cameron MacKinnon
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <LdOdnRtv7pDDAejd4p2dnA@golden.net>
Jeff Dalton wrote:
> Nonetheless, much rationalization happened during the design
> of Common Lisp, and other things were eliminated earlier on.

I guess this makes Lisp the only language old and diverse enough to have 
had one of those great evolutionary purges that the biologists are 
always on about.

The Lisp family tree!
http://www.bath.ac.uk/~cs2ylh/lispweb/Familytree.GIF

Not that you'd get the sense that Scheme is still thriving from the 
diagram. And that "Lisp 2(disaster)" annotation all on its own is 
intriguing...

-- 
Cameron MacKinnon
Toronto, Canada
From: Carl Shapiro
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <ouyoeq26xqf.fsf@panix3.panix.com>
Cameron MacKinnon <··········@clearspot.net> writes:

> The Lisp family tree!
> http://www.bath.ac.uk/~cs2ylh/lispweb/Familytree.GIF
> 
> Not that you'd get the sense that Scheme is still thriving from the
> diagram. And that "Lisp 2(disaster)" annotation all on its own is
> intriguing...

And completely false.

Lisp 2 had a number of incredibly interesting features such as the
elimination the interpreter, running all code compiled, all of the
time.

The family tree you have found is full of mistakes (such as this one)
and cannot be taken seriously.
From: Dave Roberts
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <OP3dc.210372$1p.2431273@attbi_s54>
Jeff Dalton wrote:

> Sunnan <······@handgranat.org> writes:
> 
>> > And the disadvantage of using flet would be...?
>> 
>> It's a disadvantage for implementors and for learners of the
>> language. It's brain clutter.
> 
> But natural languages are quite cluttered.  It's far from clear that
> all "clutter" creates problems for learning.
> 
> Natural languages also have a lot of redundancy, which tends
> to make it easier to get the intended meaning.

Don't confuse redundancy with this issue. Yes, natural languages have lots
of redundancy. There are many ways to say the same thing. That is not the
case here. The distinction between LET and FLET is not subject to my
personal mood at the time and whether I would suddenly choose one versus
the other. They are not redundant. They are different. The language forces
me to choose between LET and FLET depending on what I'm trying to
accomplish. Thus, they are "redundant" in the same way that "black" and
"white" are redundant in English... ;-)

> Common Lisp is more like a natural language than Scheme is.

It's more "real world" than scheme is, but that's because, like languages,
it's more the result of multiple variations of a common "mother tongue"
intermingling over the years. Put another way, CL evolved whereas Scheme
was designed.

> They very things that seem bad about Common Lisp to someone
> wanting a maximally rational design are, arguably, good.
> Or at least harmless, low cost.  Otherwise, they'd have
> been eliminated somewhere along the way.  Probably.

No, because CL was specifically created to bring together diverging Lisp
dialects (as I understand it; I'm a newbie and wasn't there ;-). If that
was the stated goal, then the reality is that CL actively sought to unify
(in the sense of creating a union) all the various features that many Lisp
dialects had created. This is exactly the opposite of paring things down to
the simplest possible model. Now that it's standardized, the baggage must
be maintained. Now, lest you think I'm slighting CL, I'm not. You can argue
quite well that CL's diversity is the result of real-world needs and thus
reflects a more broad set of concerns.

> Languages that evolve, that are shaped by use and history,
> become adapted to human needs.  They can be better adapted
> than the language you get from a rational design.

Yes, exactly.

[snip...]
> So there are many possibilities, and Common Lisp is at a
> point in this space.  I think it's likely that the combination
> of design and history that led to current Common Lisp has
> placed it at a not-too-bad point, regardless of how arbitrary
> that position seems from maximally rational point of view.

Right. I agree with that. As a newbie, I went through the "which dialect of
Lisp should I learn?" question. I have a soft spot for Scheme because I got
a small dosage of that first and I truthfully love simple things. Scheme is
very simple. I find that the more I reduce the complexity of a programming
language, the more I can concentrate on the problem at hand. Scheme is nice
that way. That said, the library situation in Scheme-land seems even worse
off than the library situation in CL-land, and is likely to remain so
precisely because so little of useful Scheme is standardized. Thus, I'm
learning CL. I find that even CL is a lot more simple than any other
language in terms of basic structure. The extended set of CL functions,
which is really a standardized library of sorts, is, of course, huge, but
there is a lot of time to learn a library. I'm finding that I'm productive
in CL after just a couple of weeks of night-time learning of the core
concepts.

All that said, I don't yet know where I really come down on the Lisp-1/2
debate. I can see some advantages to both. My hunch is that like most
things, it's a balancing act. Pick your poison, so to speak. There are
advantages and disadvantages to both. Each side can trot out a pathelogical
case where the other side seems to do dumb things ("(list list)" vs. "(list
lst)"). The question is which one comes up more in daily practice? My hunch
is that Lisp-2 is probably better for real-world usage, but there is a very
elegant simplicity to Lisp-1.

-- 
Dave Roberts
·············@re-move.droberts.com
From: Sunnan
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <87zn9mrg0j.fsf@handgranat.org>
Dave Roberts <·············@re-move.droberts.com> writes:

> Each side can trot out a pathelogical case where the other side
> seems to do dumb things ("(list list)" vs. "(list lst)"). The
> question is which one comes up more in daily practice?

While I have no proof, my hunch is that the language actually changes
which one comes up more. Someone weaned on scheme wouldn't want to do
(list list) so much since they're used to the one namespace, while a
CL'er passes around functions slightly less.

This is ever so slightly, but a language changes the way the
programmer writes.

-- 
One love,
Sunnan
From: Dave Roberts
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <Ehedc.222877$po.1126603@attbi_s52>
Sunnan wrote:

> This is ever so slightly, but a language changes the way the
> programmer writes.

Very true. I guess that's my point. Obviously, the pathelogical cases can be
thrown around, but is it really a hardship to avoid them or do you just do
it without thinking about it.

My gut feeling is that you can probably avoid the pathelogical case in CL a
bit more naturally than in Scheme, but that it really doesn't limit
anything in practice.

-- 
Dave Roberts
·············@re-move.droberts.com
From: Pascal Costanza
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <c53sm2$t4g$1@newsreader2.netcologne.de>
Dave Roberts wrote:

> My gut feeling is that you can probably avoid the pathelogical case in CL a
> bit more naturally than in Scheme, but that it really doesn't limit
> anything in practice.

That's indeed an important point. Guys, you should all realize that 
these things are really only minor details compared to the overall 
essential concepts in Lisp and Scheme, i.e. basically metacircularity 
and the resulting freedom to mold the language to your concrete needs at 
hand. Scheme and Common Lisp just follow different fundamental design 
principles, and if you happen to prefer either the one or the other, 
there's nothing wrong in sticking with just one of them. Personally, I 
prefer Common Lisp, because I generally think that its designers have 
made more "right" decisions than Scheme, but I don't have any problem 
imagining to use Scheme in a world in which Common Lisp wouldn't exist.

None of the differences between Scheme and Common Lisp restrict you in 
similar fundamental ways like those of most other languages.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Jeff Dalton
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <fx4ad1m731m.fsf@todday.inf.ed.ac.uk>
Dave Roberts <·············@re-move.droberts.com> writes:

> Jeff Dalton wrote:

> > Natural languages also have a lot of redundancy, which tends
> > to make it easier to get the intended meaning.

> Don't confuse redundancy with this issue. Yes, natural languages have lots
> of redundancy. There are many ways to say the same thing.

That's not what I meant by redundancy.  I meant it in the
information theoretic sense, applied to espressions in the
langauge, rather than to the langauge as a whole.  Individual
sentences, for instance, will have some redundancy.  Even words,
which is why misspelled words can so often be understood
even on their own.

For a Lisp example, consider that programmers use parens 
and indentation.  Even though the parens already give all the
grouping information, the indentation gives some of it again.

In CL, #'(lambda ...) doesn't really need the #'.
(lambda ...) would be enough on its own to indicate
the expression was a function.  And indeed, CL changed
to allow programmers to write it that way.  But there's
less redundancy when the #' is omitted.

> Put another way, CL evolved whereas Scheme was designed.

Evolution and design were factors for both; but Scheme is
further towards the design end, yes.  

> > They very things that seem bad about Common Lisp to someone
> > wanting a maximally rational design are, arguably, good.
> > Or at least harmless, low cost.  Otherwise, they'd have
> > been eliminated somewhere along the way.  Probably.
> 
> No, because CL was specifically created to bring together diverging
> Lisp dialects

That's true in a sense, but misleading.

Common Lisp was created to bring together some MacLisp successor
dialects that were already quite similar; and for the most part it
stayed in that area of language space after more of the Lisp world
became involved.

> If that
> was the stated goal, then the reality is that CL actively sought to unify
> (in the sense of creating a union) all the various features that many Lisp
> dialects had created. 

Common Lisp is not a union of all features.
Far from it.  The list of Interlisp features not in Common
Lisp is almost as long as the list of Interlisp features.  :)

Also, CL is a significant cleanup compared to most earlier
Lisps.  In some ways, it became messier during standardization,
but there was tidying and cleanup during standardization as
well.

Moreover, some of the messier parts of Lisp 1.5 were eliminated
well before CL.  For instance, in Maclisp.

> All that said, I don't yet know where I really come down on the Lisp-1/2
> debate. I can see some advantages to both. My hunch is that like most
> things, it's a balancing act. Pick your poison, so to speak.

Yes, I think that's right.

-- jd
From: Andy Freeman
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <8bbd9ac3.0404081507.76ff9c68@posting.google.com>
Jeff Dalton <····@todday.inf.ed.ac.uk> wrote in message news:<···············@todday.inf.ed.ac.uk>...
> Dave Roberts <·············@re-move.droberts.com> writes:
> > No, because CL was specifically created to bring together diverging
> > Lisp dialects
> 
> That's true in a sense, but misleading.
> 
> Common Lisp was created to bring together some MacLisp successor
> dialects that were already quite similar;

MacLisp, InterLisp, PSL, and others were close enough that one could
define a single language using a small set of language-specific
macros and functions.  Programs written in this language could
run in any of these lisps.

-andy
From: Jeff Dalton
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <fx4zn9m6itk.fsf@tarn.inf.ed.ac.uk>
······@earthlink.net (Andy Freeman) writes:

> Jeff Dalton <····@todday.inf.ed.ac.uk> wrote in message news:<···············@todday.inf.ed.ac.uk>...
> > Dave Roberts <·············@re-move.droberts.com> writes:
> > > No, because CL was specifically created to bring together diverging
> > > Lisp dialects
> > 
> > That's true in a sense, but misleading.
> > 
> > Common Lisp was created to bring together some MacLisp successor
> > dialects that were already quite similar;
> 
> MacLisp, InterLisp, PSL, and others were close enough that one could
> define a single language using a small set of language-specific
> macros and functions.  Programs written in this language could
> run in any of these lisps.

Are you disagreeing with me, or just adding a comment?

Standard Lisp was meant to be something that could easily be
implemented in various Lisps in the fashion you suggest, and
I know of other attempts to define such a language.

At one point during the time of Lisp standardization, I
defined a common subset of Common Lisp, EuLisp, and Le Lisp.

However, such languages tend to be fairly small.  In anything
in Interlisp really needed the spaghetti stack, for example,
it wouldn't be trivial to get it going in MacLisp.

-- jd
From: Andy Freeman
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <8bbd9ac3.0404092140.7b566cc7@posting.google.com>
Jeff Dalton <····@tarn.inf.ed.ac.uk> wrote in message news:<···············@tarn.inf.ed.ac.uk>...
> ······@earthlink.net (Andy Freeman) writes:
> > Jeff Dalton <····@todday.inf.ed.ac.uk> wrote in message news:<···············@todday.inf.ed.ac.uk>...
> > > Common Lisp was created to bring together some MacLisp successor
> > > dialects that were already quite similar;
> > 
> > MacLisp, InterLisp, PSL, and others were close enough that one could
> > define a single language using a small set of language-specific
> > macros and functions.  Programs written in this language could
> > run in any of these lisps.
> 
> Are you disagreeing with me, or just adding a comment?

My point is that the different lisps were actually similar enough
for some non-trivial purposes.

> However, such languages tend to be fairly small.  In anything
> in Interlisp really needed the spaghetti stack, for example,
> it wouldn't be trivial to get it going in MacLisp.

Huh?  We didn't think about "what mechanism does a language use to
provide relevant functionality"?  We worried about "what functionality
do we support and how do we support it in the various implementations"?

Our language was rich enough to let people who wouldn't work in a
common/single lisp dialect share code used for a fairly large research
project.  Most of them probably felt that things would be better if
others would see the light, but that wasn't going to happen.  Instead,
they had to choose between sharing and writing code written in this
invented language and not sharing code and/or constantly porting.
From: Jeff Dalton
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <fx4y8p34ub0.fsf@tarn.inf.ed.ac.uk>
······@earthlink.net (Andy Freeman) writes:

> Jeff Dalton <····@tarn.inf.ed.ac.uk> wrote in message news:<···············@tarn.inf.ed.ac.uk>...
> > ······@earthlink.net (Andy Freeman) writes:
> > > Jeff Dalton <····@todday.inf.ed.ac.uk> wrote in message news:<···············@todday.inf.ed.ac.uk>...
> > > > Common Lisp was created to bring together some MacLisp successor
> > > > dialects that were already quite similar;
> > > 
> > > MacLisp, InterLisp, PSL, and others were close enough that one could
> > > define a single language using a small set of language-specific
> > > macros and functions.  Programs written in this language could
> > > run in any of these lisps.
> > 
> > Are you disagreeing with me, or just adding a comment?
> 
> My point is that the different lisps were actually similar enough
> for some non-trivial purposes.

Yes, but what do you think the significance of that is in this
context?  Why are you making that point here and now?

> > However, such languages tend to be fairly small.  In anything
> > in Interlisp really needed the spaghetti stack, for example,
> > it wouldn't be trivial to get it going in MacLisp.
> 
> Huh?  We didn't think about "what mechanism does a language use to
> provide relevant functionality"?  We worried about "what functionality
> do we support and how do we support it in the various implementations"?

How did you provide the functionality that the spaghetti stack
provide for Interlisp, for example?

> Our language was rich enough to let people who wouldn't work in a
> common/single lisp dialect share code used for a fairly large research
> project.  Most of them probably felt that things would be better if
> others would see the light, but that wasn't going to happen.  Instead,
> they had to choose between sharing and writing code written in this
> invented language and not sharing code and/or constantly porting.

Who is "we"?  What language is "our language"?

Anyway, what I said about Common Lisp was true.  As just one piece
of evidece, there is CLtL II, p 1:

  Common Lisp originated in an attempt to focus the work of
  several implementation groups, each of which was consctucting
  successor implementations of MacLisp for different computers.

The same introduction does mention compatibility with Interlisp,
but little in CL can be traced back to Interlisp.

When CL implementations first came along, I was most used
to Franz Lisp (a close relative of MacLisp), and I could
use Common Lisp right away.  A lot of Franz Lisp code would work
unchanged.  Attempting to use Interlisp was a very different
experience.  PSL was further from Common Lisp than Franz Lisp
was, but much closer than Interlisp.

-- jd
From: Andy Freeman
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <8bbd9ac3.0404101714.48a927d0@posting.google.com>
Jeff Dalton <····@tarn.inf.ed.ac.uk> wrote in message news:<···············@tarn.inf.ed.ac.uk>...
> ······@earthlink.net (Andy Freeman) writes:
> > > However, such languages tend to be fairly small.  In anything
> > > in Interlisp really needed the spaghetti stack, for example,
> > > it wouldn't be trivial to get it going in MacLisp.
> > 
> > Huh?  We didn't think about "what mechanism does a language use to
> > provide relevant functionality"?  We worried about "what functionality
> > do we support and how do we support it in the various implementations"?
> 
> How did you provide the functionality that the spaghetti stack
> provide for Interlisp, for example?

None of our users ever wanted upwards funargs or multi-threading, so
it never came up.

Note that these folks were extremly problem driven - they didn't have
much interest in programming languages as such.

> > Our language was rich enough to let people who wouldn't work in a
> > common/single lisp dialect share code used for a fairly large research
> > project.  Most of them probably felt that things would be better if
> > others would see the light, but that wasn't going to happen.  Instead,
> > they had to choose between sharing and writing code written in this
> > invented language and not sharing code and/or constantly porting.
> 
> Who is "we"?  What language is "our language"?

We called it ·@Lisp" and it was used by a couple of AI/KSL projects
at Stanford and associated institutions in the early 80s.  I suspect
that much of it got ported to CommonLisp as such a port was trivial,
even in early implementations.
 
> Anyway, what I said about Common Lisp was true.

No one said that it wasn't.

-andy
From: Don Geddis
Subject: Re: Why flet wins
Date: 
Message-ID: <87wu4p2cwb.fsf_-_@sidious.geddis.org>
Jeff Dalton <····@todday.inf.ed.ac.uk> wrote on 08 Apr 2004 19:1:
> Individual sentences, for instance, will have some redundancy.  Even words,
> which is why misspelled words can so often be understood even on their own.

Which reminds me of this:

      Aoccdrnig to a rseearch at Cmabrigde Uinervtisy, it deosn't mttaer in
      waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht
      the frist and lsat ltteer be at the rghit pclae. The rset can be a
      total mses and you can sitll raed it wouthit porbelm. Tihs is bcuseae
      the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a
      wlohe.

_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
From: Joe Marshall
Subject: Re: Why flet wins
Date: 
Message-ID: <3c7csvw3.fsf@ccs.neu.edu>
Don Geddis <···@geddis.org> writes:

> Jeff Dalton <····@todday.inf.ed.ac.uk> wrote on 08 Apr 2004 19:1:
>> Individual sentences, for instance, will have some redundancy.  Even words,
>> which is why misspelled words can so often be understood even on their own.
>
> Which reminds me of this:
>
>       Aoccdrnig to a rseearch at Cmabrigde Uinervtisy, it deosn't mttaer in
>       waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht
>       the frist and lsat ltteer be at the rghit pclae. The rset can be a
>       total mses and you can sitll raed it wouthit porbelm. Tihs is bcuseae
>       the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a
>       wlohe.

Debunked.

    Trhee is nhitnog mroe dfcifulit to tkae in hnad, mroe poerulis to
    cdcunot, or mroe ucatinren in its scusces, tahn to tkae the laed in
    the iordcuoitntn of a new oerdr of tngihs.

    A. A vonilet oerdr is derdisor; and
    B. A garet dresdoir is an oderr. Tshee
    Two tgnhis are one.
From: Jeff Dalton
Subject: Re: Why flet wins
Date: 
Message-ID: <fx4ekqt9ojq.fsf@tarn.inf.ed.ac.uk>
Joe Marshall <···@ccs.neu.edu> writes:

> Don Geddis <···@geddis.org> writes:
> 
> > Jeff Dalton <····@todday.inf.ed.ac.uk> wrote on 08 Apr 2004 19:1:
> >> Individual sentences, for instance, will have some redundancy.
> >> Even words, which is why misspelled words can so often be
> >> understood even on their own.
> >
> > Which reminds me of this:
> >
> >       Aoccdrnig to a rseearch at Cmabrigde Uinervtisy, it deosn't mttaer in
> >       waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht
> >       the frist and lsat ltteer be at the rghit pclae. The rset can be a
> >       total mses and you can sitll raed it wouthit porbelm. Tihs is bcuseae
> >       the huamn mnid deos not raed ervey lteter by istlef, but the
> >       wrod as a wlohe.
> 
> Debunked.

Still, it works for surprisingly many words, especially in context.

-- jd
From: Brian Mastenbrook
Subject: Re: Why flet wins
Date: 
Message-ID: <120420041259569769%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <···············@tarn.inf.ed.ac.uk>, Jeff Dalton
<····@tarn.inf.ed.ac.uk> wrote:

> Still, it works for surprisingly many words, especially in context.

It ddnepes on how jelbmud the cxetnot is, and how mcuh the ioretinr of
wdros are slbmarecd. Taht elpmaxe tedned to mlerey ptumree the pemenohs
and not the eritne wrod. A mroe rodnam slbnimarcg aroghtilm swohs taht
in the vsat mtirojay of cesas wtih lgrae or uommocnn wdros, it tekas
smoe erofft to ddocee the rluset.

http://www.iddqd.org/~chandler/lispcode/perturb/perturb.lisp

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: William Bland
Subject: Re: Why flet wins
Date: 
Message-ID: <pan.2004.04.12.18.09.08.512107@abstractnonsense.com>
On Mon, 12 Apr 2004 12:59:56 -0500, Brian Mastenbrook wrote:

> In article <···············@tarn.inf.ed.ac.uk>, Jeff Dalton
> <····@tarn.inf.ed.ac.uk> wrote:
> 
>> Still, it works for surprisingly many words, especially in context.
> 
> It ddnepes on how jelbmud the cxetnot is, and how mcuh the ioretinr of
> wdros are slbmarecd. Taht elpmaxe tedned to mlerey ptumree the pemenohs
> and not the eritne wrod. A mroe rodnam slbnimarcg aroghtilm swohs taht
> in the vsat mtirojay of cesas wtih lgrae or uommocnn wdros, it tekas
> smoe erofft to ddocee the rluset.
> 
> http://www.iddqd.org/~chandler/lispcode/perturb/perturb.lisp

Now that's strange.  If I read the above slowly, trying to figure
out what each word is, I find it difficult.  Whereas if I just
scan the lines at the rate I usually would when reading, I still
find it quite readable. Any idea why?

Cheers,
	Bill.
-- 
Dr. William Bland                          www.abstractnonsense.com
Computer Programmer                           awksedgrep (Yahoo IM)
Any sufficiently advanced Emacs user is indistinguishable from magic
From: Christopher C. Stacy
Subject: Re: Why flet wins
Date: 
Message-ID: <uisg42652.fsf@news.dtpq.com>
So, is there an Emacs minor mode that does that automatically?
From: William Bland
Subject: Re: Why flet wins
Date: 
Message-ID: <pan.2004.04.12.22.28.21.57082@abstractnonsense.com>
On Mon, 12 Apr 2004 22:17:14 +0000, Christopher C. Stacy wrote:

> So, is there an Emacs minor mode that does that automatically?

There's emacs minor modes that do *everything* automatically, you
just have to find them (note my .sig doesn't say *I'm* a
sufficiently advance Emacs user - I've been using it daily for four
years now and I don't expect to reach enlightenment for at least
another four!).

Cheers,
	Bill.
-- 
Dr. William Bland                          www.abstractnonsense.com
Computer Programmer                           awksedgrep (Yahoo IM)
Any sufficiently advanced Emacs user is indistinguishable from magic
From: Gareth McCaughan
Subject: Re: Why flet wins
Date: 
Message-ID: <87d66ddvqd.fsf@g.mccaughan.ntlworld.com>
Joe Marshall wrote:

> Don Geddis <···@geddis.org> writes:
> 
> > Jeff Dalton <····@todday.inf.ed.ac.uk> wrote on 08 Apr 2004 19:1:
> >> Individual sentences, for instance, will have some redundancy.  Even words,
> >> which is why misspelled words can so often be understood even on their own.
> >
> > Which reminds me of this:
> >
> >       Aoccdrnig to a rseearch at Cmabrigde Uinervtisy, it deosn't mttaer in
> >       waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht
> >       the frist and lsat ltteer be at the rghit pclae. The rset can be a
> >       total mses and you can sitll raed it wouthit porbelm. Tihs is bcuseae
> >       the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a
> >       wlohe.
> 
> Debunked.
> 
>     Trhee is nhitnog mroe dfcifulit to tkae in hnad, mroe poerulis to
>     cdcunot, or mroe ucatinren in its scusces, tahn to tkae the laed in
>     the iordcuoitntn of a new oerdr of tngihs.
> 
>     A. A vonilet oerdr is derdisor; and
>     B. A garet dresdoir is an oderr. Tshee
>     Two tgnhis are one.

I agree that it's been somewhat debunked, but I didn't find your
text hard to read. It would be interesting to experiment with
various distortions of words and see how they affect readability;
for instance, how much does preserving the pattern of ascenders
and descenders help? The pattern of vowels and consonants? The
(multi)set of letters present? Probably a lot of this has been
done already.

-- 
Gareth McCaughan
.sig under construc
From: Christophe Rhodes
Subject: Re: Why flet wins
Date: 
Message-ID: <sqlll1maa6.fsf@lambda.dyndns.org>
Gareth McCaughan <················@pobox.com> writes:

>> Don Geddis <···@geddis.org> writes:
>> 
>> > Which reminds me of this:
>> >
>> >       Aoccdrnig to a rseearch at Cmabrigde Uinervtisy, [...]
>> 
> [...] It would be interesting to experiment with
> various distortions of words and see how they affect readability;
> for instance, how much does preserving the pattern of ascenders
> and descenders help? The pattern of vowels and consonants? The
> (multi)set of letters present? Probably a lot of this has been
> done already.

... though I suspect not at Cambridge University.

Christophe
-- 
http://www-jcsu.jesus.cam.ac.uk/~csr21/       +44 1223 510 299/+44 7729 383 757
(set-pprint-dispatch 'number (lambda (s o) (declare (special b)) (format s b)))
(defvar b "~&Just another Lisp hacker~%")    (pprint #36rJesusCollegeCambridge)
From: Gareth McCaughan
Subject: Re: Why flet wins
Date: 
Message-ID: <874qrpdqrq.fsf@g.mccaughan.ntlworld.com>
Christophe Rhodes wrote:

> Gareth McCaughan <················@pobox.com> writes:
> 
> >> Don Geddis <···@geddis.org> writes:
> >> 
> >> > Which reminds me of this:
> >> >
> >> >       Aoccdrnig to a rseearch at Cmabrigde Uinervtisy, [...]
> >> 
> > [...] It would be interesting to experiment with
> > various distortions of words and see how they affect readability;
> > for instance, how much does preserving the pattern of ascenders
> > and descenders help? The pattern of vowels and consonants? The
> > (multi)set of letters present? Probably a lot of this has been
> > done already.
> 
> ... though I suspect not at Cambridge University.

Indeed, still less at Cmabrigde Uinervtisy.

-- 
Gareth McCaughan
.sig under construc
From: Thomas A. Russ
Subject: Re: Why flet wins, was Re: scheme seems neater
Date: 
Message-ID: <ymiy8onr8ie.fsf@sevak.isi.edu>
Dave Roberts <·············@re-move.droberts.com> writes:

> > Jeff Dalton:
> > Natural languages also have a lot of redundancy, which tends
> > to make it easier to get the intended meaning.
> 
> Don't confuse redundancy with this issue. Yes, natural languages have lots
> of redundancy. There are many ways to say the same thing. That is not the
> case here. The distinction between LET and FLET is not subject to my
> personal mood at the time and whether I would suddenly choose one versus
> the other. They are not redundant. They are different. The language forces
> me to choose between LET and FLET depending on what I'm trying to
> accomplish. Thus, they are "redundant" in the same way that "black" and
> "white" are redundant in English... ;-)

If I understand Jeff Dalton's point about redundancy, it isn't that LET
and FLET are redundant with each other.

Rather, it is that having to specify both a function body and use FLET
(rather than, say a lambda expression and LET) is what is redundant.  In
other words, FLET and a function value both give the same information.
That is the redundancy that you get from the situation.

-- 
Thomas A. Russ,  USC/Information Sciences Institute
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c4uu60$ih2$1@f1node01.rhrz.uni-bonn.de>
Sunnan wrote:

> Pascal Costanza <········@web.de> writes:
> 
>>>Maybe people who adore CL-style punning could look into having
>>>different namespaces for other types, as well - a string and an int
>>>having the same name in the same variable, for example. Of course(?),
>>>that would be annoying - and that's the same argument that lisp-1
>>>proponents present for uniting the namespace of functions with other
>>>types.
>>
>>No, it's not. The one case would be annoying, the other case is not.
> 
> Care to clarify?

I don't find #' annoying.

>>BTW, what's the point of this discussion?
> 
> The point of this discussion, for me, is to find out whether there are
> any good technical (as opposed to historical/political) reasons why
> one would, today, create a new lisp-2, and if so, trying to understand
> those reasons.

Lisp-2 removes the most important name capturing issues in macros. I 
like Common Lisp's defmacro more than Scheme's syntax-rules/syntax-case. 
The Lisp-1 vs. Lisp-2 is a purely aesthetical issue, whereas being able 
to write reasonable macros is an issue with clear practical relevance.

(BTW, I don't think it makes sense to create a new Lisp-2. You'd only be 
reinventing the wheel.)

Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87ad1o538u.fsf@handgranat.org>
Pascal Costanza <········@web.de> writes:
> Sunnan wrote:
>> Care to clarify?
>
> I don't find #' annoying.

The other types could be accompanied by similar read-macros.
Or wait, let's just drop this paragraph, my argument was silly in
retrospect.

> Lisp-2 removes the most important name capturing issues in macros. I
> like Common Lisp's defmacro more than Scheme's
> syntax-rules/syntax-case. The Lisp-1 vs. Lisp-2 is a purely
> aesthetical issue, whereas being able to write reasonable macros is an
> issue with clear practical relevance.

Unless I've misunderstood something:

CL-ish defmacro can be implemented in ten lines of syntax-case and is
provided with many implementations of scheme.

The original scheme community was reluctant to add macros but for
reasons other than the lisp-1/lisp-2 issue.

> (BTW, I don't think it makes sense to create a new Lisp-2. You'd only
> be reinventing the wheel.)

Ok.

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4vfkcg92i.fsf@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> Unless I've misunderstood something:
> 
> CL-ish defmacro can be implemented in ten lines of syntax-case

Let's hame 'em.

> and is provided with many implementations of scheme.

Yes, but had zero chance of getting into the standard once
hygienic macros came along.

> The original scheme community was reluctant to add macros but for
> reasons other than the lisp-1/lisp-2 issue.

The name conflict issues in lisp-1 were important.  I don't
know about originally (the original scheme had macros written
in a different language (MacLisp rather than Scheme) or
outputting a different language (MacLisp again), I forget
which).

But by the time of CL standardization work (mid 80s),
hygiene was the biggest issue, and the greater problem
of capture in Lisp-1 was part of that.

They were one reason why Common Lisp couldn't just become
a Lisp-1, as well.

-- jd
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87isgc259v.fsf@handgranat.org>
Jeff Dalton <····@todday.inf.ed.ac.uk> writes:

> Sunnan <······@handgranat.org> writes:
>
>> Unless I've misunderstood something:
>> 
>> CL-ish defmacro can be implemented in ten lines of syntax-case
>
> Let's hame 'em.

(define-syntax (define-macro x)
  (syntax-case x ()
    ((_ (name . args) . body)
     #'(define-macro name (lambda args . body)))
    ((_ name transformer)
     #'(define-syntax (name y)
	 (syntax-case y ()
	   ((k . args)
	    (datum->syntax-object
	     #'k
	     (apply transformer (syntax-object->datum #'args)))))))) )

From chicken's sources. (Don't shoot me, Felix!) Okay, so it's eleven.

>> and is provided with many implementations of scheme.
>
> Yes, but had zero chance of getting into the standard once
> hygienic macros came along.

A bummer. I wish syntax-case could get into the standard, though.

<snip interesting history lesson, don't know what to answer to it>

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4r7v01r0l.fsf@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> 
> > Sunnan <······@handgranat.org> writes:
> >
> >> Unless I've misunderstood something:
> >> 
> >> CL-ish defmacro can be implemented in ten lines of syntax-case
> >
> > Let's hame 'em.
> 
> (define-syntax (define-macro x)
>   (syntax-case x ()
>     ((_ (name . args) . body)
>      #'(define-macro name (lambda args . body)))
>     ((_ name transformer)
>      #'(define-syntax (name y)
> 	 (syntax-case y ()
> 	   ((k . args)
> 	    (datum->syntax-object
> 	     #'k
> 	     (apply transformer (syntax-object->datum #'args)))))))) )
> 
> From chicken's sources. (Don't shoot me, Felix!) Okay, so it's eleven.

Does it impose any hygiene?

(It's been years since I've looked at any Scheme macro sys.)

-- jd
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87fzbgyr55.fsf@handgranat.org>
Jeff Dalton <····@todday.inf.ed.ac.uk> writes:

> Sunnan <······@handgranat.org> writes:
>> (define-syntax (define-macro x)
>>   (syntax-case x ()
>>     ((_ (name . args) . body)
>>      #'(define-macro name (lambda args . body)))
>>     ((_ name transformer)
>>      #'(define-syntax (name y)
>> 	 (syntax-case y ()
>> 	   ((k . args)
>> 	    (datum->syntax-object
>> 	     #'k
>> 	     (apply transformer (syntax-object->datum #'args)))))))) )
>> 
>> From chicken's sources. (Don't shoot me, Felix!) Okay, so it's eleven.
>
> Does it impose any hygiene?

As far as I understand it, the transformations between datum and
syntax-object allows you to get around the hygiene.

It's like datums and syntax-objects have separate namespaces (!) and
you can have explicit translations between them when you actually want
to.

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4smf981wg.fsf@tarn.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> 
> > Sunnan <······@handgranat.org> writes:
> >> (define-syntax (define-macro x)
> >>   (syntax-case x ()
> >>     ((_ (name . args) . body)
> >>      #'(define-macro name (lambda args . body)))
> >>     ((_ name transformer)
> >>      #'(define-syntax (name y)
> >> 	 (syntax-case y ()
> >> 	   ((k . args)
> >> 	    (datum->syntax-object
> >> 	     #'k
> >> 	     (apply transformer (syntax-object->datum #'args)))))))) )
> >> 
> >> From chicken's sources. (Don't shoot me, Felix!) Okay, so it's eleven.
> >
> > Does it impose any hygiene?
> 
> As far as I understand it, the transformations between datum and
> syntax-object allows you to get around the hygiene.
> 
> It's like datums and syntax-objects have separate namespaces (!) and
> you can have explicit translations between them when you actually want
> to.

Ok.  Thanks!

-- jd
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c4v0i2$iiu$1@f1node01.rhrz.uni-bonn.de>
Sunnan wrote:

>>Lisp-2 removes the most important name capturing issues in macros. I
>>like Common Lisp's defmacro more than Scheme's
>>syntax-rules/syntax-case. The Lisp-1 vs. Lisp-2 is a purely
>>aesthetical issue, whereas being able to write reasonable macros is an
>>issue with clear practical relevance.
> 
> Unless I've misunderstood something:
> 
> CL-ish defmacro can be implemented in ten lines of syntax-case and is
> provided with many implementations of scheme.

Sure, but the name capture issue bites you harder with defmacro in 
Scheme, because additionally to those name captures that you sometimes 
might want to have you also get name captures that you will never want 
to have. You don't have the latter in CL.

Furthermore, you can implement Scheme in a few lines in a number of 
languages. Does this make you want to use any of those?

> The original scheme community was reluctant to add macros but for
> reasons other than the lisp-1/lisp-2 issue.

Maybe, I can't comment on that. I am only trying to get across why I 
prefer a Lisp-2. (You can count me as one of those that even adore it. ;)


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87d66k3lpa.fsf@handgranat.org>
Pascal Costanza <········@web.de> writes:
> Furthermore, you can implement Scheme in a few lines in a number of
> languages. Does this make you want to use any of those?

Yes. I use The Gnu Image Manipulation Program, written in C, which has
SIOD as an extention language.

In some schemes, syntax-case is defined in terms of
defmacro+gensym. In others, it's the other way around. It's all
abstraction.

>> The original scheme community was reluctant to add macros but for
>> reasons other than the lisp-1/lisp-2 issue.
>
> Maybe, I can't comment on that. I am only trying to get across why I
> prefer a Lisp-2. (You can count me as one of those that even adore
> it. ;)

Right, which is why I'm pestering you, so that I may understand this
once and for all so I'll get as tired of the issue as all the other
jaded foxes (men menar positivt) that hang out here.

-- 
One love,
Sunnan
From: Hartmann Schaffer
Subject: Re: scheme seems neater
Date: 
Message-ID: <nQFcc.84$wE2.584@newscontent-01.sprint.ca>
In article <··············@handgranat.org>,
	Sunnan <······@handgranat.org> writes:
> ...
> Right, which is why I'm pestering you, so that I may understand this
> once and for all so I'll get as tired of the issue as all the other
> jaded foxes (men menar positivt) that hang out here.

multiple namespaces mean that you can use the same identifier in
different contexts (i doubt that that is news to you), and tastes
differ in where the inability to do so starts to be annoying.  you
might as well make the argument that, e.g. in C, each structure/union
establishing a new namespace for field names is unnecessary or
esthetically displeasing because name mangling (e.g. by chosing a
different prefix for the fields of each structure) can be applied to
resolve problems.

hs

-- 

Patriotism is the last refuge of the scoundrel
                                     Samuel Johnson

Patriotism is your conviction that this country is superior to all
others because you were born in it
                                     George Bernard Shaw
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c4v43b$ih2$1@f1node01.rhrz.uni-bonn.de>
Sunnan wrote:

> Right, which is why I'm pestering you, so that I may understand this
> once and for all

I don't think I can say more about it than I have already said. The rest 
is up to you...


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: =?utf-8?b?QmrDtnJuIExpbmRiZXJn?=
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcssmfh80ed.fsf@fnatte.nada.kth.se>
Sunnan <······@handgranat.org> writes:

> ·······@nada.kth.se (Björn Lindberg) writes:
> > My comment was meant the other way around; I was referring to the name
> > mangling necessary in Scheme, eg "list" becomes "lst", "saw" becomes
> > "sw" and "saugh" respectively, and so forth.
> 
> So was I, or at least I was trying to.
> 
> Name mangling, a.k.a. choosing variable names with care, is always
> necessary when you have two variables that would otherwise have the
> same name, regardless of the number of namespaces involved. I guess
> that's my point summed up as shortly as I can.

You're making it sound like there isn't a significant difference in
the need for name mangling between a Lisp-1 and a Lisp-2. This is
obviously wrong. Common Lisp code usually don't contain any name
mangling at all, whereas in Scheme it is the norm.

> That's why I brought up the (defun mangle-lists (list list list) ...)
> invalid example.

That is a silly example. Are you proposing different namespaces for
the first, second, etc argument to a function? How would that look
like?

> Maybe people who adore CL-style punning 

It is not punning.

> could look into having
> different namespaces for other types, as well - a string and an int
> having the same name in the same variable, for example. 

And what would the advantage of this be do you think?


Björn
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <874qrx564t.fsf@handgranat.org>
·······@nada.kth.se (Bj�rn Lindberg) writes:
> You're making it sound like there isn't a significant difference in
> the need for name mangling between a Lisp-1 and a Lisp-2. This is
> obviously wrong. Common Lisp code usually don't contain any name
> mangling at all, whereas in Scheme it is the norm.

Of course there is a difference.

However, I think that the difference is exaggerated. In my current
scheme project, which is small, I admit (1226 pairs of parens), I
didn't have any name mangling.

Then I was reading through SRFI-1 and discovered the idiom of writing
"lis" for variables called "list", so I went through my code and
changed my "list" variables to "lis" to reflect that idiom (my ideas
on what constitutes idiomatic code and/or good style is somewhat
fluctuating as I gain experience with the language). I now have six
occurences of "lis" - none of them needed because none of them is in a
scope where I use the function "list".

> It is not punning.

I'm sorry. (See also my reply to Jeff.)

>> could look into having
>> different namespaces for other types, as well - a string and an int
>> having the same name in the same variable, for example. 
>
> And what would the advantage of this be do you think?

I can't think of any. I just figure it being an interesting thought
experiment for me to think of advantages of multiple namespaces, since
I'm continually surprised over how people genuinely speak out in favor
of lisp-2.

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4isgdgcel.fsf@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> ·······@nada.kth.se (Bj�rn Lindberg) writes:
> > You're making it sound like there isn't a significant difference in
> > the need for name mangling between a Lisp-1 and a Lisp-2. This is
> > obviously wrong. Common Lisp code usually don't contain any name
> > mangling at all, whereas in Scheme it is the norm.

> Of course there is a difference.

> However, I think that the difference is exaggerated. In my current
> scheme project, which is small, I admit (1226 pairs of parens), I
> didn't have any name mangling.

It's not only the cases where you do it, or have to do it;
it's something that you have to think about, or have to fix
when you get it wrong, where in a Lisp-2 the whole issue
doesn't arise.

> ... I now have six occurences of "lis" - none of them needed
> because none of them is in a scope where I use the function "list".

If I'm doing list processing, I might end up introducing a use
of list as a function.

I might not notice that it would conflict with a variable.

(This is a general problem when writing.  Because you know what
you mean, you tend to see the intended interpretation of what
you write, even if it's not the only one, and even if the words
don't actually support that interpretation.)

Then there might be an annoying or obscure bug.

So I would tend to program defensively and use "lis"; I might
even adopt that as a convention.  I certainly would want to
look around all the time to make sure I can safely use "list"
as a variable.

In a Lisp-2, much of this is avoided.

> I'm continually surprised over how people genuinely speak out in favor
> of lisp-2.

But why is that?  Why are you suprised?  There seems to be a
presumption that lisp-1 is clearly better.

All the arguments for lisp-2 really need to do is to establish
that it is a reasonable choice.

-- jd
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87y8p83ntx.fsf@handgranat.org>
Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> In a Lisp-2, much of this is avoided.

I guess my sentiment is as follows: in a lisp-2 you *still* have to be
careful - and even if the bug comes up a bit more seldom, it can still
bite you.

If I'm going to introduce something annoying to the language, I think
there's some kind of treshold where the benefit needs to be greater
than the annoyance.

(For CL, the radical change of going from a lisp-2 to a lisp-1 could
in itself be a big annoyance.)

>> I'm continually surprised over how people genuinely speak out in favor
>> of lisp-2.
>
> But why is that?  Why are you suprised?  There seems to be a
> presumption that lisp-1 is clearly better.
>
> All the arguments for lisp-2 really need to do is to establish
> that it is a reasonable choice.

The reason for my surprise could be that I've never seen the "choice"
of lisp-2 being consciously made as a design choice. It was a
historical "accident"/consequence of details of the lisp 1.5 reader.

Now, some of my favourite features of lisp is historical accidents, so
this in itself would not've been an argument for lisp-1, I know...

-- 
One love,
Sunnan
From: Erann Gat
Subject: Required reading (was: Re: scheme seems neater)
Date: 
Message-ID: <gNOSPAMat-0604041248060001@k-137-79-50-101.jpl.nasa.gov>
I can't recall if anyone has mentioned this already or not, but the
definitive treatment of the lisp-1/lisp2 issues is:

http://www.dreamsongs.com/Separation.html

IMHO it should be considered required reading for anyone wishing to
participate in a discussion on this topic.

E.
From: Sunnan
Subject: Re: Required reading
Date: 
Message-ID: <87vfkc25r0.fsf@handgranat.org>
·········@flownet.com (Erann Gat) writes:
> I can't recall if anyone has mentioned this already or not, but the
> definitive treatment of the lisp-1/lisp2 issues is:
>
> http://www.dreamsongs.com/Separation.html

I linked to it in one of my first messages. No one is arguing over
technical detail (with the possible exception of the macro issue).

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4n05og88c.fsf@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> > In a Lisp-2, much of this is avoided.
> 
> I guess my sentiment is as follows: in a lisp-2 you *still* have to be
> careful - and even if the bug comes up a bit more seldom, it can still
> bite you.

You have to be careful about some things, but not about variable /
function-name conflicts, because they're in separate namespaces.

> >> I'm continually surprised over how people genuinely speak out in favor
> >> of lisp-2.
> >
> > But why is that?  Why are you suprised?  There seems to be a
> > presumption that lisp-1 is clearly better.
> >
> > All the arguments for lisp-2 really need to do is to establish
> > that it is a reasonable choice.
> 
> The reason for my surprise could be that I've never seen the "choice"
> of lisp-2 being consciously made as a design choice. It was a
> historical "accident"/consequence of details of the lisp 1.5 reader.

Lisp 1.5 *reader*?  In any case, Lisp 1.5 wasn't really a 2-namespace
Lisp.  It might count as such at the top level, but not "locally".

Besides, later implementations changed many things from Lisp 1.5.

-- jd
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87ad1o24ji.fsf@handgranat.org>
Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> You have to be careful about some things, but not about variable /
> function-name conflicts, because they're in separate namespaces.

But variables can be setq'd to lambdas, so you can have a symbol that
has one function in the value cell and one in the function cell (so to
speak). Isn't that gross? Oh, I'm getting as tired on this issue as
everybody else.

> Lisp 1.5 *reader*?  In any case, Lisp 1.5 wasn't really a 2-namespace
> Lisp.  It might count as such at the top level, but not "locally".

I don't really know lisp 1.5 at all yet, I must admit. Here's what
http://www.dreamsongs.com/Separation.html says:

"Lisp 1.5 broke symbols into values and functions; values were stored
on an association list, and function on the property lists of
symbols. Complied and interpreted code worked in different ways. In
the interpreter, the association list was where all bindings were
kept. When an identifier was encountered )an 'atomic symbol' in Lisp
1.5 terminology), it was taken as a variable to be evaluated for its
value. First the APVAL part of the symbol was interrogated--an APVAL
was "A Permanent, system-defined VALue". Stored in a specific place in
the symbol. Second, the association list was searched. Finally, if no
binding was found, an error was signaled."

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4vfkc1rak.fsf@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> > You have to be careful about some things, but not about variable /
> > function-name conflicts, because they're in separate namespaces.
> 
> But variables can be setq'd to lambdas, so you can have a symbol that
> has one function in the value cell and one in the function cell (so to
> speak). Isn't that gross? 

Why?  Suppose I have a data structure with several fields.
Is it gross to have functions in two of them?  I'm not sure
what you think the grossness is.

In any case, that's a different issue (ie, it's not name conflicts,
accidental capture, choosing names carefully, etc).

> > Lisp 1.5 *reader*?  In any case, Lisp 1.5 wasn't really a 2-namespace
> > Lisp.  It might count as such at the top level, but not "locally".
> 
> I don't really know lisp 1.5 at all yet, I must admit. Here's what
> http://www.dreamsongs.com/Separation.html says:

Look at the Lisp 1.5 Programmer's Manual.  It should be easy
to find.

It is like a 2-namespace Lisp in some ways, but not in others.

For example, LABEL can be used to bind a function name to a
function so that it can call itself recursively.  (It's
a bit like Common Lisp LABELS, but for only one function.)
But this binding goes on the a-list just like a variable
binding.  There's no separate a-list for function bindings,
there's nothing on the a-list that distinguishes different
kinds of bindings, and the interpreter doesnt skip names
bound to functions when its looking up the value of a variable.

So here it's working as a Lisp 1, and these "function names"
are effectively variables.

Moreover, a variable that has a function as its value can
be written as the CAR of a form and will be called as a
function.

-- jd
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87brm4yqy4.fsf@handgranat.org>
Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> Why?  Suppose I have a data structure with several fields.
> Is it gross to have functions in two of them?  I'm not sure
> what you think the grossness is.

Ok, so in CL, each and every variable is in essence a data structure
with several fields, and convention has it that one of them is
generally used for functions (i.e. the slot that's assigned with
defun). I guess the grossness that I initially felt is very
subjective.

> In any case, that's a different issue (ie, it's not name conflicts,
> accidental capture, choosing names carefully, etc).

Right.

> Look at the Lisp 1.5 Programmer's Manual.  It should be easy
> to find.

Thanks, I'll check it out.

-- 
One love,
Sunnan
From: Rob Warnock
Subject: Re: scheme seems neater
Date: 
Message-ID: <D_ydnf94T58OjejdRVn_iw@speakeasy.net>
Sunnan  <······@handgranat.org> wrote:
+---------------
| Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
| > Why?  Suppose I have a data structure with several fields.
| > Is it gross to have functions in two of them?  I'm not sure
| > what you think the grossness is.
| 
| Ok, so in CL, each and every variable is in essence a data structure
| with several fields, and convention has it that one of them is
| generally used for functions (i.e. the slot that's assigned with defun).
+---------------

By the way, Perl is much like CL in this case -- every variable (symbol)
is in essence a data structure with several fields -- except that Perl
has chosen a particular reader macro character as an explicit indicator
of each slot:

	$ for the scalar value binding, e.g., $foo
	@ for the array value binding
	% for the associative-array value binding
	Capitalization for the I/O stream binding, e.g., FOO
	* for "the symbol itself" (a.k.a. the "glob"), which
	  gives access to *all* of the bindings (and which is the
	  only way, AFAIK, to pass an I/O stream as a parameter).


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx41xmy71zj.fsf@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> > Why?  Suppose I have a data structure with several fields.
> > Is it gross to have functions in two of them?  I'm not sure
> > what you think the grossness is.
> 
> Ok, so in CL, each and every variable is in essence a data structure
> with several fields, and convention has it that one of them is
> generally used for functions (i.e. the slot that's assigned with
> defun). I guess the grossness that I initially felt is very
> subjective.

Yes, I think that's a reasonable model, especially for
the top level environment.

Symbols as data objects double as variables and function names
at the top level, and each one has a "function cell" (field) and
a "value cell" (field).

Strinctly speaking, they don't have to be fields, so long as
symbol-function and symbol-value can return the right value
and setf of those functions can set it.  For example, symbols
might not have either field, and the mappings might
be held in hash tables instead.

Local var and function names are more like that.  The mappings
might be in association lists, for example.  In compiled code,
the symbols might even be "compiled away".

-- jd
From: Frode Vatvedt Fjeld
Subject: Re: scheme seems neater
Date: 
Message-ID: <2h1xmxce4w.fsf@vserver.cs.uit.no>
Sunnan <······@handgranat.org> writes:

> Ok, so in CL, each and every variable is in essence a data structure
> with several fields, and convention has it that one of them is
> generally used for functions (i.e. the slot that's assigned with
> defun).

I think this model only fosters confusion. A variable name is
different from a variable. In CL, "variable" means a binding in the
"ordinary" variable namespace. But in this context, let's assume a
variable is what CL calls a binding, i.e. a storage cell than can be
referenced by a name via some namespace. Then that's precisely what it
is: a single storage cell associated with a name, no data structure.

The /names/, which for the most case means symbols, might be thought
of as a data structure with one field for each namespace. However,
this model doesn't work very well in the case of lexical variables.

It's nonsensical to think that CL's function namespace is "generally
used for functions", at least if this suggests that there's a choice
of name-space being made on the basis of the value's type. The
function namespace is really used for a wholly different concept than
anything to do with values and variables. It's a concept that is
unrelated to the fact that functions are also values. Therefore,
throwing the "but functions are just objects" line into the debate, as
schemers often seem to do, just shows that they never understood the
concept of what the function namespace is. And, btw, it's much more
appropriately referred to as the "operator namespace".

> I guess the grossness that I initially felt is very subjective.

When you do grasp the concept of the operator namespace, the lisp-1
approach is likely to gross you out more.

-- 
Frode Vatvedt Fjeld
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c4v1rs$sas$1@f1node01.rhrz.uni-bonn.de>
Sunnan wrote:

> Jeff Dalton <····@todday.inf.ed.ac.uk> writes:
> 
>>In a Lisp-2, much of this is avoided.
> 
> I guess my sentiment is as follows: in a lisp-2 you *still* have to be
> careful - and even if the bug comes up a bit more seldom, it can still
> bite you.

See 
http://groups.google.com/groups?selm=9310091717.AA29226%40inferno.lucid.com


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <874qrw3kpd.fsf@handgranat.org>
Pascal Costanza <········@web.de> writes:
> See
> http://groups.google.com/groups?selm=9310091717.AA29226%40inferno.lucid.com

Thank you very much, that was the gem of the day. I'm saving that link.

-- 
One love,
Sunnan
From: Lars Brinkhoff
Subject: Re: scheme seems neater
Date: 
Message-ID: <85ekr0l4j7.fsf@junk.nocrew.org>
Sunnan <······@handgranat.org> writes:
> The reason for my surprise could be that I've never seen the "choice"
> of lisp-2 being consciously made as a design choice. It was a
> historical "accident"/consequence of details of the lisp 1.5 reader.

It may have been a conscious design choise in ISLisp.

-- 
Lars Brinkhoff,         Services for Unix, Linux, GCC, HTTP
Brinkhoff Consulting    http://www.brinkhoff.se/
From: Björn Lindberg
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcsd66kc3in.fsf@knatte.nada.kth.se>
Sunnan <······@handgranat.org> writes:

> ·······@nada.kth.se (Bj�rn Lindberg) writes:
> > You're making it sound like there isn't a significant difference in
> > the need for name mangling between a Lisp-1 and a Lisp-2. This is
> > obviously wrong. Common Lisp code usually don't contain any name
> > mangling at all, whereas in Scheme it is the norm.
> 
> Of course there is a difference.
> 
> However, I think that the difference is exaggerated. In my current
> scheme project, which is small, I admit (1226 pairs of parens), I
> didn't have any name mangling.

So what are the *huge* benefits of Lisp-1 then? There are no huge
benefits either way.


Bj�rn
From: Frode Vatvedt Fjeld
Subject: Re: scheme seems neater
Date: 
Message-ID: <2hisgddnne.fsf@vserver.cs.uit.no>
Sunnan <······@handgranat.org> writes:

> Yeah, I understand, but I warped the analogy a bit because I
> couldn't think of anything cleaner. I have now, though, how about
> this sentence:
>
> "Tomorrow I am going to saw with the saw".
>
> In CL, that would have to be 
> "..going to #'saw with the saw".

Well it'd be "tomorrow I'm going to apply the verb `saw' to the
board", but I think in one sense you're right. In sexpr terms, this
would be

  CL:     (tomorrow #'saw board)
  Scheme: (tomorrow saw board)

and one expects a "verb" after tomorrow, hence CL's #' might be
considered superfluous. There are problems with this, however.  Humans
can expect a verb after tomorrow because humans understand the concept
of "tomorrow I'm going to" and expect an action (verb in some form) to
follow. We don't want to design the programming language after such
rules. I.e. we don't want to make tomorrow a special operator with
special evaluation rules, i.e. "the argument to tomorrow is a function
name". Such designs we leave to Perl and its ilk. So this is the
rationale for why lisp-2 accepts this minor problem.  In scheme, what
if you want to say "tomorrow do the noun `saw' and the board"?
(tomorrow sw board). It's the same old story: in scheme you have to
simulate two name-spaces by making up mangled names, and so in general
it has the same problem as that CL has here, except of course the
problem is much more pervasive in scheme and there's no principled way
to deal with it.

And yes, in CL you have to "simulate namespaces by making up names"
also, when writing e.g.

  (defun append-two-lists (list1 list2)
    ...)

but I don't see this as any argument against lisp-2. It's a rare
situation that you have two variables that would naturally be given
the same name, and I don't think there's any sensible way to resolve
this "problem" in the language design.

-- 
Frode Vatvedt Fjeld
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87d66l57bj.fsf@handgranat.org>
Frode Vatvedt Fjeld <······@cs.uit.no> writes:
> We don't want to design the programming language after such
> rules. I.e. we don't want to make tomorrow a special operator with
> special evaluation rules, i.e. "the argument to tomorrow is a function
> name".

Sure, that's understandable (see also the recent discussion about
trace).

But isn't the function/value separation as stands in CL also a kind of
"special evaluation rules"?

Something like "The first atom in a unquoted list is to have it's
function value called". (Eeep, that sentence may need some revision
but you get the gist.)

> Such designs we leave to Perl and its ilk. So this is the
> rationale for why lisp-2 accepts this minor problem.  In scheme, what
> if you want to say "tomorrow do the noun `saw' and the board"?
> (tomorrow sw board).

Yeah, I'd probably do that, but I often work within a lambda or a
function so the scope is limited enough.

(let ((saw (noun saw)))
  (tomorrow saw board))

works fine.

> It's the same old story: in scheme you have to
> simulate two name-spaces by making up mangled names, and so in general
> it has the same problem as that CL has here, except of course the
> problem is much more pervasive in scheme and there's no principled way
> to deal with it.
>
> And yes, in CL you have to "simulate namespaces by making up names"
> also, when writing e.g.
>
>   (defun append-two-lists (list1 list2)
>     ...)
>
> but I don't see this as any argument against lisp-2.

The argument is that the "simulate namespaces by making up names" is a
problem that both lisp-1 and lisp-2 (and all other languages with
variables) has. Lisp-1 has that problem more often (not *that* often -
there's a reason why the perennial example is "list"...) but being a
lisp-2 does not completely solve it and thus this argument for lisp-2
is at least somewhat void.

(One reason for this particular example is that list is a verb and a
noun in english as well, instead of being called e.g. listify.)

> It's a rare situation that you have two variables that would
> naturally be given the same name, and I don't think there's any
> sensible way to resolve this "problem" in the language design.

Agreed.

-- 
One love,
Sunnan
From: Frode Vatvedt Fjeld
Subject: Re: scheme seems neater
Date: 
Message-ID: <2hekr1dkox.fsf@vserver.cs.uit.no>
Sunnan <······@handgranat.org> writes:

> But isn't the function/value separation as stands in CL also a kind
> of "special evaluation rules"?

Certainly. You can't make an omelet without breaking some eggs.

> [..] but being a lisp-2 does not completely solve it and thus this
> argument for lisp-2 is at least somewhat void.

Well, this is precisely a statement I'd expect from someone chosing
lisp-1, and I disagree wholeheartedly. That some solution does not
cover every conceivable problem does not void that solution. Scheme
dismisses lisp-2 on this grounds, and continues to live with "lsp"
variables and unnecessarily complicated macro systems because
according to some twisted way of thinking this requires no solutions
that don't cover every problem. Scheme tries to make the omelet
without breaking any eggs, so they put the egg very carefully in the
frying pan, cook it long and well, and swallow the thing, eggshell and
all, satisfied that the egg remained unbroken and pure.

-- 
Frode Vatvedt Fjeld
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87n05p3ph4.fsf@handgranat.org>
(Feeling kind of bad to reply with a petty aside to a good post...)

Frode Vatvedt Fjeld <······@cs.uit.no> writes:
<snip>
> variables and unnecessarily complicated macro systems because
> according to some twisted way of thinking this requires no solutions
> that don't cover every problem.

In practice, there's nothing wrong with scheme's macro systems. You
want CL-style defmacro, you've got it. (I wish R6RS would include
Dybvig's syntax-case, since that's rapidly becoming something of a
standard among schemes.)

(I groan at people who complain about CL's lack of tail-call
optimization, as well. Nothing prevents an implementation from
providing it. Same as being a lisp-1 doesn't mean "unwieldly macros".)

> Scheme tries to make the omelet without breaking any eggs, so they
> put the egg very carefully in the frying pan, cook it long and well,
> and swallow the thing, eggshell and all, satisfied that the egg
> remained unbroken and pure.

Yeah, I guess that's it. That's a pretty good explanation. And CL
would break the egg, a bit of the shell would fall in but most of it
would get cleaned out, and the omelet would turn out pretty fine. We
would both be happy.

(So it makes sense for scheme to be lisp-1 and CL to be lisp-2, but
can't it in theory exist languages that are lisp-1 but with some of
CL's omelettish advantages? I think it can.)

-- 
One love,
Sunnan
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c4uv3t$ih6$1@f1node01.rhrz.uni-bonn.de>
Sunnan wrote:

> But isn't the function/value separation as stands in CL also a kind of
> "special evaluation rules"?
> 
> Something like "The first atom in a unquoted list is to have it's
> function value called". (Eeep, that sentence may need some revision
> but you get the gist.)

You have a similar exceptional rule in Scheme: The car of an expression 
is used to determine whether it indicates the use of a function or a 
macro before anything is evaluated. This determines the roles of the 
members of the expression's cdr.

So the car plays a special role both in Scheme and in Common Lisp. I 
don't think this can be avoided. (Unless you remove macros from the 
language.)

Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87u0zw3nik.fsf@handgranat.org>
Pascal Costanza <········@web.de> writes:
> So the car plays a special role both in Scheme and in Common Lisp. I
> don't think this can be avoided. (Unless you remove macros from the
> language.)

In my (non-existing) "dream lisp" macros can be expanded in non-car
positions. Or maybe that would suck (introduce ambiguities and so on),
but I've been seriously considering it, to see how it would work.

Or maybe some lisp-1 could have macros implemented in a way such that
"mapcar macroname list" would work.

I'm just brainstorming.

Of course, the car always plays a special role in lisp eval, that's
more or less axiomatic (some exceptions have existed). I was replying
to a particular issue of function/value separation.

-- 
One love,
Sunnan
From: Peter Seibel
Subject: Re: scheme seems neater
Date: 
Message-ID: <m38yh8am1c.fsf@javamonkey.com>
Sunnan <······@handgranat.org> writes:

> Pascal Costanza <········@web.de> writes:
>> So the car plays a special role both in Scheme and in Common Lisp. I
>> don't think this can be avoided. (Unless you remove macros from the
>> language.)
>
> In my (non-existing) "dream lisp" macros can be expanded in non-car
> positions. Or maybe that would suck (introduce ambiguities and so on),
> but I've been seriously considering it, to see how it would work.

You might want to look at DEFINE-SYMBOL-MACRO and SYMBOL-MACROLET in
Common Lisp.

-Peter

-- 
Peter Seibel                                      ·····@javamonkey.com

         Lisp is the red pill. -- John Fraser, comp.lang.lisp
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c4v2hm$sb0$1@f1node01.rhrz.uni-bonn.de>
Sunnan wrote:

> Pascal Costanza <········@web.de> writes:
> 
>>So the car plays a special role both in Scheme and in Common Lisp. I
>>don't think this can be avoided. (Unless you remove macros from the
>>language.)
> 
> In my (non-existing) "dream lisp" macros can be expanded in non-car
> positions. Or maybe that would suck (introduce ambiguities and so on),
> but I've been seriously considering it, to see how it would work.

You want symbol macros. Check out the CL spec. ;)

> Or maybe some lisp-1 could have macros implemented in a way such that
> "mapcar macroname list" would work.

I also think this could be neat, but I also recall reading that this was 
available in some of the earlier Lisp dialects. However, I don't know 
the details. The problem with this is that you cannot compile away 
lexical information anymore. Whether this is a real problem - I don't know.

> Of course, the car always plays a special role in lisp eval, that's
> more or less axiomatic (some exceptions have existed). I was replying
> to a particular issue of function/value separation.

Whatever. The special role of the car position doesn't make it look too 
far-fetched to opt for a Lisp-2, though. That's my point.


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87zn9o25zz.fsf@handgranat.org>
Pascal Costanza <········@web.de> writes:
> You want symbol macros. Check out the CL spec. ;)

Right, that's what I mean.

>> Or maybe some lisp-1 could have macros implemented in a way such that
>> "mapcar macroname list" would work.
>
> I also think this could be neat, but I also recall reading that this
> was available in some of the earlier Lisp dialects. However, I don't
> know the details. The problem with this is that you cannot compile
> away lexical information anymore. Whether this is a real problem - I
> don't know.

(mapcar #'(lambda (x) macroname x) list) works (er... does it?), which
I can't see as much less harmful.

-- 
One love,
Sunnan
From: =?utf-8?b?QmrDtnJuIExpbmRiZXJn?=
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcs4qrx9k7o.fsf@fnatte.nada.kth.se>
Frode Vatvedt Fjeld <······@cs.uit.no> writes:

> Sunnan <······@handgranat.org> writes:
> 
> > [..] In conclusion, I like CL a lot. I just don't think that having
> > to say "I saw the #'saw and sawed a board with it" is something to
> > write home about.
> 
> I think you're missing the point here. You don't say "I saw the #'saw
> and sawed .." in CL, you say "I saw the saw and sawed ..". The point
> is that contextually, humans understand that the first saw is a verb
> and the second a noun, even if they are spelled identically. What this
> illustrates about CL is that in e.g. (saw saw), the first saw
> references the function name-space, while the second saw references
> the variable name-space.
> 
> Your sentence "I saw the #'saw and sawed a board with it", is
> presumably intended to correspond to a CL form like (saw #'saw).
> However, the translation of (saw #'saw) to english would be something
> like "I saw the verb `saw' and sawed .." where the qualification of
> the second saw is required in order for it not to be mistaken for the
> metal thing with teeth. Precisely the same situation as in CL, and why
> we have to say (function saw), shortcut to #'saw, to express "the
> value of saw in the function name-space" when not at the operator
> postion in a form.

And the scheme version is

  I saugh the sw and sawed a board with it.

;-)


Björn
From: Thomas A. Russ
Subject: Re: scheme seems neater
Date: 
Message-ID: <ymiad1nslf1.fsf@sevak.isi.edu>
Sunnan <······@handgranat.org> writes:
> In scheme you have to change the variable name a little.
> 
> (define (hack-the-list list-)
>    (list (hackhackhack list-)))
> 
> How is that much less elegant than CL having to add #' pretty often?
> (uses of mapcar comes to mind).

Well, in most Common Lisp code, you actually don't end up really using
#' "pretty often".  In part that is because there tends to be a bit less
use of mapping functions and more use of iteration constructs.  As a
practical matter, it then comes down to Common Lisp having a system
where more effort is needed in the uncommon situations of passing
functions as arguments rather in the vastly more common situation of
naming variables.

> (Common idioms are lis and lst.)
> 
> So CL-ers have to add #' and funcalls in some situations, and schemers
> have to change their names in some others. The same inelegancy.
> 
> However, I can't think of any drawback with a lisp-1 that a lisp-2
> doesn't also have.

Well, maybe.  The real problem is that Common Lisp isn't really a lisp-2
system anymore.  It is a lisp-n where n is greater than two.  In
addition to namespaces for variables and functions, Common Lisp also has
namespaces for:

   classes
   packages

and maybe a couple of other things I've forgotten about.  That means
that more of the name interpretation is context dependent in the
language.  Now, one could mandate that all of these namespaces be
joined, but it seems that people don't seem to have trouble using the
same name for different purposes as long as it is clear from context
which purpose is intended.  It is just when one wants to use a name in a
foreign context that some special effort is needed.




-- 
Thomas A. Russ,  USC/Information Sciences Institute
From: Jeff Dalton
Subject: Counting namespaces, was Re: scheme seems neater
Date: 
Message-ID: <fx4oeq34fo9.fsf_-_@todday.inf.ed.ac.uk>
···@sevak.isi.edu (Thomas A. Russ) writes:

> Well, maybe.  The real problem is that Common Lisp isn't really a lisp-2
> system anymore.  It is a lisp-n where n is greater than two.  In
> addition to namespaces for variables and functions, Common Lisp also has
> namespaces for:
> 
>    classes
>    packages

I would say that neither have namespaces in the sense of "namespace"
that gives us Lisp-1 and Lisp-2.  Namespaces are about identifiers in
code, rather than data uses of symbols.  They're about different
interpretation in the interpreter or compiler, rather than data
mappings performed by (essentially) library procedures such as
find-class or find-package.

There's a namespace of go tags and another for block names,
so that CL is a Lisp-4.

There might be some ones added after CLTL 1, but I don't
remember any off hand.

In the early days of CL, Guy Steele wrote sample interpreters
to show how the language worked.  They used a 4-part environment
(one per namespace), like this:

;;; This evaluator splits the lexical environment into four
;;; logically distinct entities:
;;;     VENV = lexical variable environment
;;;     FENV = lexical function and macro environment
;;;     BENV = block name environment
;;;     GENV = go tag environment
;;; Each environment is an a-list.

...

(defstruct interpreted-lexical-closure function venv fenv benv genv)

...

(defun eval (exp)
  (*eval exp nil nil nil nil))

;;; *EVAL looks useless here, but does more complex things
;;; in alternative implementations of this evaluator.

(defun *eval (exp venv fenv benv genv)
  (%eval exp venv fenv benv genv))

;;; Function names beginning with "%" are intended to be internal
;;; and not defined in the Common LISP white pages.

...

-- jd
From: Rob Warnock
Subject: Re: Counting namespaces, was Re: scheme seems neater
Date: 
Message-ID: <CtudncvIYJpJh-jdRVn-ig@speakeasy.net>
Jeff Dalton  <····@todday.inf.ed.ac.uk> wrote:
+---------------
| ···@sevak.isi.edu (Thomas A. Russ) writes:
| > In addition to namespaces for variables and functions,
| > Common Lisp also has namespaces for:
| >    classes
| >    packages
...
| There's a namespace of go tags and another for block names,
| so that CL is a Lisp-4.
| 
| There might be some ones added after CLTL 1, but I don't
| remember any off hand.
+---------------

Looking at CLHS "3.1.1.1 The Global Environment" we see at least 5,
and adding "3.1.1.3 Lexical Environments" we see 3 or 4 more (4 if
you count "compiler macros that are locally disabled"), for a grand
total of 8 or 9 at least. And Pekka Pirinen <·····@harlequin.co.uk>
in <·······················@gaspode.cam.harlequin.co.uk> says you
should add structures, methods, method combinations, and SETF methods.
Some might even add LOOP keywords, since they can't be shadowed.
What's that now, 12? 13?  But in any case, probably at least 11 or
12 of the following 13:

    - Functions & macros
    - lexical variables
    - special variables
    - types and classes
    - GO tags
    - block names
    - structures
    - SETF methods
    - compiler macros
    - methods
    - method combinations
    - declaration specifiers (not types, things like IGNORE, INLINE, etc.)
    - (LOOP keywords?)

Not to mention that each defined package adds a nearly-full set of
additional namespaces.


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Brian Mastenbrook
Subject: Re: Counting namespaces, was Re: scheme seems neater
Date: 
Message-ID: <080420040818110005%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <······················@speakeasy.net>, Rob Warnock
<····@rpw3.org> wrote:

> Looking at CLHS "3.1.1.1 The Global Environment" we see at least 5,
> and adding "3.1.1.3 Lexical Environments" we see 3 or 4 more (4 if
> you count "compiler macros that are locally disabled"), for a grand
> total of 8 or 9 at least. And Pekka Pirinen <·····@harlequin.co.uk>
> in <·······················@gaspode.cam.harlequin.co.uk> says you
> should add structures, methods, method combinations, and SETF methods.
> Some might even add LOOP keywords, since they can't be shadowed.
> What's that now, 12? 13?  But in any case, probably at least 11 or
> 12 of the following 13:

Counting global namespaces has a problem: any function that keeps an
alist of bindings around adds a new one. Thus, it's really only sane to
count lexical namespaces. The declaration lexical environment simply
modifies existing bindings, instead of adding new ones, so it's not
really a namespace to itself. Thus, CL is a Lisp-4.

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Rob Warnock
Subject: Re: Counting namespaces, was Re: scheme seems neater
Date: 
Message-ID: <a8idnU98jaRN7-vdRVn-tw@speakeasy.net>
Brian Mastenbrook  <····················@cs.indiana.edu> wrote:
+---------------
| Rob Warnock <····@rpw3.org> wrote:
| > Looking at CLHS "3.1.1.1 The Global Environment" we see at least 5,
| > and adding "3.1.1.3 Lexical Environments" we see 3 or 4 more (4 if
| > you count "compiler macros that are locally disabled"), for a grand
| > total of 8 or 9 at least. ... But in any case, probably at least 11 or
| > 12 of the following 13:
| 
| Counting global namespaces has a problem: any function that keeps an
| alist of bindings around adds a new one. Thus, it's really only sane to
| count lexical namespaces. The declaration lexical environment simply
| modifies existing bindings, instead of adding new ones, so it's not
| really a namespace to itself. Thus, CL is a Lisp-4.
+---------------

I disagree. But perhaps you misunderstood what I meant about "declarations":
I was referring to the namespace *of* declarations, not the namespace of
things they describe. That is, the existence of the INLINE declaration
does not in any way preclude using "INLINE" for the name of a function,
a variable, a block name, or a GO tag. This is true for all of the other
declarations words as well (including those the user might add with the
DECLARE DECLARATION declaration), excepting only those which are shorthand
for type declarations. Thus the *language* of declarations is in a separate
namespace from the 4 you mention. Likewise, type and class names (together)
are mostly disjoint from other previously-mentioned 5 (with the previously-
noted exception of using some type names as declarations), so they too are
in a separate namespace. And I really think one should include LOOP keywords
as a "namespace", since they don't really conflict with variable or function
names [though see below as to whether they should be merged with all other
"marker" uses of symbols].

Keeping this more restricted notion of "conflict" in mind, I might be
willing to concede that special variables & lexical variables are in
the same namespace, since a special declaration (explicit or implicit)
trumps the lexical status of a variable of the same name in scope.
Likewise, I won't argue very hard about SETF methods, methods (since
they conflict with functions) and method combinations (which are just
"markers" much like CATCH tags [or LOOP keywords, hmmm...]). But even
so, one is still left with at least 6 classes of non-conflicting uses
of the same symbol which can appear in the same lexical scope. Hence
I would still claim that CL is at least a Lisp-6.


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Duane Rettig
Subject: Re: Counting namespaces, was Re: scheme seems neater
Date: 
Message-ID: <48yh5krs1.fsf@franz.com>
····@rpw3.org (Rob Warnock) writes:

> Brian Mastenbrook  <····················@cs.indiana.edu> wrote:
> +---------------
> | Rob Warnock <····@rpw3.org> wrote:
> | > Looking at CLHS "3.1.1.1 The Global Environment" we see at least 5,
> | > and adding "3.1.1.3 Lexical Environments" we see 3 or 4 more (4 if
> | > you count "compiler macros that are locally disabled"), for a grand
> | > total of 8 or 9 at least. ... But in any case, probably at least 11 or
> | > 12 of the following 13:
> | 
> | Counting global namespaces has a problem: any function that keeps an
> | alist of bindings around adds a new one. Thus, it's really only sane to
> | count lexical namespaces. The declaration lexical environment simply
> | modifies existing bindings, instead of adding new ones, so it's not
> | really a namespace to itself. Thus, CL is a Lisp-4.
> +---------------
> 
> I disagree. But perhaps you misunderstood what I meant about "declarations":
> I was referring to the namespace *of* declarations, not the namespace of
> things they describe. That is, the existence of the INLINE declaration
> does not in any way preclude using "INLINE" for the name of a function,
> a variable, a block name, or a GO tag. This is true for all of the other
> declarations words as well (including those the user might add with the
> DECLARE DECLARATION declaration), excepting only those which are shorthand
> for type declarations. Thus the *language* of declarations is in a separate
> namespace from the 4 you mention.

Besides which, the X3J13 committee had temporarily defined (and CLtL2 had
caputured, and I am working on reviving) the define-declaration macro, which
legitimizes the extensibility of this namespace.  Note that the declaration
declaration is primarily useful for making portable any vendor-specific
declarations, but it doesn't actually extend the namespace by giving the
"new" declaration a binding.  define-declaration does precisely that.

The environments package I'm working on has 5 shadowable namespaces:
variable, function, block, tag, and declaration.  It also has a global
component, which uses property-like lists, for the many other global
namespaces that CL allows.

> Likewise, type and class names (together)
> are mostly disjoint from other previously-mentioned 5 (with the previously-
> noted exception of using some type names as declarations), so they too are
> in a separate namespace. And I really think one should include LOOP keywords
> as a "namespace", since they don't really conflict with variable or function
> names [though see below as to whether they should be merged with all other
> "marker" uses of symbols].

I would agree that types are a namespace.  I think loop keywords are iffy;
once you start getting into the words that macros recognize as special within
certain points of the syntax, you must also allow the keys in case statements
to be considered as names in a micro-namespace, or condition names in a
handler-case or handler-bind, or any other user-defined macro that defines
such a namespace using comparisons.  And these namespaces are definitely
not as extensible as others; one can't easily add a loop keyword, and one
can't recognize new keywords in a case statement without actually changing
the case statement.

The real explosion in namespaces comes with property lists and hash
tables.  It is easy to create new namespaces with properties, and new
names within those properties.  And hash-tables can implement anything
that a property list can implement (though shadowing semantics are
a little harder to simulate in a hash table).

> Keeping this more restricted notion of "conflict" in mind, I might be
> willing to concede that special variables & lexical variables are in
> the same namespace, since a special declaration (explicit or implicit)
> trumps the lexical status of a variable of the same name in scope.

Agreed.

> Likewise, I won't argue very hard about SETF methods, methods (since
> they conflict with functions) and method combinations (which are just
> "markers" much like CATCH tags [or LOOP keywords, hmmm...]).

Hmm, indeed.

> But even
> so, one is still left with at least 6 classes of non-conflicting uses
> of the same symbol which can appear in the same lexical scope. Hence
> I would still claim that CL is at least a Lisp-6.

I think that it is useless for CL users to argue amongst ourselves
as to how many spaces CL has.  To any question "is CL a Lisp-n" where
n is greater than 1, the answer can easily be "yes".  The real issue
lies in emphasis and basic structure, and the crossover points for
CL are at least:

 lisp-2: because CL is defined to distinguish betweeen variable-value
         and function-value

 lisp-5: because the lexical heirarchy of CL contains variables,
         functions, blocks, tags, and declarations

 lisp-n: because CL is extensible and can thus simulate any number
         of namespaces.

I believe that this last one is true for Scheme and Forth as well.

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Kalle Olavi Niemitalo
Subject: Re: Counting namespaces, was Re: scheme seems neater
Date: 
Message-ID: <87smfc6fzl.fsf@Astalo.kon.iki.fi>
Duane Rettig <·····@franz.com> writes:

> Besides which, the X3J13 committee had temporarily defined (and CLtL2 had
> caputured, and I am working on reviving) the define-declaration macro, which
> legitimizes the extensibility of this namespace.

The CVS version of GNU Emacs has a "macro-declaration-function"
variable that gets funcalled for (declare ...) forms in defmacro.
AFAICT, this is the only way it uses "declare".
From: Jeff Dalton
Subject: Re: Counting namespaces, was Re: scheme seems neater
Date: 
Message-ID: <fx4ad1kbjz4.fsf@tarn.inf.ed.ac.uk>
Duane Rettig <·····@franz.com> writes:

> I think that it is useless for CL users to argue amongst ourselves
> as to how many spaces CL has.  To any question "is CL a Lisp-n" where
> n is greater than 1, the answer can easily be "yes".  The real issue
> lies in emphasis and basic structure, and the crossover points for
> CL are at least:
> 
>  lisp-2: because CL is defined to distinguish betweeen variable-value
>          and function-value
> 
>  lisp-5: because the lexical heirarchy of CL contains variables,
>          functions, blocks, tags, and declarations
> 
>  lisp-n: because CL is extensible and can thus simulate any number
>          of namespaces.

I think that's the right way to look at it (modulo a few details).

-- jd
From: Jeff Dalton
Subject: Re: Counting namespaces, was Re: scheme seems neater
Date: 
Message-ID: <fx465c8bh0t.fsf@tarn.inf.ed.ac.uk>
····@rpw3.org (Rob Warnock) writes:

> ... That is, the existence of the INLINE declaration
> does not in any way preclude using "INLINE" for the name of a function,
> a variable, a block name, or a GO tag.

Yes, but not every different use of symbols is a namespace.

To have a namespace, the names (symbols) have to name something.
A namespace is a mapping from names to objects of some sort,
not necessarily 1st-class objects.  [Some, e.g. below, have
it as name -> location -> object instead.]

For example, you say later:

  I really think one should include LOOP keywords as a "namespace",
  since they don't really conflict with variable or function names

The criterion there seems to be: nonconflicting uses of symbols.
And later you have:

  one is still left with at least 6 classes of non-conflicting uses
  of the same symbol which can appear in the same lexical scope.

But the loop keywords aren't the names of anything.  (It would
be stretching to far to say they're the names of clauses or of
syntactic elements.)

So loop keywords aren't a namespace.

Here are some relevant definitions from the Gabriel-Pitman
paper, http://www.dreamsongs.com/Separation.html

  A _binding_ is a pairing of an identifier with a location in which a
  Lisp object may be placed.

  An _environment_ is the set of all bindings in existence at
  some given time. We shall call a subset of an environment a
  _subenvironment_.

  A _namespace_ is a subenvironment. Often the location parts of the
  bidings in a namespace contain objects used for some purpose, and the
  identifiers of the namespace are said to "name objects" used for that
  purpose; if the types of objects that can be stored in the location
  parts of bindings in a namespace are used for a purpose such that only
  objects of a certain type can be used for that purpose, then we say
  that "the objects in the namespace are restricted" to that type of
  object. The objects that are stored in the location parts of bindings
  in a namespace are not necessarily restricted to first-class Lisp
  objects. In this paper, there are two namespaces of concern, which we
  shall term the "value namespace" and the "function namespace." Other
  namespaces include tag names (used by TAGBODY and GO) and block names
  (used by BLOCK and RETURN-FROM), but the objects in the location parts
  of their bindings are not first-class Lisp objects. 

(That's not to say that I agree with everything in that paper.)

I also think that what I've called "data uses" of symbols don't count.
Namespaces should be about identifiers in code.  So the existence of a
function, find-class, that maps from symbols to classes, is not enough
to give us a namespace.  Namespaces should also be things the compiler
or interpreter has to know about.  So mappings handled by what are
essentially library functions don't count.  Is every symbol a
namespace, because it has a p-list that maps from symbols (usually
symbols) to objects?  I don't think so.

There's a legitimate sense of the word "namespace" in which
every package would be a namespace; but that "cuts across"
the other namespaces we're considering.  They're mappings
from symbols (regarded as identifiers), while packages are
mappings to symbols.  So if packages are namespaces, they
are namespaces in a different sense of the word.

This gives us a fairly small number: variables, functions,
block names, go tags, and types, possibly also declaration
names: so Lisp-5 or Lisp-6.

I don't think it matters exactly how many namespaces Common
Lisp has.

I do think it's useful to try to counter the impression that
CL has namespaces coming out of its ears in some kind of
arbitrary, unprincipled mess.

-- jd
From: Jeff Dalton
Subject: Re: Counting namespaces, was Re: scheme seems neater
Date: 
Message-ID: <fx4ekqy752l.fsf@todday.inf.ed.ac.uk>
····@rpw3.org (Rob Warnock) writes:

> Looking at CLHS "3.1.1.1 The Global Environment" we see at least 5,
> and adding "3.1.1.3 Lexical Environments" we see 3 or 4 more (4 if
> you count "compiler macros that are locally disabled"),

There should be at least 4 (not counting "compiler macros
that are locally disabled") because of the 4 I mentioned:
variables, function names, block names, and go tags.

> for a grand total of 8 or 9 at least.
> And Pekka Pirinen <·····@harlequin.co.uk>
> in <·······················@gaspode.cam.harlequin.co.uk> says you
> should add structures, methods, method combinations, and SETF methods.
> Some might even add LOOP keywords, since they can't be shadowed.
> What's that now, 12? 13?  But in any case, probably at least 11 or
> 12 of the following 13: ...

I don't think all of those should be counted.  If you go down
that road, users can add namespaces all over the place, whenever
they have a mapping from symbols to objects, and even Scheme
programs could have any number of namespaces.

>     - Functions & macros
>     - lexical variables
>     - special variables

How do specials get a separate namespace?  Consider, in CMUCL:

* (let ((x 1))                 ; special
    (declare (special x))      ; because of this declare
    (let ((x 2))               ; lexical
      x)))
2

The lexical X shadows the special one, so they're not in
separate namespaces.  Of course, you can do

* (let ((x 1)) 
    (declare (special x))
    (let ((x 2))
      (locally (declare (special x)) x)))
1

But that just shows that the situation is complex, not that
specials are a separate namespace.

I won't go through the rest.  We obviously disagree on what
counts as a namespace.

Historically, the aim of saying Common Lisp has lots of
namespaces has been to say Common Lisp is a mess.  So
everything that's anything like a namespace is thrown
in, because the aim is to spin to make CL sound as bad
as possible, and of course to complicate things for anyone
trying to defend separate function and variable namespaces.

Presumably that's not *your* aim; but the effect can be
the same.

-- jd
From: Rob Warnock
Subject: Re: Counting namespaces, was Re: scheme seems neater
Date: 
Message-ID: <pPqdnUGlwtWa6uvd4p2dnA@speakeasy.net>
Jeff Dalton  <····@todday.inf.ed.ac.uk> wrote:
+---------------
| The lexical X shadows the special one, so they're not in
| separate namespaces.
+---------------

And vice-versa, which led me to concede that particular point in
my parallel reply to Brian Mastenbrook.

+---------------
| I won't go through the rest.  We obviously disagree on what
| counts as a namespace.
+---------------

Possibly, though perhaps not as much as it seems. I was quoting a bunch
of people from previous similar threads, and not arguing that hard myself
for all of their viewpoints.

My own view is more that something is a separate namespace if within
the same lexical scope -- and *especially* within the same expression --
you can use the same symbol to mean different things. As I replied to
Brian, that cuts the number of namespaces to (I think) about 6, and
even that only if you consider the uses of a symbol for it's symbol-name
rather for some binding. [Which I feel one should, since LOOP keywords,
normal keywords in function calls and macros, method combinatation names,
CATCH tags -- all of these can be lumped into one "marker" namespace,
which *is* disjoint from function, variable, block tag, & GO tag bindings,
as well as declaration identifiers (which you omitted).]

+---------------
| Historically, the aim of saying Common Lisp has lots of namespaces
| has been to say Common Lisp is a mess.
+---------------

That is/was certainly not my intent. It was only to show that it is a
"Lisp-N" for some (small) N > 2, just as C has more than two namespaces.

I happen to think the number is at least 5, probably 6 [rational: CL
really *does* have programmer-visible symbols, *not* just variable
bindings], and *maybe* one or two more, but I'm not going to argue
strongly for more than 6.

+---------------
| So everything that's anything like a namespace is thrown in,
| because the aim is to spin to make CL sound as bad as possible...
+---------------

Again, not my intention at all. Rather, I simply think that it's good
from time to time to clarify just what CL's rules are, so that we all
can use it reliably & safely & without undue surprises.

+---------------
| Presumably that's not *your* aim; but the effect can be the same.
+---------------

(See above.)


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Jeff Dalton
Subject: Re: Counting namespaces, was Re: scheme seems neater
Date: 
Message-ID: <fx41xmwbfut.fsf@tarn.inf.ed.ac.uk>
····@rpw3.org (Rob Warnock) writes:

> Jeff Dalton  <····@todday.inf.ed.ac.uk> wrote:
> +---------------
> | I won't go through the rest.  We obviously disagree on what
> | counts as a namespace.
> +---------------
> 
> Possibly, though perhaps not as much as it seems. I was quoting a bunch
> of people from previous similar threads, and not arguing that hard myself
> for all of their viewpoints.

Ok.  Fair enough.

> My own view is more that something is a separate namespace if within
> the same lexical scope -- and *especially* within the same expression --
> you can use the same symbol to mean different things.

I don't think different meaning is enough.  I think the most
relevant sense of "namespace" is about reference: mappings
from identifiers to objects, not necessarily to 1st-class
objects.  I said more about this in a message posted just
before I posted this one.

> +---------------
> | So everything that's anything like a namespace is thrown in,
> | because the aim is to spin to make CL sound as bad as possible...
> +---------------
> 
> Again, not my intention at all. Rather, I simply think that it's good
> from time to time to clarify just what CL's rules are, so that we all
> can use it reliably & safely & without undue surprises.

Ok, but I think that can be done without using the term
"namespace".  The semantics can be explained using the
various mappings that have symbols as their domain without
ever saying, or even considering, which ones are "namespaces".

Moreover, namespaces have become politicized, and there are two
competing images of Common Lisp out there.  They feature in
messages on a variety of topics.

I think the one in which Common Lisp is a union of all features of
earlier Lisps, is full of random historical artifacts that have been
retained for no good reason, has a confusing jumble of namespaces
(no one can even say how many), and so on, is false and should lose.

-- jd
From: Peter Seibel
Subject: Re: Counting namespaces, was Re: scheme seems neater
Date: 
Message-ID: <m3y8p6qq4f.fsf@javamonkey.com>
····@rpw3.org (Rob Warnock) writes:

> Jeff Dalton  <····@todday.inf.ed.ac.uk> wrote:
> +---------------
> | ···@sevak.isi.edu (Thomas A. Russ) writes:
> | > In addition to namespaces for variables and functions,
> | > Common Lisp also has namespaces for:
> | >    classes
> | >    packages
> ...
> | There's a namespace of go tags and another for block names,
> | so that CL is a Lisp-4.
> | 
> | There might be some ones added after CLTL 1, but I don't
> | remember any off hand.
> +---------------
>
> Looking at CLHS "3.1.1.1 The Global Environment" we see at least 5,
> and adding "3.1.1.3 Lexical Environments" we see 3 or 4 more (4 if
> you count "compiler macros that are locally disabled"), for a grand
> total of 8 or 9 at least. And Pekka Pirinen <·····@harlequin.co.uk>
> in <·······················@gaspode.cam.harlequin.co.uk> says you
> should add structures, methods, method combinations, and SETF
> methods. Some might even add LOOP keywords, since they can't be
> shadowed. What's that now, 12? 13? But in any case, probably at
> least 11 or 12 of the following 13:

Hmmm. I think there's a useful distinction to be drawn between "things
that have names" and "namespaces".

For instance, it's not clear to me that method names should be thought
of as being in their own namespace--the names of methods are used to
connect them to the names of generic functions which are in the
function-name namespace. I seems to me that if we want to say method
names are in their own namespace then we should also say that symbols
used to call functions are in a different namespace than symbols used
to name functions (i.e. (foo ...) vs (defun foo ...)). Which is
clearly insane, right?

Similarly, the names of compiler macros are really just the names of
functions--the compiler macro is associated with the function through
the name. Likewise for SETF expanders--they are associated with
functions via the function name.

The names of structures and classes are not in distinct
namespaces because, as I'm sure you know, structures *are* classes.
This is analogous to the reason you put functions and macros in the
same namespace even though they are obviously different kinds of
things--it isn't possible to have both a structure-class and a
standard-class with the same symbol for their name.

Whether types and classes are in the same namespace is a sort of
interesting (in an angels dancing on pinheads kind of way) question.
On the one hand, since every standard-class name is also automatically
made the name of the type, you could argue that the type and the class
are connected through the name the same a way a method is connected to
its GF. On the other hand, type names can exist independent of class
names so you could also argue that there are distinct namespaces that
happen to overlap.

Another namespace that I don't think anyone has mentoned are restarts.

Anyway, based on those comments I'd reorganize your list like this:

  - Functions, macros, compiler macros, setf expanders
  - Variables (lexical and special), symbol macros
  - Classes (including structure classes)
  - Types
  - Restarts
  - Method combinations
  - Declaration specifiers
  - GO tags
  - Block names
  - LOOP keywords

It may be worth thinking about which of these namespaces are
understood/implemented by the compiler vs which are "library"
namespaces, no different in principle than something that could be
written in user land. I'd include names that are used by special
operators in the first category. I'm not sure this is necessary a
principled distinction but here's how I might divide up the list above:

Compiler/special-op namespaces:

  - Functions, macros, compiler macros, setf expanders
  - Variables (lexical and special), symbol macros
  - GO tags
  - Block names
  - Declaration specifiers

"Library" name spaces

  - Classes (including structure classes)
  - Types
  - Restarts
  - Method combinations
  - LOOP keywords

If you further split up the Compiler/special-op list into those
namespaces that are implied by the basic form evaluation rules vs the
rest you then get down to the essential Lisp-2 nature of Common Lisp
that the evaluation rules make the functions, macros, etc. namespace
distinct from the variables namespace. For whatever that's worth.

-Peter

-- 
Peter Seibel                                      ·····@javamonkey.com

         Lisp is the red pill. -- John Fraser, comp.lang.lisp
From: Pekka P. Pirinen
Subject: Re: Counting namespaces, was Re: scheme seems neater
Date: 
Message-ID: <ixllkx17qa.fsf@ocoee.cam.harlequin.co.uk>
I agree with Duane Rettig that the number of global namespaces in CL
is not an interesting issue, but I'd like to point out that my list
was correct, and even complete in some sense.  I counted the angels on
this pinhead when I was working on the LispWorks facility for
recording the location of definitions.

Peter Seibel <·····@javamonkey.com> writes:
> ····@rpw3.org (Rob Warnock) writes:
> 
> > Jeff Dalton  <····@todday.inf.ed.ac.uk> wrote:
> > [...]
> > Looking at CLHS "3.1.1.1 The Global Environment" we see at least 5,
> > and adding "3.1.1.3 Lexical Environments" we see 3 or 4 more (4 if
> > you count "compiler macros that are locally disabled"), for a grand
> > total of 8 or 9 at least. And Pekka Pirinen <·····@harlequin.co.uk>
> > in <·······················@gaspode.cam.harlequin.co.uk> says you
> > should add structures, methods, method combinations, and SETF
> > methods. [...]
> 
> Hmmm. I think there's a useful distinction to be drawn between "things
> that have names" and "namespaces".

Yes, but all those things are namespaces: separate mappings from names
to bindings.  The mechanics of these namespaces are perhaps not
obvious, because the standard doesn't discuss the naming from this
point of view.

> For instance, it's not clear to me that method names should be thought
> of as being in their own namespace--the names of methods are used to
> connect them to the names of generic functions which are in the
> function-name namespace.  [...]

Except that a method is named by the gf name AND the argument
signature AND the set of qualifiers.  E.g., LispWorks represents
method names like this: (FROBNICATE :BEFORE (BAR NUMBER)).

> Similarly, the names of compiler macros are really just the names of
> functions--the compiler macro is associated with the function through
> the name.

The association is purely pragmatic.  The function (or macro)
definition and compiler macro definition of the names are separate and
independent, so there are two mappings.  There's no formal requirement
for there to even be a function (or macro) of the same name.

> Likewise for SETF expanders--they are associated with
> functions via the function name.

Likewise, indeed.

> The names of structures and classes are not in distinct namespaces
> because, as I'm sure you know, structures *are* classes.  [...]

Except the ones that aren't, obscure as they are: the ones with an
explicit :TYPE option.

> Whether types and classes are in the same namespace is a sort of
> interesting (in an angels dancing on pinheads kind of way) question.
> [...]

The best way of describing that is that the class namespace is a
subnamespace of the type namespace.  Subnamespaces are not uncommon:
say, functions and generic functions.

> Another namespace that I don't think anyone has mentoned are restarts.

Hmm, they work rather like CATCH tags, with dynamic bindings only.
I'm not sure if any useful parallel can be drawn with the lexical and
global namespaces.

> [list of namespaces omitted]
> It may be worth thinking about which of these namespaces are
> understood/implemented by the compiler vs which are "library"
> namespaces, no different in principle than something that could be
> written in user land.  [...]

Since CL is in essence meta-circular, I can't see any firm dividing
line here.  Nor is there a strict division in Lisp implementations
between the compiler and library: If a form expands into (EVAL-WHEN
(:COMPILE-TOPLEVEL) ...) or does a code transformation by walking its
argument, is it compiler or library?
-- 
Pekka P. Pirinen
Alas, Usenet is where one will often see crowds of people jumping up
and down on the greasy smear on the pavement that used to be a dead
horse.  - Nyrath the nearly wise
From: André Thieme
Subject: Re: scheme seems neater
Date: 
Message-ID: <c594ra$ql2$1@ulric.tng.de>
Paul Wallich wrote:

> Some people view "I perceived with my eyes a toothed wood-cutting device 
> and sawed a board with it" as more elegant, and others prefer "I saw the 
> saw and sawed a board with it." 

Or:

"Steele was stealing some steel".  ;)


Andr�
--
From: Steven M. Haflich
Subject: Re: scheme seems neater
Date: 
Message-ID: <57Jfc.23172$pD1.11255@newssvr27.news.prodigy.com>
Paul Wallich wrote:

> Some people view "I perceived with my eyes a toothed wood-cutting device 
> and sawed a board with it" as more elegant, and others prefer "I saw the 
> saw and sawed a board with it." The language that only allows the first 
> version is undoubtedly more elegant in some ways, the one that 
> encourages the second in others.

When I was a child I played on a C saw.
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87isgd6uev.fsf@handgranat.org>
mikel <·····@evins.net> writes:
> I can see why you think that way, but in practice about the only real
> difference it makes is that coming up with a good macro system for
> Scheme has been harder (because of the much greater likelihood of
> accidental variable capture in a single namespace).

But lisp-2:s are also vulnerable to variable capture. Use gensyms.

Scheme now has three widespread macrosystems, one very similar to the
one available in Common Lisp.

-- 
One love,
Sunnan
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c4uhqa$g18$1@f1node01.rhrz.uni-bonn.de>
Sunnan wrote:

> mikel <·····@evins.net> writes:
> 
>>I can see why you think that way, but in practice about the only real
>>difference it makes is that coming up with a good macro system for
>>Scheme has been harder (because of the much greater likelihood of
>>accidental variable capture in a single namespace).
> 
> But lisp-2:s are also vulnerable to variable capture. Use gensyms.

Take a look at the paper "Syntactic Closures" by Bawden and Rees. They 
present four examples, one of which is admitted to be a pathological case.

The first example in that paper cannot happen in a Lisp-2. The other two 
examples present cases of name capture that you might actually want, and 
therefore have to understand anyway. (Additionally, the third one can be 
solved with the Common Lisp package system.)


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <871xn16puq.fsf@handgranat.org>
Pascal Costanza <········@web.de> writes:
> The first example in that paper cannot happen in a Lisp-2. The other
> two examples present cases of name capture that you might actually
> want, and therefore have to understand anyway.

Are you saying that you never, never, ever, under the bluest of moons
would want the first example, so you should use a lisp-2 so you don't
have to understand it?

> (Additionally, the
> third one can be solved with the Common Lisp package system.)

In theory Lisp-1:s can have a standardized package system. Scheme's
lack of one isn't because it's a lisp-1.

-- 
One love,
Sunnan
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c4unns$saq$1@f1node01.rhrz.uni-bonn.de>
Sunnan wrote:

> Pascal Costanza <········@web.de> writes:
> 
>>The first example in that paper cannot happen in a Lisp-2. The other
>>two examples present cases of name capture that you might actually
>>want, and therefore have to understand anyway.
> 
> Are you saying that you never, never, ever, under the bluest of moons
> would want the first example, so you should use a lisp-2 so you don't
> have to understand it?

The first example shows how a global function definition is replaced 
with a local value definition. If you have a good example in which this 
makes sense, I'd be interested to hear about it.

>>(Additionally, the
>>third one can be solved with the Common Lisp package system.)
> 
> In theory Lisp-1:s can have a standardized package system. Scheme's
> lack of one isn't because it's a lisp-1.

Right. So?


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87hdvx580x.fsf@handgranat.org>
Pascal Costanza <········@web.de> writes:
> The first example shows how a global function definition is replaced
> with a local value definition. If you have a good example in which
> this makes sense, I'd be interested to hear about it.

Not really, but how about where a global function definition is
replaced by a local function definition? Bad style, maybe, but for the
purposes of example:

(let ((cons monitored-cons))
  (push 'foo stack))

(or am I looking at the wrong example in the paper?)

>>>(Additionally, the
>>>third one can be solved with the Common Lisp package system.)
>> In theory Lisp-1:s can have a standardized package system. Scheme's
>> lack of one isn't because it's a lisp-1.
>
> Right. So?

Just saying that it isn't necesserily the lisp-2:iness off CL that
solves the third example.

(Btw, please see the other post on why I am arguing these perhaps
petty points. It is not for the purpose of hostility.)

-- 
One love,
Sunnan
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c4uuhf$ih4$1@f1node01.rhrz.uni-bonn.de>
Sunnan wrote:

> Pascal Costanza <········@web.de> writes:
> 
>>The first example shows how a global function definition is replaced
>>with a local value definition. If you have a good example in which
>>this makes sense, I'd be interested to hear about it.
> 
> Not really, but how about where a global function definition is
> replaced by a local function definition? Bad style, maybe, but for the
> purposes of example:
> 
> (let ((cons monitored-cons))
>   (push 'foo stack))

That's a case you might want to have and need to understand. The 
annoying name capture issues are those between values and functions, not 
those between values and values, or functions and functions. (Note that 
you have, consciously or subconsciously, chosen an example in which cons 
is replaced by a modified version thereof that you may want to see used 
in the enclosed code.)

>>>>(Additionally, the
>>>>third one can be solved with the Common Lisp package system.)
>>>
>>>In theory Lisp-1:s can have a standardized package system. Scheme's
>>>lack of one isn't because it's a lisp-1.
>>
>>Right. So?
> 
> Just saying that it isn't necesserily the lisp-2:iness off CL that
> solves the third example.

Huh?!? I didn't say that packages "necessarily" solve this issue. [1] I 
have only said that you _can_ use them for solving this issue.


Pascal

[1] This doesn't really make sense, does it? Maybe you mean something else.

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <8765cc52yf.fsf@handgranat.org>
Pascal Costanza <········@web.de> writes:
> That's a case you might want to have and need to understand. The
> annoying name capture issues are those between values and functions,
> not those between values and values, or functions and functions. (Note
> that you have, consciously or subconsciously, chosen an example in
> which cons is replaced by a modified version thereof that you may want
> to see used in the enclosed code.)

I had trouble finding the example in the paper that you referred to. I
thought you referred to a specific example and I thought you meant the
one I chose.

IMVHO, all accidental variable capture is annoying and possibly fatal.

>>>>>(Additionally, the
>>>>>third one can be solved with the Common Lisp package system.)
>>>>
>>>>In theory Lisp-1:s can have a standardized package system. Scheme's
>>>>lack of one isn't because it's a lisp-1.
>>>
>>>Right. So?
>> Just saying that it isn't necesserily the lisp-2:iness off CL that
>> solves the third example.
>
> Huh?!? I didn't say that packages "necessarily" solve this issue. [1]

(As an aside, words I tend to misspell include "adress", "necessarily"
and "occasion". Thanks for keeping me in line.)

> I have only said that you _can_ use them for solving this issue.

Which I don't disagree with. All I'm saying is that that's not an
argument in favour of lisp-2, just an argument in favour of a good
package system.

-- 
One love,
Sunnan
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx44qrwho6i.fsf@todday.inf.ed.ac.uk>
Sunnan <······@handgranat.org> writes:

> >> Just saying that it isn't necesserily the lisp-2:iness off CL that
> >> solves the third example.

> > Huh?!? I didn't say that packages "necessarily" solve this issue. [1]

> > I have only said that you _can_ use them for solving this issue.

> Which I don't disagree with. All I'm saying is that that's not an
> argument in favour of lisp-2, just an argument in favour of a good
> package system.

The argument is that when you combine 2 namespaces with packages
and certain conventions for writing macros, you get a very usable
system that is not plagued by the sorts of conflicts hygienic
macros are designed to avoid, so that having 2 namespaces is
not nearly as bad as the enemies of 2 namespaces would have us
believe.

-- jd
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c4v1fc$sao$1@f1node01.rhrz.uni-bonn.de>
Sunnan wrote:

> Pascal Costanza <········@web.de> writes:
> 
>>That's a case you might want to have and need to understand. The
>>annoying name capture issues are those between values and functions,
>>not those between values and values, or functions and functions. (Note
>>that you have, consciously or subconsciously, chosen an example in
>>which cons is replaced by a modified version thereof that you may want
>>to see used in the enclosed code.)
> 
> I had trouble finding the example in the paper that you referred to. I
> thought you referred to a specific example and I thought you meant the
> one I chose.

Yep, I did. The important point is that it presents an example of name 
capture between a function and a value. You have changed it to present a 
name capture between two functions. That's an essential change.

> IMVHO, all accidental variable capture is annoying and possibly fatal.

This sounds like the arguments for the presumably fatal errors of 
non-static type systems. The important question is whether these 
"possibly fatal" consequences do regularly occur in practice. Lisp-2 
makes an important class of accidental name capture completely go away. 
The naming conventions for global variables cover another important 
class of name captures. The rest can be dealt with with packages and 
gensym. These are simple and straightforward solutions that work well in 
concert. They are not available in Scheme, seemingly because they are 
regarded as hacks, and therefore rejected.

> (As an aside, words I tend to misspell include "adress", "necessarily"
> and "occasion". Thanks for keeping me in line.)

Ha! You need a hygienic spell checker for editing programs. ;-))

>>I have only said that you _can_ use them for solving this issue.
> 
> Which I don't disagree with. All I'm saying is that that's not an
> argument in favour of lisp-2, just an argument in favour of a good
> package system.

It's an argument for a synergistic combination of language features.


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87ekr024wz.fsf@handgranat.org>
Pascal Costanza <········@web.de> writes:
> Yep, I did. The important point is that it presents an example of name
> capture between a function and a value. You have changed it to present
> a name capture between two functions. That's an essential change.

Right, and I just realized it could be done in CL as well, with flet,
so I broke the "can only happen in lisp-1"-requirement with my
change. I give up on that.

Still, aren't you atleast a teensy bit impressed? I figure it'd be
fine because I didn't change the macro. All I did was to change the 5
to a monitored-cons and I got something useful out of it.

>> IMVHO, all accidental variable capture is annoying and possibly fatal.
>
> This sounds like the arguments for the presumably fatal errors of
> non-static type systems. The important question is whether these
> "possibly fatal" consequences do regularly occur in practice. Lisp-2 
> makes an important class of accidental name capture completely go
> away. The naming conventions for global variables cover another
> important class of name captures. The rest can be dealt with with
> packages and gensym. These are simple and straightforward solutions
> that work well in concert. They are not available in Scheme, seemingly
> because they are regarded as hacks, and therefore rejected.

gensym is available in many implementations, including the one I use,
and they work for function names as well.

>> (As an aside, words I tend to misspell include "adress", "necessarily"
>> and "occasion". Thanks for keeping me in line.)
>
> Ha! You need a hygienic spell checker for editing programs. ;-))

Naw, I'd just need to grep for "address", "necessarily", and
"occasion". Hmm, maybe introduce that to gnus... not tonight, though.

-- 
One love,
Sunnan
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <LHAcc.13439$yN6.3440@newsread2.news.atl.earthlink.net>
"mikel" <·····@evins.net> wrote:
> robbie carlton wrote:
> > Hi. Probs stupid question from someone relatively new to Lisp. In
> > scheme a symbol has only one visible binding at anytime, whereas in CL
> > a symbol can have a variable value a function value, a property list,
> > a documentation string, and probs some other junk I forgot. Question
> > is, why? Doesn't the CL way just promote messy unreadable code. Also,
> > the Scheme way means function definitions are much more pretty, and
> > consistent, ie assigning a symbol to a function literal. It just seems
> > nicer. I understand Paul Grahams arc is going to include some of that
> > schemeyness, but probably won't be around for a decade. Am I just
> > wrong, or is Scheme just more elegant than CL?
>
> I can see why you think that way, but in practice about the only real
> difference it makes is that coming up with a good macro system for
> Scheme has been harder (because of the much greater likelihood of
> accidental variable capture in a single namespace).

This is a myth.

Most Schemes provide a defmacro just like CL's that works just fine, as far
as defmacro goes.  Scheme macro systems are addressing other issues - in
particular, providing a macro system that integrates with the lexical
structure of the underlying language, rather than being entirely unaware of
that structure.

Anton
From: Erann Gat
Subject: Re: scheme seems neater
Date: 
Message-ID: <gNOSPAMat-0604041004140001@k-137-79-50-101.jpl.nasa.gov>
In article <····················@newsread2.news.atl.earthlink.net>, "Anton
van Straaten" <·····@appsolutions.com> wrote:

> "mikel" <·····@evins.net> wrote:
> > robbie carlton wrote:
> > > Hi. Probs stupid question from someone relatively new to Lisp. In
> > > scheme a symbol has only one visible binding at anytime, whereas in CL
> > > a symbol can have a variable value a function value, a property list,
> > > a documentation string, and probs some other junk I forgot. Question
> > > is, why? Doesn't the CL way just promote messy unreadable code. Also,
> > > the Scheme way means function definitions are much more pretty, and
> > > consistent, ie assigning a symbol to a function literal. It just seems
> > > nicer. I understand Paul Grahams arc is going to include some of that
> > > schemeyness, but probably won't be around for a decade. Am I just
> > > wrong, or is Scheme just more elegant than CL?
> >
> > I can see why you think that way, but in practice about the only real
> > difference it makes is that coming up with a good macro system for
> > Scheme has been harder (because of the much greater likelihood of
> > accidental variable capture in a single namespace).
> 
> This is a myth.
> 
> Most Schemes provide a defmacro just like CL's that works just fine, as far
> as defmacro goes.  Scheme macro systems are addressing other issues - in
> particular, providing a macro system that integrates with the lexical
> structure of the underlying language, rather than being entirely unaware of
> that structure.

This is a myth.  ;-)

Common Lisp macros have visibility into the lexical structure of the
langauge through the &environment mechanism.  This mechanism may not be
specified in enough detail to fulfill all the desires of a Scheme
programmer, but to say that CL macros are "entirely unaware" of the
lexical structure of the language is not correct.

E.
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <EzDcc.11995$NL4.6131@newsread3.news.atl.earthlink.net>
Erann Gat wrote:
> In article <····················@newsread2.news.atl.earthlink.net>, "Anton
> van Straaten" <·····@appsolutions.com> wrote:
...
> > > I can see why you think that way, but in practice about the only real
> > > difference it makes is that coming up with a good macro system for
> > > Scheme has been harder (because of the much greater likelihood of
> > > accidental variable capture in a single namespace).
> >
> > This is a myth.
> >
> > Most Schemes provide a defmacro just like CL's that works just fine, as
far
> > as defmacro goes.  Scheme macro systems are addressing other issues - in
> > particular, providing a macro system that integrates with the lexical
> > structure of the underlying language, rather than being entirely unaware
of
> > that structure.
>
> This is a myth.  ;-)

Your "this" is much narrower than my "this", though.  :)  I take back the
"entirely unaware" part, and will clarify below.

> Common Lisp macros have visibility into the lexical structure of the
> langauge through the &environment mechanism.  This mechanism may not be
> specified in enough detail to fulfill all the desires of a Scheme
> programmer, but to say that CL macros are "entirely unaware" of the
> lexical structure of the language is not correct.

OK, granted.  However, it's not just that the CL mechanism isn't specified
in enough detail, but that it's relatively uncontrolled.  The connection to
lexical structure that I made is important here: the Scheme core language is
entirely predicated on lambda and its lexical scope.  What's wanted for
Scheme is a macro system which respects that structure and integrates with
it, and even (e.g. with syntax-case) requires the programmer to be explicit
about how a macro is expected to interact with its lexical context.  The
hygienic Scheme macro systems do this.  "Hygiene" is a basic requirement for
such systems, but in itself it's not the end goal.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx48yh8hof5.fsf@todday.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> What's wanted for
> Scheme is a macro system which respects that structure and integrates with
> it, and even (e.g. with syntax-case) requires the programmer to be explicit
> about how a macro is expected to interact with its lexical context.  The
> hygienic Scheme macro systems do this.  "Hygiene" is a basic requirement for
> such systems, but in itself it's not the end goal.

It's more than a basic requirement.  I think it's fair to say
hygiene is a goal.

Many Scheme implementations had non-hygienic macro systems
before hygienic macros came along.  Hygiene wasn't enough
of a basic requirement to stop those earlier macro systems
from being implemented and used.  And once hygiene came along,
anyone devising a macro system for Scheme had to make it
hygienic if they wanted it to have any hope of being adopted.

There was a debate about whether macros had to be defined
using rules or whether they could be done at a lower level,
and there was some difficulty in combining the two, but
hygiene was the most important issue.

Moreover, the rules don't fit all that well with Scheme.
They're a different language.

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <AVVcc.171$l75.49@newsread2.news.atl.earthlink.net>
Jeff Dalton wrote:

> "Anton van Straaten" <·····@appsolutions.com> writes:
>
> > What's wanted for
> > Scheme is a macro system which respects that structure and integrates
with
> > it, and even (e.g. with syntax-case) requires the programmer to be
explicit
> > about how a macro is expected to interact with its lexical context.  The
> > hygienic Scheme macro systems do this.  "Hygiene" is a basic requirement
for
> > such systems, but in itself it's not the end goal.
>
> It's more than a basic requirement.  I think it's fair to say
> hygiene is a goal.

I don't mean to quibble about that.  I'm trying to express that hygiene is
part of a bigger picture.

For example, in Bawden & Rees' 1988 paper "Syntactic Closures", after
describing the various scenarios which violate hygiene and referential
transparency, the authors summarize it like this:

"All of these problems are consequences of the fact that macros are
oblivious to the lexical scoping of the program text that they are
constructing.  Any macro facility that proposes to address this shortcoming
also has to take into account that sometimes the macro writer needs explicit
control over scoping."

It's this desire to integrate with lexical scoping, which also implies
referential transparency, that in Scheme at least, drives the design of
these macro systems.  Hygiene is just a really basic issue you have to take
care of in order to achieve that.

To some extent, the term "hygienic macro" focuses on a particular technical
requirement of the implementation of these macro systems, and may cause the
larger context to be missed, as in "geez, you designed a whole complicated
new macro system just to avoid unintended capture??", when there's a lot
more to it than that.  It would be a bit like assuming that LISP is only
good for LISt Processing (which someone just did in the last couple of
days).

In addition, assuming that the perceived complexity and difference of these
macro systems is all because of hygiene, or even the broader lexical scoping
problem, would be a mistake.  They address a number of other issues which
aren't strictly required to deal with hygiene, for example pattern matching
of syntax, a template substitution system that supports pattern variables
that don't need to be quoted, templates that can specify the expansion of
repetitive forms, and an abstract data type for syntax that supports richer
representations of programs.

> Many Scheme implementations had non-hygienic macro systems
> before hygienic macros came along.  Hygiene wasn't enough
> of a basic requirement to stop those earlier macro systems
> from being implemented and used.

Right, I'm saying hygiene was a basic requirement for macro systems which
attempted to be more sophisticated than the non-hygienic systems.  It was
certainly a goal in the sense that it had to be achieved in order to achieve
the primary broader goal of integration with lexical scoping.

> And once hygiene came along,
> anyone devising a macro system for Scheme had to make it
> hygienic if they wanted it to have any hope of being adopted.

Yeah - it was a basic requirement.  Now, I'm quibbling.  :)

> There was a debate about whether macros had to be defined
> using rules or whether they could be done at a lower level,
> and there was some difficulty in combining the two, but
> hygiene was the most important issue.

I think this somewhat misses the point.  Hygiene was very important - you
couldn't integrate macros with lexical scope without it.  But it's
nevertheless a kind of implementation detail.

> Moreover, the rules don't fit all that well with Scheme.
> They're a different language.

It sounds like you're talking about syntax-rules.  In that case, since
standard Scheme is not allowed within syntax-rules macros, you do indeed
have a different language, one which is guaranteed to interact correctly
with the lexical structure of the target language.  However, systems like
syntax-case, which allow ordinary Scheme code, are very much the same
language as Scheme - it's ordinary Scheme augmented by some macros and
abstract data types, like much other Scheme code.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4smfe5ji4.fsf@todday.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Jeff Dalton wrote:

> > It's more than a basic requirement.  I think it's fair to say
> > hygiene is a goal.

> I don't mean to quibble about that.  I'm trying to express that hygiene is
> part of a bigger picture.

I agree.

> For example, in Bawden & Rees' 1988 paper "Syntactic Closures", after
> describing the various scenarios which violate hygiene and referential
> transparency, the authors summarize it like this:
> 
> "All of these problems are consequences of the fact that macros are
> oblivious to the lexical scoping of the program text that they are
> constructing.  Any macro facility that proposes to address this shortcoming
> also has to take into account that sometimes the macro writer needs explicit
> control over scoping."

Ok, but that's "explicit control over scoping", which syntactic
closures provides.  Some other Scheme macro systems don't, though.
They do what's thought to be the right thing, rather than giving
control.

> It's this desire to integrate with lexical scoping, which also implies
> referential transparency, that in Scheme at least, drives the design of
> these macro systems.  Hygiene is just a really basic issue you have to take
> care of in order to achieve that.

But the advent of hygienic macros was what made this a big deal.
Before that, people were happy to implement and use ordinary Lisp
macro systems in Scheme.

> To some extent, the term "hygienic macro" focuses on a particular technical
> requirement of the implementation of these macro systems, and may cause the
> larger context to be missed, as in "geez, you designed a whole complicated
> new macro system just to avoid unintended capture??", when there's a lot
> more to it than that. 

What's the lot more?

> In addition, assuming that the perceived complexity and difference of these
> macro systems is all because of hygiene, or even the broader lexical scoping
> problem, would be a mistake.  They address a number of other issues which
> aren't strictly required to deal with hygiene, for example pattern matching
> of syntax, a template substitution system that supports pattern variables
> that don't need to be quoted, templates that can specify the expansion of
> repetitive forms, and an abstract data type for syntax that supports richer
> representations of programs.

For me, and I think for quite a few other people as well, the
preceived complexity is almost all about hygiene and avoiding
all the different cases of unintentional capture.  The pattern-
matching, templates, etc, are not preceived as complex, but as
quite straightforward.

With Dylan, it's different, because the pattern language is more
complex.

> > There was a debate about whether macros had to be defined
> > using rules or whether they could be done at a lower level,
> > and there was some difficulty in combining the two, but
> > hygiene was the most important issue.

> ...

> > Moreover, the rules don't fit all that well with Scheme.
> > They're a different language.
> 
> It sounds like you're talking about syntax-rules.  In that case, since
> standard Scheme is not allowed within syntax-rules macros, you do indeed
> have a different language, one which is guaranteed to interact correctly
> with the lexical structure of the target language.

Right, and that's the sort of thing I had in mind above when saying
some systems didn't give you control, they just did what was thought
to be the right thing (though if I remember correctly, syntax-rules
had some kind of clause that gave you some control).

> However, systems like
> syntax-case, which allow ordinary Scheme code, are very much the same
> language as Scheme - it's ordinary Scheme augmented by some macros and
> abstract data types, like much other Scheme code.

Syntax case was later than the "debate" I mention above.

Also, syntax-case allows ordinary Scheme, but it has some other
stuff as well.  Patterns, template-filling.  Do they become
available forordinary list processing?  It doesn't look like
they do.  And macros are written using pattern-matching and
templates (syntax-case and syntax), not ordinary Scheme,
though ordinary Scheme can be mixed in.

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <c6jdc.1101$l75.197@newsread2.news.atl.earthlink.net>
Jeff Dalton wrote:
> "Anton van Straaten" <·····@appsolutions.com> writes:
...
> > For example, in Bawden & Rees' 1988 paper "Syntactic Closures", after
> > describing the various scenarios which violate hygiene and referential
> > transparency, the authors summarize it like this:
> >
> > "All of these problems are consequences of the fact that macros are
> > oblivious to the lexical scoping of the program text that they are
> > constructing.  Any macro facility that proposes to address this
shortcoming
> > also has to take into account that sometimes the macro writer needs
explicit
> > control over scoping."
>
> Ok, but that's "explicit control over scoping", which syntactic
> closures provides.  Some other Scheme macro systems don't, though.
> They do what's thought to be the right thing, rather than giving
> control.

All of the so-called hygienic systems (afaik) support referential
transparency by default, meaning (in this context) that you can write a
macro without taking special steps, expand it anywhere, and it will behave
the same way, depending only on its explicitly provided inputs.  To achieve
this, you need "hygiene", and this is the sense in which I consider hygiene
a basic requirement.  The broader goal is to achieve the sorts of semantic
properties from macros that can be relied on in the rest of Scheme.

> > It's this desire to integrate with lexical scoping, which also implies
> > referential transparency, that in Scheme at least, drives the design of
> > these macro systems.  Hygiene is just a really basic issue you have to
take
> > care of in order to achieve that.
>
> But the advent of hygienic macros was what made this a big deal.
> Before that, people were happy to implement and use ordinary Lisp
> macro systems in Scheme.

Part of the problem with this discussion is that the term "hygienic macros"
begs the question.  I think it puts a misleading focus on what amounts to an
implementation detail, which becomes apparent when you examine what the
hygienic systems actually do.

> > To some extent, the term "hygienic macro" focuses on a particular
technical
> > requirement of the implementation of these macro systems, and may cause
the
> > larger context to be missed, as in "geez, you designed a whole
complicated
> > new macro system just to avoid unintended capture??", when there's a lot
> > more to it than that.
>
> What's the lot more?

The fact that the macros are semantically sound with respect to the target
language.  I don't think you can claim that hygiene means this, but it's
fair to say that you need hygiene to implement this.

Then, in systems more flexible than syntax-rules, like syntax-case, the "lot
more" is the ability to capture variables and introduce capturing variables
but in a way in which explicit control of lexical scope is provided.  This
explicit scope control is not required merely to be "hygienic", but it is
required to provide a macro system of at least equal power to defmacro which
integrates with the target language's semantics.

> For me, and I think for quite a few other people as well, the
> preceived complexity is almost all about hygiene and avoiding
> all the different cases of unintentional capture.

I don't deny that perception exists, but nevertheless that perception is
focused around what is essentially an implementation detail, and misses the
bigger picture.

To attempt a hopefully helpful analogy, it would be like calling values in
Lisp or objects in Java "pointer-free values", because they don't allow
C-style pointer errors.  It's an accurate enough description, in context,
but it doesn't actually give much insight into the properties of such
values.  Focusing on "hygiene" seems to be similarly misleading.

> > It sounds like you're talking about syntax-rules.  In that case, since
> > standard Scheme is not allowed within syntax-rules macros, you do indeed
> > have a different language, one which is guaranteed to interact correctly
> > with the lexical structure of the target language.
>
> Right, and that's the sort of thing I had in mind above when saying
> some systems didn't give you control, they just did what was thought
> to be the right thing

Syntax-rules doesn't have to guess what the right thing is, since it never
allows variable capture under any circumstances, short of cheating and using
it to redefine binding operators so you can "simulate" capture.  As a
result, syntax-rules always does the right thing, but has limitations in
what it can do.

> (though if I remember correctly, syntax-rules
> had some kind of clause that gave you some control).

I can't think of anything like that.

> Also, syntax-case allows ordinary Scheme, but it has some other
> stuff as well.  Patterns, template-filling.  Do they become
> available forordinary list processing?  It doesn't look like
> they do.

Those constructs operate on a specific data type, the syntax object.  If you
want to process lists with them, you'd have to convert them to syntax
objects and back, e.g. here are some ordinary function definitions of car
and cdr using syntax-case and associated functions to destructure the
provided list:

(define (kar x)
  (syntax-object->datum
    (syntax-case (datum->syntax-object #f x) ()
      ((xar xdr ...) (syntax xar)))))

(define (kdr x)
  (syntax-object->datum
    (syntax-case (datum->syntax-object #f x) ()
      ((xar xdr ...) (syntax (xdr ...))))))

(kar '(3 2 1))  => 3
(kdr '(3 2 1))  => (2 1)
(kar (list 3 2 1))  => 3
(kdr (list 3 2 1))  => (2 1)

(And yes, those conversion functions are way too verbose!)

The point is, they're essentially ordinary Scheme constructs which you can
use in ordinary code.  Within the syntax-case macro itself, you also use
ordinary code except in the "pattern" position.  The only constraint on the
"template" position is that it must return a syntax object; other than that,
it's ordinary Scheme code.

> And macros are written using pattern-matching and
> templates (syntax-case and syntax), not ordinary Scheme,
> though ordinary Scheme can be mixed in.

That's a choice.  You only use the syntax-case form if you want to match
patterns.  However, the syntax-case system as a whole is bigger than that.
You can write macros completely procedurally without using the syntax-case
form, still making use of syntax objects.  You do at some point need to use
a syntax object constructor, like 'syntax', quasisyntax, or
datum->syntax-object, to create syntax objects, but you can use ordinary
Scheme and convert back and forth and do whatever you like.  You could write
such macros in defmacro style if you wanted to, using quasisyntax and
related operations that correspond directly to quasiquote, but for syntax
objects.

Perhaps the biggest problem with the accessibility of syntax-case is that
it's usually presented in much more abstract terms.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4vfk97vne.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Part of the problem with this discussion is that the term "hygienic
> macros" begs the question.  I think it puts a misleading focus on
> what amounts to an implementation detail, which becomes apparent
> when you examine what the hygienic systems actually do.

It's not an implementation detail; it's a property that a macro
system might, or might not, have.

> > What's the lot more?
> 
> The fact that the macros are semantically sound with respect to the target
> language.  I don't think you can claim that hygiene means this, but it's
> fair to say that you need hygiene to implement this.

What do you mean by "semantically sound"?  I take "hygeine" to mean
that ids have the referent they would in the context in which the
macro was defined -- ie to do for macros what full lexical scoping
does for functions.

> Then, in systems more flexible than syntax-rules, like syntax-case,
> the "lot more" is the ability to capture variables and introduce
> capturing variables but in a way in which explicit control of
> lexical scope is provided.

That's just a relaxation of hygiene in some cases.

> This explicit scope control is not required merely to be "hygienic",
> but it is required to provide a macro system of at least equal power
> to defmacro which integrates with the target language's semantics.

Ok, but that seemed to be relatively unimportant.  Once hygienic
macros came along, that was the big deal.  Scheme macros had to
be hygienic.  If some flexibility was possible, that might be
ok, but only so long as hygiene was normally in effect.

> > For me, and I think for quite a few other people as well, the
> > preceived complexity is almost all about hygiene and avoiding
> > all the different cases of unintentional capture.
> 
> I don't deny that perception exists,

But there we were talking about the perceived complexity.  You
wrote:

  In addition, assuming that the perceived complexity and difference
  of these macro systems is all because of hygiene, or even the
  broader lexical scoping problem, would be a mistake.

I don't think it *is* a mistake.  The perception might be wrong,
but it's the perception nonetheless.  And hygiene, and things
related to it, are behind that perception.  The perceived
complexity is therefore largely because of hygiene.

> but nevertheless that perception is
> focused around what is essentially an implementation detail, and
> misses the bigger picture.

It's not a detail.  A large part of the implementation, and the
part that's hard to understand, is about hygiene.

> > (though if I remember correctly, syntax-rules
> > had some kind of clause that gave you some control).
> 
> I can't think of anything like that.

I now think I was remembering something else.  I find it hard
to remember the names of all the different systems that were
developed for Scheme.

> > Also, syntax-case allows ordinary Scheme, but it has some other
> > stuff as well.  Patterns, template-filling.  Do they become
> > available forordinary list processing?  It doesn't look like
> > they do.
> 
> Those constructs operate on a specific data type, the syntax object.
> ...

Thanks for going into that.

> Perhaps the biggest problem with the accessibility of syntax-case is
> that it's usually presented in much more abstract terms.

The biggest problem I have with it is that it seems very complicated,
about as complicated as the rest of Scheme put together, but harder to
understand.

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <9_qec.4919$zj3.372@newsread3.news.atl.earthlink.net>
Jeff Dalton wrote:
> "Anton van Straaten" <·····@appsolutions.com> writes:
>
> > Part of the problem with this discussion is that the term "hygienic
> > macros" begs the question.  I think it puts a misleading focus on
> > what amounts to an implementation detail, which becomes apparent
> > when you examine what the hygienic systems actually do.
>
> It's not an implementation detail; it's a property that a macro
> system might, or might not, have.

I agree it's also a property of a macro system.  Perhaps I should have said
that the name puts a misleading focus on a single property.  But it's an
implementation detail in the sense that it's something that the
implementation takes care of, and relieves the user of dealing with.  The
user of a hygienic macro system doesn't have to worry about hygiene.

> > > What's the lot more?
> >
> > The fact that the macros are semantically sound with respect to the
target
> > language.  I don't think you can claim that hygiene means this, but it's
> > fair to say that you need hygiene to implement this.
>
> What do you mean by "semantically sound"?

I mean that expanding a macro doesn't produce unexpected results due to
unintended name collisions, in whichever direction, e.g. capturing of
user-defined names by the macro, or capturing of macro-defined names by user
code.  You should be able to rely on the semantics of the macro without
consideration of the context in which it is expanded, other than the
arguments with which it is provided at expansion time; and even then, the
names used in those arguments should not be subject to unintended
interaction with names in the macro.  Finally, all of this should be done in
such a way that the lexical scope of the target language is preserved
appropriately, which means the alpha conversion rules of the lambda calculus
need to be followed.

> I take "hygeine" to mean that ids have the referent they
> would in the context in which the macro was defined
> -- ie to do for macros what full lexical scoping does for functions.

I would add that it's also necessary to ensure that ids in user code, i.e.
the macro arguments, have the referent they would in the context in which
the macro is invoked.  Traditionally, afaict, the scenario you describe is
referred to in the macro context as referential transparency.

> > Then, in systems more flexible than syntax-rules, like syntax-case,
> > the "lot more" is the ability to capture variables and introduce
> > capturing variables but in a way in which explicit control of
> > lexical scope is provided.
>
> That's just a relaxation of hygiene in some cases.

Right, but it's a controlled relaxation of hygiene, in which a macro needs
to specify how it will interact with the lexical scope of the invoking
context.  Again, the goal is achieving integration with lexical scope, and
in doing this, hygiene is merely a necessary means to an end.

> > This explicit scope control is not required merely to be "hygienic",
> > but it is required to provide a macro system of at least equal power
> > to defmacro which integrates with the target language's semantics.
>
> Ok, but that seemed to be relatively unimportant.  Once hygienic
> macros came along, that was the big deal.  Scheme macros had to
> be hygienic.  If some flexibility was possible, that might be
> ok, but only so long as hygiene was normally in effect.

Please see the reply which I've just posted to Pascal Costanza for my
response to this.  I provide a more historically-grounded explanation of my
perspective.

> > > For me, and I think for quite a few other people as well, the
> > > preceived complexity is almost all about hygiene and avoiding
> > > all the different cases of unintentional capture.
> >
> > I don't deny that perception exists,
>
> But there we were talking about the perceived complexity.  You
> wrote:
>
>   In addition, assuming that the perceived complexity and difference
>   of these macro systems is all because of hygiene, or even the
>   broader lexical scoping problem, would be a mistake.
>
> I don't think it *is* a mistake.  The perception might be wrong,
> but it's the perception nonetheless.  And hygiene, and things
> related to it, are behind that perception.  The perceived
> complexity is therefore largely because of hygiene.

As I said, I don't deny that the perception exists.  The main point I'm
trying to make in this respect is that in a system like syntax-case, there
is very little *actual* complexity, from the user's perspective, related to
hygiene.  With syntax-rules, there's absolutely none.  The hygiene is all
taken care of under the hood.

The reason syntax-case can look impenetrable to the uninitiated has to do
with other things: pattern matching, a new abstract data type for syntax, a
syntactic template expansion mechanism, various macros to support all this,
including binding constructs for pattern variables.  Some parts of some of
these pieces support hygiene, but not all that much.

You could develop a version of defmacro which introduced hygiene by default,
but added no other new features, and I don't think you'd find it to be a
particularly complex system.  Quite the opposite, in fact - it would be
easily recognizable and usable by a defmacro user.

Such a system would still be unsatisfactory to the "average Scheme macro
designer", because it would rely on a purely procedural macro expansion
process.  Part of the point of these macro systems was also to constrain
what happens at macro expansion time, to support efficient compilation and
separate compilation, and to allow sensible interaction with module systems
supporting these goals.  None of which had to do with Lisp-1, or with
hygiene, btw.

> > but nevertheless that perception is
> > focused around what is essentially an implementation detail, and
> > misses the bigger picture.
>
> It's not a detail.  A large part of the implementation, and the
> part that's hard to understand, is about hygiene.

Call it a huge implementation chunk, if you like.  The point is it's part of
the implementation.  If people are complaining because the implementation is
hard to understand, that's an argument in favor of using these systems!
Otherwise, they'll just have to deal with those same hard to understand
details when writing reliable defmacros.

Hygiene and referential transparency do increase the complexity of the
*implementation* of these macro systems.  But unless the systems become too
inefficient as a result, this should have no impact on users of the systems.
And from a usability point of view, I think that is in fact the case,
regardless of perception.

> > > (though if I remember correctly, syntax-rules
> > > had some kind of clause that gave you some control).
> >
> > I can't think of anything like that.
>
> I now think I was remembering something else.  I find it hard
> to remember the names of all the different systems that were
> developed for Scheme.

Following the active research in a field is a challenging task.  I think
it's perfectly reasonable to sit back and wait to see what emerges, if
you're not actively researching the subject yourself.  But now, there are
really only two widely used and implemented systems (no offense to some of
the other systems which have portable implementations).  The R5RS standard
system, syntax-rules, is a simplified and restricted subset of the other
widely-used system, syntax-case.

> > Perhaps the biggest problem with the accessibility of syntax-case is
> > that it's usually presented in much more abstract terms.
>
> The biggest problem I have with it is that it seems very complicated,
> about as complicated as the rest of Scheme put together, but harder to
> understand.

I really think "seems" is the operative word.  The basic operation of
syntax-case is identical to that of defmacro - it takes syntax in, and puts
syntax out.  The differences in between are mainly the use of
pattern-matching, of an abstract data type for syntax, and a different
template expansion mechanism.

Of those, I think it's the abstract data type for syntax that has the most
impact on making syntax-case macros look very unfamiliar to a defmacro user.
Defmacro is an obvious system for a Lisp user because it's based on lists,
which are such an integral part of Lisp.  Lists let you use quasiquote and
all the usual list manipulation functions with it - there's very little new
to learn.

As soon as a new ADT is introduced, this changes, and it becomes necessary
to convert between the ADT and other representations.  Although syntax-case
has a quasiquote equivalent, with the tokens #` #, #' etc., using them
extensively isn't very idiomatic in syntax-case macros.  The end result here
isn't really any more difficult than dealing with any new data type, but you
do have to think of syntax in terms of that data type rather than in terms
of lists.  I think that's the real stumbling block for many people.  Plus,
as I said, a lack of introductory and tutorial material for these systems
that's not geared towards other researchers and macro system designers.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx465c59k9z.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Jeff Dalton wrote:

> > I take "hygeine" to mean that ids have the referent they
> > would in the context in which the macro was defined
> > -- ie to do for macros what full lexical scoping does for functions.
> 
> I would add that it's also necessary to ensure that ids in user code, i.e.
> the macro arguments, have the referent they would in the context in which
> the macro is invoked.

Yes, I should have included that.

> > > > For me, and I think for quite a few other people as well, the
> > > > preceived complexity is almost all about hygiene and avoiding
> > > > all the different cases of unintentional capture.
> > >
> > > I don't deny that perception exists,
> >
> > But there we were talking about the perceived complexity.  You
> > wrote:
> >
> >   In addition, assuming that the perceived complexity and difference
> >   of these macro systems is all because of hygiene, or even the
> >   broader lexical scoping problem, would be a mistake.
> >
> > I don't think it *is* a mistake.  The perception might be wrong,
> > but it's the perception nonetheless.  And hygiene, and things
> > related to it, are behind that perception.  The perceived
> > complexity is therefore largely because of hygiene.
> 
> As I said, I don't deny that the perception exists.  The main point I'm
> trying to make in this respect is that in a system like syntax-case, there
> is very little *actual* complexity, from the user's perspective, related to
> hygiene.  With syntax-rules, there's absolutely none.  The hygiene is all
> taken care of under the hood.

But that is part of the perceived complexity.  Indeed, a think it is
the greatest factor.  It's clear how traditional macros work.  It is
very much harder to understand how hygienic macros work.

The rule language does not seem very complex, though syntax-case
makes it a bit harder to understand than some of the alternatives
(at least in some of the cases that I see posted).

It's the, largely hidden, mechanisms the ensure hygiene that
make the whole thing seem so complex.

People have provided the rule part for Common Lisp a couple
of times now, but never (so far as I know) the hygiene.  So
it looks like the hygiene is the hard part.

> You could develop a version of defmacro which introduced hygiene by default,
> but added no other new features, and I don't think you'd find it to be a
> particularly complex system.  Quite the opposite, in fact - it would be
> easily recognizable and usable by a defmacro user.
> 
> Such a system would still be unsatisfactory to the "average Scheme macro
> designer", because it would rely on a purely procedural macro expansion
> process.  Part of the point of these macro systems was also to constrain
> what happens at macro expansion time, to support efficient compilation and
> separate compilation, and to allow sensible interaction with module systems
> supporting these goals.  None of which had to do with Lisp-1, or with
> hygiene, btw.

Do you mean unsatisfactory to the people why desing macro systems,
or to the ones who design (write) macros?

Because it sounded in other messages like you were saying Scheme
programmers still continued to happily use defmacro.

> > It's not a detail.  A large part of the implementation, and the
> > part that's hard to understand, is about hygiene.
> 
> Call it a huge implementation chunk, if you like.  The point is it's part of
> the implementation.  If people are complaining because the implementation is
> hard to understand, that's an argument in favor of using these systems!

Perhaps for some people; but many would see it was an argument
against using them.

> Otherwise, they'll just have to deal with those same hard to understand
> details when writing reliable defmacros.

No.  They can use other techniques and avoid the cases those
techniques don't handle.

> Hygiene and referential transparency do increase the complexity of the
> *implementation* of these macro systems.  But unless the systems become too
> inefficient as a result, this should have no impact on users of the systems.

Programmers like to understand what is happening in the
implementation.

> And from a usability point of view, I think that is in fact the case,
> regardless of perception.

If there are bugs, and you don't understand how things work, the
bug can be harder to find.

> > > > (though if I remember correctly, syntax-rules
> > > > had some kind of clause that gave you some control).
> > >
> > > I can't think of anything like that.
> >
> > I now think I was remembering something else.  I find it hard
> > to remember the names of all the different systems that were
> > developed for Scheme.
> 
> Following the active research in a field is a challenging task.  I think
> it's perfectly reasonable to sit back and wait to see what emerges, if
> you're not actively researching the subject yourself.  But now, there are
> really only two widely used and implemented systems (no offense to some of
> the other systems which have portable implementations).  The R5RS standard
> system, syntax-rules, is a simplified and restricted subset of the other
> widely-used system, syntax-case.

I used to keep up with it.  For instance, I grabbed a copy of
syntax-case years ago (1992), intending to put it into a Scheme
implementation I had.  But I haven't closely followed what's
happened since then.

I've never seen anything that made it easy to understand how
these systems worked, but perhaps someone has written something
since then.

> > > Perhaps the biggest problem with the accessibility of syntax-case is
> > > that it's usually presented in much more abstract terms.
> >
> > The biggest problem I have with it is that it seems very complicated,
> > about as complicated as the rest of Scheme put together, but harder to
> > understand.
> 
> I really think "seems" is the operative word.  The basic operation of
> syntax-case is identical to that of defmacro - it takes syntax in,
> and puts syntax out.  The differences in between are mainly the use of
> pattern-matching, of an abstract data type for syntax, and a different
> template expansion mechanism.
> 
> Of those, I think it's the abstract data type for syntax that has the most
> impact on making syntax-case macros look very unfamiliar to a defmacro user.

For example, given some Scheme source code as lists, symbols, etc,
how is it transformed into instances of the abstract type?  It
sounds like you need something like the initial phases of a compiler.

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <FMCec.5418$zj3.3411@newsread3.news.atl.earthlink.net>
Jeff Dalton wrote:
> But that is part of the perceived complexity.  Indeed, a think it is
> the greatest factor.  It's clear how traditional macros work.  It is
> very much harder to understand how hygienic macros work.

For a beginner, syntax-rules is at least as easy to understand as defmacro,
and possibly easier to use reliably.  Hygiene is not an issue, and doesn't
have to be understood.  As I've pointed out, syntax-case adds complexity in
areas other than hygiene; the actual complexity related to hygiene, when
using the system, is minimal.

By your quote above, you're only underscoring my point that the focus on
"hygiene" in these macro systems is terribly misleading.  I'm not sure if
you would say that "it is very much harder to understand how [syntax-rules]
works" - I'm hoping not.  But I think you mean that it's much harder to
understand how systems like syntax-case work.  On that point, as I've
explained, I think the real issue lie elsewhere, not with hygiene.  I think
that if you actually used syntax-case, you would see that.

> It's the, largely hidden, mechanisms the ensure hygiene that
> make the whole thing seem so complex.

Why, or how?  And how do you know, or why do you think, that hygiene is the
cause?

> > You could develop a version of defmacro which introduced hygiene by
default,
> > but added no other new features, and I don't think you'd find it to be a
> > particularly complex system.  Quite the opposite, in fact - it would be
> > easily recognizable and usable by a defmacro user.
> >
> > Such a system would still be unsatisfactory to the "average Scheme macro
> > designer", because it would rely on a purely procedural macro expansion
> > process.  Part of the point of these macro systems was also to constrain
> > what happens at macro expansion time, to support efficient compilation
and
> > separate compilation, and to allow sensible interaction with module
systems
> > supporting these goals.  None of which had to do with Lisp-1, or with
> > hygiene, btw.
>
> Do you mean unsatisfactory to the people why desing macro systems,
> or to the ones who design (write) macros?

The latter.

> Because it sounded in other messages like you were saying Scheme
> programmers still continued to happily use defmacro.

Some do, yes.  As I've tried to clarify in another post, the fact that
Scheme programmers use defmacro doesn't mean they want it as part of the
core definition of Scheme, or that they don't want alternatives which they
perceive as better.

> > > It's not a detail.  A large part of the implementation, and the
> > > part that's hard to understand, is about hygiene.
> >
> > Call it a huge implementation chunk, if you like.  The point is it's
part of
> > the implementation.  If people are complaining because the
implementation is
> > hard to understand, that's an argument in favor of using these systems!
>
> Perhaps for some people; but many would see it was an argument
> against using them.
>
> > Otherwise, they'll just have to deal with those same hard to understand
> > details when writing reliable defmacros.
>
> No.  They can use other techniques and avoid the cases those
> techniques don't handle.

So in other words, continually code around the limitations of defmacro to
avoid learning a system which solves those limitations automatically?

> > Hygiene and referential transparency do increase the complexity of the
> > *implementation* of these macro systems.  But unless the systems become
too
> > inefficient as a result, this should have no impact on users of the
systems.
>
> Programmers like to understand what is happening in the
> implementation.

Good point.  But usually, before you can do that, you have to understand how
to use the system in question.  I think you'll find that if you take the
trouble to learn how to use syntax-case, the nature of its implementation
will become much clearer.

> > And from a usability point of view, I think that is in fact the case,
> > regardless of perception.
>
> If there are bugs, and you don't understand how things work, the
> bug can be harder to find.

The conceptual model for syntax-case is strong enough and consistent enough
that this should not be a problem.  I haven't found it to be the case.

Believe it or not, when you are able to see past the extras I've described,
the basic underlying model is very much like defmacro.  The syntax data type
is a model of Lisp/Scheme syntax, and manipulations of it are mostly
isomorphic to manipulations of lists.  A syntax-case guide for defmacro
users could show direct parallels for just about any operation defmacro
allows, although the result wouldn't necessarily be idiomatic, but that's
only because of differences in preferred style.

> I've never seen anything that made it easy to understand how
> these systems worked, but perhaps someone has written something
> since then.

I agree, there isn't much at the introductory level.

> For example, given some Scheme source code as lists, symbols, etc,
> how is it transformed into instances of the abstract type?  It
> sounds like you need something like the initial phases of a compiler.

Yes, that's exactly what syntax-case is.  Some Scheme implementations
consist, at their core, of nothing more than syntax-case, macro definitions
of the core constructs, and a small core language consisting mainly of
lambda, IF, and datatypes.

You can also convert to and from the syntax datatype using
syntax-object->datum and datum->syntax-object, or instantiate new syntax
objects using the SYNTAX or QUASISYNTAX constructors, or their
abbreviations, #' and #`.  But of course, that happens at a different level
of the process.

> People have provided the rule part for Common Lisp a couple
> of times now, but never (so far as I know) the hygiene.  So
> it looks like the hygiene is the hard part.

I'm sure it's because you can't just plug it in as an add-on, the way a
defmacro system can be implemented.

You're attributing a lot of problems to hygiene.  None of them really make
sense to me.  I think you're arguing with me precisely because the point I'm
making is valid, and this demonizing of hygiene has acted as a barrier to
further understanding.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx47jwktuhv.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Jeff Dalton wrote:
> > But that is part of the perceived complexity.  Indeed, a think it is
> > the greatest factor.  It's clear how traditional macros work.  It is
> > very much harder to understand how hygienic macros work.
> 
> For a beginner, syntax-rules is at least as easy to understand as
> defmacro, and possibly easier to use reliably.

It is not as easy to understand how it works.  Maybe it's some kind of
complicated renaming, maybe it's something else involving a new ADT,
but it's nothing like as simple and easily understood as what defmacro
does, nor is it as easy to implement.

> Hygiene is not an issue, and doesn't have to be understood.

Nothing *has* to be understood.  People can write programs without
understanding how a language works.  But that's hardly a desirable
state.

> As I've pointed out, syntax-case adds complexity in
> areas other than hygiene; the actual complexity related to hygiene,
> when using the system, is minimal.

And I keep telling you that the perceived complexity is not just
about using it.  It's also, and for many primarily, about
the difficulty of understanding how it works internally.

That's just a fact.  You can either accept it or not.

> > It's the, largely hidden, mechanisms the ensure hygiene that
> > make the whole thing seem so complex.
> 
> Why, or how?  And how do you know, or why do you think, that hygiene
> is the cause?

For one thing, the non-hygienic versions of the rule language that
have become available for Common Lisp -- such as Dorai Sitaram's
syntax-rules and the one I ported earlier -- are not difficult
to understand.  Implemnentations that are hygienic are much
harder to understand.

If the hygienic part were really not a significant complication,
people would have implemented it years ago for Common Lisp,
and hygienic macros would have been an available alternative to
defmacro for CL programmers.

Instead, we had people providing only the rule language
and no hygiene.

(There are occasional small experiments in hygiene in CL,
but no generally available implementation of a full system of
the sort used in Scheme has appeared.  At least not that I've
heard of.  If anyone *does* have such a system, I think
quite a few people would like to use it.)

> > Programmers like to understand what is happening in the
> > implementation.
> 
> Good point.  But usually, before you can do that, you have to understand how
> to use the system in question.  I think you'll find that if you take the
> trouble to learn how to use syntax-case, the nature of its implementation
> will become much clearer.

I took the trouble to learn how to use it about 10 years ago
(going by some file dates).

> Believe it or not, when you are able to see past the extras I've described,
> the basic underlying model is very much like defmacro.  The syntax data type
> is a model of Lisp/Scheme syntax, and manipulations of it are mostly
> isomorphic to manipulations of lists.  A syntax-case guide for defmacro
> users could show direct parallels for just about any operation defmacro
> allows, although the result wouldn't necessarily be idiomatic, but that's
> only because of differences in preferred style.

That guide sounds like it's still at the "use" level.

> > People have provided the rule part for Common Lisp a couple
> > of times now, but never (so far as I know) the hygiene.  So
> > it looks like the hygiene is the hard part.
> 
> I'm sure it's because you can't just plug it in as an add-on, the way a
> defmacro system can be implemented.

But people have written fairly complicated things in CL that
needed code-walkers.  It could even be for a subset of CL.
Yet no such implementation has appeared, not even back when
hygienic macros were a hot topic.

> You're attributing a lot of problems to hygiene.  None of them really make
> sense to me.  I think you're arguing with me precisely because the point I'm
> making is valid, and this demonizing of hygiene has acted as a barrier to
> further understanding.

Which point?  You've made quite a few.

What demonizing?

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <m%6fc.6820$zj3.6017@newsread3.news.atl.earthlink.net>
Jeff Dalton wrote:
> "Anton van Straaten" <·····@appsolutions.com> writes:
>
> > Jeff Dalton wrote:
> > > But that is part of the perceived complexity.  Indeed, a think it is
> > > the greatest factor.  It's clear how traditional macros work.  It is
> > > very much harder to understand how hygienic macros work.
> >
> > For a beginner, syntax-rules is at least as easy to understand as
> > defmacro, and possibly easier to use reliably.
>
> It is not as easy to understand how it works.  Maybe it's some kind of
> complicated renaming, maybe it's something else involving a new ADT,
> but it's nothing like as simple and easily understood as what defmacro
> does, nor is it as easy to implement.

It's not that easy to understand how most compilers work.  Syntax-case is a
compiler.

> And I keep telling you that the perceived complexity is not just
> about using it.  It's also, and for many primarily, about
> the difficulty of understanding how it works internally.

How well do "many" understand the internal workings of their compiler?  I
think many people have a reasonable mental model of it, but don't know all
the details.  The same goes for syntax-case.

> That's just a fact.  You can either accept it or not.

It's a fact, but what does it mean?  It may mean that people are erroneously
mapping their expectations based on defmacro, to a rather different sort of
animal, syntax-case, when they wouldn't have those expectations if they
understood what sort of animal syntax-case really was.

> For one thing, the non-hygienic versions of the rule language that
> have become available for Common Lisp -- such as Dorai Sitaram's
> syntax-rules and the one I ported earlier -- are not difficult
> to understand.  Implemnentations that are hygienic are much
> harder to understand.

The implementation of any system which implements lexical scoping is much
harder to understand than the implementation of defmacro.  I don't see that
stopping people from using LAMBDA.

> If the hygienic part were really not a significant complication,
> people would have implemented it years ago for Common Lisp,
> and hygienic macros would have been an available alternative to
> defmacro for CL programmers.

Dorai Sitaram stated that he was surprised that CL people would be
interested in hygiene.  You reap what you sow.

> Instead, we had people providing only the rule language
> and no hygiene.

That's because the implementations tend to use defmacro, which fundamentally
isn't capable of implementing all the features of hygienic, referentially
transparent macros.  CL's built-in macro facility isn't expressive enough
for the job, someone would have to work harder to provide it at a lower
level.

> (There are occasional small experiments in hygiene in CL,
> but no generally available implementation of a full system of
> the sort used in Scheme has appeared.  At least not that I've
> heard of.  If anyone *does* have such a system, I think
> quite a few people would like to use it.)

That's interesting to hear.

> > > Programmers like to understand what is happening in the
> > > implementation.
> >
> > Good point.  But usually, before you can do that, you have to understand
how
> > to use the system in question.  I think you'll find that if you take the
> > trouble to learn how to use syntax-case, the nature of its
implementation
> > will become much clearer.
>
> I took the trouble to learn how to use it about 10 years ago
> (going by some file dates).

And you didn't gain enough insight into how it worked?  Do you recall what
kinds of things you found unclear or unsatisfactory?

> But people have written fairly complicated things in CL that
> needed code-walkers.  It could even be for a subset of CL.
> Yet no such implementation has appeared, not even back when
> hygienic macros were a hot topic.

I'm not sure what you're suggesting.

> > You're attributing a lot of problems to hygiene.  None of them really
make
> > sense to me.  I think you're arguing with me precisely because the point
I'm
> > making is valid, and this demonizing of hygiene has acted as a barrier
to
> > further understanding.
>
> Which point?  You've made quite a few.

The point that the focus on hygiene as the end goal is misleading.

> What demonizing?

The claims that hygiene and the implementation thereof make the system so
difficult to understand.  Have you read Dybvig's "Writing Hygenic Macros in
Scheme with
Syntax-Case", for example?
ftp://ftp.cs.indiana.edu/pub/scheme-repository/doc/pubs/iucstr356.ps.gz

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4r7ux7vhy.fsf@tarn.inf.ed.ac.uk>
BTW, I ran across an interesting quote from Drew McDermott
while looking up something else (his remark about feeling
"the bits between your toes"):

  Anyway, the point is that it's unpredictable how a change to a
  language is going to break it or enhance it. A major change like ARC
  has practically zero chance of being an improvement.

  Then again, one might have predicted that Scheme would have zero
  chance of surviving, and been dead wrong. It is still the case,
  however, that one could not have predicted the differences in the
  Scheme community and the CL community given a list of differences
  between the two languages.  Why would merging function and variable
  name spaces cause people to demand hygienic macros? One of life's
  little mysteries.

http://coding.derkeiler.com/Archive/Lisp/comp.lang.lisp/2003-10/2338.html

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <VUndc.1830$zj3.769@newsread3.news.atl.earthlink.net>
Jeff Dalton wrote:
> BTW, I ran across an interesting quote from Drew McDermott
> while looking up something else (his remark about feeling
> "the bits between your toes"):
>
>   Anyway, the point is that it's unpredictable how a change to a
>   language is going to break it or enhance it. A major change like ARC
>   has practically zero chance of being an improvement.
>
>   Then again, one might have predicted that Scheme would have zero
>   chance of surviving, and been dead wrong. It is still the case,
>   however, that one could not have predicted the differences in the
>   Scheme community and the CL community given a list of differences
>   between the two languages.  Why would merging function and variable
>   name spaces cause people to demand hygienic macros? One of life's
>   little mysteries.
>
> http://coding.derkeiler.com/Archive/Lisp/comp.lang.lisp/2003-10/2338.html

Yes, Drew's quote simply reflects the same myth.  I see it was made a few
months ago here in c.l.l., where the myth is fondly held as a kind of
convenient (non-)explanation used to claim a benefit for Lisp-2.  That's
precisely why I'm posting now, in order to correct misconceptions.

I'm encouraged to see that Drew is smart enough to recognize the disconnect
in the cause and effect here: he wrote "Why would merging function and
variable name spaces cause people to demand hygienic macros?"

Why indeed?  The answer is, there's no causal connection whatsoever.

Anton
From: Thomas A. Russ
Subject: Re: scheme seems neater
Date: 
Message-ID: <ymi7jwprtha.fsf@sevak.isi.edu>
"Anton van Straaten" <·····@appsolutions.com> writes:

> I'm encouraged to see that Drew is smart enough to recognize the disconnect
> in the cause and effect here: he wrote "Why would merging function and
> variable name spaces cause people to demand hygienic macros?"
> 
> Why indeed?  The answer is, there's no causal connection whatsoever.

I'm not entirely convinced.  While there may not be a technical
connection, there could possibliy be a psychological one.

I suspect that the same people to whom the merged namespaces appeal (on
the basis of "cleanliness" or "simplicity") would also find a similar
appeal in having macros hygenic as well.  Especially given the name
chosen for the macro idea -- as opposed to "safe macros"...

-- 
Thomas A. Russ,  USC/Information Sciences Institute
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <1XHdc.2735$zj3.1458@newsread3.news.atl.earthlink.net>
Thomas A. Russ wrote:

> "Anton van Straaten" <·····@appsolutions.com> writes:
>
> > I'm encouraged to see that Drew is smart enough to recognize the
disconnect
> > in the cause and effect here: he wrote "Why would merging function and
> > variable name spaces cause people to demand hygienic macros?"
> >
> > Why indeed?  The answer is, there's no causal connection whatsoever.
>
> I'm not entirely convinced.  While there may not be a technical
> connection, there could possibliy be a psychological one.
>
> I suspect that the same people to whom the merged namespaces appeal
> (on the basis of "cleanliness" or "simplicity") would also find a similar
> appeal in having macros hygenic as well.

Oh, I wouldn't argue against that.  In fact, your conjecture could be
restated with a stronger technical foundation: merged namespaces appeal to
people interested in functional programming, and people interested in
functional programming are strongly interested in referential transparency,
which is a driving factor underlying the more sophisticated macro systems.

However, in that case, it wouldn't be the merging of the namespaces which
leads to the demand for hygienic macros.  Any causal connection is not from
Lisp-1 to hygienic macros.  There's a deeper underlying principle, and
that's the basis for my entire point.

> Especially given the name chosen for the macro
> idea -- as opposed to "safe macros"...

The name choice is unfortunate - it's both too narrow, and pejorative - and
I think that's very much part of the problem here.  People often have a hard
time seeing past names (Sapir-Whorf in action).

Anton
From: Mario S. Mommer
Subject: Re: scheme seems neater
Date: 
Message-ID: <fzfzbd5uu9.fsf@germany.igpm.rwth-aachen.de>
"Anton van Straaten" <·····@appsolutions.com> writes:
> "Why would merging function and variable name spaces cause people to
> demand hygienic macros?"
> 
> Why indeed?  The answer is, there's no causal connection whatsoever.

So this 'hygienic' business is just the neurotic disorder it look
like?
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <_QHdc.2731$zj3.1174@newsread3.news.atl.earthlink.net>
Mario S. Mommer wrote:
>
> "Anton van Straaten" <·····@appsolutions.com> writes:
> > "Why would merging function and variable name spaces cause people to
> > demand hygienic macros?"
> >
> > Why indeed?  The answer is, there's no causal connection whatsoever.
>
> So this 'hygienic' business is just the neurotic disorder it look
> like?

No, it just has no significant connection to Lisp-1 vs. Lisp-2.

Anton
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c56bc1$2r8$1@newsreader2.netcologne.de>
Anton van Straaten wrote:

> I see it was made a few
> months ago here in c.l.l., where the myth is fondly held as a kind of
> convenient (non-)explanation used to claim a benefit for Lisp-2.  That's
> precisely why I'm posting now, in order to correct misconceptions.

It doesn't become a myth just by repeating to state that it is one.

Here is the first example from the paper "Syntactic Closures" by Bawden 
and Rees. First in Scheme:

#;> (defmacro push (obj-exp list-var)
       `(set! ,list-var (cons ,obj-exp ,list-var)))
#;> (define stack nil)
#;> (let ((cons 5))
       (push 'foo stack))
{warning: compiler detected application of non-procedure '5'.)
Error: attempt to apply non-procedure '5'.

Now exactly the same example in Common Lisp:

(defmacro push* (obj-exp list-var)
   `(setf ,list-var (cons ,obj-exp ,list-var)))

CL-USER 1 > (defvar *stack* nil)
*STACK*

CL-USER 2 > (let ((cons 5))
               (push 'foo *stack*))
(FOO)

The example _clearly_ works in Common Lisp, but _clearly_ doesn't in 
Scheme. This is only because Scheme is a Lisp-1 whereas Common Lisp is a 
Lisp-2.

The expansion in Common Lisp is this:

(LET ((CONS 5))
   (LET ((#:|new-value-761| 'FOO))
     (LET* ((#:G760 (CONS #:|new-value-761| *STACK*)))
       (SETQ *STACK* #:G760))))

You have one value binding for CONS that isn't used anywhere in this 
expansion, and one function binding for CONS that is called in the third 
line.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <3MHdc.2725$zj3.1841@newsread3.news.atl.earthlink.net>
Pascal Costanza wrote:>
> Anton van Straaten wrote:
>
> > I see it was made a few
> > months ago here in c.l.l., where the myth is fondly held as a kind of
> > convenient (non-)explanation used to claim a benefit for Lisp-2.  That's
> > precisely why I'm posting now, in order to correct misconceptions.
>
> It doesn't become a myth just by repeating to state that it is one.

I'm providing specific and relevant arguments to debunk the myth.  The only
reason I'm repeating "it's a myth" is to point out exactly where it's being
used, and which claims specifically relate to it.  I could switch to saying
something like "the belief, which I consider false".  It can't be considered
anything more than a belief, because no-one has produced anything that could
be considered strong evidence for it.

> The example _clearly_ works in Common Lisp, but _clearly_ doesn't in
> Scheme. This is only because Scheme is a Lisp-1 whereas Common Lisp is a
> Lisp-2.

That's true, but has no bearing on the validity of the myth.  It would only
have a bearing if such macros were common and difficult to avoid in
practice.  They're not.  As such, an example like this doesn't come close to
providing a reason to pursue fully hygienic macro systems.

Note that the Bawden and Rees example you quoted fails under Lisp-2 if you
use an FLET to redefine CONS to a procedure.  Such an example would have
been at least as relevant, if not more relevant in the Scheme context -
since in Scheme, it's more common to reassign or shadow a procedure variable
to refer to a replacement procedure, than it is to change it to some other
type of value.  The real issue in this example is not limited to Lisp-2 vs.
Lisp-1, it's an ordinary and general referential transparency issue - the
macro behaves differently depending on the context in which it is expanded,
which violates referential transparency.

More importantly, Scheme's single namespace applies in ordinary non-macro
code also, and the only thing Scheme programmers do to deal with that is use
some care to avoid name clashes.  Ordinary Scheme naming practices are
sufficient to avoid function vs. variable name clashes, for the most part.
And of course, you use gensym to avoid name clashes for introduced variables
in defmacros.

In the quoted example, CONS is redefined as a lexical variable.  In general,
with or without defmacro, Scheme programmers avoid punning in this way as a
matter of course, since reassigning or shadowing procedure names to have
other values can cause problems in areas other than macros.  The most
notorious example, that of LIST, is one Scheme newbies occasionally run
into: they define a variable called LIST and then try to use the LIST
function in the same scope.  They very soon learn not to do that - to quote
John Lennon, "it's easy if you try".

This issue is not specific to macros, and the normal steps that Scheme
programmers take to address it covers most reasonable macro cases.  If you
want to propose an example of idiomatic and reasonable Scheme code with
defmacro in which a Lisp-1 issue arises, I'll be glad to address it.

Finally, as I've pointed out, people still use defmacro in Scheme today.
Someone posted a fairly sophisticated 100-line defmacro on c.l.scheme a few
weeks ago, for example.  There are even one or two Scheme implementation
authors who use defmacro in preference to the hygienic systems.  People
writing portable Scheme often use defmacro because it's more powerful than
syntax-rules, but also sufficiently portable between implementations.  If
Lisp-1 were really such a problem for defmacro, it would be difficult to
explain this continued usage.

To summarize, looking at that Bawden and Rees example and concluding "oh,
the problem here is that Scheme is a Lisp-1, and that's why hygienic macro
systems had to be developed" is simply flawed logic.  If that's not obvious,
the briefest proof of that is simply to substitute the LET with an FLET,
which doesn't affect the case which the paper is making, but fails under
Lisp-2 also.

Anton
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c59nj1$2ru$1@newsreader2.netcologne.de>
Anton van Straaten wrote:

>>The example _clearly_ works in Common Lisp, but _clearly_ doesn't in
>>Scheme. This is only because Scheme is a Lisp-1 whereas Common Lisp is a
>>Lisp-2.
> 
> That's true, but has no bearing on the validity of the myth.  It would only
> have a bearing if such macros were common and difficult to avoid in
> practice.  They're not.  As such, an example like this doesn't come close to
> providing a reason to pursue fully hygienic macro systems.

The Bawden and Rees paper comments its fourth example as follows:

"An example similar to case 2, but involving keywords, is possible, but 
never seems to happen in practice."

This indicates to me that the authors consider the other cases to happen 
at least occasionally.

> To summarize, looking at that Bawden and Rees example and concluding "oh,
> the problem here is that Scheme is a Lisp-1, and that's why hygienic macro
> systems had to be developed" is simply flawed logic.

I am not yet convinced. Here is another data point: 
http://groups.google.com/groups?selm=10Oct1993.020007.Alan%40LCS.MIT.EDU


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <Y2qec.4904$zj3.1518@newsread3.news.atl.earthlink.net>
Pascal Costanza wrote:
>
> Anton van Straaten wrote:
>
> >>The example _clearly_ works in Common Lisp, but _clearly_ doesn't in
> >>Scheme. This is only because Scheme is a Lisp-1 whereas Common Lisp is a
> >>Lisp-2.
> >
> > That's true, but has no bearing on the validity of the myth.  It would
only
> > have a bearing if such macros were common and difficult to avoid in
> > practice.  They're not.  As such, an example like this doesn't come
close to
> > providing a reason to pursue fully hygienic macro systems.
>
> The Bawden and Rees paper comments its fourth example as follows:
>
> "An example similar to case 2, but involving keywords, is possible, but
> never seems to happen in practice."
>
> This indicates to me that the authors consider the other cases to happen
> at least occasionally.

I'm not suggesting they don't happen occasionally.  I'm saying that these
cases are neither the reason that defmacro wasn't good enough for Scheme,
nor why a better system had to be developed.  In addition, the Lisp-1 issue
is not what made it difficult to develop more advanced macro systems.  All
three of these claims have been made in this thread, and I've been
responding to them, because I consider them all to be misleading at best.

> > To summarize, looking at that Bawden and Rees example and concluding
"oh,
> > the problem here is that Scheme is a Lisp-1, and that's why hygienic
macro
> > systems had to be developed" is simply flawed logic.
>
> I am not yet convinced. Here is another data point:
> http://groups.google.com/groups?selm=10Oct1993.020007.Alan%40LCS.MIT.EDU

That's a nice reference.  :)

I think this points directly at the origins of these misunderstandings.  In
the second post in that thread, Alan Bawden, who wrote the above post, says:

"Good grief.  Do you -really- mean 'Common-Lisp style' in the sense of
'blind to issues of lexical scoping'?"

As Alan said, lexical scoping is the important issue here.  The problem is
that when arguing that issue on more concrete grounds than the seemingly
abstract goal of "respecting lexical scoping", examples are the most obvious
way to do that.  The Lisp-2 vs. Lisp-1 case provides an easy target for such
examples.  But it should be recognized that the Lisp-1 issue is just a
subset of a subset of the issues which referentially transparent & hygienic
macro systems are addressing.

To explain what I mean by "subset of a subset": the broad issue is
preservation of lexical scoping; hygiene is a subset of that issue, or a
means to that end; and Lisp-2 vs. Lisp-1 cases are a subset of that subset.
The idea that these cases provide a rationale for such macro systems would
be an extreme case of the tail wagging the dog.  It would be far more
accurate and meaningful to say that resolution of any Lisp-1 issues with
macros are a happy side effect of the hygienic macro systems.

You could eliminate all Lisp-1 problems with defmacro, and you'd still be
left with an important issue for Scheme, which is that defmacro doesn't
respect lexical scoping, whereas lexical scoping provides the very
foundation for the rest of Scheme.

* * *

I'd like to present some historical context.  As Jeff Dalton has pointed
out, this is based on published papers, so obviously doesn't represent all
perspectives and positions.  However, it might help to explain my
perspective.

I've previously quoted R5RS as saying "All macros defined using the pattern
language are 'hygienic' and 'referentially transparent' and thus preserve
Scheme's lexical scoping".  The point here is the preservation of lexical
scoping.  This goal can be traced back to the first published paper on
hygienic macros, Kohlbecker et al's 1986 paper, "Hygienic Macro Expansion".
The first sentence of the abstract from that paper reads:

    "Macro expansion in current Lisp systems is naive with respect to block
structure."

The Kohlbecker paper draws a comparison to the lambda calculus: "The
capturing problem of the naive expansion algorithm is analogous to the
substitution problem in the lambda calculus."  The entire problem is
discussed in this context, and the term "hygiene" is credited to
Barendregt's 1984 paper, "Introduction to the lambda calculus".

The 1994 revision of the Barendregt paper says: "For reasons of hygiene it
will always be assumed that the bound variables that occur in a certain
expression are different from the free ones. This can be fulfilled by
renaming bound variables." In discussions of lambda calculus, you don't see
much focus on this question of hygiene (I didn't find the word anywhere else
in Barendregt's paper, although my copy isn't searchable).  It's taken for
granted that to achieve a reasonable system, you just have to have hygiene.
It's a basic requirement for referential transparency, fulfilled in the
lambda calculus by the alpha renaming rule.

Kohlbecker writes: "From the lambda calculus, one knows that if the hygiene
condition does not hold, it can be established by an appropriate number of
alpha conversions.  That is also the basis of our solution."

I think it's easy to see the appeal of such an approach to users and
designers of a language originally billed as "An Interpreter for Extended
Lambda Calculus" (the title of the first Scheme paper).  Defmacro disregards
the basic rules of lambda calculus and referential transparency, whereas the
"hygienic" systems respect those rules.

With this background, it should be clear that "hygiene" wasn't interesting
because of its ability to address problems with macros in a Lisp-1, nor was
it some kind of arbitrary feature that became fashionable or required
because competing macro systems were offering it.  It was an essential and
major step towards a macro system that could reliably and safely manipulate
a target language that itself was based on lambda calculus.

I can't deny that because of the examples that tend to get used to
illustrate the benefits of a hygienic and referentially transparent macro
system, and because some people may have overstated that issue in order to
"sell" hygienic macro systems, and because it apparently scores a point for
Lisp-2 against Lisp-1, and because people may have been looking for
convenient excuses to keep some piece of scary new technology off their
mental radar, that there have been discussions in which Lisp-1 has been
invoked and accepted as an explanation of the need for hygienic macro
systems.  Those discussions don't make that position any more valid, though.

Even if some of the designers of these macro systems did believe that they
were primarily solving a problem related to Lisp-1-specific name capture
(something for which I would require evidence), in hindsight, they would
appear to have been somewhat misguided, because the resulting macro systems
achieve something far more significant.

Unfortunately, to explain that significance to the average CL programmer, if
he doesn't immediately get it when I say "these macro systems respect
lexical scoping in the target language", very soon I'll have to get into
examples, and soon after that the Lisp-1 issue will come up, and then the CL
programmer will say, "oh, so it's because Scheme is a Lisp-1!"  At that
point, I am happy to now be able to say, I will refer him to my messages in
this thread.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4wu4l828z.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> I'm not suggesting they don't happen occasionally.  I'm saying that these
> cases are neither the reason that defmacro wasn't good enough for Scheme,
> nor why a better system had to be developed.

Has anyone said it was *the* reason?

I don't think I have, anyway; and I've explicitly said several
times that it wasn't.  It was one factor among others, not *the*
reason.

> In addition, the Lisp-1 issue
> is not what made it difficult to develop more advanced macro
> systems.

Has anyone said that?  Anyway, that wasn't my point about difficulty.

> All three of these claims have been made in this thread

Have they?

> As Alan said, lexical scoping is the important issue here.  The problem is
> that when arguing that issue on more concrete grounds than the seemingly
> abstract goal of "respecting lexical scoping", examples are the most obvious
> way to do that.  The Lisp-2 vs. Lisp-1 case provides an easy target for such
> examples.  But it should be recognized that the Lisp-1 issue is just a
> subset of a subset of the issues which referentially transparent & hygienic
> macro systems are addressing.
> 
> To explain what I mean by "subset of a subset": the broad issue is
> preservation of lexical scoping; hygiene is a subset of that issue, or a
> means to that end;

What do you include in preservation of lexical scoping that you
don't inluide in hygiene?  In your discussion of lambda calculus,
for instance, it doesn't sound like there's any difference.

> You could eliminate all Lisp-1 problems with defmacro, and you'd still be
> left with an important issue for Scheme, which is that defmacro doesn't
> respect lexical scoping, whereas lexical scoping provides the very
> foundation for the rest of Scheme.

I think it's clear that lexical scoping is an important issue for
Scheme, although in a sense lexical scoping is just a means to an end
such as referential transparency.

But you've done as much as anyone to cast doubt on the importance of
lexical scoping , by saying that people continue to use defacro.  If
continued use is supposed to count against the claim that Lisp-1
creates a significant problem for traditional macros, continued use
should also count against the claim that respecting lexical scoping is
an important issue, since people continue to use the sort of macros
that has Alan Bowden saying

  Good grief.  Do you -really- mean 'Common-Lisp style' in the sense
  of 'blind to issues of lexical scoping'?"

There seems to be significant difference of opinion in the Scheme
community.

> ...
> 
> I think it's easy to see the appeal of such an approach to users and
> designers of a language originally billed as "An Interpreter for Extended
> Lambda Calculus" (the title of the first Scheme paper).  Defmacro disregards
> the basic rules of lambda calculus and referential transparency, whereas the
> "hygienic" systems respect those rules.

Yes, it is easy to see the appeal.  So why the wide continued use
of defmacro in Scheme that you've talked about?

> With this background, it should be clear that "hygiene" wasn't interesting
> because of its ability to address problems with macros in a Lisp-1, nor was
> it some kind of arbitrary feature that became fashionable or required
> because competing macro systems were offering it.  It was an essential and
> major step towards a macro system that could reliably and safely manipulate
> a target language that itself was based on lambda calculus.

There again you seem to be replying to claims that haven't been made.

The point about Lisp-1 has always been that it makes certain problems
worse, and thus makes defmacro less suitable.  That's all.  Not that
the Lisp-1 / Lisp-2 difference at that point is why hygiene was
interesting.

Nor was it claimed that hygiene was an arbitrary feature etc.

> I can't deny that because of the examples that tend to get used to
> illustrate the benefits of a hygienic and referentially transparent macro
> system, and because some people may have overstated that issue in order to
> "sell" hygienic macro systems, and because it apparently scores a point for
> Lisp-2 against Lisp-1, and because people may have been looking for
> convenient excuses to keep some piece of scary new technology off their
> mental radar, that there have been discussions in which Lisp-1 has been
> invoked and accepted as an explanation of the need for hygienic macro
> systems.

They need excuses to continue using technology that is flexible,
powerful, easy to understand, and works well?  Instead of technology
that is complex, harder to understand, and doesn't seem to have
ever been available in Common Lisp?

In any case, Lisp-1 is not an explanation; it's part of an
explanation.  Perhaps we could even say part of a part.

> Those discussions don't make that position any more valid, though.
> 
> Even if some of the designers of these macro systems did believe that they
> were primarily solving a problem related to Lisp-1-specific name capture
> (something for which I would require evidence), in hindsight, they would
> appear to have been somewhat misguided, because the resulting macro systems
> achieve something far more significant.

Has anyone said "that they were primarily solving a problem related
to Lisp-1-specific name capture"?  That's not what I've said, and I
don't think it's what anyone else has said either.

The closest thing I can find is the McDermott quote, and even
it doesn't say that.  Indeed, his point was that the language
differences between Scheme and CL would not lead one to
predict the different attitude towards macros in the Scheme
and CL communities -- in other words, that there isn't a
clear connection between Lisp-1 and hygiene.

> Unfortunately, to explain that significance to the average CL
> programmer, if he doesn't immediately get it when I say "these macro
> systems respect lexical scoping in the target language", very soon
> I'll have to get into examples, and soon after that the Lisp-1 issue
> will come up, and then the CL programmer will say, "oh, so it's
> because Scheme is a Lisp-1!"

They will?  It should be clear to the CL programmer that there are
cases that are just as much a problem in a Lisp-2.

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <67Dec.5508$zj3.94@newsread3.news.atl.earthlink.net>
Jeff Dalton wrote:
> "Anton van Straaten" <·····@appsolutions.com> writes:
>
> > I'm not suggesting they don't happen occasionally.  I'm saying that
these
> > cases are neither the reason that defmacro wasn't good enough for
Scheme,
> > nor why a better system had to be developed.
>
> Has anyone said it was *the* reason?

Yes.  See my other recent reply to you.

> > All three of these claims have been made in this thread
>
> Have they?

Yes.

> What do you include in preservation of lexical scoping that you
> don't inluide in hygiene?  In your discussion of lambda calculus,
> for instance, it doesn't sound like there's any difference.

I'll respond to this later.  My time budget for this is blown and this
requires a thought-through response.

> I think it's clear that lexical scoping is an important issue for
> Scheme, although in a sense lexical scoping is just a means to an end
> such as referential transparency.

I agree.  In another recent post, I wrote: "hygiene is a condition which
needs to be achieved for proper functioning of the system, it's achieved via
alpha renaming, in the service of achieving referential transparency for
lexically scoped functions."  To some extent, perhaps this answers the
question above.

> But you've done as much as anyone to cast doubt on the importance of
> lexical scoping , by saying that people continue to use defacro.  If
> continued use is supposed to count against the claim that Lisp-1
> creates a significant problem for traditional macros, continued use
> should also count against the claim that respecting lexical scoping is
> an important issue, since people continue to use the sort of macros
> that has Alan Bowden saying
>
>   Good grief.  Do you -really- mean 'Common-Lisp style' in the sense
>   of 'blind to issues of lexical scoping'?"
>
> There seems to be significant difference of opinion in the Scheme
> community.

Not at all.  I've explained that in another post, but in summary, the fact
that defmacro continues to be useful in certain circumstances and to certain
people, doesn't affect the fact that no-one I'm aware of in the Scheme
community wants it to be Scheme's standard, or only, macro system.

> > I think it's easy to see the appeal of such an approach to users and
> > designers of a language originally billed as "An Interpreter for
Extended
> > Lambda Calculus" (the title of the first Scheme paper).  Defmacro
disregards
> > the basic rules of lambda calculus and referential transparency, whereas
the
> > "hygienic" systems respect those rules.
>
> Yes, it is easy to see the appeal.  So why the wide continued use
> of defmacro in Scheme that you've talked about?

Explained in another post.

> > With this background, it should be clear that "hygiene" wasn't
interesting
> > because of its ability to address problems with macros in a Lisp-1, nor
was
> > it some kind of arbitrary feature that became fashionable or required
> > because competing macro systems were offering it.  It was an essential
and
> > major step towards a macro system that could reliably and safely
manipulate
> > a target language that itself was based on lambda calculus.
>
> There again you seem to be replying to claims that haven't been made.

Both claims I refer to were made by you: the first originally by mikel and
then implicitly by you in his defense, and the second I based on your
statement "once hygienic macros came along, the old-style,
accidental-capture-allowing ones were doomed, in Scheme at least."  No doubt
you weren't making the exact claim I'm addressing, but I'm covering all the
bases here, which seems to have become necessary.

> The point about Lisp-1 has always been that it makes certain
> problems worse, and thus makes defmacro less suitable.
> That's all.  Not that the Lisp-1 / Lisp-2 difference at that
> point is why hygiene was interesting.

I'm glad to hear it.  So we're agreed that what I originally claimed is a
myth, is in fact a myth (or simply incorrect).

> > I can't deny that because of the examples that tend to get used to
> > illustrate the benefits of a hygienic and referentially transparent
macro
> > system, and because some people may have overstated that issue in order
to
> > "sell" hygienic macro systems, and because it apparently scores a point
for
> > Lisp-2 against Lisp-1, and because people may have been looking for
> > convenient excuses to keep some piece of scary new technology off their
> > mental radar, that there have been discussions in which Lisp-1 has been
> > invoked and accepted as an explanation of the need for hygienic macro
> > systems.
>
> They need excuses to continue using technology that is flexible,
> powerful, easy to understand, and works well?

I was talking about the CL community there, not the Scheme community.  It's
mainly in the CL community that these myths are repeatedly and consistently
used as an excuse to avoid learning.  You're doing it yourself, with your
claims that hygiene are what makes syntax-case so impenetrable.

I only hope this can be taken in the spirit in which it is meant, which is
to point out an area where lack of knowledge of something, combined with
competitiveness/defensiveness with the Scheme world, is leading to
misunderstandings and wrong conclusions.  Please also note that I'm not
suggesting all CL programmers should learn Scheme or syntax-case, but rather
that for those who do have an interest in macro systems, that some of the
fondly-held beliefs about them that are often seen in this forum are,
simply, incorrect.

> > Unfortunately, to explain that significance to the average CL
> > programmer, if he doesn't immediately get it when I say "these macro
> > systems respect lexical scoping in the target language", very soon
> > I'll have to get into examples, and soon after that the Lisp-1 issue
> > will come up, and then the CL programmer will say, "oh, so it's
> > because Scheme is a Lisp-1!"
>
> They will?  It should be clear to the CL programmer that there are
> cases that are just as much a problem in a Lisp-2.

Thanks, I'm glad we're in agreement about that at least.

Anton
From: Duane Rettig
Subject: Re: scheme seems neater
Date: 
Message-ID: <4brlwj332.fsf@franz.com>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Jeff Dalton wrote:
> > "Anton van Straaten" <·····@appsolutions.com> writes:
> > > I can't deny that because of the examples that tend to get used to
> > > illustrate the benefits of a hygienic and referentially transparent macro
> > > system, and because some people may have overstated that issue in order to
> > > "sell" hygienic macro systems, and because it apparently scores a point for
> > > Lisp-2 against Lisp-1, and because people may have been looking for
> > > convenient excuses to keep some piece of scary new technology off their
> > > mental radar, that there have been discussions in which Lisp-1 has been
> > > invoked and accepted as an explanation of the need for hygienic macro
> > > systems.
> >
> > They need excuses to continue using technology that is flexible,
> > powerful, easy to understand, and works well?
> 
> I was talking about the CL community there, not the Scheme community.  It's
> mainly in the CL community that these myths are repeatedly and consistently
> used as an excuse to avoid learning.  You're doing it yourself, with your
> claims that hygiene are what makes syntax-case so impenetrable.

I think that Jeff is not the only one who is in the position of trying
to avoid learning something.  You have been writing long and detailed
responses to this issue to a community that doesn't know the subtleties
of the differences, and you even respond below that you aren't suggesting
that CL programmers learn these subtlelies, and yet you continue to be
surprised by our lack of understanding.  If your teaching techniques are
not working, then you must learn new ones if you hope to break through.
If you have no hope of breaking through, and yet still try, well ...

> I only hope this can be taken in the spirit in which it is meant, which is
> to point out an area where lack of knowledge of something, combined with
> competitiveness/defensiveness with the Scheme world, is leading to
> misunderstandings and wrong conclusions.  Please also note that I'm not
> suggesting all CL programmers should learn Scheme or syntax-case, but rather
> that for those who do have an interest in macro systems, that some of the
> fondly-held beliefs about them that are often seen in this forum are,
> simply, incorrect.

> > > Unfortunately, to explain that significance to the average CL
> > > programmer, if he doesn't immediately get it when I say "these macro
> > > systems respect lexical scoping in the target language", very soon
> > > I'll have to get into examples, and soon after that the Lisp-1 issue
> > > will come up, and then the CL programmer will say, "oh, so it's
> > > because Scheme is a Lisp-1!"
> >
> > They will?  It should be clear to the CL programmer that there are
> > cases that are just as much a problem in a Lisp-2.
> 
> Thanks, I'm glad we're in agreement about that at least.

This agreement unfortunately doesn't do very much good, since all it is
really saying is that your examples are bad.  If you want to break
through to us, construct examples where namespace-count issues can't
possibly come up.  This way, the conversation can't get derailed
by defocussing from the original intent of your examples.

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4k70ktzgx.fsf@tarn.inf.ed.ac.uk>
Duane Rettig <·····@franz.com> writes:

> I think that Jeff is not the only one who is in the position of trying
> to avoid learning something.

Ok, I'll bite.  What am I trying to avoid learning?

-- jd
From: Duane Rettig
Subject: Re: scheme seems neater
Date: 
Message-ID: <41xmsmqtn.fsf@franz.com>
Jeff Dalton <····@tarn.inf.ed.ac.uk> writes:

> Duane Rettig <·····@franz.com> writes:
> 
> > I think that Jeff is not the only one who is in the position of trying
> > to avoid learning something.
> 
> Ok, I'll bite.  What am I trying to avoid learning?

You'd have to ask Anton to know for sure, but, you elided the paragraph
from Anton that I took as precisely that kind of statement:

| I was talking about the CL community there, not the Scheme community.  It's
| mainly in the CL community that these myths are repeatedly and consistently
| used as an excuse to avoid learning.  You're doing it yourself, with your
| claims that hygiene are what makes syntax-case so impenetrable.

I assume that your avoidance-of-learning is simply in your disagreement
with Anton's point of view.

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx465c33gkz.fsf@tarn.inf.ed.ac.uk>
Duane Rettig <·····@franz.com> writes:

> Jeff Dalton <····@tarn.inf.ed.ac.uk> writes:
> 
> > Duane Rettig <·····@franz.com> writes:
> > 
> > > I think that Jeff is not the only one who is in the position of trying
> > > to avoid learning something.
> > 
> > Ok, I'll bite.  What am I trying to avoid learning?
> 
> You'd have to ask Anton to know for sure, but, you elided the paragraph
> from Anton that I took as precisely that kind of statement:

I may have elided it in one reply, but I quoted it and answered it in
another.  (There are too many messages.)

> | I was talking about the CL community there, not the Scheme community.  It's
> | mainly in the CL community that these myths are repeatedly and consistently
> | used as an excuse to avoid learning.  You're doing it yourself, with your
> | claims that hygiene are what makes syntax-case so impenetrable.
> 
> I assume that your avoidance-of-learning is simply in your disagreement
> with Anton's point of view.

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <dfWec.6287$zj3.1184@newsread3.news.atl.earthlink.net>
Duane Rettig wrote:
> "Anton van Straaten" <·····@appsolutions.com> writes:
> > I was talking about the CL community there, not the Scheme community.
> > It's mainly in the CL community that these myths are repeatedly and
> > consistently used as an excuse to avoid learning.  You're doing it
> > yourself, with your claims that hygiene are what makes syntax-case
> > so impenetrable.
>
> I think that Jeff is not the only one who is in the position of trying
> to avoid learning something.  You have been writing long and detailed
> responses to this issue to a community that doesn't know the subtleties
> of the differences, and you even respond below that you aren't suggesting
> that CL programmers learn these subtlelies, and yet you continue to be
> surprised by our lack of understanding.

Thanks, Duane.  I take your point.  My last couple of responses to Jeff were
somewhat reactive.

I wouldn't say I'm "surprised by lack of understanding", because I'm not
talking about any lack of understanding as such.  What I was referring to
above was that there are some common ideas that one sees repeated here over
and over - although I don't post here regularly, I lurk pretty often - and
some of those ideas are more like a "party line" that have at best a tenuous
basis in fact, or have a factual basis that isn't particularly relevant.
Some of the issues around Lisp-1 vs. Lisp-2 are like that, which is not too
surprising given the history & nature of this difference.  We could insert a
long thread here about how language choice etc. is political,
community-driven, blah blah, but we all know that by now.

But politics at the knee-jerk level which is often seen here is basically
for people who aren't thinking for themselves.  There are some of those
here, as everywhere.  However, there are also *plenty* of smart people here.
Some of them at least, are able to see past this kind of thing.

> If your teaching techniques are not working, then you must
> learn new ones if you hope to break through. If you have
> no hope of breaking through, and yet still try, well ...

I may not have reached any agreement with Jeff, but I've received private
mail from people who have appreciated my posts.

> > > They will?  It should be clear to the CL programmer that there are
> > > cases that are just as much a problem in a Lisp-2.
> >
> > Thanks, I'm glad we're in agreement about that at least.
>
> This agreement unfortunately doesn't do very much good, since all it is
> really saying is that your examples are bad.  If you want to break
> through to us, construct examples where namespace-count issues can't
> possibly come up.  This way, the conversation can't get derailed
> by defocussing from the original intent of your examples.

What examples?  I don't believe I've constructed any examples myself.

I think I've explained what I was getting at about as well as I plan to.  As
it turns out, the post which started out this thread, where I called the
following a myth:

"coming up with a good macro system for Scheme has been harder (because of
the much greater likelihood of accidental variable capture in a single
namespace)"

...seems to have now been disowned by Jeff Dalton, who original objected to
my calling it a myth.  So, unless there are any questions about why the
above quote is untrue, I'm happy to leave this where it stands, although I
don't doubt I'll find other things to respond to in this thread.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4ekqrrsxk.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> "coming up with a good macro system for Scheme has been harder (because of
> the much greater likelihood of accidental variable capture in a single
> namespace)"
> 
> ...seems to have now been disowned by Jeff Dalton

It hasn't been.

In another message, you had:

  It now sounds to me as though you no longer believe that, although
  you seem unwilling to acknowledge that directly.

Again no.

I just don't see the need to go through it all again, since I've
already explained the sense in which I think the claim is true.

-- jd
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5f76l$p57$1@newsreader2.netcologne.de>
Anton van Straaten wrote:

>> They need excuses to continue using technology that is flexible, 
>> powerful, easy to understand, and works well?
> 
> I was talking about the CL community there, not the Scheme community.
> It's mainly in the CL community that these myths are repeatedly and
> consistently used as an excuse to avoid learning.
[...]

> I only hope this can be taken in the spirit in which it is meant, 
> which is to point out an area where lack of knowledge of something, 
> combined with competitiveness/defensiveness with the Scheme world, is
> leading to misunderstandings and wrong conclusions.

I think you are going too far here. It seems to me that you don't fully
understand the "spirit" of the CL "community", at least as far as I
understand it.

You have said several things in other postings in this thread that lead
me to believe this. For example:

>>> Otherwise, they'll just have to deal with those same hard to 
>>> understand details when writing reliable defmacros.
>> 
>> No.  They can use other techniques and avoid the cases those 
>> techniques don't handle.
> 
> So in other words, continually code around the limitations of 
> defmacro to avoid learning a system which solves those limitations 
> automatically?

And:

> You could eliminate all Lisp-1 problems with defmacro, and you'd 
> still be left with an important issue for Scheme, which is that 
> defmacro doesn't respect lexical scoping, whereas lexical scoping 
> provides the very foundation for the rest of Scheme.

So I am going to sidestep the whole macro issue for a moment and try to
explain the bigger context, as far as I perceive it.

It seems to me that Scheme is a "right thing" language whereas Common
Lisp is a "good enough" language. An important characteristic of a
"right thing" language is that its designers think they have found the
"one true way" how things should be done. In the case of Scheme, the
basis seems to be lambda calculus and lexical scoping. So as far as I
can tell, Scheme is essentially a single-paradigm language, with a
strong emphasis on a higher-order applicative programming style.

On the other hand, a "good enough" language acknowledges the fact that
different problems can be solved best in their own specific ways (most
of the time even in multiple ways). No single way fits all problems. An
essential issue for such a language is how to find a basis that can
serve as many problems as possible. The nice thing about Lisp is that
its basis - s-expressions + metacircular evaluation (programs = data) -
is flexible enough to be molded into anything you ever want. So as far
as I see things, Common Lisp is essentially a multi-paradigm language,
with no strong emphasis on any of the paradigms it provides, and even no
preference for particular paradigms that can be implemented on top of 
it. The fact that LOOP and FORMAT, for example, are part of ANSI CL is 
clear evidence for this - they provide two embedded languages that are 
extremely unlispy, but at the same time work extremely well for the 
problems they solve.

So how does the "multi-paradigm programmer" work? When she sees a
problem that needs to be solved, she tries to solve it in the most
promising way. When it turns out that it doesn't work, she has no
problem to switch the approach taken and use something different that is
more appropriate. She stops when she has found a working solution that
is good enough for the problem at hand, and not too complicated to for
future maintenance purposes.

I don't really understand how the "single-paradigm programmer" works,
because I am not one of those. But it seems to me that he enjoys the
fact that everything can be based on a few essential concepts, and all
the rest on top of it is just syntactic sugar at best. Statements that
identify someone as a single-paradigm programmer are, for example,
"objects are essentially closures", "everything is an object", "side
effects can be expressed as monads in a purely applicative fashion", and
so forth.

A multi-paradigm programmer thinks "interesting, but so what?" ;)

Next, Scheme and Common Lisp share a feature, to a certain degree
accidentally, i.e. lexical scoping (in Scheme for all definitions, in
Common Lisp for most definitions with a few exceptions, foremostly
condition handlers and global variable definitions). For a
single-paradigm programmer, it may seem strange that Common Lisp doesn't
have lexical scoping for everything, especially not for macros. However,
if you take into account what I have said about the multi-paradigm
mindset, it should become clearer:

On the one hand, lexical scoping was introduced in Common Lisp because
when everything is dynamically scoped, you regularly get very nasty bugs
that are hard to track down and hard to solve by hand. Lexical scoping
solves these bugs in a very elegant way.

On the other hand, the use of defmacro doesn't lead to serious bugs most
of the time. It's not a problem. End of story.

+++

So when you say something like Common Lispers "continually code around
the limitations of defmacro to avoid learning a system which solves
those limitations automatically" you forget that a) they don't do this
continually, and b) the workarounds are simple and straightforward. For
example, when you happen to write code in which a local macro needs to
refer to entities in the lexical environment, it is just a matter of
moving things into the right packages, or using some other workaround,
and things are solved. This is just an example of switching approaches
and stopping when having found a solution that is good enough. Why put
more energy into a problem than strictly necessary?

I think that hygienic macros would only be widely accepted in the Common
Lisp "community" if they solved a pressing issue that is hard to solve
in any other way. (I am not interested in Turing equivalence here. The 
focus is on _hard_ to solve.)

Apart from that, defmacro is very flexible. For example, you can mimick 
the pattern matching part of syntax-rules by implementing 
destructuring-case on top of destructuring-bind to a degree that can be 
considered good enough. However, I also find it very handy to be able to 
use LOOP in macros in various ways...


As a last note, you wrote:

> The continued use of defmacro is evidence that the Scheme community
> finds it useful, but its continued exclusion from the standard
> indicates that it's not considered good enough to be part of the core
> definition of Scheme.

I think this is evidence that some members of the Scheme community use
Scheme as a multi-paradigm language and don't care whether its basis is
consistent or not.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Cameron MacKinnon
Subject: Re: scheme seems neater
Date: 
Message-ID: <08OdnaWY2sSB2-bdRVn-jw@golden.net>
Pascal Costanza wrote:

> Next, Scheme and Common Lisp share a feature, to a certain degree
> accidentally, i.e. lexical scoping

Accidentally? Were the benefits of lexical scoping discovered 
independently in the Scheme and Lisp communities?

> On the other hand, the use of defmacro doesn't lead to serious bugs most
> of the time. It's not a problem. End of story.

I can think of a long list of "features" in C that don't lead to serious 
bugs most of the time, because programmers have gotten used to them.

When defmacro does lead to a serious bug, how much time can be wasted 
finding it? Does defmacro increase the chances that there are 
undiscovered bugs (or potential bugs) in production code?

> I think that hygienic macros would only be widely accepted in the Common
> Lisp "community" if they solved a pressing issue that is hard to solve
> in any other way. (I am not interested in Turing equivalence here. The 
> focus is on _hard_ to solve.)

This is like what Fortran users said about C, what C users said about 
C++, what C++ users said about Java.

Resolving a problem every day just because it only takes a few seconds, 
and running the risk that you'll forget, rather than fixing it once and 
never having to worry, is not a sign of Homo habilis.


-- 
Cameron MacKinnon
Toronto, Canada
From: Paul F. Dietz
Subject: Re: scheme seems neater
Date: 
Message-ID: <yYqdnSIl1f-_2ubd4p2dnA@dls.net>
Cameron MacKinnon wrote:

> Resolving a problem every day just because it only takes a few seconds, 
> and running the risk that you'll forget, rather than fixing it once and 
> never having to worry, is not a sign of Homo habilis.

In my experience, one doesn't write macros every day.  Also in my
experience, capture in macros simply isn't a significant problem in practice.

	Paul
From: Cameron MacKinnon
Subject: Re: scheme seems neater
Date: 
Message-ID: <IcadnYXJzI66iOHdRVn-ig@golden.net>
Paul F. Dietz wrote:
> Cameron MacKinnon wrote:
> 
>> Resolving a problem every day just because it only takes a few 
>> seconds, and running the risk that you'll forget, rather than fixing 
>> it once and never having to worry, is not a sign of Homo habilis.
> 
> In my experience, one doesn't write macros every day.  Also in my
> experience, capture in macros simply isn't a significant problem in 
> practice.

Those are fair points that indicate that there's a limited upside to 
improving the macro system.

Addressing the first one, I'll point out that every day, macros are 
written. Since one solution could be used by everyone, the code 
amortizes over a large user base.

You say capture isn't a significant problem in practice. Is that because 
  you always remember to avoid it, or because the problem never even 
comes up with the vast majority of macros? Because if the former is 
true, I'd suggest that you're advocating inflicting pain on new users as 
a rite of passage, merely because you went through the same experience.

Now that we've talked about the limited benefits, what do you see as the 
disadvantages of improving the macro system?

-- 
Cameron MacKinnon
Toronto, Canada
From: Paul Dietz
Subject: Re: scheme seems neater
Date: 
Message-ID: <407C2CF5.24D6E68F@motorola.com>
Cameron MacKinnon wrote:

> You say capture isn't a significant problem in practice. Is that because
>   you always remember to avoid it, or because the problem never even
> comes up with the vast majority of macros?  Because if the former is
> true, I'd suggest that you're advocating inflicting pain on new users as
> a rite of passage, merely because you went through the same experience.

The motivation for not changing DEFMACRO is simply that it doesn't
need changing.  Why should lisp implementors and lisp users expend
the effort to complicate something that will, in the end, not have
any significant benefit (your unfounded imputation that this is
a great burden to new lisp users notwithstanding)?  This is not
a matter of wanting to sadistically torture new users, but rather
a matter of avoiding complex, unnecessary changes to the language.

	Paul
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5hbjh$am1$1@newsreader2.netcologne.de>
Cameron MacKinnon wrote:

> Now that we've talked about the limited benefits, what do you see as 
> the disadvantages of improving the macro system?

If you think that it's worthwhile to do something that has only limited
benefits, then just go ahead. Noone is stopping you. Some people have
even showed interest in using a hygienic macro system in Common Lisp, so
you would even have an audience. Why does it matter to you that other
people are just not interested?


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Tayssir John Gabbour
Subject: Re: scheme seems neater
Date: 
Message-ID: <866764be.0404140210.4f2de939@posting.google.com>
Pascal Costanza <········@web.de> wrote in message news:<············@newsreader2.netcologne.de>...
> If you think that it's worthwhile to do something that has only limited
> benefits, then just go ahead. Noone is stopping you. Some people have
> even showed interest in using a hygienic macro system in Common Lisp, so
> you would even have an audience. Why does it matter to you that other
> people are just not interested?

In every other language community I've been in, change was effected by
threatening to take your money elsewhere, or loudly convincing users
to migrate to a competing language.  This also holds true for certain
software, such as from Microsoft, where the user is constrained from
various changes due to the software engineering used.

One must unlearn this when using lisp.  It ceases to work.  People
should notice they've been infantilized by these other technologies. 
I no doubt had to unlearn it.

Long threads like this, from "neophytes," make me think this is a
large problem.  The lisper-in-the-street usually doesn't need to care
about how uncaring she appears in response to some little
computational thing isn't currently written but can be.  Intelligent
people who've adjusted to the confines of a highly patriarchal
language, whose only freedom is to be spoonfed and to complain loudly,
will be initially offended by this.

I know this is not ontopic for a technical subject, but the problem
generating so many posts seems social and not technical.
From: Cameron MacKinnon
Subject: Re: scheme seems neater
Date: 
Message-ID: <99OdneK2l6KAxODdRVn-hQ@golden.net>
Tayssir John Gabbour wrote:
> In every other language community I've been in, change was effected by
> threatening to take your money elsewhere, or loudly convincing users
> to migrate to a competing language [...]

> One must unlearn this when using lisp.  It ceases to work.  People
> should notice they've been infantilized by these other technologies. 
> I no doubt had to unlearn it.
> 
> Long threads like this, from "neophytes," make me think this is a
> large problem.  The lisper-in-the-street usually doesn't need to care
> about how uncaring she appears in response to some little
> computational thing isn't currently written but can be.  Intelligent
> people who've adjusted to the confines of a highly patriarchal
> language, whose only freedom is to be spoonfed and to complain loudly,
> will be initially offended by this.
> 
> I know this is not ontopic for a technical subject, but the problem
> generating so many posts seems social and not technical.

This is a solid insight, and quite true as far as it goes. The sight of 
the Java community begging for crumbs of new features from Sun is rather 
sad. But your point of view is that of the seasoned Lisper.

People shopping for or starting out with a language don't quite see it 
this way. You are quite right that they are, in a sense, infants. They 
want, to a certain degree, to be spoon fed. How do you do x, y, z in 
Lisp? Hearing that there are several choices doesn't typically inspire 
them, it causes them to think "Oh great, now I'll have to spend time 
evaluating these competing approaches and, since I don't really know the 
language yet, I'm not qualified to understand the tradeoffs being made 
in each."

Here's an example of a young hothead on c.l.l  :-)    :

>  Cameron> It's not the dummies you need to worry about. It's the people who've
>  Cameron> heard such great things, decide to check it out and are appalled to
>  Cameron> discover that nearly a decade after "The Internet's a passing fad"
>  Cameron> Gates threw in the towel and wired his OS for TCP/IP, Lisp still has
>  Cameron> no standard network API. It was difficult for me to reconcile this
>  Cameron> with a living language.
> 
> I have a standard network API I use portably on two platforms and two
> different Lisp implementations.  It took me about two hours to write it.
> I didn't like the ones that were available for free downloading.

My response to this was not "Wow, that's so cool!" That an expert could 
find no cross-implementation API that he liked, and invested a few hours 
of his own time writing one (plus, presumably, a few hours evaluating 
the available ones) at hundreds of dollars per hour?

Sure, when one becomes a journeyman and can quickly spot flaws in 
others' libraries or write my own well designed ones, Lisp is grand. But 
how many people won't, on cursory inspection, be able to see beyond the 
fact that vital things aren't standardized? Like it or not, there's a 
bit of a taint associated with having to choose from a grab bag of third 
party tools, all written and documented different styles in order to get 
started.

There's those that will say "well, we don't want the type of person who 
can't see beyond that." But expectations are set by how welcoming other 
languages are to the beginner, like it or not.

Lest someone point out that the commercial vendors do a wonderful job 
providing uniform, well documented, complete environments, I know. But 
that's not necessarily the first thing someone evaluating the language 
sees, and other views can be disconcerting.


-- 
Cameron MacKinnon
Toronto, Canada
From: Gareth McCaughan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87isg29zwx.fsf@g.mccaughan.ntlworld.com>
Cameron MacKinnon wrote:

> Here's an example of a young hothead on c.l.l  :-)    :
> 
> >  Cameron> It's not the dummies you need to worry about. It's the people who've
> >  Cameron> heard such great things, decide to check it out and are appalled to
> >  Cameron> discover that nearly a decade after "The Internet's a passing fad"
> >  Cameron> Gates threw in the towel and wired his OS for TCP/IP, Lisp still has
> >  Cameron> no standard network API. It was difficult for me to reconcile this
> >  Cameron> with a living language.
> > I have a standard network API I use portably on two platforms and two
> > different Lisp implementations.  It took me about two hours to write it.
> > I didn't like the ones that were available for free downloading.
> 
> My response to this was not "Wow, that's so cool!" That an expert
> could find no cross-implementation API that he liked, and invested a
> few hours of his own time writing one (plus, presumably, a few hours
> evaluating the available ones) at hundreds of dollars per hour?

Using someone else's API is not free; you need to spend time
learning how it works. If there are problems with it, you
need to spend time interacting with the vendor, which may
be much cheaper or infinitely more expensive than fixing
bugs in your own implementation. All this applies no matter
how expert you are and no matter how much you charge for
your time.

-- 
Gareth McCaughan
.sig under construc
From: Mike Kozlowski
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5k2ha$f21$1@reader2.panix.com>
In article <··············@g.mccaughan.ntlworld.com>,
Gareth McCaughan  <················@pobox.com> wrote:

>Using someone else's API is not free; you need to spend time
>learning how it works. If there are problems with it, you
>need to spend time interacting with the vendor, which may
>be much cheaper or infinitely more expensive than fixing
>bugs in your own implementation. All this applies no matter
>how expert you are and no matter how much you charge for
>your time.

Despite which, using a good API is still a significant net win over
rolling your own code, which is why programmers use them instead of
reinventing the wheel every couple of days.  And the out-of-the-box
experience for open source Common Lisp is terrible in this regard.
I'm trying to use CL to rewrite some of my little convenience programs
(my way of evaluating a new language/environment in a practical
fashion), and it's frustrating how narrow the standard CL libraries
are, and how difficult it is to find a set of good (free) third-party
APIs.

I can download the JDK and get regexes and XML parsing for free.  I
can download the .NET SDK and get regexes and XML parsing for free.  I
can download Perl, get regexes for free, and get irritated while
trying to figure out which of the many XML parsers on CPAN is the
"standard" one that I should be using.  I can download CLISP, get
neither regexes nor XML, search around on CLiki.net for XML libraries
that will actually work with CLISP (which rules out a surprisingly
large number of them), rule out the equally surprisingly large number
that kinda-sorta-vaguely support XML ("Well, no Unicode, but..."
doesn't cut it), and then maybe find something.

(Note: I think the CLOCC stuff ends up being what I need, but I'm not
sure; I hit some unrelated problems (specifically: disk quota) that
prevented me from being able to evaluate it to find out.)

I'm willing to believe that the problem is that I have the wrong CL
implementation, and that some better one comes bundled with piles of
libraries; I'm willing to believe that the problem is my cultural
unfamiliarity with CL conventions, and I should know that Site X is
the standard repository for good libraries; I'm not willing to believe
that I'm being foolish in looking for libraries.

-- 
Mike Kozlowski
http://www.klio.org/mlk/
From: Steven E. Harris
Subject: Re: scheme seems neater
Date: 
Message-ID: <q67zn9ebbah.fsf@L75001820.us.ray.com>
Mike Kozlowski <···@klio.org> writes:

> I can download CLISP, get neither regexes nor XML

"You're soaking in it."

If you invoke CLISP as "clisk -K full," you'll pull in the built-in
REGEXP module.� I happen to prefer CL-PPCRE�, which is simple to
install�.


Footnotes: 
� http://clisp.sourceforge.net/impnotes.html#regexp
� http://www.weitz.de/cl-ppcre/
� http://www.weitz.de/cl-ppcre/#install

-- 
Steven E. Harris        :: ········@raytheon.com
Raytheon                :: http://www.raytheon.com
From: Mike Kozlowski
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5k4qo$ftp$1@reader2.panix.com>
In article <···············@L75001820.us.ray.com>,
Steven E. Harris <········@raytheon.com> wrote:
>Mike Kozlowski <···@klio.org> writes:
>
>> I can download CLISP, get neither regexes nor XML
>
>"You're soaking in it."
>If you invoke CLISP as "clisk -K full," you'll pull in the built-in
>REGEXP module.

Ah, that's nice.  It turns out that it's not included with the version
of CLISP my ISP has installed, which is why I didn't notice that; I'm
not sure if that's a version thing (2.29), an OS thing (NetBSD), or
just a site configuration thing.

How about the XML?  Is the parser in CLOCC the best bet?

-- 
Mike Kozlowski
http://www.klio.org/mlk/
From: Steven E. Harris
Subject: Re: scheme seems neater
Date: 
Message-ID: <q67n05eb91n.fsf@L75001820.us.ray.com>
Mike Kozlowski <···@klio.org> writes:

> It turns out that it's not included with the version of CLISP my ISP
> has installed, which is why I didn't notice that; I'm not sure if
> that's a version thing (2.29), an OS thing (NetBSD), or just a site
> configuration thing.

Might that ISP be Panix? Panix has CLISP 2.29 installed on
NetBSD. That's the only 2.29 installation I can reach, and I confirmed
that the REGEXP module is not part of the "full" linking set. Sorry.

Given that, CL-PPCRE might be the best choice.

> How about the XML?  Is the parser in CLOCC the best bet?

I'll defer to others on this one, for I have not yet done any XML
processing with Common Lisp.

-- 
Steven E. Harris        :: ········@raytheon.com
Raytheon                :: http://www.raytheon.com
From: Mike Kozlowski
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5ke37$j64$1@reader2.panix.com>
In article <···············@L75001820.us.ray.com>,
Steven E. Harris <········@raytheon.com> wrote:
>Mike Kozlowski <···@klio.org> writes:
>
>> It turns out that it's not included with the version of CLISP my ISP
>> has installed, 
>
>Might that ISP be Panix?

That's the one, yep.

>Given that, CL-PPCRE might be the best choice.

Sounds like it; it looks pretty solid, at any rate.

-- 
Mike Kozlowski
http://www.klio.org/mlk/
From: ·········@random-state.net
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5k5ng$7dkej$1@midnight.cs.hut.fi>
Mike Kozlowski <···@klio.org> wrote:

> How about the XML?  Is the parser in CLOCC the best bet?

Not having used any of them, this is based on hearsay, but the
following ones come ot mind before anything clocc has to offer:

 XMLS - http://www.common-lisp.net/project/xmls/
 CXML - http://www.common-lisp.net/project/cxml/
 CL-XML - http://www.cl-xml.org/

Cheers,

  -- Nikodemus
From: Mike Kozlowski
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5kdu0$j3u$1@reader2.panix.com>
In article <··············@midnight.cs.hut.fi>,
 <·········@random-state.net> wrote:
>Mike Kozlowski <···@klio.org> wrote:
>
>> How about the XML?  Is the parser in CLOCC the best bet?
>
>Not having used any of them, this is based on hearsay, but the
>following ones come ot mind before anything clocc has to offer:
>
> XMLS - http://www.common-lisp.net/project/xmls/

Not fully compliant with XML, and doesn't have any explanation of what
it's missing on the Web page.

> CL-XML - http://www.cl-xml.org/

Doesn't work with CLISP.

> CXML - http://www.common-lisp.net/project/cxml/

Probably the best bet, of those.  But I was vaguely hoping for a more
native API than DOM/SAX.  It's been my experience that DOM is highly
irritating in non-Javascript languages, and since I've heard so much
about how XML is just stripped-down s-expressions, it (perhaps
wrongly; I won't know until I try) seems to me that there should be a
more straightforward way to access XML.  XMLS seems to provide that,
but I'm highly wary of partial implementations.

Does anyone who's actually used any of these parsers have anything to
say about them?  Is the CLOCC xml.lisp really bad?  (There's
essentially no documentation for it, so I have no idea what the status
of it is.)

-- 
Mike Kozlowski
http://www.klio.org/mlk/
From: ·········@random-state.net
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5lhs4$7evs3$1@midnight.cs.hut.fi>
Mike Kozlowski <···@klio.org> wrote:

>> XMLS - http://www.common-lisp.net/project/xmls/

> Not fully compliant with XML, and doesn't have any explanation
> of what it's missing on the Web page.

Then it's probably a better idea to send email on the relevant
mailing list and ask there, then here. Asking about it there will
probably lead to the webpage being updated as well.

The vibrant lisp communities have wild parties in the safe
environs of various mailing lists, and only arrive here on cll in
the mornings to show off their hangovers.

Cheers,

  -- Nikodemus
From: Jeremy Yallop
Subject: Re: scheme seems neater
Date: 
Message-ID: <slrnc7r4ib.aag.jeremy@maka.cl.cam.ac.uk>
Mike Kozlowski wrote:
> I can download CLISP, get neither regexes nor XML

Clisp has a (builtin) interface to POSIX regular expressions.

   http://clisp.cons.org/impnotes/modules.html#regexp

You might also try CL-PPCRE

   http://common-lisp.net/project/cl-ppcre

which is a rather fast implementation of Perl-compatible regular
expressions.

> I'm willing to believe that the problem is my cultural unfamiliarity
> with CL conventions

If you type

   (apropos "REGEX")

at a listener prompt you might find something useful.

Jeremy.
From: Antonio Menezes Leitao
Subject: Re: scheme seems neater
Date: 
Message-ID: <pan.2004.04.14.23.21.01.49857@evaluator.pt>
On Wed, 14 Apr 2004 19:13:46 +0000, Mike Kozlowski wrote:

> Despite which, using a good API is still a significant net win over
> rolling your own code, which is why programmers use them instead of
> reinventing the wheel every couple of days.  And the out-of-the-box
> experience for open source Common Lisp is terrible in this regard.
> I'm trying to use CL to rewrite some of my little convenience programs
> (my way of evaluating a new language/environment in a practical
> fashion), and it's frustrating how narrow the standard CL libraries
> are, and how difficult it is to find a set of good (free) third-party
> APIs.
> 
> I can download the JDK and get regexes and XML parsing for free.  I
> can download the .NET SDK and get regexes and XML parsing for free.  I
> can download Perl, get regexes for free, and get irritated while
> trying to figure out which of the many XML parsers on CPAN is the
> "standard" one that I should be using.  I can download CLISP, get
> neither regexes nor XML, search around on CLiki.net for XML libraries
> that will actually work with CLISP (which rules out a surprisingly
> large number of them), rule out the equally surprisingly large number
> that kinda-sorta-vaguely support XML ("Well, no Unicode, but..."
> doesn't cut it), and then maybe find something.
> 
> (Note: I think the CLOCC stuff ends up being what I need, but I'm not
> sure; I hit some unrelated problems (specifically: disk quota) that
> prevented me from being able to evaluate it to find out.)
> 
> I'm willing to believe that the problem is that I have the wrong CL
> implementation, and that some better one comes bundled with piles of
> libraries; I'm willing to believe that the problem is my cultural
> unfamiliarity with CL conventions, and I should know that Site X is
> the standard repository for good libraries; I'm not willing to believe
> that I'm being foolish in looking for libraries.

IMHO, in what regards libraries, it is impossible for Common Lisp to
compete with Java.  We simply don't have the resources to do that. There
are so many Java programmers out there and so much investment in the Java
camp that it's impossible just to rewrite in Common Lisp all the Java
libraries that are being developed, much less to develop new ones from
scratch.

Just to give you an example, let's suppose you need to write a program
that reads data from an Excel spreadsheet.  You google a lot for Excel
and Common Lisp and you find nothing relevant.  You google a bit for Excel
and Java and you immediately find hits for "Java Excel API - A Java
API to read, write and modify Excel spreadsheets", "Jakarta POI - Java
API To Access Microsoft Format Files", etc.  If you really need to read
Excel files would you start writing your program in Common Lisp or in Java?

I consider myself a Lisper.  However, it is becoming impossible to keep
developing Common Lisp programs while, at the same time, the only
available APIs for the constantly changing outside world are being
written in Java.  Time to market means something in this world and
waiting for a Common Lisp library to be developed or develop one myself
is not going to help in this regard.

After spending quite a long time developing in Java (but always
missing the times when I develop exclusively in Lisp (ZetaLisp,
Common Lisp or Scheme)) I decided to develop something that would combine
the advantages of Lisp with the advantages of Java.  The result is a
Common Lisp-like language that I call Linj (Linj is not Java).  You
write programs just like in Common Lisp, using lots of Common Lisp
features but you can also access any Java library.  Extra bonus: the
compilation of a Linj program is a Java program very similar to what I
would have written by hand.

Just to finish the Excel example, here is a Linj program that
reads a cell (row,col) on a given sheet on an Excel spreadsheet (using
the POI Java API):

(import org.apache.poi.poifs.filesystem.POIFSFileSystem)
(import org.apache.poi.hssf.model.*)
(import org.apache.poi.hssf.usermodel.*)
(import org.apache.poi.hssf.util.*)

(defun main (xls-name/string sheet-name/string row/string col/string)
  (let ((row (the short (1- (parse-integer row))))
	(col (the short (- (char col 0) #\A))))
    (let ((workbook (new 'HSSFWorkbook (new 'POIFSFileSystem (new 'file-input-stream xls-name)))))
      (let ((sheet (get-sheet workbook sheet-name)))
	(let ((cell (get-cell (get-row sheet row) col)))
	  (let ((text (get-string-cell-value cell)))
	    (format t "Cell (~A,~A) on sheet '~A' contains the text '~A'"
		    row col sheet-name text)))))))
From: Tayssir John Gabbour
Subject: Re: scheme seems neater
Date: 
Message-ID: <866764be.0404151641.772f490d@posting.google.com>
Cameron MacKinnon <··········@clearspot.net> wrote in message news:<······················@golden.net>...
> This is a solid insight, and quite true as far as it goes. The sight of 
> the Java community begging for crumbs of new features from Sun is rather 
> sad. But your point of view is that of the seasoned Lisper.

I hope I'm not considered a seasoned Lisper. ;)  I'm recent and mainly
here for the interesting ideas.


> My response to this was not "Wow, that's so cool!" That an expert could 
> find no cross-implementation API that he liked, and invested a few hours 
> of his own time writing one (plus, presumably, a few hours evaluating 
> the available ones) at hundreds of dollars per hour?

Communication interfaces are a different topic, since those aren't
purely computational.  I agree that this is unfortunate.

On one hand, I talk about lisp because it's just not interesting to
talk about programming relative to some language which isn't itself
programmable.

However, I will use whatever tool is needed to get the job done.

That might sound odd because there is a difference between word and
action.  That is why my position is, "Lisp has lessons you can take
back to your own languages, to lean on your language designers about."
 And I talk about things which can be transplanted, like lisp's
"errorhandling" system.  This is successful because a) it makes lisp
sound less outrageous, and b) because it's giving them something
useful.  People want to understand things on their own terms.  They
don't want to swallow lisp in one huge gulp.

And it removes the expectation that lisp must be everything to
everyone.  Let people see lisp as a source of good ideas instead,
because that's what it is.

(Incidentally, when Smith talked about an "invisible hand," I just
think he meant we should all be a little less heavy handed.  Pretend
there's an invisible hand, use a light touch...)


> Like it or not, there's a 
> bit of a taint associated with having to choose from a grab bag of third 
> party tools, all written and documented different styles in order to get 
> started.

Many agree.  But Fred Brooks claims there's a 9X effort multiplier on
making polished systems that are easy to use and complete.  There is a
time gap when before the Free Software community can scale that
multiplier.  Plus another because the relatively preconceptionless
Java guys needed some time to look around.

Python's older and more productive than Java; but Java has more
libraries.  Asymmetries and nonlinear curves...
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5j8a9$qdm$1@f1node01.rhrz.uni-bonn.de>
Tayssir John Gabbour wrote:

> I know this is not ontopic for a technical subject, but the problem
> generating so many posts seems social and not technical.

Good point.


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5glfj$o4u$1@newsreader2.netcologne.de>
Cameron MacKinnon wrote:

> Pascal Costanza wrote:
> 
>> Next, Scheme and Common Lisp share a feature, to a certain degree 
>> accidentally, i.e. lexical scoping
> 
> Accidentally? Were the benefits of lexical scoping discovered 
> independently in the Scheme and Lisp communities?

Not only there, but also in other communities. Some other Lisp dialects
didn't adopt it. Scheme has placed its main focus on lexical scoping. My
guess is that it was adopted in Common Lisp mainly because of the people
involved in setting the basic design choices.

>> On the other hand, the use of defmacro doesn't lead to serious bugs
>> most of the time. It's not a problem. End of story.
> 
> I can think of a long list of "features" in C that don't lead to
> serious bugs most of the time, because programmers have gotten used
> to them.
> 
> When defmacro does lead to a serious bug, how much time can be wasted
> finding it? Does defmacro increase the chances that there are 
> undiscovered bugs (or potential bugs) in production code?

Fans of statically typed languages use a similar reasoning. Still,
people make a conscious decision to use a dynamically typed one.

I can't think of any interesting language construct that is absolutely
fail safe. You seem to be thinking of programming as an activity that
shouldn't involve risks. I don't think this can be achieved. So whenever
you try to address a potential source of bugs, you should first check
whether it is _actually_ one, and why.

I am pretty sure that people write buggy programs with
syntax-rules/syntax-case. I don't think this necessarily means that
these macro systems suck, or that macro programming sucks at all. It
only proves that programming is hard.

>> I think that hygienic macros would only be widely accepted in the
>> Common Lisp "community" if they solved a pressing issue that is
>> hard to solve in any other way. (I am not interested in Turing
>> equivalence here. The focus is on _hard_ to solve.)
> 
> This is like what Fortran users said about C, what C users said about
> C++, what C++ users said about Java.

...and what Scheme users say about Haskell? ;)

> Resolving a problem every day just because it only takes a few
> seconds, and running the risk that you'll forget, rather than fixing
> it once and never having to worry, is not a sign of Homo habilis.

Name capture in macros is an important feature that several macro
libraries rely on. You need to understand the concept anyway when you 
want to understand the full range of possibilities of macros.



Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Paul F. Dietz
Subject: Re: scheme seems neater
Date: 
Message-ID: <J66dnW9cVdivR-bdRVn_iw@dls.net>
Pascal Costanza wrote:

> Not only there, but also in other communities. Some other Lisp dialects
> didn't adopt it. Scheme has placed its main focus on lexical scoping. My
> guess is that it was adopted in Common Lisp mainly because of the people
> involved in setting the basic design choices.

Lexical scoping can also be a big efficiency win.

	Paul
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4y8p0sdje.fsf@tarn.inf.ed.ac.uk>
Pascal Costanza <········@web.de> writes:

> On the one hand, lexical scoping was introduced in Common Lisp because
> when everything is dynamically scoped, you regularly get very nasty bugs
> that are hard to track down and hard to solve by hand. Lexical scoping
> solves these bugs in a very elegant way.

That's part of the story, but it was also a very natural move.

Remember that in the MacLisp branch of the Lisp tree, "local" lexical
scoping was the norm in compiled code.  Only in the interpreter was it
all dynamic.

The move to full lexical scoping (with "upward and downward funargs"),
for both interpreted and compiled code, was a natural cleanup and
generalization.

Consistency between interpreter and compiler was a stronger,
or at least more talked-about, motivating issue than bugs.

-- jd
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4fzb8txt5.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Jeff Dalton wrote:

> > But you've done as much as anyone to cast doubt on the importance of
> > lexical scoping , by saying that people continue to use defacro.  If
> > continued use is supposed to count against the claim that Lisp-1
> > creates a significant problem for traditional macros, continued use
> > should also count against the claim that respecting lexical scoping is
> > an important issue, since people continue to use the sort of macros
> > that has Alan Bowden saying
> >
> >   Good grief.  Do you -really- mean 'Common-Lisp style' in the sense
> >   of 'blind to issues of lexical scoping'?"
> >
> > There seems to be significant difference of opinion in the Scheme
> > community.
> 
> Not at all.  I've explained that in another post, but in summary, the fact
> that defmacro continues to be useful in certain circumstances and to certain
> people, doesn't affect the fact that no-one I'm aware of in the Scheme
> community wants it to be Scheme's standard, or only, macro system.

You again take the actual point and turn it into something more
extreme: whether anyone "wants it to be Scheme's standard, or only,
macro system".  This happens throughout the discussion.

> > > With this background, it should be clear that "hygiene" wasn't
> > > interesting because of its ability to address problems with
> > > macros in a Lisp-1, nor was it some kind of arbitrary feature
> > > that became fashionable or required because competing macro
> > > systems were offering it.  ...

> > There again you seem to be replying to claims that haven't been made.

> Both claims I refer to were made by you: 

No, they were not.

> the first originally by mikel and then implicitly by you in his
> defense,

What I was defending was "coming up with a good macro system for
Scheme has been harder", not what you say above, and I've explained
why I thought so and what I meant by it.  I don't want to get into
a further dispute about what mikel meant "originally".

> and the second I based on your
> statement "once hygienic macros came along, the old-style,
> accidental-capture-allowing ones were doomed, in Scheme at least."

Nothing about arbitrary or fashionable.  Not even "required because
competing macro systems were offering it".

> No doubt you weren't making the exact claim I'm addressing,

No doubt.

> but I'm covering
> all the bases here, which seems to have become necessary.

Then put it that way.  Set out some possibilities.  Say you're
not sure which one, if any, were meant by the people you're
disagreeing with.

Besides, you're not covering all the bases.  In particular,
you're not covering the point I was actually trying to make.

> > The point about Lisp-1 has always been that it makes certain
> > problems worse, and thus makes defmacro less suitable.
> > That's all.  Not that the Lisp-1 / Lisp-2 difference at that
> > point is why hygiene was interesting.
> 
> I'm glad to hear it.  So we're agreed that what I originally claimed
> is a myth, is in fact a myth (or simply incorrect).

I don't know what you think you originally claimed was a myth.
It seems to be different things at different points.

> > > I can't deny that because of the examples that tend to get used
> > > to illustrate the benefits of a hygienic and referentially
> > > transparent macro system, and because some people may have
> > > overstated that issue in order to "sell" hygienic macro systems,
> > > and because it apparently scores a point > for Lisp-2 against
> > > Lisp-1, and because people may have been looking for convenient
> > > excuses to keep some piece of scary new technology off their
> > > mental radar, that there have been discussions in which Lisp-1
> > > has been invoked and accepted as an explanation of the need for
> > > hygienic macro systems.
>
> > They need excuses to continue using technology that is flexible,
> > powerful, easy to understand, and works well?
> 
> I was talking about the CL community there, not the Scheme
> community.

I know.  That's why my "they" was a reference to the CL people.

They're the ones you think "may have been looking for convenient
excuses", right?  Trying to keep "scary new technology off their
mental radar".  But they need no excuse to continue to use their
old, yet flexible, powerful, etc technology.  (I mean defmacro.)

That was the import of my rhetorical question.  I should have
made the point more clearly.

> It's mainly in the CL community that these myths are repeatedly
> and consistently used as an excuse to avoid learning.  You're
> doing it yourself, with your claims that hygiene are what makes
> syntax-case so impenetrable.

What?  You think it's the rule language?  But it's straightforward
and easy to understand.

I am telling you why some people perceive scheme macros as complex.
If you want to continue as if no one thought that way, go ahead,
but then you're the one who is trying to avoid learning.

> > > Unfortunately, to explain that significance to the average CL
> > > programmer, if he doesn't immediately get it when I say "these macro
> > > systems respect lexical scoping in the target language", very soon
> > > I'll have to get into examples, and soon after that the Lisp-1 issue
> > > will come up, and then the CL programmer will say, "oh, so it's
> > > because Scheme is a Lisp-1!"
> >
> > They will?  It should be clear to the CL programmer that there are
> > cases that are just as much a problem in a Lisp-2.
> 
> Thanks, I'm glad we're in agreement about that at least.

Yes, we agree that your caricature of CL programmers is
incorrect.  ;>

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <zwWec.6302$zj3.5660@newsread3.news.atl.earthlink.net>
Jeff Dalton wrote:

> What I was defending was "coming up with a good macro system for
> Scheme has been harder", not what you say above, and I've explained
> why I thought so and what I meant by it.  I don't want to get into
> a further dispute about what mikel meant "originally".

But this was the basis for the discussion.  If you're saying you were only
defending "coming up with a good macro system for Scheme has been harder",
but not saying that the reason it was harder was, as mikel stated, "because
of the much greater likelihood of accidental variable capture in a single
namespace", then you were in fact in agreement with me.  The original
statement which I responded to was untrue, or at least, no-one has defended
it as being true, other than your apparently retracted objection to my
calling it a myth.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx44qrnyuya.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Jeff Dalton wrote:
> 
> > What I was defending was "coming up with a good macro system for
> > Scheme has been harder", not what you say above, and I've explained
> > why I thought so and what I meant by it.  I don't want to get into
> > a further dispute about what mikel meant "originally".

Note the "what you say above" in the quote above.

> But this was the basis for the discussion.  If you're saying you were only
> defending "coming up with a good macro system for Scheme has been harder",
> but not saying that the reason it was harder was, as mikel stated, "because
> of the much greater likelihood of accidental variable capture in a single
> namespace", then you were in fact in agreement with me.

Sigh.  We have to go meta *again*.

Compare 

(1) "coming up with a good macro system for Scheme has been harder"
    "because of the much greater likelihood of accidental variable
    capture in a single namespace"

with

(2) "hygiene" wasn't interesting because of its ability to address
    problems with macros in a Lisp-1, nor was it some kind of
    arbitrary feature that became fashionable or required because
    competing macro systems were offering it.

Now, (2) was the "what you say above" I was referring to when I wrote:

  What I was defending was "coming up with a good macro system for
  Scheme has been harder", not what you say above, and I've explained
  why I thought so and what I meant by it.  I don't want to get into a
  further dispute about what mikel meant "originally".

You said I had made the claims in (2).  I replied that I had not.

Because I hadn't.

re the second part of (2), you wrote:

  and the second I based on your
  statement "once hygienic macros came along, the old-style,
  accidental-capture-allowing ones were doomed, in Scheme at least."

I pointed out that that contained

  Nothing about arbitrary or fashionable.  Not even "required because
  competing macro systems were offering it".

And so on.

Re (1), "I've explained why I thought so and what I meant by it".

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <SN6fc.6814$zj3.5679@newsread3.news.atl.earthlink.net>
Jeff Dalton wrote:

> "Anton van Straaten" <·····@appsolutions.com> writes:
>
> > Jeff Dalton wrote:
> >
> > > What I was defending was "coming up with a good macro system for
> > > Scheme has been harder", not what you say above, and I've explained
> > > why I thought so and what I meant by it.  I don't want to get into
> > > a further dispute about what mikel meant "originally".
>
> Note the "what you say above" in the quote above.
>
> > But this was the basis for the discussion.  If you're saying you were
only
> > defending "coming up with a good macro system for Scheme has been
harder",
> > but not saying that the reason it was harder was, as mikel stated,
"because
> > of the much greater likelihood of accidental variable capture in a
single
> > namespace", then you were in fact in agreement with me.
>
> Sigh.  We have to go meta *again*.

Actually, we don't.  You seem to be doing it because you don't want to
acknowledge that your very first response to me was apparently some kind of
misunderstanding.  You didn't actually disagree with what I was saying.
Apparently, in hindsight, since your original objection to my "it's a myth"
claim, you haven't yet said a word to defend the statement which I claimed
was a myth:

"the only real difference [Lisp-1] makes is that coming up with a good macro
system for
Scheme has been harder (because of the much greater likelihood of accidental
variable capture in a single namespace)."

You now want to insist on focusing on subsequent responses I made to you
when I (reasonably) believed you were defending the above statement.  That's
silly.  You're just wriggling because you don't want to admit the problem
with your very first response to me.

Anton
From: Cameron MacKinnon
Subject: Re: scheme seems neater
Date: 
Message-ID: <S8udnaCoedjooObdRVn-sA@golden.net>
I'm writing because I can't figure out what, exactly, you are arguing. 
If you appeared to be just another person who likes debate and must have 
the last word, I would put it down to that. But you've got deep 
knowledge of the history and the issues, yet seem to be defending 
defmacro over syntax-*, and I can't understand why.


Jeff Dalton wrote:

> The combination of defmacro, gensym, and packages is a good enough
> macro system for Common Lisp, but Scheme doesn't have packages,
> and even with all three together, accidental captures are still
> possible; and in Scheme they are more likely.  So traditional
> Lisp macros weren't good enough for Scheme.

and elsewhere:
> The argument is that when you combine 2 namespaces with packages
> and certain conventions for writing macros, you get a very usable
> system that is not plagued by the sorts of conflicts hygienic
> macros are designed to avoid, so that having 2 namespaces is
> not nearly as bad as the enemies of 2 namespaces would have us
> believe.

I worry when I see phrases like "good enough" and "certain conventions 
for writing macros". If you were arguing against investing effort in an 
as-yet-undiscovered superior solution, I could understand (though not 
agree with) the sentiment. But you seem to be arguing against using 
better, already existing (though perhaps not for CL) tools in favour of 
older, riskier ones.

Sure, Lispers learn how to avoid unintended capture in macros, just as 
users of any language quickly learn the traps of that language. But 
they're still traps. When your code doesn't work, you go down the 
possible causes. Having something like unintended capture on that list 
can, I understand, make Lispers break into a cold sweat, as it's 
sometimes a very difficult bug to find.

Am I missing something? Are there things that defmacro solves that 
syntax-* can't, or where defmacro is more expressive?


Part of your aversion to what I'd term progress is to the complexity of 
the implementation:

>> As I said, I don't deny that the perception exists.  The main point I'm
>> trying to make in this respect is that in a system like syntax-case, there
>> is very little *actual* complexity, from the user's perspective, related to
>> hygiene.  With syntax-rules, there's absolutely none.  The hygiene is all
>> taken care of under the hood.
> 
> But that is part of the perceived complexity.  Indeed, a think it is
> the greatest factor.  It's clear how traditional macros work.  It is
> very much harder to understand how hygienic macros work.
...
> It's the, largely hidden, mechanisms the ensure hygiene that
> make the whole thing seem so complex.
...
>>Call it a huge implementation chunk, if you like.  The point is it's part of
>>> the implementation.  If people are complaining because the implementation is
>>> hard to understand, that's an argument in favor of using these systems!
> 
> Perhaps for some people; but many would see it was an argument
> against using them.
...
> Programmers like to understand what is happening in the
> implementation.
...
> They need excuses to continue using technology that is flexible,
> powerful, easy to understand, and works well?  Instead of technology
> that is complex, harder to understand, and doesn't seem to have
> ever been available in Common Lisp?

I understand how you feel. It's natural for many programmers (myself 
included) to want to understand what's going on all the way down to the 
bits. We've been burned before by buggy libraries which purport to solve 
our problems without us having to worry about the implementation.

But at some point, if you can't stay current and you won't trust 
implementations you don't understand, it's time to move into the 
embedded space (where you can understand everything), or recognize that 
you just won't be as productive as those using newer tools. I don't know 
that disparaging the newer tools to keep others at your level is very 
helpful.

I'm asserting that if the tools are just as expressive and less bug 
prone, then those using them are more productive, including debugging 
time and time to learn the tools.

Our development tools are the best examples of code reuse: They're not 
just something we'll use again in one or more projects in the future, 
they're something we use daily. It is here that advanced technology 
provides us the most leverage. The implementor does the work once, and 
every coder benefits, every day.

> But you've done as much as anyone to cast doubt on the importance of
> lexical scoping , by saying that people continue to use defacro.  If
> continued use is supposed to count against the claim that Lisp-1
> creates a significant problem for traditional macros, continued use
> should also count against the claim that respecting lexical scoping is
> an important issue, since people continue to use the sort of macros
> that has Alan Bowden saying
> 
>   Good grief.  Do you -really- mean 'Common-Lisp style' in the sense
>   of 'blind to issues of lexical scoping'?"
> 
> There seems to be significant difference of opinion in the Scheme
> community.

and elsewhere:
>>I think it's easy to see the appeal of such an approach to users and
>>designers of a language originally billed as "An Interpreter for Extended
>>Lambda Calculus" (the title of the first Scheme paper).  Defmacro disregards
>>the basic rules of lambda calculus and referential transparency, whereas the
>>"hygienic" systems respect those rules.
> 
> 
> Yes, it is easy to see the appeal.  So why the wide continued use
> of defmacro in Scheme that you've talked about?

I'm betting on a combination of habit/portability issues for those who 
use both Scheme and Lisp regularly (or port Lisp code to Scheme), and 
documentation/tutorial issues for those who use only Scheme. If the best 
macro tutorials use defmacro, people are going to learn defmacro.

In fact, I suspect that a lot of the reason lies specifically with On 
Lisp (which I haven't yet read, BTW). It's widely touted as a must-read, 
and no equivalent exists for Scheme (does it?). So a great book which 
teaches using technology that isn't state of the art slows adoption of 
the newer, better macro system.

I'm interested in why CL users haven't adopted newer macro technology. 
My guesses are, in no significant order:
  - reflexive antipathy to ideas originating in the Scheme world
  - once disciplined in avoiding traps, people tend to minimize them
  - the horror of debugging with two interacting macro systems
  - Scheme is used in teaching, where safe tools are prized, CL less so

Have I misunderstood your position? I can't see where you've pointed out 
superiorities in defmacro, nor deficiencies for the users of syntax-*.


-- 
Cameron MacKinnon
Toronto, Canada
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4u0zosaki.fsf@tarn.inf.ed.ac.uk>
Cameron MacKinnon <··········@clearspot.net> writes:

> I'm writing because I can't figure out what, exactly, you are
> arguing. If you appeared to be just another person who likes debate
> and must have the last word, I would put it down to that. But you've
> got deep knowledge of the history and the issues, yet seem to be
> defending defmacro over syntax-*, and I can't understand why.

I'll try to clarify.

For one thing, I didn't think the discussion was about which sort of
macro system was better, and so I haven't said much about that.

> Jeff Dalton wrote:
> 
> > The combination of defmacro, gensym, and packages is a good enough
> > macro system for Common Lisp, but Scheme doesn't have packages,
> > and even with all three together, accidental captures are still
> > possible; and in Scheme they are more likely.  So traditional
> > Lisp macros weren't good enough for Scheme.
> 
> and elsewhere:
> > The argument is that when you combine 2 namespaces with packages
> > and certain conventions for writing macros, you get a very usable
> > system that is not plagued by the sorts of conflicts hygienic
> > macros are designed to avoid, so that having 2 namespaces is
> > not nearly as bad as the enemies of 2 namespaces would have us
> > believe.

There I'm arguing that defmacro is a reasonably good way to do macros,
given some other things which are present in Common Lisp, and that the
situation in Scheme is different in ways that make defmacro less good.

> I worry when I see phrases like "good enough" and "certain conventions
> for writing macros". If you were arguing against investing effort in
> an as-yet-undiscovered superior solution, I could understand (though
> not agree with) the sentiment. But you seem to be arguing against
> using better, already existing (though perhaps not for CL) tools in
> favour of older, riskier ones.

At least I'm not the one saying that defmacro "works fine" in
Scheme.  :)

Anyway, the risks are not all on one side.  There are risks
in using a more complex system.  The bugs prevented by hygienic
macros are not the only bugs, and they are probably not the
most common either.

> Sure, Lispers learn how to avoid unintended capture in macros, just as
> users of any language quickly learn the traps of that language. But
> they're still traps. When your code doesn't work, you go down the
> possible causes. Having something like unintended capture on that list
> can, I understand, make Lispers break into a cold sweat, as it's
> sometimes a very difficult bug to find.

I don't think most Lispers break into a cold sweat.

I think a more typical attitude is this one:

http://groups.google.com/groups?hl=en&lr=&ie=UTF-8&selm=9310091717.AA29226%40inferno.lucid.com

Or even this (a response the Alan Bawden's "Good grief" remark
mentioned in other messages):

http://groups.google.com/groups?hl=en&lr=&ie=UTF-8&selm=DAVIS.93Oct7095120%40passy.ilog.fr

> Part of your aversion to what I'd term progress is to the complexity
> of the implementation:

You are assuming -- I'm not sure why -- that I have an aversion to
Scheme-style hygienic macros; but, as I said a while back, and again
in a message earlier today, I would like to use such macros in Common
Lisp if they were available.

> I'm interested in why CL users haven't adopted newer macro
> technology. My guesses are, in no significant order:
>   - reflexive antipathy to ideas originating in the Scheme world
>   - once disciplined in avoiding traps, people tend to minimize them
>   - the horror of debugging with two interacting macro systems
>   - Scheme is used in teaching, where safe tools are prized, CL less so

Here are some reasons:

The "newer technology" is not available for CL.

At the time CL was being standardized and people in X3J13 were
considering this issue, hygienic macros were still quite new, and 
even the Schemers were having trouble deciding how to fit them into
their language and how to provide lower-level support as well as
the language for pattern-matching and template-filling.

Common Lisp macros work well enough that a better system
is not high on many wish-lists.  Other things are more important.

Defmacro is simple and easy to understand, significantly
more so than Scheme's hygienic macros.

Although the Scheme pattern language allows very nice definitions
in some cases, it is awkward in others, which decreases its
attractiveness.  Also, it isn't Lisp and involves a different
style of programming.

Common Lispers don't make a fetish of referential transparency.  :)

-- jd
From: Cameron MacKinnon
Subject: Re: scheme seems neater
Date: 
Message-ID: <zPudnd1VLv9zhuHdRVn-hQ@golden.net>
Jeff Dalton wrote:
> Cameron MacKinnon <··········@clearspot.net> writes:
>>I'm writing because I can't figure out what, exactly, you are
>>arguing.
> 
> I'll try to clarify.

Thank you, that helped a lot. The thread was long and winding, and I'd 
gotten somewhat dizzy following it all.

> I don't think most Lispers break into a cold sweat.
> 
> I think a more typical attitude is this one:
> 
> http://groups.google.com/groups?hl=en&lr=&ie=UTF-8&selm=9310091717.AA29226%40inferno.lucid.com

"Millions on lines of code, even written in God's Own Favorite
Language, Scheme, will be bug-ridden - as bug-ridden as in any other
language."

I don't agree that programmers produce an equal proportion of bugs in 
any language. Languages vary both in their expressivity (power) and in 
their danger (how much rope they give you). Heck, even different 
implementations of the same language can have an impact, via diagnostic 
messages. I don't think the search for a powerful, safe language is futile.

"I have been writing macros for twenty years now, and for the last 19 of 
them I don't think I've ever introduced a name conflict bug."

I just don't like this attitude. It admits the problem, but says you'll 
eventually learn to live with it. I find this difficult to reconcile 
with the "Lisp is a local maximum" tenet.


> You are assuming -- I'm not sure why -- that I have an aversion to
> Scheme-style hygienic macros; but, as I said a while back, and again
> in a message earlier today, I would like to use such macros in Common
> Lisp if they were available.

I don't know how I got that impression either. Glad to see I was wrong.


-- 
Cameron MacKinnon
Toronto, Canada
From: Duane Rettig
Subject: Re: scheme seems neater
Date: 
Message-ID: <4n05fsnai.fsf@franz.com>
Cameron MacKinnon <··········@clearspot.net> writes:

>     "I have been writing macros for twenty years now, and for the last 19
>      of them I don't think I've ever introduced a name conflict bug."
> 
> I just don't like this attitude. It admits the problem, but says
> you'll eventually learn to live with it. I find this difficult to
> reconcile with the "Lisp is a local maximum" tenet.

Careful, here; if you're really true to yourself, and based on
other statements you've made in this thread about any interesting
language constructs not being failsafe, then the only logical
conclusion that this line of reasoning is going to lead you to is
the conclusion that you need to give up programming!

:-)

On a completely serious vein:  Yes, all programmers make mistakes,
and yes, some of these mistakes they learn to live with, while
others are viewed as fixable with a change of process.  I don't know
of anyone who sits completely on one side (live with all mistakes)
or the other (fix every mistakes so that it never happens again);
generally, the mix is a hybrid based on experience and a belief
system about how often mistakes of a particular class are likely
to occur in the future.

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Mario S. Mommer
Subject: Re: scheme seems neater
Date: 
Message-ID: <fzu0zn3iax.fsf@germany.igpm.rwth-aachen.de>
Cameron MacKinnon <··········@clearspot.net> writes:
> "I have been writing macros for twenty years now, and for the last 19
> of them I don't think I've ever introduced a name conflict bug."
>
> I just don't like this attitude. It admits the problem, but says
> you'll eventually learn to live with it.

Strange. For me this simply says that this name conflict bug is a
mythical beast, and that there is no need to learn to live with risks
that, to all practical effects, do not exist.
From: =?utf-8?b?QmrDtnJuIExpbmRiZXJn?=
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcsllkz68v8.fsf@fnatte.nada.kth.se>
Cameron MacKinnon <··········@clearspot.net> writes:

> Jeff Dalton wrote:
> > Cameron MacKinnon <··········@clearspot.net> writes:
> >>I'm writing because I can't figure out what, exactly, you are
> >>arguing.
> > I'll try to clarify.
> 
> Thank you, that helped a lot. The thread was long and winding, and I'd
> gotten somewhat dizzy following it all.
> 
> > I don't think most Lispers break into a cold sweat.
> > I think a more typical attitude is this one:
> > http://groups.google.com/groups?hl=en&lr=&ie=UTF-8&selm=9310091717.AA29226%40inferno.lucid.com

<snip>

> "I have been writing macros for twenty years now, and for the last 19
> of them I don't think I've ever introduced a name conflict bug."
> 
> I just don't like this attitude. It admits the problem, but says
> you'll eventually learn to live with it. I find this difficult to
> reconcile with the "Lisp is a local maximum" tenet.

It says you have to learn the tool. Don't you think the scheme macro
system takes some time to learn as well? When I started learning CL,
it took me a while to learn the syntax of various elements, such as
do, loop, etc.


Björn
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4wu4jz04v.fsf@tarn.inf.ed.ac.uk>
Cameron MacKinnon <··········@clearspot.net> writes:

> "I have been writing macros for twenty years now, and for the last 19
> of them I don't think I've ever introduced a name conflict bug."
> 
> I just don't like this attitude. It admits the problem, but says
> you'll eventually learn to live with it. I find this difficult to
> reconcile with the "Lisp is a local maximum" tenet.

It doesn't say you'll learn to live with it.  You learn enough to stop
making that kind of mistake; and that's something that happens with
many different aspects of programming, not only with CL macros.

Anyway, I think there are two different classes of potential bugs.

First, there the the ones the macro writer tries to prevent,
for instance by using gensym to create local variables introduced
in the macro's expansion.

Many macros simply don't have that problem.  For example,
consider the macro DELAY, which is shorthand for a call
to MAKE-DELAY.

(defmacro delay (expr)
  `(make-delay #'(lamnda () ,expr)))

In other cases, there may be a higher-level answer,
such as the ONCE-ONLY macro.  If the reason for a variable
is to hold a value so that it doesn't get computed more
than once, the ONCE-ONLY macro can handle the gensym etc
for you.

The second category is potentially more worrying.  These
are the bugs the macro author doesn't try to prevent --
it's up to the person using the macro to avoid them.

For example, if someone using my DELAY macro doesn't know
that it expands into a call to MAKE-DELAY, and they redefine
MAKE-DELAY, then my macro will not work as I intended.

However, that problem also occurs with function definitions.
If I call MAKE-DELAY in a function, and someone's redefined
MAKE-DELAY, then my function will not work as I intended.

For the DELAY macro, there is an additional potential problem
of that sort: if someone makes a local definition of MAKE-DELAY
using FLET or LABELS, that will also make things go wrong for
my macro, even though it would not cause a problem for a function.

In earlier Lisps, which didn't have FLET or LABELS, there
wasn't tis additional problem; but Common Lisp has FLET and LABELS.

The macro author could take steps to prevent this problem,
or the macro author could make it a "feature".

I think a typical approach in Common Lisp is to take advantage
of packages to manage the problem.  For some people, that's
just not good enough.  Suppose MAKE_DELAY isn't exported.
That doesn't absolutely stop someone from doing

  (flet ((somepack::make-delay (thunk)
           (i-will-kill-your-macro-and-eat-it)))
    ...)

But it also doesn't absolutely stop someone from doing

  (defun somepack::make-delay (thunk)
    (die-macro-writing-dog))

Since Lisp isn't always in the business of absolutely stopping
things, that sort of situation will not bother many Lisp
programmers; and in practice, this sort of flet / labels
problem doesn't stand out as a source of bugs.

-- jd
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5gjad$jmn$1@newsreader2.netcologne.de>
Cameron MacKinnon wrote:

> Sure, Lispers learn how to avoid unintended capture in macros, just 
> as users of any language quickly learn the traps of that language. 
> But they're still traps.

What makes you think so?

> Am I missing something? Are there things that defmacro solves that 
> syntax-* can't, or where defmacro is more expressive?

defmacro doesn't impose or favor a specific programming style for
writing macros.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Frode Vatvedt Fjeld
Subject: Re: scheme seems neater
Date: 
Message-ID: <2hk70k9i3u.fsf@vserver.cs.uit.no>
Cameron MacKinnon <··········@clearspot.net> writes:

> Sure, Lispers learn how to avoid unintended capture in macros, just
> as users of any language quickly learn the traps of that
> language. But they're still traps. When your code doesn't work, you
> go down the possible causes. Having something like unintended
> capture on that list can, I understand, make Lispers break into a
> cold sweat, as it's sometimes a very difficult bug to find.

Have you experienced this yourself? I've written many CL macros over
the years, and have yet to be bitten by this problem, that I can
remember.

To me, the way scheme attacks the problem of unintended variable
capture looks like some hypothetical approach to solve the "problem"
of unintended divisions by zeros: Construct a new linguistic
sub-system so as to make it impossible to write arithmetic expressions
with potential divisions by zero. Such an approach might be feasible,
it might be possible to use it to write every legal expression, and
perhaps it would even catch a bug or two occationally, and thus in
some sense be superior to what we currently have. I still wouldn't
want it. One reason for this is that it's "brain-clutter" far beyond
that of let vs. flet. Should such "improvements" accumulate in a
language, the result would likely be a complete mess. In practice,
arithmethics in Lisp works very well, just as defmacro works very well
in CL.

-- 
Frode Vatvedt Fjeld
From: Cameron MacKinnon
Subject: Re: scheme seems neater
Date: 
Message-ID: <V_ydneY_gtTrv-HdRVn-tA@golden.net>
Frode Vatvedt Fjeld wrote:
> Cameron MacKinnon <··········@clearspot.net> writes:
> 
>>Sure, Lispers learn how to avoid unintended capture in macros, just
>>as users of any language quickly learn the traps of that
>>language. But they're still traps. When your code doesn't work, you
>>go down the possible causes. Having something like unintended
>>capture on that list can, I understand, make Lispers break into a
>>cold sweat, as it's sometimes a very difficult bug to find.
> 
> Have you experienced this yourself? I've written many CL macros over
> the years, and have yet to be bitten by this problem, that I can
> remember.

No, I haven't. I haven't started studying macrology yet, so I only know 
from what I've read. And I have no reason to believe my experience won't 
eventually be like everyone else's: Learn to avoid the traps, and spend 
many years with no problems at all.

Or at least I'll remember my experience just as others do - "Nope, never 
had a problem that I can remember." It is human nature to forget our 
mistakes and remember our triumphs.

But if the computer can solve the problem so that I don't have to worry 
about it, and if I don't lose any power with the new solution, then my 
vote is for progress. If a language can help amateurs or newbies avoid 
mistakes without taking power away from advanced users, why not?


> To me, the way scheme attacks the problem of unintended variable
> capture looks like some hypothetical approach to solve the "problem"
...
> I still wouldn't
> want it. One reason for this is that it's "brain-clutter" far beyond
> that of let vs. flet. Should such "improvements" accumulate in a
> language, the result would likely be a complete mess. In practice,
> arithmethics in Lisp works very well, just as defmacro works very well
> in CL.

Well, CL already has enough clutter that a little bit more couldn't 
hurt, and the Scheme mindset is such that they've probably set a hard 
upper limit of 75 pages on their language spec. :-)  [OK, OK, it isn't 
really fair to exclude SRFIs.]

But, oddly enough, it seems that classical Lisp has stopped evolving 
with CL, while Scheme continues to mutate.

-- 
Cameron MacKinnon
Toronto, Canada
From: Matthew Danish
Subject: Re: scheme seems neater
Date: 
Message-ID: <20040413174854.GG25328@mapcar.org>
On Tue, Apr 13, 2004 at 01:25:10PM -0400, Cameron MacKinnon wrote:
> But, oddly enough, it seems that classical Lisp has stopped evolving 
> with CL, while Scheme continues to mutate.

Nah, it's just that Lispers have stopped endlessly fiddling with the
base language and instead moved on to fiddling with libraries and
language extensions, which are possible to do within the base language.

-- 
; Matthew Danish <·······@andrew.cmu.edu>
; OpenPGP public key: C24B6010 on keyring.debian.org
; Signed or encrypted mail welcome.
; "There is no dark side of the moon really; matter of fact, it's all dark."
From: Kenny Tilton
Subject: Re: scheme seems neater
Date: 
Message-ID: <zVWec.37311$WA4.6241@twister.nyc.rr.com>
Cameron MacKinnon wrote:
> But, oddly enough, it seems that classical Lisp has stopped evolving 
> with CL, while Scheme continues to mutate.

I think the reason this silly thread has dragged on so long is that you 
can write that sentence without understanding that it explains everything.

The other reason it has dragged on so long is that you denigrate our 
reasoned choice as "stopping evolution", so even Kenny has dropped in to 
suggest you go...

...write some code. Cello could use some bindings to a text-to-speech 
library. Festival?

Or do what everyone else seems to be doing: write your own CL 
implementation.

:)

kt



-- 
Home? http://tilton-technology.com
Cells? http://www.common-lisp.net/project/cells/
Cello? http://www.common-lisp.net/project/cello/
Why Lisp? http://alu.cliki.net/RtL%20Highlight%20Film
Your Project Here! http://alu.cliki.net/Industry%20Application
From: Frode Vatvedt Fjeld
Subject: Re: scheme seems neater
Date: 
Message-ID: <2hllkz91z2.fsf@vserver.cs.uit.no>
Cameron MacKinnon <··········@clearspot.net> writes:

> No, I haven't. I haven't started studying macrology yet, so I only
> know from what I've read. And I have no reason to believe my
> experience won't eventually be like everyone else's: Learn to avoid
> the traps, and spend many years with no problems at all.
>
> Or at least I'll remember my experience just as others do - "Nope,
> never had a problem that I can remember." It is human nature to
> forget our mistakes and remember our triumphs.

So you come in as a neopythe, insisting that there are problems that
you have yet to experience yourself, and which old-timers claim do not
exist. You conclude that the old-timers must display some mental
flaw. I hope you can forgive me for suggesting that such behavior is
perhaps slightly ridiculous.

> But if the computer can solve the problem so that I don't have to
> worry about it, and if I don't lose any power with the new solution,
> then my vote is for progress. If a language can help amateurs or
> newbies avoid mistakes without taking power away from advanced
> users, why not?

Why not is what I tried to explain below. The language would become a
mess, and "solve" extremely little. I believe such an addition to CL
would be much, much more likely to increase the learning threshold for
newbies than decrease it, all things considered.

> Well, CL already has enough clutter [..]

I strongly disagree with this statement, if you mean to imply that CL
has significant amounts of unnecessary operators and/or concepts.

> [..] that a little bit more couldn't hurt, [..]

And regardless of the truthfulness of your previous statement, this
statement I also strongly disagree with, at least so long as the
benefits of the clutter are unclear.

> [..] and the Scheme mindset is such that they've probably set a hard
> upper limit of 75 pages on their language spec. :-) [OK, OK, it
> isn't really fair to exclude SRFIs.]
>
> But, oddly enough, it seems that classical Lisp has stopped evolving
> with CL, while Scheme continues to mutate.

What is really odd is that schemers, who appear to be obsessed with
keeping the size of the language spec small and only standardize the
absolutely most technically fundamental operators[*], eagerly advocate
the addition of what to me appears to be a whole new sub-language just
to solve some very minor problem.  Well, of course, in a lisp-1, it's
not so minor, but still.

I'd be happy to see Common Lisp evolve, but not just for the sake of
evolving, and certainly not for "solving" non-existing problems. Maybe
the case is that Common Lisp as it stands is actually useful, and its
community is all busy doing actual useful programming, confident that
their platform of choice won't randomly mutate under their feet for
obscure reasons.

Also, I disagree with the notion that CL doesn't evolve. There's lots
of things happening. But that doesn't mean the language standard is
necessarily ripe for "modernization".


[*] "technically fundamental" as opposed to "fundamental", for example
    because Scheme dismisses the IMHO very fundamental difference
    between the operator and variable name-spaces, just because it's
    technically feasible to do so.

-- 
Frode Vatvedt Fjeld
From: Cameron MacKinnon
Subject: Re: scheme seems neater
Date: 
Message-ID: <hJudnYl0C7DioOHdRVn_iw@golden.net>
Frode Vatvedt Fjeld wrote:
> So you come in as a neopythe, insisting that there are problems that
> you have yet to experience yourself, and which old-timers claim do not
> exist. You conclude that the old-timers must display some mental
> flaw. I hope you can forgive me for suggesting that such behavior is
> perhaps slightly ridiculous.

I'm not insisting on anything that hasn't been advocated before. I 
certainly didn't come up with the idea that defmacro has flaws - it's a 
staple of conversation on c.l.l. I've been lurking here quite long 
enough to see the trends and the arguments. This particular problem is 
one that wizards have seen fit to solve. But, rather than admit that CL 
could perhaps learn a trick, old-timers feel it necessary to downplay 
the problem.

I haven't yet seen an old-timer claim that defmacro's problems don't 
exist. There's always an adjective or a modifier, such as "in practice", 
"significant", "major". Further on in your post, you call it a "very 
minor problem." So does it exist, or doesn't it? And if it is such a 
minor problem, how come it keeps coming up on c.l.l again and again and 
again. Don't try to blame this latest flare-up on me, either. I 
certainly didn't start this thread, nor jump into it until it had been 
going on for a long time.

I suspect that, had there been a context-aware macro system in use in 
some Lisp for several years before the CL standard was adopted, the 
standard would have incorporated it. But because the progress came 
later, CL's defenders seek to minimize it as a solution in search of a 
problem.

Is CL perfect? Are there ANY problems which, if mentioned in c.l.l, 
won't get a number of responses to the effect that the OP is imagining 
things?

> What is really odd is that schemers, who appear to be obsessed with
> keeping the size of the language spec small and only standardize the
> absolutely most technically fundamental operators[*], eagerly advocate
> the addition of what to me appears to be a whole new sub-language just
> to solve some very minor problem.  Well, of course, in a lisp-1, it's
> not so minor, but still.

CL has loop and format. Schemers felt that the problem was worth solving 
and solved it. Is defmacro a sublanguage, or is it straight Lisp?

I like the idea of Scheme's small spec. It encourages multiple 
implementations. There's several Schemes that run on the Java VM, for 
example.

Once a language specification gets beyond a certain size, Language 
Lawyers become necessary, and people become overwhelmed by the spec. 
There have been lots of threads on c.l.l where a person, an old-timer 
even, has had to ask for help because he couldn't find what he needed in 
the spec, even though it was there. Just recently, the group couldn't 
agree on how many namespaces CL has, even though namespace is used as a 
specific technical term in the standard.

> I'd be happy to see Common Lisp evolve, but not just for the sake of
> evolving, and certainly not for "solving" non-existing problems. Maybe
> the case is that Common Lisp as it stands is actually useful, and its
> community is all busy doing actual useful programming, confident that
> their platform of choice won't randomly mutate under their feet for
> obscure reasons.

For every problem, there's people who will loudly say that the problem 
doesn't exist and other people who will say it's not the highest priority.

The CL community has come to believe that standardization has to be a 
big, lengthy, costly and traumatic experience. This becomes a 
self-fulfilling prophesy, so people say the standard isn't ripe for 
modernization, notwithstanding that it excludes networking, Unicode, 
etcetera. The vendors know that the customers want and need these 
features, so they implement (incompatible versions of) them. And that, 
of course, will just add that much more to the pain when these features 
eventually become standardized. Every day, people are writing 
nonstandard code, thus increasing the cost of the position they'll 
protect when the standards committee is struck.


-- 
Cameron MacKinnon
Toronto, Canada
From: Thomas F. Burdick
Subject: Re: scheme seems neater
Date: 
Message-ID: <xcvk70jab8k.fsf@famine.OCF.Berkeley.EDU>
Cameron MacKinnon <··········@clearspot.net> writes:

> I haven't yet seen an old-timer claim that defmacro's problems don't 
> exist. There's always an adjective or a modifier, such as "in practice", 
> "significant", "major". Further on in your post, you call it a "very 
> minor problem." So does it exist, or doesn't it? And if it is such a 
> minor problem, how come it keeps coming up on c.l.l again and again and 
> again.

I don't suppose it's occured to you that there are real, practical
benefits that come along with the potential problem of accidental name
capture.  If you spent some effort learning CL's macro system, you'd
have seen that.

> Don't try to blame this latest flare-up on me, either. I 
> certainly didn't start this thread, nor jump into it until it had been 
> going on for a long time.

Anton and Jeff have been arguing back and forth intelligently.  By
your own admission, you don't even know what you're talking about!

> I suspect that, had there been a context-aware macro system in use in 
> some Lisp for several years before the CL standard was adopted, the 
> standard would have incorporated it.

What basis could you possibly have to suspect that?

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | No to Imperialist war |                        
     ,--'    _,'   | Wage class war!       |                        
    /       /      `-----------------------'                        
   (   -.  |                               
   |     ) |                               
  (`-.  '--.)                              
   `. )----'                               
From: Cameron MacKinnon
Subject: Re: scheme seems neater
Date: 
Message-ID: <htSdnXOfBJCdyeHdRVn-tA@golden.net>
Thomas F. Burdick wrote:
> Cameron MacKinnon <··········@clearspot.net> writes:
> 
> I don't suppose it's occured to you that there are real, practical
> benefits that come along with the potential problem of accidental name
> capture.  If you spent some effort learning CL's macro system, you'd
> have seen that.

If you'd wanted to be part of the solution, you'd have pointed to some 
examples. I've read lots and lots on the subject, as has anyone who's 
spent significant time on c.l.l, and haven't seen too much in the way of 
examples where defmacro is superior. But I didn't eliminate the 
possibility. Here's an example, in this thread, of proof that it HAS 
occurred to me:

> Am I missing something? Are there things that defmacro solves that syntax-* can't, or where defmacro is more expressive?


>>I suspect that, had there been a context-aware macro system in use in 
>>some Lisp for several years before the CL standard was adopted, the 
>>standard would have incorporated it.
> 
> What basis could you possibly have to suspect that?

My faith in the rational intelligence of the members of the standard 
committee. If they'd been shown a working, time tested example of a 
superior solution, they'd have chosen it. What basis have you to believe 
otherwise?

-- 
Cameron MacKinnon
Toronto, Canada
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5hkfv$srd$1@newsreader2.netcologne.de>
Cameron MacKinnon wrote:

> Thomas F. Burdick wrote:
> 
>> Cameron MacKinnon <··········@clearspot.net> writes:
>>
>> I don't suppose it's occured to you that there are real, practical
>> benefits that come along with the potential problem of accidental name
>> capture.  If you spent some effort learning CL's macro system, you'd
>> have seen that.
> 
> If you'd wanted to be part of the solution, you'd have pointed to some 
> examples.

You have already decided against reading Paul Graham's "On Lisp". What 
else can we do?

Here's a suggestion: Read Paul Graham's "On Lisp".


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Cameron MacKinnon
Subject: Re: scheme seems neater
Date: 
Message-ID: <77OdnSKw_ePbweHdRVn-uQ@golden.net>
Pascal Costanza wrote:

> You have already decided against reading Paul Graham's "On Lisp". What 
> else can we do?
> 
> Here's a suggestion: Read Paul Graham's "On Lisp".

I don't have time to do everyone's homework here when they misunderstand 
my clearly written posts. Why don't you use google groups to review my 
postings that have mentioned On Lisp, and then admit that they don't 
support what you've said above.

-- 
Cameron MacKinnon
Toronto, Canada
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5htc7$fq9$2@newsreader2.netcologne.de>
Cameron MacKinnon wrote:

> Pascal Costanza wrote:
> 
>> You have already decided against reading Paul Graham's "On Lisp". 
>> What else can we do?
>> 
>> Here's a suggestion: Read Paul Graham's "On Lisp".
> 
> I don't have time to do everyone's homework here when they 
> misunderstand my clearly written posts. Why don't you use google 
> groups to review my postings that have mentioned On Lisp, and then 
> admit that they don't support what you've said above.

You mean like
http://groups.google.com/groups?selm=ldOdnTbUt6uiu-HdRVn-ug%40golden.net
in which you wrote:

> I think the time has come for me to learn macrology. The standard 
> text seems to be On Lisp, but I worry that if I only read that, I'll 
> end up defending defmacro just like all the other Luddites.
> 
> What are the best Scheme flavoured resources?

Maybe I have exaggerated things a bit, but it seems that you have
already made up your mind...


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5hido$o9n$1@newsreader2.netcologne.de>
Cameron MacKinnon wrote:

> I'm not insisting on anything that hasn't been advocated before. I 
> certainly didn't come up with the idea that defmacro has flaws - it's a 
> staple of conversation on c.l.l. I've been lurking here quite long 
> enough to see the trends and the arguments. This particular problem is 
> one that wizards have seen fit to solve. But, rather than admit that CL 
> could perhaps learn a trick, old-timers feel it necessary to downplay 
> the problem.

Take a look at common-lisp.net - does this give you the impression that 
people are not interested in extending Common Lisp?

> I haven't yet seen an old-timer claim that defmacro's problems don't 
> exist. There's always an adjective or a modifier, such as "in practice", 
> "significant", "major". Further on in your post, you call it a "very 
> minor problem." So does it exist, or doesn't it? And if it is such a 
> minor problem, how come it keeps coming up on c.l.l again and again and 
> again.

c.l.l is just a newsgroup. The fact that there doesn't exist a widely 
used hygienic macro system for Common Lisp is a much more interesting 
data point.

> Is CL perfect? Are there ANY problems which, if mentioned in c.l.l, 
> won't get a number of responses to the effect that the OP is imagining 
> things?

The CL standard is powerful enough so that extensions can be easily 
implemented on top of it. One important thing that is missing in the 
ANSI CL standard is, for example, a system definition facility. There 
exist two widely used defsystems, asdf and mk-defsystem. This is an 
example of a feature that Common Lispers think of as important. And 
guess what? It exists as a widely used library.

Again, the fact that a hygienic macro system doesn't exist for Common 
Lisp to the same degree as asdf, mk-defsystem, screamer, series, etc., 
should tell you that Common Lispers don't think of this as an important 
issue. See http://www.cliki.net/Library or 
http://www.common-lisp.net/projects.shtml for listings of things that 
they do find important.

> I like the idea of Scheme's small spec. It encourages multiple 
> implementations. There's several Schemes that run on the Java VM, for 
> example.

Which revision of the various RnRS? Including full call/cc and 
space-safe tail recursion? What SRFIs do they support? Are their 
implementations of syntax-rules bug-free?

It seems to me that the number of Scheme implementations that are 
actually faithful to the standard and provide a sufficiently broad range 
of language extensions for practical needs is relatively low in 
comparison to the vast number of toy implementations.

> Once a language specification gets beyond a certain size, Language 
> Lawyers become necessary, and people become overwhelmed by the spec. 
> There have been lots of threads on c.l.l where a person, an old-timer 
> even, has had to ask for help because he couldn't find what he needed in 
> the spec, even though it was there. Just recently, the group couldn't 
> agree on how many namespaces CL has, even though namespace is used as a 
> specific technical term in the standard.

Other people also don't agree how many angels fit the tip of a needle. 
This doesn't make it an important question.

> For every problem, there's people who will loudly say that the problem 
> doesn't exist and other people who will say it's not the highest priority.

Again, this is just a newsgroup. Noone is stopping you from providing a 
hygienic macro system for Common Lisp if it has such a high priority for 
you.

> The CL community has come to believe that standardization has to be a 
> big, lengthy, costly and traumatic experience. This becomes a 
> self-fulfilling prophesy, so people say the standard isn't ripe for 
> modernization, notwithstanding that it excludes networking, Unicode, 
> etcetera. The vendors know that the customers want and need these 
> features, so they implement (incompatible versions of) them. And that, 
> of course, will just add that much more to the pain when these features 
> eventually become standardized. Every day, people are writing 
> nonstandard code, thus increasing the cost of the position they'll 
> protect when the standards committee is struck.

The ANSI CL standard is not being changed because the standardization 
process costs a huge amount of time and money. The benefits are not 
considered worthwhile enough, especially because the Common Lisp 
community lacks the financial resources, probably mainly due to its lack 
of popularity.

Nothing prevents you from actually being productive with Common Lisp 
implementations, thanks to the efforts of the vendors and the various 
open source projects.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Cameron MacKinnon
Subject: Re: scheme seems neater
Date: 
Message-ID: <Q4SdnQuJCKRVx-HdRVn-jA@golden.net>
Pascal Costanza wrote:

> The CL standard is powerful enough so that extensions can be easily 
> implemented on top of it. One important thing that is missing in the 
> ANSI CL standard is, for example, a system definition facility. There 
> exist two widely used defsystems, asdf and mk-defsystem. This is an 
> example of a feature that Common Lispers think of as important. And 
> guess what? It exists as a widely used library.

Good point.

>> Just recently, the group 
>> couldn't agree on how many namespaces CL has, even though namespace is 
>> used as a specific technical term in the standard.
> 
> Other people also don't agree how many angels fit the tip of a needle. 
> This doesn't make it an important question.

Well, nobody said "the standard says there are x namespaces" even though 
the standard is deliberate and clear in its use of the word namespace, 
and even though "the standard says..." is usually accorded definitive 
status in this group. So why wouldn't anyone have used it to buttress 
their argument? And the question was debated on c.l.l, so someone 
thought it was important enough to debate.

> The ANSI CL standard is not being changed because the standardization 
> process costs a huge amount of time and money. The benefits are not 
> considered worthwhile enough, especially because the Common Lisp 
> community lacks the financial resources, probably mainly due to its lack 
> of popularity.

I don't think you and I disagree that much here. But the REASON that the 
process costs a huge amount of time and money is that everybody's got an 
old codebase full of extensions to CL to defend and maintain 
compatibility with. The vigour of the defense is commensurate with the 
size of the codebase, so this problem will be even bigger next year.

Do you have reason to believe that, sometime in the future, the desire 
to standardize will suddenly become greater than the annually increasing 
cost? Because, if I'm judging present trends correctly, cost is growing 
faster than desire.

> Nothing prevents you from actually being productive with Common Lisp 
> implementations, thanks to the efforts of the vendors and the various 
> open source projects.

I'm certainly not arguing otherwise.


-- 
Cameron MacKinnon
Toronto, Canada
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5ht6c$fq9$1@newsreader2.netcologne.de>
Cameron MacKinnon wrote:

> Pascal Costanza wrote:
> 
>> The ANSI CL standard is not being changed because the
>> standardization process costs a huge amount of time and money. The
>> benefits are not considered worthwhile enough, especially because
>> the Common Lisp community lacks the financial resources, probably
>> mainly due to its lack of popularity.
> 
> I don't think you and I disagree that much here. But the REASON that
> the process costs a huge amount of time and money is that everybody's
> got an old codebase full of extensions to CL to defend and maintain 
> compatibility with.

Again, I wonder where you get this from. Kent Pitman was involved in the
ANSI standardization and has explained the reasons in a number of
postings to c.l.l. They seem to paint a very different picture. But
maybe you know more about this than me. If you don't, just google for 
the relevant postings before drawing any conclusions.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Cameron MacKinnon
Subject: Re: scheme seems neater
Date: 
Message-ID: <Wc2dnScqLIDl0ODdRVn-sQ@golden.net>
Pascal Costanza wrote:
> 
> Cameron MacKinnon wrote:
>> I don't think you and I disagree that much here. But the REASON that
>> the process costs a huge amount of time and money is that everybody's
>> got an old codebase full of extensions to CL to defend and maintain 
>> compatibility with.
> 
> Again, I wonder where you get this from. Kent Pitman was involved in the
> ANSI standardization and has explained the reasons in a number of
> postings to c.l.l. They seem to paint a very different picture. But
> maybe you know more about this than me. If you don't, just google for 
> the relevant postings before drawing any conclusions.

I've now read Kent's posts about ANSI being a monopoly and charging 
accordingly. But I don't need ANSI's imprimatur to give the standard 
legitimacy. So why use ANSI? Are there other reasons, in an age of 
ubiquitous and cheap global text, voice and video telecommunications, 
why the standardization process needs be expensive?

Scheme's SRFI process and the IETF's RFC process show that 
standardization doesn't need to be hugely time consuming, process-bound 
or expensive.

-- 
Cameron MacKinnon
Toronto, Canada
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5jjbs$s0s$1@f1node01.rhrz.uni-bonn.de>
Cameron MacKinnon wrote:

> I've now read Kent's posts about ANSI being a monopoly and charging 
> accordingly. But I don't need ANSI's imprimatur to give the standard 
> legitimacy. So why use ANSI? Are there other reasons, in an age of 
> ubiquitous and cheap global text, voice and video telecommunications, 
> why the standardization process needs be expensive?
> 
> Scheme's SRFI process and the IETF's RFC process show that 
> standardization doesn't need to be hugely time consuming, process-bound 
> or expensive.

I have understood "evolving the CL standard" as a reference to the ANSI 
standard. Sorry for misinterpreting your statement.

Yes, the SRFI process is a good idea IMHO.


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Jon Boone
Subject: Re: scheme seems neater
Date: 
Message-ID: <BD5D77E6.2A51%ipmonger@comcast.net>
On 2004-04-14 10:40, in article ······················@golden.net, "Cameron
MacKinnon" <··········@clearspot.net> wrote:

> 
> Scheme's SRFI process and the IETF's RFC process show that
> standardization doesn't need to be hugely time consuming, process-bound
> or expensive.

  You would be referring to the RFCs that define how the Internet operates
on a daily basis, such as for Border Gateway Protocol Version 4 - which has
never become a full-standard.  The latest RFC on BGP-4 (1771) is only a
Draft Standard after almost 10 years of use.

  So, you were saying that IETF-style standardization need not be costly or
time consuming?  Oh, yeah, or process bound.

  Perhaps you could spend some time at the IETF and the report back to us.

--jon
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4k70jyy28.fsf@tarn.inf.ed.ac.uk>
Cameron MacKinnon <··········@clearspot.net> writes:

> >> Just recently, the group couldn't agree on how many namespaces CL
> >> has, even though namespace is used as a specific technical term in
> >> the standard.

> > Other people also don't agree how many angels fit the tip of a
> > needle. This doesn't make it an important question.

> Well, nobody said "the standard says there are x namespaces" even
> though the standard is deliberate and clear in its use of the word
> namespace, and even though "the standard says..." is usually accorded
> definitive status in this group.

The question being discussed was not "how many namespaces does
the CL standard say there are in Common Lisp, in the semse of
`namespace' defined in the standard?"

> So why wouldn't anyone have used it to buttress their argument?

Why would they?  I would refer to the standard if the question
was about how programs or implementations had to behave in order
to conform to the standard; but it wasn't.  The standard is
not the last word on what words mean when discussing programming
languages, especially when not all of the languages are
Common Lisp.

> > The ANSI CL standard is not being changed because the
> > standardization process costs a huge amount of time and money. The
> > benefits are not considered worthwhile enough, especially because
> > the Common Lisp community lacks the financial resources, probably
> > mainly due to its lack of popularity.
> 
> I don't think you and I disagree that much here. But the REASON that
> the process costs a huge amount of time and money is that everybody's
> got an old codebase full of extensions to CL to defend and maintain
> compatibility with.

It costs too much regardless.  I used to have funding and time
to go to X3J13 meetings.  I couldn't get any funding for that
now even if the whole thing could be done quickly as standardizations
go; and other people are going to be in the same situation.

Maybe if we somehow hooked it up to XML ... :(

-- jd
From: Frode Vatvedt Fjeld
Subject: Re: scheme seems neater
Date: 
Message-ID: <2had1f8smm.fsf@vserver.cs.uit.no>
Cameron MacKinnon <··········@clearspot.net> writes:

> I'm not insisting on anything that hasn't been advocated before. I
> certainly didn't come up with the idea that defmacro has flaws -
> it's a staple of conversation on c.l.l. I've been lurking here quite
> long enough to see the trends and the arguments. This particular
> problem is one that wizards have seen fit to solve. But, rather than
> admit that CL could perhaps learn a trick, old-timers feel it
> necessary to downplay the problem.

Heh, that's an interesting usage of "wizards" for expert schemers and
"feeling old-timers" for expert CL-ers.

Nothing stops anyone from writing some new and improved kind of
defmacro for CL, and make it available for others to use. Perhaps it's
even been done, but to my knowledge no such thing has had any
substantial popularity. Maybe because no one that actually uses the
defmacro we've got has much trouble with it.

> I haven't yet seen an old-timer claim that defmacro's problems don't
> exist. There's always an adjective or a modifier, such as "in
> practice", "significant", "major". Further on in your post, you call
> it a "very minor problem." So does it exist, or doesn't it?

It's a "very minor problem" in the sense that it's an issue that one
needs to be aware of, and which might bite you if you aren't. This in
common with probably hundreds of similar issues. I don't really see
that any language mechanism is going to change those issues with
defmacro substantially, although they might be shifted somewhat.

> And if it is such a minor problem, how come it keeps coming up on
> c.l.l again and again and again. Don't try to blame this latest
> flare-up on me, either. I certainly didn't start this thread, nor
> jump into it until it had been going on for a long time.

I'm not blaming anything on anyone; if I find some thread to be a
waste of bandwidth or worse, I certainly would not participate in it.

There are many issues that comes up here again and again, such as
"lisp is a slow, interpreted language with only one data-structure"
and whatnot. This shouldn't tell you much about the speed of Lisp.

> I suspect that, had there been a context-aware macro system in use
> in some Lisp for several years before the CL standard was adopted,
> the standard would have incorporated it. But because the progress
> came later, CL's defenders seek to minimize it as a solution in
> search of a problem.

Then again, maybe there's an _actual reason_ why no one had developed
such a macro system for a lisp-2 at the time, or later. For example,
the fact that there is no problem in the first place. I'm not really
big on Lisp history, but I'd venture to say that at the time of CL
standardization, there's the evidence of 2-3 decades of programming
with defmacro as we know it without any improved macro system catching
on, even in a community very eager and able to experiment with
precisely this kind of meta-linguistic constructs.

> Is CL perfect? Are there ANY problems which, if mentioned in c.l.l,
> won't get a number of responses to the effect that the OP is
> imagining things?

How many problems with CL have you mentioned in cll, exactly? And of
these, how many do you have zero first-hand experience with, as is the
case with defmacro?

> Is defmacro a sublanguage, or is it straight Lisp?

It is quite straight Lisp. This fact is IMHO one of the essential
beauties of Lisp.

> I like the idea of Scheme's small spec. It encourages multiple
> implementations. There's several Schemes that run on the Java VM,
> for example.

Well, this viewpoint I have since long come to terms with I will
never, ever fathom.

> Once a language specification gets beyond a certain size, Language
> Lawyers become necessary, and people become overwhelmed by the spec.

Well, I for one consider it more of a treasure chest where I sometimes
find even better ways of doing things, although this occurs less and
less frequently as I become more proficient. For me, programming is a
serious business, and I would not want or expect not to be overwhelmed
upon being introduced to the primary tool of this trade.

> There have been lots of threads on c.l.l where a person, an
> old-timer even, has had to ask for help because he couldn't find
> what he needed in the spec, even though it was there. Just recently,
> the group couldn't agree on how many namespaces CL has, even though
> namespace is used as a specific technical term in the standard.

You can't possibly conjure up that discussion as some sort of problem
with CL or its spec.

> For every problem, there's people who will loudly say that the
> problem doesn't exist and other people who will say it's not the
> highest priority.

Really? I don't quite see how this is relevant to anything. I'd be
very happy to see wonderful general solutions to FFI, multithreading,
networking, whatever. I just don't know that they exist.

> The CL community has come to believe that standardization has to be
> a big, lengthy, costly and traumatic experience. This becomes a
> self-fulfilling prophesy, so people say the standard isn't ripe for
> modernization, notwithstanding that it excludes networking, Unicode,
> etcetera.

Well, you don't just "modernize" a language. There's got to be some
proposed solution that solves some problem, and with which there's
been positive experiences across implementations and so on. In the
case of hygienic macros, there's absolutely none of any of this.

> The vendors know that the customers want and need these features, so
> they implement (incompatible versions of) them.

I don't know of any vendor that offers some sort of hygienic macro
system.

> And that, of course, will just add that much more to the pain when
> these features eventually become standardized. Every day, people are
> writing nonstandard code, thus increasing the cost of the position
> they'll protect when the standards committee is struck.

How this viewpoint fits into the same brain that just a screenful
earlier stated that

> I like the idea of Scheme's small spec. It encourages multiple
> implementations. There's several Schemes that run on the Java VM,
> for example.

..without either exploding or imploding, I find kind of peculiar.

-- 
Frode Vatvedt Fjeld
From: Karl A. Krueger
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5hrbp$gv0$1@baldur.whoi.edu>
Frode Vatvedt Fjeld <······@cs.uit.no> wrote:
> Nothing stops anyone from writing some new and improved kind of
> defmacro for CL, and make it available for others to use. Perhaps it's
> even been done, but to my knowledge no such thing has had any
> substantial popularity. Maybe because no one that actually uses the
> defmacro we've got has much trouble with it.

Yeah, I don't get it.  If these people want us to use SYNTAX-FOO instead
of DEFMACRO, why don't they write up a portable SYNTAX-FOO, ASDF it, and
post it to CLiki?

I'm new enough that I've got nothing against learning it, but I'm not
likely to spend the effort to learn it if it doesn't run in one of the
Lisps that I use.

-- 
Karl A. Krueger <········@example.edu>
Woods Hole Oceanographic Institution
Email address is spamtrapped.  s/example/whoi/
"Outlook not so good." -- Magic 8-Ball Software Reviews
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <kQ%ec.6553$zj3.6129@newsread3.news.atl.earthlink.net>
Karl A. Krueger wrote:

> Frode Vatvedt Fjeld <······@cs.uit.no> wrote:
> > Nothing stops anyone from writing some new and improved kind of
> > defmacro for CL, and make it available for others to use. Perhaps it's
> > even been done, but to my knowledge no such thing has had any
> > substantial popularity. Maybe because no one that actually uses the
> > defmacro we've got has much trouble with it.
>
> Yeah, I don't get it.  If these people want us to use SYNTAX-FOO instead
> of DEFMACRO, why don't they write up a portable SYNTAX-FOO, ASDF it, and
> post it to CLiki?

The point is that if the majority of the community holds a negative
impression about such systems, there won't be much interest in developing
them in the first place, and very little incentive to do so.  Further, if
that negative impression is based on questionable facts or questionable
interpretations of the facts, the community may in fact be closing its eyes
to something that could be more useful than it realizes.

The current big subthread started when I took issue with a counter-factual
claim, and called it a myth.  My desire to correct erroneous claims doesn't
extend to a desire to implement a sophisticated macro system for CL just to
prove my point.

Anton
From: Wade Humeniuk
Subject: Re: scheme seems neater
Date: 
Message-ID: <K40fc.2832$dg7.1667@edtnps84>
Karl A. Krueger wrote:
> Frode Vatvedt Fjeld <······@cs.uit.no> wrote:
> 
>>Nothing stops anyone from writing some new and improved kind of
>>defmacro for CL, and make it available for others to use. Perhaps it's
>>even been done, but to my knowledge no such thing has had any
>>substantial popularity. Maybe because no one that actually uses the
>>defmacro we've got has much trouble with it.
> 
> 
> Yeah, I don't get it.  If these people want us to use SYNTAX-FOO instead
> of DEFMACRO, why don't they write up a portable SYNTAX-FOO, ASDF it, and
> post it to CLiki?
> 

The real reason they do not is that they know it is not as powerful.  DEFMACRO
is more than just a basic macro (substitution) system, it is a builtin user interface
to the compiler.  This allows totally new languages to be defined by allowing user
"pre-compilation" of Lisp like expressions.  To be powerful it has to dangerous
as it directly plays with the nuts and bolts of a Lisp system.  Fortunately
most Lispers are not afraid of their Lisps.

Wade
From: B.B.
Subject: Re: scheme seems neater
Date: 
Message-ID: <DoNotSpamthegoat4-76F45D.22312513042004@library.airnews.net>
In article <···················@edtnps84>,
 Wade Humeniuk <····································@telus.net> wrote:

[...]

>The real reason they do not is that they know it is not as powerful.  DEFMACRO
>is more than just a basic macro (substitution) system, it is a builtin user 
>interface
>to the compiler.  This allows totally new languages to be defined by allowing 
>user
>"pre-compilation" of Lisp like expressions.  To be powerful it has to 
>dangerous
>as it directly plays with the nuts and bolts of a Lisp system.  Fortunately
>most Lispers are not afraid of their Lisps.

   I'm curious: what dangerous things can defmacro do?  I'm still 
learning it and the only thing I've done BAD so far is write an 
abomination that reduced to something like

(defmacro alpha ()
  '(beta))

(defmacro beta ()
  '(alpha))

(macroexpand '(alpha))

   So is that the kind of thing you're talking about?  Or is it 
something more evil like mangling a built-in facility?

-- 
B.B.           --I am not a goat!       thegoat4 at airmail.net
    Fire the stupid--Vote.
From: Wade Humeniuk
Subject: Re: scheme seems neater
Date: 
Message-ID: <wh4fc.1639$aD.1490@edtnps89>
B.B. wrote:
> 
>    I'm curious: what dangerous things can defmacro do?  I'm still 
> learning it and the only thing I've done BAD so far is write an 
> abomination that reduced to something like
> 
> (defmacro alpha ()
>   '(beta))
> 
> (defmacro beta ()
>   '(alpha))
> 
> (macroexpand '(alpha))
> 
>    So is that the kind of thing you're talking about?  Or is it 
> something more evil like mangling a built-in facility?
> 

Yes, that is one where one is not protected from oneself.

though its hardly dangerous...

CL-USER 1 > (defmacro alpha ()
   '(beta))

(defmacro beta ()
   '(alpha))

(macroexpand '(alpha))
ALPHA

CL-USER 2 >
BETA

CL-USER 3 >
Stack overflow (stack size 16000).
   1 (abort) Return to level 0.
   2 Return to top loop level 0.

Type :b for backtrace, :c <option number> to proceed,  or :? for other options

CL-USER 4 : 1 >

Much of the talk has been about the "dangers" of "inadvertent variable
capture".  As defmacro has the full functionality of the Lisp image
it is evaluated in, then it is only as dangerous as the Lisp image.
If one does not see the Lisp image as dangerous then one will not see
defmacro as dangerous.  But if one sees the Lisp image as "too complex",
"not type safe", "too expressive", "not pure enough", "unhygienic", "messy"
then defmacro can become a focal point for one's fears. Little
attempts are made to bring "the beast" under control by changing
things like defmacro.  But one discovers that that little
attempt is actually an attempt to tame the whole "wild beast".

Wade
From: Ray Dillinger
Subject: Re: scheme seems neater
Date: 
Message-ID: <407CF1D9.D7B8F234@sonic.net>
Wade Humeniuk wrote:

> Much of the talk has been about the "dangers" of "inadvertent variable
> capture".  As defmacro has the full functionality of the Lisp image
> it is evaluated in, then it is only as dangerous as the Lisp image.
> If one does not see the Lisp image as dangerous then one will not see
> defmacro as dangerous.  But if one sees the Lisp image as "too complex",
> "not type safe", "too expressive", "not pure enough", "unhygienic", "messy"
> then defmacro can become a focal point for one's fears. 

These are mischaracterizations.  For example, CL is in fact hygienic 
without its macros.  Defmacro is not a focus of vague fears caused 
by something else here; it is the sole source of this issue.

Similarly, it is entirely possible to write pure-functional code in 
CL, although it requires more verbosity and ugliness than it does 
in Scheme because you always have to jump through more hoops to call
a value or pass a function as a procedure due to the namespace 
duplication.  And *both* languages are too expressive if either of 
them is, since they have about the same expressiveness.  While a lot 
of the scheme community *does* regard CL (even without macros) as 
needlessly complicated and therefore "messy", that is not generally 
laid at the feet of its macrology; it arises from the redundant 
functions that CL coders have use to manage their unnecessary 
multiple namespaces, and the additional contortions required in CL 
to do functional programming, again due to its double namespace. 

You sound like a man with an agenda to try and make CL'ers feel like
they're being insulted, or to try to pretend that those who like any
other lisp besides CL are deluded.  In fact neither is the case. 
What is true is that each camp looks at the other with preconceptions
that value its own language's strengths more;  That is why there is 
more than one camp in the first place, and more than one language.  
People value different strengths differently. 

define-syntax was developed as a method of "solving" the hygiene 
problem for a closer modeling of lambda calculus.  It did that 
fine; a hygienic macro means the same thing no matter what variables
are defined in the code that calls it.  But because intentional 
variable capture can in fact be useful in some circumstances, 
there are also syntax-case and defmacro, which allow intentional 
variable capture.

Because all three macro systems are readily available for scheme, 
CL's macrology is in practice a strict subset of scheme's macrology. 
That said, it's annoying to have three different systems for doing
essentially the same thing; it's ironic that a bunch of people who 
thought multiple namespaces and most looping primitives cluttered 
up a language needlessly resort to using multiple macrologies.  In 
all fairness only one macrology is required by the standard, but 
whatever. At least one and usually both of the other two are provided
by the implementations. 

				Bear
From: Wade Humeniuk
Subject: Re: scheme seems neater
Date: 
Message-ID: <qSafc.17368$mn3.16082@clgrps13>
Ray Dillinger wrote:

> 
> These are mischaracterizations.  For example, CL is in fact hygienic 
> without its macros.  Defmacro is not a focus of vague fears caused 
> by something else here; it is the sole source of this issue.
> 
> Similarly, it is entirely possible to write pure-functional code in 
> CL, although it requires more verbosity and ugliness than it does 
> in Scheme because you always have to jump through more hoops to call
> a value or pass a function as a procedure due to the namespace 
> duplication.  And *both* languages are too expressive if either of 
> them is, since they have about the same expressiveness.  While a lot 
> of the scheme community *does* regard CL (even without macros) as 
> needlessly complicated and therefore "messy", that is not generally 
> laid at the feet of its macrology; it arises from the redundant 
> functions that CL coders have use to manage their unnecessary 
> multiple namespaces, and the additional contortions required in CL 
> to do functional programming, again due to its double namespace. 
> 

Is that "dangerous"? :)

> You sound like a man with an agenda to try and make CL'ers feel like
> they're being insulted, or to try to pretend that those who like any
> other lisp besides CL are deluded.  In fact neither is the case. 
> What is true is that each camp looks at the other with preconceptions
> that value its own language's strengths more;  That is why there is 
> more than one camp in the first place, and more than one language.  
> People value different strengths differently. 
> 

They are too smart for that, nor did I say that.  Is a man with
an agenda dangerous?  It is interesting that you call them preconceptions
when many have spent many years *learning*.  Cannot one have values
based on experience and thoughtful choice?  You make it sound like
they are prejudiced, tribal camps whose choices are based on irrational
forces.

> 
> Because all three macro systems are readily available for scheme, 
> CL's macrology is in practice a strict subset of scheme's macrology. 
> That said, it's annoying to have three different systems for doing
> essentially the same thing; it's ironic that a bunch of people who 
> thought multiple namespaces and most looping primitives cluttered 
> up a language needlessly resort to using multiple macrologies.  In 
> all fairness only one macrology is required by the standard, but 
> whatever. At least one and usually both of the other two are provided
> by the implementations. 

I guess they cannot make up their "collective" mind (whatever that is).

Wade
From: Paul F. Dietz
Subject: Re: scheme seems neater
Date: 
Message-ID: <142dnerPvrw6oODdRVn-hg@dls.net>
Wade Humeniuk wrote:

>  You make it sound like
> they are prejudiced, tribal camps whose choices are based on irrational
> forces.

"Remember, kids: people who disagree with you are not only wrong, they're evil!"

	Paul
From: Joe Marshall
Subject: Re: scheme seems neater
Date: 
Message-ID: <r7uqlk95.fsf@ccs.neu.edu>
"Paul F. Dietz" <·····@dls.net> writes:

> Wade Humeniuk wrote:
>
>>  You make it sound like
>> they are prejudiced, tribal camps whose choices are based on irrational
>> forces.
>
> "Remember, kids: people who disagree with you are not only wrong, they're evil!"

Corollary #1, for liberals:  They probably don't understand why they
are wrong.  Explain it to them.

Corollary #1, for conservatives:  Evil must be destroyed.
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <A41fc.6629$zj3.4784@newsread3.news.atl.earthlink.net>
Wade Humeniuk wrote:

> Karl A. Krueger wrote:

> > Yeah, I don't get it.  If these people want us to use SYNTAX-FOO instead
> > of DEFMACRO, why don't they write up a portable SYNTAX-FOO, ASDF it, and
> > post it to CLiki?
> >
>
> The real reason they do not is that they know it is not as powerful.
DEFMACRO
> is more than just a basic macro (substitution) system, it is a builtin
user interface
> to the compiler.  This allows totally new languages to be defined by
allowing user
> "pre-compilation" of Lisp like expressions.  To be powerful it has to
dangerous
> as it directly plays with the nuts and bolts of a Lisp system.
Fortunately
> most Lispers are not afraid of their Lisps.

Syntax-case is essentially an extensible compiler front-end, which can act
globally on source code, and is aware of global context.  As such, it is
more powerful than DEFMACRO, which only acts locally.  This was covered in
recent threads, I think "three macro systems" or perhaps "Scheme macros".

Anton
From: Mario S. Mommer
Subject: Re: scheme seems neater
Date: 
Message-ID: <fz7jwj9f6p.fsf@germany.igpm.rwth-aachen.de>
"Anton van Straaten" <·····@appsolutions.com> writes:
> Syntax-case is essentially an extensible compiler front-end, which can act
> globally on source code, and is aware of global context.  As such, it is
> more powerful than DEFMACRO, which only acts locally.

*Laugh!*

All your arguments have been countered successfully. Now you keep
making that claim like a mantra, in the hope that it comes true. Very
funny indeed.
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5ir31$5gl$1@newsreader2.netcologne.de>
Mario S. Mommer wrote:

> "Anton van Straaten" <·····@appsolutions.com> writes:
> 
>>Syntax-case is essentially an extensible compiler front-end, which can act
>>globally on source code, and is aware of global context.  As such, it is
>>more powerful than DEFMACRO, which only acts locally.
[...]

> All your arguments have been countered successfully. Now you keep
> making that claim like a mantra, in the hope that it comes true. Very
> funny indeed.

Not quite. syntax-rule/syntax-case use, and keep track of, global 
information about variables in order to achieve what they achieve. This 
would require a code walker in Common Lisp. I still don't understand 
whether one can make use of the "built-in" code walker of syntax-case 
for one's own purposes, though. That would be interesting in some cases.

However, environment objects also keep track of global information in 
some Common Lisp implementations. I don't see why Anton keeps ignoring 
this bit. (environment objects are not part of ANSI Common Lisp, but 
syntax-case is also not part of R5RS...)


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Duane Rettig
Subject: Re: scheme seems neater
Date: 
Message-ID: <4isg2eemm.fsf@franz.com>
Pascal Costanza <········@web.de> writes:

> Not quite. syntax-rule/syntax-case use, and keep track of, global
> information about variables in order to achieve what they
> achieve. This would require a code walker in Common Lisp. I still
> don't understand whether one can make use of the "built-in" code
> walker of syntax-case for one's own purposes, though. That would be
> interesting in some cases.
> 
> 
> However, environment objects also keep track of global information in
> some Common Lisp implementations. I don't see why Anton keeps ignoring
> this bit. (environment objects are not part of ANSI Common Lisp, but
> syntax-case is also not part of R5RS...)

Environment objects are very much a part of ANSI Common Lisp.  Much is
written about these things in the spec, and there are quite a few
functions that accept environment objects as arguments (and there is
syntax available for all macros to accept these objects as arguments
during their macroexpansions).

What is missing from the CL spec is transparency of these objects.
Environment objects are opaque as CL is defined, though there was
effort (as evidenced by "Syntactic-Environment-Access", described in
http://www.lispworks.com/reference/HyperSpec/Issues/iss343_w.htm
and in a snapshot given in CLtL2, pp 207-214) that was later
retracted to provide such visibility into environments at
macroexpand time.

As you may know, we at Franz Inc are working on what will eventually
become an open-source version of this access functionality.  But I
guess I shouldn't be talking about that in this discussion, because
that might upset the mental model of those who believe that CL isn't
evolving anymore :-)

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <1Mefc.7245$zj3.4553@newsread3.news.atl.earthlink.net>
Duane Rettig wrote:
> As you may know, we at Franz Inc are working on what will eventually
> become an open-source version of this access functionality.

I'm glad to hear that.  Have you ever given consideration to implementing a
hygienic macro system, or do you feel the market is completely uninterested?

> But I
> guess I shouldn't be talking about that in this discussion, because
> that might upset the mental model of those who believe that CL isn't
> evolving anymore :-)

I think we should set up a race to see whether R6RS can be released before
the next CL standard.  It'll be a close one (mathematical infinities being
what they are)...

Anton
From: Duane Rettig
Subject: Re: scheme seems neater
Date: 
Message-ID: <41xmqe7rm.fsf@franz.com>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Duane Rettig wrote:
> > As you may know, we at Franz Inc are working on what will eventually
> > become an open-source version of this access functionality.
> 
> I'm glad to hear that.  Have you ever given consideration to implementing a
> hygienic macro system, or do you feel the market is completely uninterested?

Hygenic system, or syntax-case?

My foremost goal in implementing the environments-access system is
to stay as close as possible to the original goal outlined in the committee
proposal (and described in CLtL2).  It has required some enhancements,
but the interface is for the most part recognizable as it stands.

As representing a CL vendor, I invite you to buy our product for an
early start, or else wait for the opensource version to come out,
and to determine whether these environments accessors are adequate for
implementing syntax-case on top of it.  Obviously, if it is not possible
to do this, then that might point to holes in the strategy for such
access.

If you still wish to pursue this, but do not want to bother using Allegro
CL, I can always be reached by email and we can discuss requirements.

> > But I
> > guess I shouldn't be talking about that in this discussion, because
> > that might upset the mental model of those who believe that CL isn't
> > evolving anymore :-)
> 
> I think we should set up a race to see whether R6RS can be released before
> the next CL standard.  It'll be a close one (mathematical infinities being
> what they are)...

Are you telling me that Cameron MacKinnon is wrong after all, and
that Scheme has also stopped evolving?

:-)

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Jens Axel Søgaard
Subject: Re: scheme seems neater
Date: 
Message-ID: <407d84e5$0$210$edfadb0f@dread11.news.tele.dk>
Duane Rettig wrote:

> "Anton van Straaten" <·····@appsolutions.com> writes:

>>I think we should set up a race to see whether R6RS can be released before
>>the next CL standard.  It'll be a close one (mathematical infinities being
>>what they are)...

> Are you telling me that Cameron MacKinnon is wrong after all, and
> that Scheme has also stopped evolving?

The commitee is still out on that question.

-- 
Jens Axel S�gaard
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <IOhfc.8075$zj3.4159@newsread3.news.atl.earthlink.net>
Duane Rettig wrote:

> "Anton van Straaten" <·····@appsolutions.com> writes:
>
> > Duane Rettig wrote:
> > > As you may know, we at Franz Inc are working on what will eventually
> > > become an open-source version of this access functionality.
> >
> > I'm glad to hear that.  Have you ever given consideration to
implementing a
> > hygienic macro system, or do you feel the market is completely
uninterested?
>
> Hygenic system, or syntax-case?

Either, I was just curious about a vendor perspective on the question, since
vendors often have a broader perspective than individual users.

> My foremost goal in implementing the environments-access system is
> to stay as close as possible to the original goal outlined in the
committee
> proposal (and described in CLtL2).  It has required some enhancements,
> but the interface is for the most part recognizable as it stands.
>
> As representing a CL vendor, I invite you to buy our product for an
> early start, or else wait for the opensource version to come out,
> and to determine whether these environments accessors are adequate for
> implementing syntax-case on top of it.  Obviously, if it is not possible
> to do this, then that might point to holes in the strategy for such
> access.
>
> If you still wish to pursue this, but do not want to bother using Allegro
> CL, I can always be reached by email and we can discuss requirements.

Thanks.  I have a technical interest here, not a commercial one, and if it
becomes a burning desire, I'll let you know.  Scheme-on-CL, mmm....  :)

> > > But I
> > > guess I shouldn't be talking about that in this discussion, because
> > > that might upset the mental model of those who believe that CL isn't
> > > evolving anymore :-)
> >
> > I think we should set up a race to see whether R6RS can be released
before
> > the next CL standard.  It'll be a close one (mathematical infinities
being
> > what they are)...
>
> Are you telling me that Cameron MacKinnon is wrong after all, and
> that Scheme has also stopped evolving?

Scheme, and Scheme's standard, are two different things, which seems to be
similar to the case with CL.  The timescales connoted by the the word
"evolution" seem appropriate for both language standards...

Anton
From: Bulent Murtezaoglu
Subject: Re: scheme seems neater
Date: 
Message-ID: <87smf63291.fsf@cubx.internal>
>>>>> "DR" == Duane Rettig <·····@franz.com> writes:
[...]
    DR> As you may know, we at Franz Inc are working on what will
    DR> eventually become an open-source version of this access
    DR> functionality.  [...]

I didn't know that.  Short googling and clicking around at franz.com 
didn't yield anything useful.  Is there a link you could point us to?

cheers,

BM
From: Duane Rettig
Subject: Re: scheme seems neater
Date: 
Message-ID: <465c2e8ci.fsf@franz.com>
Bulent Murtezaoglu <··@acm.org> writes:

> >>>>> "DR" == Duane Rettig <·····@franz.com> writes:
> [...]
>     DR> As you may know, we at Franz Inc are working on what will
>     DR> eventually become an open-source version of this access
>     DR> functionality.  [...]
> 
> I didn't know that.  Short googling and clicking around at franz.com 
> didn't yield anything useful.  Is there a link you could point us to?

Only previous discussions:

http://groups.google.com/groups?q=g:thl1854010111d&dq=&hl=en&lr=&ie=UTF-8&selm=4smmmkyhu.fsf%40beta.franz.com

and prior work by Howard Stearns:

http://groups.google.com/groups?selm=36E82309.26704DBA%40elwood.com&output=gplain


-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <4Jefc.7238$zj3.1991@newsread3.news.atl.earthlink.net>
Pascal Costanza wrote:
>
> Mario S. Mommer wrote:
>
> > "Anton van Straaten" <·····@appsolutions.com> writes:
> >
> >>Syntax-case is essentially an extensible compiler front-end, which can
act
> >>globally on source code, and is aware of global context.  As such, it is
> >>more powerful than DEFMACRO, which only acts locally.
[...]
> However, environment objects also keep track of global information in
> some Common Lisp implementations. I don't see why Anton keeps ignoring
> this bit.

I'm not ignoring it, I just don't think it competes, in the sense that
syntax-case has this knowledge integrated into its system.  I don't know
enough to say whether you could implement, say, syntax-rules using defmacro
& environment objects, but it would probably match the syntax-case
implementation in complexity.  I know the author of the syntax-rules port
for CL didn't seem to think it was possible.

Anton
From: Björn Lindberg
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcsd66b55lu.fsf@knatte.nada.kth.se>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Wade Humeniuk wrote:
> 
> > Karl A. Krueger wrote:
> 
> > > Yeah, I don't get it.  If these people want us to use SYNTAX-FOO instead
> > > of DEFMACRO, why don't they write up a portable SYNTAX-FOO, ASDF it, and
> > > post it to CLiki?
> > >
> >
> > The real reason they do not is that they know it is not as powerful.
> DEFMACRO
> > is more than just a basic macro (substitution) system, it is a builtin
> user interface
> > to the compiler.  This allows totally new languages to be defined by
> allowing user
> > "pre-compilation" of Lisp like expressions.  To be powerful it has to
> dangerous
> > as it directly plays with the nuts and bolts of a Lisp system.
> Fortunately
> > most Lispers are not afraid of their Lisps.
> 
> Syntax-case is essentially an extensible compiler front-end, which can act
> globally on source code, and is aware of global context.  As such, it is
> more powerful than DEFMACRO, which only acts locally.  This was covered in
> recent threads, I think "three macro systems" or perhaps "Scheme macros".

Can you give an example of a mcaro which is possible to write using
syntax-case, but impossible with defmacro?


Bj�rn
From: Joe Marshall
Subject: Re: scheme seems neater
Date: 
Message-ID: <8ygymzdb.fsf@ccs.neu.edu>
·······@nada.kth.se (Bj�rn Lindberg) writes:

> Can you give an example of a macro which is possible to write using
> syntax-case, but impossible with defmacro?

That isn't the issue.

The issue is whether you can write a set of composable macros that
interact correctly with other composable macros.  For example, suppose
I write something like this:

  (loop for element in list
        do (with-foo (element)
             (when (magic-p element) (loop-finish))
             (hack element))
        finally (clean-up))

in order for this to work, WITH-FOO cannot itself expand into a LOOP
expression (or, in fact, into any code might eventually expand into a
LOOP).
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5jjvv$s0u$1@f1node01.rhrz.uni-bonn.de>
Joe Marshall wrote:

> ·······@nada.kth.se (Bj�rn Lindberg) writes:
> 
>>Can you give an example of a macro which is possible to write using
>>syntax-case, but impossible with defmacro?
> 
> That isn't the issue.
> 
> The issue is whether you can write a set of composable macros that
> interact correctly with other composable macros.  For example, suppose
> I write something like this:
> 
>   (loop for element in list
>         do (with-foo (element)
>              (when (magic-p element) (loop-finish))
>              (hack element))
>         finally (clean-up))
> 
> in order for this to work, WITH-FOO cannot itself expand into a LOOP
> expression (or, in fact, into any code might eventually expand into a
> LOOP).

That's indeed a nasty example. Thanks for providing it.


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Paul Dietz
Subject: Re: scheme seems neater
Date: 
Message-ID: <407D5CE5.62C42623@motorola.com>
Pascal Costanza wrote:
> 
> Joe Marshall wrote:

> > The issue is whether you can write a set of composable macros that
> > interact correctly with other composable macros.  For example, suppose
> > I write something like this:
> >
> >   (loop for element in list
> >         do (with-foo (element)
> >              (when (magic-p element) (loop-finish))
> >              (hack element))
> >         finally (clean-up))
> >
> > in order for this to work, WITH-FOO cannot itself expand into a LOOP
> > expression (or, in fact, into any code might eventually expand into a
> > LOOP).
> 
> That's indeed a nasty example. Thanks for providing it.


WITH-FOO can expand into a form that contains loops,
but the macro argument forms can't go into that loop
(they could be located outside, in flets, and called
from within the loop.)

This does suggest that LOOP should have had a 'named LOOP-FINISH'
feature (similar to the way you can change its block name.)

	Paul
From: Matthias
Subject: Re: scheme seems neater
Date: 
Message-ID: <36w65c1iq5o.fsf@hundertwasser.ti.uni-mannheim.de>
Pascal Costanza <········@web.de> writes:

> Joe Marshall wrote:
> > ·······@nada.kth.se (Bj�rn Lindberg) writes:
> >
> >>Can you give an example of a macro which is possible to write using
> >>syntax-case, but impossible with defmacro?
> > That isn't the issue.
> > The issue is whether you can write a set of composable macros that
> > interact correctly with other composable macros.  For example, suppose
> > I write something like this:
> >   (loop for element in list
> >         do (with-foo (element)
> >              (when (magic-p element) (loop-finish))
> >              (hack element))
> >         finally (clean-up))
> > in order for this to work, WITH-FOO cannot itself expand into a LOOP
> > expression (or, in fact, into any code might eventually expand into a
> > LOOP).
> 
> That's indeed a nasty example. Thanks for providing it.

How would one solve or circumvent such problems in practice?  Assume I
build a CL-library and later want to change the implementation of
with-foo.

Matthias
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5lka7$j9v$1@newsreader2.netcologne.de>
Matthias wrote:

>>>The issue is whether you can write a set of composable macros that
>>>interact correctly with other composable macros.  For example, suppose
>>>I write something like this:
>>>  (loop for element in list
>>>        do (with-foo (element)
>>>             (when (magic-p element) (loop-finish))
>>>             (hack element))
>>>        finally (clean-up))
>>>in order for this to work, WITH-FOO cannot itself expand into a LOOP
>>>expression (or, in fact, into any code might eventually expand into a
>>>LOOP).
>>
>>That's indeed a nasty example. Thanks for providing it.
> 
> How would one solve or circumvent such problems in practice?  Assume I
> build a CL-library and later want to change the implementation of
> with-foo.

One obvious solution on the user side is this:

(loop for element in list
       do (flet ((my-loop-finish () (loop-finish)))
            (with-foo (element)
              (when (magic-p element) (my-loop-finish))
              (hack element)))
       finally (clean-up))

I don't think there is a straightforward way to solve this in the 
WITH-FOO macro, except by implementing a code walker that replaces 
symbols according to what Schemers call hygiene.

However, you could add a check in WITH-FOO whether the body form 
mentions LOOP-FINISH and generate a warning accordingly.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Frode Vatvedt Fjeld
Subject: Re: scheme seems neater
Date: 
Message-ID: <2hbrlt7f4o.fsf@vserver.cs.uit.no>
Pascal Costanza <········@web.de> writes:

> [..] However, you could add a check in WITH-FOO whether the body
> form mentions LOOP-FINISH and generate a warning accordingly.

I think the easiest thing is to avoid using loop in macro-expansions,
at least not as a wrapper around some body forms. The way I see it,
loop is all about convenient surface-syntax (which is why it's liberal
with slightly irregular syntax and implicit operator bindings); its
role is not to be used as a building block for other control
operators.

I believe the only other operator that has implicit operator bindings,
such as loop's loop-finish, is defmethod which installs
call-next-method and next-method-p. Can anyone think of other such
cases? With-hash-table-iterator also sets up operator bindings, but in
this case they are explicitly named.

Yes this might be construed as a problem that needs fixing, but I
don't find that to be case.

-- 
Frode Vatvedt Fjeld
From: Joe Marshall
Subject: Re: scheme seems neater
Date: 
Message-ID: <y8oxr1wg.fsf@comcast.net>
Frode Vatvedt Fjeld <······@cs.uit.no> writes:

> I believe the only other operator that has implicit operator bindings,
> such as loop's loop-finish, is defmethod which installs
> call-next-method and next-method-p. Can anyone think of other such
> cases? With-hash-table-iterator also sets up operator bindings, but in
> this case they are explicitly named.

The `implicit block' that is set up around various macros may
unexpectedly bind the block name NIL.  Again, as in the loop example:

(loop for element in list
      do (with-foo (element)
            (when (null element) (return))
            ..more processing...))

The RETURN is meant to exit the loop, but it might return to an
implicit block if WITH-FOO uses DO (or anything that expands to such).

-- 
~jrm
From: Frode Vatvedt Fjeld
Subject: Re: scheme seems neater
Date: 
Message-ID: <2h3c757dei.fsf@vserver.cs.uit.no>
Joe Marshall <·············@comcast.net> writes:

> The `implicit block' that is set up around various macros may
> unexpectedly bind the block name NIL.  [..]

I agree that this is more of an issue, in the sense that one needs to
be aware of it when writing and using macros. However, any with-foo
macro, and certainly iterating macros, are expected do document their
use of the block scope. This issue is implicit in the `implicit block'
feature of CL, and I don't think any hygienic macro system would be
able to resolve this, because it can't possibly detect what the macro
programmer "really means" when it comes to providing a nil block or
not.

-- 
Frode Vatvedt Fjeld
From: Joe Marshall
Subject: Re: scheme seems neater
Date: 
Message-ID: <u0zlr1tu.fsf@comcast.net>
Frode Vatvedt Fjeld <······@cs.uit.no> writes:

> I think the easiest thing is to avoid using loop in macro-expansions,
> at least not as a wrapper around some body forms. 

How?  You may not explicitly use loop, but are you going to fully
expand every macro that you use to check that it doesn't use LOOP
somewhere?

-- 
~jrm
From: Frode Vatvedt Fjeld
Subject: Re: scheme seems neater
Date: 
Message-ID: <2h7jwh7do2.fsf@vserver.cs.uit.no>
Joe Marshall <·············@comcast.net> writes:

> How?  You may not explicitly use loop, but are you going to fully
> expand every macro that you use to check that it doesn't use LOOP
> somewhere?

Well, of the probably thousands of macros I depend on every day, I'm
pretty confident none of them expands in terms of loop, so long as I
don't do it myself.

-- 
Frode Vatvedt Fjeld
From: Matthias
Subject: Re: scheme seems neater
Date: 
Message-ID: <36wad1da4t4.fsf@hundertwasser.ti.uni-mannheim.de>
Pascal Costanza <········@web.de> writes:

> Matthias wrote:
>> How would one solve or circumvent such problems in practice?  Assume
>> I
>> build a CL-library and later want to change the implementation of
>> with-foo.
> 
> One obvious solution on the user side is this:
> 
> (loop for element in list
>        do (flet ((my-loop-finish () (loop-finish)))
>             (with-foo (element)
>               (when (magic-p element) (my-loop-finish))
>               (hack element)))
>        finally (clean-up))
> 
> I don't think there is a straightforward way to solve this in the
> WITH-FOO macro, except by implementing a code walker that replaces
> symbols according to what Schemers call hygiene.
> 
> However, you could add a check in WITH-FOO whether the body form
> mentions LOOP-FINISH and generate a warning accordingly.

But when I implement with-foo I don't know if its users are using it
within a loop, specifically, or within other macros which may interact
nastily with it or any of the macros with-foo is built on top of.

So in essence, it is the macro user's responsibility to macro-expand
every call completely and manually remove every name clash that may
arise.  And the user has to repeat this process every time some
non-trivial macro in some library code has changed.

On the other hand, one can in practice circumvent this potential
maintenance disaster by rigorous unit testing.  

On the other hand, if I sell a library and my users' code breaks
because I changed some "internals" it's still ugly.

---
PS: Thanks for these code snippets!  I think they helped me better 
    understand some issues discussed in this thread.
From: ·········@random-state.net
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5ls2s$7ecml$1@midnight.cs.hut.fi>
Matthias <··@spam.pls> wrote:

> But when I implement with-foo I don't know if its users are using it
> within a loop, specifically, or within other macros which may interact
> nastily with it or any of the macros with-foo is built on top of.

> So in essence, it is the macro user's responsibility to macro-expand
> every call completely and manually remove every name clash that may
> arise.  And the user has to repeat this process every time some
> non-trivial macro in some library code has changed.

Bull and FUD.

It's the macro-writers responsibility to deal with this,
specifically in this case by not putting user code inside LOOPs,
or alternatively by documenting this and noting the loop-finish
"feature". Same goes for implicit blocks and tagbodies in macros:
if they are there, they should be documented.

So instead of

 `(loop ... ,@body ...)

you have

 (with-gensyms (body)
   `(flet ((,body () ,@body))
      (loop ... (,body) ...)))

simple as that.

Otoh, I do agree with the point that loop-finish should accept the
loop name as an optional argument. 

I also do think that having a hygienic macro system for CL would
be an interesting _experiment_, but my intuition is also that
syntax-rules and syntax-case as such would not be a good match for
Common Lisp. 

Cheers,

  -- Nikodemus
From: Mario S. Mommer
Subject: Re: scheme seems neater
Date: 
Message-ID: <fzisg1lag9.fsf@germany.igpm.rwth-aachen.de>
·········@random-state.net writes:
> So instead of
>
>  `(loop ... ,@body ...)
>
> you have
>
>  (with-gensyms (body)
>    `(flet ((,body () ,@body))
>       (loop ... (,body) ...)))
>
> simple as that.

What a nice solution :)

So with the above technique, 

   (loop for element in list
         do (with-foo (element)
              (when (magic-p element) (loop-finish))
              (hack element))
         finally (clean-up))

the with-foo can use loop freely. Great!

Could this technique be used to guarantee ''hygiene'' everywhere?
From: Thomas F. Burdick
Subject: Re: scheme seems neater
Date: 
Message-ID: <xcvu0zl7zyv.fsf@famine.OCF.Berkeley.EDU>
Pascal Costanza <········@web.de> writes:

> Matthias wrote:
> 
> >>>The issue is whether you can write a set of composable macros that
> >>>interact correctly with other composable macros.  For example, suppose
> >>>I write something like this:
> >>>  (loop for element in list
> >>>        do (with-foo (element)
> >>>             (when (magic-p element) (loop-finish))
> >>>             (hack element))
> >>>        finally (clean-up))
> >>>in order for this to work, WITH-FOO cannot itself expand into a LOOP
> >>>expression (or, in fact, into any code might eventually expand into a
> >>>LOOP).
> >>
> >>That's indeed a nasty example. Thanks for providing it.
> > 
> > How would one solve or circumvent such problems in practice?  Assume I
> > build a CL-library and later want to change the implementation of
> > with-foo.
> 
> One obvious solution on the user side is this:
> 
> (loop for element in list
>        do (flet ((my-loop-finish () (loop-finish)))
>             (with-foo (element)
>               (when (magic-p element) (my-loop-finish))
>               (hack element)))
>        finally (clean-up))
> 
> I don't think there is a straightforward way to solve this in the 
> WITH-FOO macro, except by implementing a code walker that replaces 
> symbols according to what Schemers call hygiene.

You were so close!  The solution for the macro writer is your user
solution turned inside-out:

  (defmacro do-with-foo ((var foos) &body body)
    (let ((=fn (gensym))
          (=foos (gensym)))
      `(flet ((,=fn (,var) ,@body))
         (let ((,=foos ,foos))
           (loop for element in (list-foos ,=foos)
               do (,=fn element))))))

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | No to Imperialist war |                        
     ,--'    _,'   | Wage class war!       |                        
    /       /      `-----------------------'                        
   (   -.  |                               
   |     ) |                               
  (`-.  '--.)                              
   `. )----'                               
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5n7t2$km9$2@newsreader2.netcologne.de>
Thomas F. Burdick wrote:

> You were so close!  The solution for the macro writer is your user
> solution turned inside-out:
> 
>   (defmacro do-with-foo ((var foos) &body body)
>     (let ((=fn (gensym))
>           (=foos (gensym)))
>       `(flet ((,=fn (,var) ,@body))
>          (let ((,=foos ,foos))
>            (loop for element in (list-foos ,=foos)
>                do (,=fn element))))))

Yep, you're right. I gave up too early. ;)


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5r93m$mib$1@newsreader2.netcologne.de>
Pascal Costanza wrote:
> 
> Thomas F. Burdick wrote:
> 
>> You were so close!  The solution for the macro writer is your user
>> solution turned inside-out:
>>
>>   (defmacro do-with-foo ((var foos) &body body)
>>     (let ((=fn (gensym))
>>           (=foos (gensym)))
>>       `(flet ((,=fn (,var) ,@body))
>>          (let ((,=foos ,foos))
>>            (loop for element in (list-foos ,=foos)
>>                do (,=fn element))))))
> 
> 
> Yep, you're right. I gave up too early. ;)

Here is another example that I can think of that makes hygiene hard to 
achieve in CL. Assume you are using a third-party library that provides 
the following macro:

(defmacro with-two-references (&body body)
   `(let ((this 'some-value)
          (that 'some-other-value))
      ,@body))

You want to use it in your own macro, but only want to expose the 
binding (!) of THIS to user code, but not the binding of THAT. I can 
only see how to expose either none or both.

The following exposes none:

(defmacro with-frobnication-0 (&body body)
   (let ((fbody (gensym)))
     `(flet ((,fbody () ,@body))
        (with-two-references
          (,fbody)))))

The next exposes both:

(defmacro with-frobnication-2 (&body body)
   `(with-two-references
      ,@body))

Any ideas?


Pascal

P.S.: I regard this only as an intellectual puzzle, since I am not 
convinced yet that this is likely to occur in practice. Too many things 
have to accidentally happen at the same time here. (-> third-party 
library, original author cannot be contacted, need to expose the 
binding, not just the value, and so on...)

P.P.S.: How would a syntax-case solution look like for this case?

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Brian Mastenbrook
Subject: Re: scheme seems neater
Date: 
Message-ID: <170420041249184765%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <············@newsreader2.netcologne.de>, Pascal Costanza
<········@web.de> wrote:

> Here is another example that I can think of that makes hygiene hard to 
> achieve in CL. Assume you are using a third-party library that provides 
> the following macro:
> 
> (defmacro with-two-references (&body body)
>    `(let ((this 'some-value)
>           (that 'some-other-value))
>       ,@body))
> 
> You want to use it in your own macro, but only want to expose the 
> binding (!) of THIS to user code, but not the binding of THAT. I can 
> only see how to expose either none or both.
> 
> The following exposes none:
> 
> (defmacro with-frobnication-0 (&body body)
>    (let ((fbody (gensym)))
>      `(flet ((,fbody () ,@body))
>         (with-two-references
>           (,fbody)))))
> 
> The next exposes both:
> 
> (defmacro with-frobnication-2 (&body body)
>    `(with-two-references
>       ,@body))
> 
> Any ideas?

(asdf-install:install :common-idioms)
(require :common-idioms)
(use-package :common-idioms)

(defmacro with-frobnication-1 (&body body)
  (with-gensyms (env-name)
    `(let-env* ,env-name (that)
       (with-two-references
         (,env-name ,@body)))))

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5s43c$d9t$2@newsreader2.netcologne.de>
Brian Mastenbrook wrote:

>>Any ideas?
> 
> (asdf-install:install :common-idioms)
> (require :common-idioms)
> (use-package :common-idioms)
> 
> (defmacro with-frobnication-1 (&body body)
>   (with-gensyms (env-name)
>     `(let-env* ,env-name (that)
>        (with-two-references
>          (,env-name ,@body)))))

Cool.

BTW, I have checked out the common idioms website and it's not 
completely clear to me from the documentation what is portable and what 
is not. IIRC from earlier discussions, let-env* is portable, right?


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Brian Mastenbrook
Subject: Re: scheme seems neater
Date: 
Message-ID: <180420041356550069%NOSPAMbmastenbNOSPAM@cs.indiana.edu>
In article <············@newsreader2.netcologne.de>, Pascal Costanza
<········@web.de> wrote:

> Cool.
> 
> BTW, I have checked out the common idioms website and it's not 
> completely clear to me from the documentation what is portable and what 
> is not. IIRC from earlier discussions, let-env* is portable, right?
> 
> 
> Pascal

Yes, it is. let-env* is the version which requires that you supply the
captures at the let-env* site. let-env does this at the env application
site, which requires some jumping through non-portable hoops to pass
this information around at macroexpand time.

Sorry for the confusion.

-- 
Brian Mastenbrook
http://www.cs.indiana.edu/~bmastenb/
From: Mario S. Mommer
Subject: Re: scheme seems neater
Date: 
Message-ID: <fzu0zihft0.fsf@germany.igpm.rwth-aachen.de>
Pascal Costanza <········@web.de> writes:
> Here is another example that I can think of that makes hygiene hard to
> achieve in CL. Assume you are using a third-party library that
> provides the following macro:
>
> (defmacro with-two-references (&body body)
>    `(let ((this 'some-value)
>           (that 'some-other-value))
>       ,@body))
>
> You want to use it in your own macro, but only want to expose the
> binding (!) of THIS to user code, but not the binding of THAT. I can
> only see how to expose either none or both.
>
> The following exposes none:
>
> (defmacro with-frobnication-0 (&body body)
>    (let ((fbody (gensym)))
>      `(flet ((,fbody () ,@body))
>         (with-two-references
>           (,fbody)))))
>
> The next exposes both:
>
> (defmacro with-frobnication-2 (&body body)
>    `(with-two-references
>       ,@body))
>
> Any ideas?

(without-having-tested

(defmacro with-frobnication-0ne (&body body)
    (let ((fbody (gensym)))
      `(flet ((,fbody (this) ,@body))
          (with-two-references
            (,fbody this)))))

)

Am I missing something?
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5rjep$d0i$1@newsreader2.netcologne.de>
Mario S. Mommer wrote:

> (without-having-tested
> 
> (defmacro with-frobnication-0ne (&body body)
>     (let ((fbody (gensym)))
>       `(flet ((,fbody (this) ,@body))
>           (with-two-references
>             (,fbody this)))))
> 
> )
> 
> Am I missing something?

"Missing" is a strong word. ;)

You are passing the value of this, not the binding. When I setf this to 
something else, this isn't seen in the surrounding scope.



Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Mario S. Mommer
Subject: Re: scheme seems neater
Date: 
Message-ID: <fzn05ad14o.fsf@germany.igpm.rwth-aachen.de>
Pascal Costanza <········@web.de> writes:
> Mario S. Mommer wrote:
>
>> (without-having-tested
>> (defmacro with-frobnication-0ne (&body body)
>>     (let ((fbody (gensym)))
>>       `(flet ((,fbody (this) ,@body))
>>           (with-two-references
>>             (,fbody this)))))
>> )
>> Am I missing something?
>
> "Missing" is a strong word. ;)
>
> You are passing the value of this, not the binding. When I setf this
> to something else, this isn't seen in the surrounding scope.

Gah, I see. You even marked the "binding" with a !.
From: Kalle Olavi Niemitalo
Subject: Re: scheme seems neater
Date: 
Message-ID: <87ekqly247.fsf@Astalo.kon.iki.fi>
Pascal Costanza <········@web.de> writes:

> You want to use it in your own macro, but only want to expose the
> binding (!) of THIS to user code, but not the binding of THAT.

If the third-party macro took the names of the bindings as
arguments as in (with-two-references (this that) body...),
it would be easy to define a wrapper macro that substituted
gensyms for the names you don't want to expose.

This seems analogous to C: if you provide printf() in a library,
you'd better provide vprintf() too, so that callers can wrap it.
From: John Thingstad
Subject: Re: scheme seems neater
Date: 
Message-ID: <opr6ie0nkdxfnb1n@news.chello.no>
On Wed, 14 Apr 2004 10:02:24 -0400, Joe Marshall <···@ccs.neu.edu> wrote:

> ·······@nada.kth.se (Bj�rn Lindberg) writes:
>
>> Can you give an example of a macro which is possible to write using
>> syntax-case, but impossible with defmacro?
>
> That isn't the issue.
>
> The issue is whether you can write a set of composable macros that
> interact correctly with other composable macros.  For example, suppose
> I write something like this:
>
>   (loop for element in list
>         do (with-foo (element)
>              (when (magic-p element) (loop-finish))
>              (hack element))
>         finally (clean-up))
>
> in order for this to work, WITH-FOO cannot itself expand into a LOOP
> expression (or, in fact, into any code might eventually expand into a
> LOOP).
>

Using a return within a loop is bad form:
Instead use named loops as:
(loop named outer
       for element in list
       do (with-foo (element)
           (when (magic-p element) (mystuff) (return-from outer nil))
           (hack element)
       finally (clean-up))


-- 
Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
From: Joe Marshall
Subject: Re: scheme seems neater
Date: 
Message-ID: <ekqp5fbj.fsf@ccs.neu.edu>
John Thingstad <··············@chello.no> writes:

>
> Using a return within a loop is bad form:
> Instead use named loops as:
> (loop named outer
>        for element in list
>        do (with-foo (element)
>            (when (magic-p element) (mystuff) (return-from outer nil))
>            (hack element)
>        finally (clean-up))

This doesn't do the same thing as the call to (loop-finish)
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5mfo0$78b$1@newsreader2.netcologne.de>
John Thingstad wrote:

>> The issue is whether you can write a set of composable macros that
>> interact correctly with other composable macros.  For example, suppose
>> I write something like this:
>>
>>   (loop for element in list
>>         do (with-foo (element)
>>              (when (magic-p element) (loop-finish))
>>              (hack element))
>>         finally (clean-up))
>>
>> in order for this to work, WITH-FOO cannot itself expand into a LOOP
>> expression (or, in fact, into any code might eventually expand into a
>> LOOP).
> 
> Using a return within a loop is bad form:
> Instead use named loops as:
> (loop named outer
>       for element in list
>       do (with-foo (element)
>           (when (magic-p element) (mystuff) (return-from outer nil))
>           (hack element)
>       finally (clean-up))

return-from doesn't execute the clean-up form.

BTW, I think this is strange - does anyone know why? Probably there is 
some good reason, although I have expected the finally part to be 
treated as the clean-up form in unwind-protect.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Edi Weitz
Subject: Re: scheme seems neater
Date: 
Message-ID: <m34qrlds9e.fsf@bird.agharta.de>
On Thu, 15 Apr 2004 19:11:42 +0200, Pascal Costanza <········@web.de> wrote:

> BTW, I think this is strange - does anyone know why? Probably there
> is some good reason, although I have expected the finally part to be
> treated as the clean-up form in unwind-protect.

In 90% of the code I've seen FINALLY is used together with RETURN. If
clauses like THEREIS or explicit transfers of control wouldn't be able
to bypass FINALLY then it'd be quite useless, I think - at least for
many typical LOOP idioms.

Edi.
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5n7rl$km9$1@newsreader2.netcologne.de>
Edi Weitz wrote:

> On Thu, 15 Apr 2004 19:11:42 +0200, Pascal Costanza <········@web.de> wrote:
> 
>>BTW, I think this is strange - does anyone know why? Probably there
>>is some good reason, although I have expected the finally part to be
>>treated as the clean-up form in unwind-protect.
> 
> In 90% of the code I've seen FINALLY is used together with RETURN. If
> clauses like THEREIS or explicit transfers of control wouldn't be able
> to bypass FINALLY then it'd be quite useless, I think - at least for
> many typical LOOP idioms.

OK, this makes sense. Thanks for your clarification.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Edi Weitz
Subject: Re: scheme seems neater
Date: 
Message-ID: <m38ygxdutk.fsf@bird.agharta.de>
On Thu, 15 Apr 2004 19:01:25 +0100, John Thingstad <··············@chello.no> wrote:

> On Wed, 14 Apr 2004 10:02:24 -0400, Joe Marshall <···@ccs.neu.edu> wrote:
>
>> The issue is whether you can write a set of composable macros that
>> interact correctly with other composable macros.  For example,
>> suppose I write something like this:
>>
>>   (loop for element in list
>>         do (with-foo (element)
>>              (when (magic-p element) (loop-finish))
>>              (hack element))
>>         finally (clean-up))
>>
>> in order for this to work, WITH-FOO cannot itself expand into a
>> LOOP expression (or, in fact, into any code might eventually expand
>> into a LOOP).
>
> Using a return within a loop is bad form:

LOOP-FINISH is part of the ANSI standard and is there especially for
this task. (Although I agree with others that it might have been a
good idea to provide a 'named' form of LOOP-FINISH.)

> Instead use named loops as:
> (loop named outer
>        for element in list
>        do (with-foo (element)
>            (when (magic-p element) (mystuff) (return-from outer nil))
>            (hack element)
>        finally (clean-up))

That's bad advice because the two loops don't do the same thing. Your
loop won't CLEAN-UP.

  * (loop named foo
          for i below 10
          when (oddp i) do (return-from foo i)
          finally (return-from foo 42))

  1
  * (loop named foo
          for i below 10
          when (oddp i) do (loop-finish)
          finally (return-from foo 42))

  42

Cheers,
Edi.
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5iu99$b1e$1@newsreader2.netcologne.de>
Bj�rn Lindberg wrote:

> Can you give an example of a mcaro which is possible to write using
> syntax-case, but impossible with defmacro?

You didn't ask me, so I won't answer this directly. ;)

One thing which I think is hard to achieve with CL's defmacro is the 
ability to record line numbers with syntax objects in Scheme. So when a 
macro is expanded, the resulting code is internally augmented with 
information from which macro the code originally stems from. This can 
improve the quality of error messages AFAICS.

With defmacro this becomes harder, because you can reuse the same (eq) 
piece of code in different places.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <FEefc.7234$zj3.1812@newsread3.news.atl.earthlink.net>
Bj�rn Lindberg wrote:
> "Anton van Straaten" <·····@appsolutions.com> writes:
> > Syntax-case is essentially an extensible compiler front-end, which can
act
> > globally on source code, and is aware of global context.  As such, it is
> > more powerful than DEFMACRO, which only acts locally.  This was covered
in
> > recent threads, I think "three macro systems" or perhaps "Scheme
macros".
>
> Can you give an example of a mcaro which is possible to write using
> syntax-case, but impossible with defmacro?

That's easy.  Syntax-rules.

Anton
From: Björn Lindberg
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcs4qrm5eyv.fsf@knatte.nada.kth.se>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Bj�rn Lindberg wrote:
> > "Anton van Straaten" <·····@appsolutions.com> writes:
> > > Syntax-case is essentially an extensible compiler front-end, which can
> act
> > > globally on source code, and is aware of global context.  As such, it is
> > > more powerful than DEFMACRO, which only acts locally.  This was covered
> in
> > > recent threads, I think "three macro systems" or perhaps "Scheme
> macros".
> >
> > Can you give an example of a mcaro which is possible to write using
> > syntax-case, but impossible with defmacro?
> 
> That's easy.  Syntax-rules.

In what way? Would syntax-rules really be impossible to implement in
CL then? I would appreciate another example, since I'm not too
familiar with syntax-rules.


Bj�rn
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <OrCfc.9956$zj3.4489@newsread3.news.atl.earthlink.net>
Bj�rn Lindberg wrote:
> > > Can you give an example of a mcaro which is possible to write using
> > > syntax-case, but impossible with defmacro?
> >
> > That's easy.  Syntax-rules.
>
> In what way? Would syntax-rules really be impossible to implement in
> CL then? I would appreciate another example, since I'm not too
> familiar with syntax-rules.

Any example is going to be based on the same underlying capability, which is
that both syntax-case and syntax-rules are aware of a macro's surrounding
context in a way that defmacro alone is not, and they use that ability to
support hygienic and referentially transparent macros.

It's been pointed out that it might be possible to implement syntax-rules
using demacro and environment objects.  No-one's actually done this, though,
and I don't know enough to say how well it would work or what the pros &
cons would be.  However, such an implementation would certainly be more
complex than an equivalent syntax-case implementation, partly because the
defmacro version would have to implement behavior that already exists within
syntax-case.

Anton
From: Erann Gat
Subject: Re: scheme seems neater
Date: 
Message-ID: <gNOSPAMat-A91B98.14191015042004@nntp1.jpl.nasa.gov>
In article <···················@newsread3.news.atl.earthlink.net>,
 "Anton van Straaten" <·····@appsolutions.com> wrote:

> It's been pointed out that it might be possible to implement syntax-rules
> using demacro and environment objects.  No-one's actually done this

Dorai Sitaram claims to have done it:

http://www.ccs.neu.edu/home/dorai/mbe/mbe-lsp.html

I'm having trouble downloading his code so I can't evaluate the claim at 
the moment.

E.
From: Simon Adameit
Subject: Re: scheme seems neater
Date: 
Message-ID: <pan.2004.04.15.21.16.30.459295@gmx.net>
On Thu, 15 Apr 2004 15:19:10 -0700, Erann Gat wrote:

> In article <···················@newsread3.news.atl.earthlink.net>,
>  "Anton van Straaten" <·····@appsolutions.com> wrote:
> 
>> It's been pointed out that it might be possible to implement syntax-rules
>> using demacro and environment objects.  No-one's actually done this
> 
> Dorai Sitaram claims to have done it:
> 
> http://www.ccs.neu.edu/home/dorai/mbe/mbe-lsp.html
> 
It seems like he only implemented the rule language but no hygiene. At
least he says so on this site.

"This Common Lisp implementation does not provide hygiene."
From: Björn Lindberg
Subject: Re: scheme seems neater
Date: 
Message-ID: <hcsd668yv42.fsf@tjatte.nada.kth.se>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Bj�rn Lindberg wrote:
> > > > Can you give an example of a mcaro which is possible to write using
> > > > syntax-case, but impossible with defmacro?
> > >
> > > That's easy.  Syntax-rules.
> >
> > In what way? Would syntax-rules really be impossible to implement in
> > CL then? I would appreciate another example, since I'm not too
> > familiar with syntax-rules.
> 
> Any example is going to be based on the same underlying capability, which is
> that both syntax-case and syntax-rules are aware of a macro's surrounding
> context in a way that defmacro alone is not, and they use that ability to
> support hygienic and referentially transparent macros.

I would just like to see a realistic example of where syntax-rules can
be used to implement a useful macro and can not.

> It's been pointed out that it might be possible to implement syntax-rules
> using demacro and environment objects.  No-one's actually done this, though,
> and I don't know enough to say how well it would work or what the pros &
> cons would be.  However, such an implementation would certainly be more
> complex than an equivalent syntax-case implementation, partly because the
> defmacro version would have to implement behavior that already exists within
> syntax-case.

Isn't syntax-rules more or less a subset of syntax-case? If so, it is
hardly surprising that it is easier to implement syntax-rules in
syntax-case rather than defmacro. And the syntax-rules implementation
itself is likely more complex than a defmacro implementation, right?


Bj�rn
From: John Thingstad
Subject: Re: scheme seems neater
Date: 
Message-ID: <opr6ie98ksxfnb1n@news.chello.no>
destructuring-bind would be used in the place of sytax rules.


On Wed, 14 Apr 2004 17:36:37 GMT, Anton van Straaten 
<·····@appsolutions.com> wrote:

> Bj�rn Lindberg wrote:
>> "Anton van Straaten" <·····@appsolutions.com> writes:
>> > Syntax-case is essentially an extensible compiler front-end, which can
> act
>> > globally on source code, and is aware of global context.  As such, it 
>> is
>> > more powerful than DEFMACRO, which only acts locally.  This was 
>> covered
> in
>> > recent threads, I think "three macro systems" or perhaps "Scheme
> macros".
>>
>> Can you give an example of a mcaro which is possible to write using
>> syntax-case, but impossible with defmacro?
>
> That's easy.  Syntax-rules.
>
> Anton
>
>



-- 
Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <OkCfc.9950$zj3.6470@newsread3.news.atl.earthlink.net>
John Thingstad wrote:
> destructuring-bind would be used in the place of sytax rules.

That doesn't give you hygiene and referential transparency.

Anton
From: Cameron MacKinnon
Subject: Re: scheme seems neater
Date: 
Message-ID: <NKSdncm7hbJk1uDdRVn-iQ@golden.net>
Frode Vatvedt Fjeld wrote:
> Cameron MacKinnon <··········@clearspot.net> writes:
>>This particular
>>problem is one that wizards have seen fit to solve. But, rather than
>>admit that CL could perhaps learn a trick, old-timers feel it
>>necessary to downplay the problem.
> 
> Heh, that's an interesting usage of "wizards" for expert schemers and
> "feeling old-timers" for expert CL-ers.

I got 'old-timers' from your post. I used 'wizards' to indicate that the 
solution didn't just come from some disaffected person on the fringe, 
but from people central to the community. I wasn't trying to imply 
anything else.


> Well, you don't just "modernize" a language. There's got to be some
> proposed solution that solves some problem, and with which there's
> been positive experiences across implementations and so on. In the
> case of hygienic macros, there's absolutely none of any of this.

Do you agree that there are some things which could benefit from 
standardization? I'm less worried about the standardization of any one 
thing than about the complete lack of will, anywhere, for a 
standardization process.


>>And that, of course, will just add that much more to the pain when
>>these features eventually become standardized. Every day, people are
>>writing nonstandard code, thus increasing the cost of the position
>>they'll protect when the standards committee is struck.
> 
> How this viewpoint fits into the same brain that just a screenful
> earlier stated that
> 
>>I like the idea of Scheme's small spec. It encourages multiple
>>implementations. There's several Schemes that run on the Java VM,
>>for example.
> 
> ...without either exploding or imploding, I find kind of peculiar.

?? I accept CL for what it is: A large standard with lots of bells and 
whistles. There are also advantages to a minimalist approach. But once a 
standard has passed a certain size, there's no sense trying to keep it 
from growing bigger.

-- 
Cameron MacKinnon
Toronto, Canada
From: Tayssir John Gabbour
Subject: Re: scheme seems neater
Date: 
Message-ID: <866764be.0404140120.2f3a6838@posting.google.com>
Cameron MacKinnon <··········@clearspot.net> wrote in message news:<······················@golden.net>...
> But if the computer can solve the problem so that I don't have to worry 
> about it, and if I don't lose any power with the new solution, then my 
> vote is for progress. If a language can help amateurs or newbies avoid 
> mistakes without taking power away from advanced users, why not?
>
> But, oddly enough, it seems that classical Lisp has stopped evolving 
> with CL, while Scheme continues to mutate.

You have made one mistake, and it is very, very enormous.  You are
taking usenet as some barometer of the lisp community.

If there is one lesson in SICP, it's that programming languages are
programs.  So arguing X and Y about CL vs. Scheme is like arguing
Emacs vs. VI.  (In other words, Emacs is better than VI, same with CL
and Scheme respectively... ahem)  So you can find momentary annoyances
with one that can be made into a mountain, but with lisp you do get
the ability to improve the big ball of mud.

Better than begging Sun or Guido to put in some new loop.  And if you
crack open CLtL2, especially near the end there, there's a lot of cool
shit in lisp that really makes programming soo fucking decent.

Currently yes, you must look at Scheme and lisp to see which one
requires you to build more in order to write the programs you want. 
That may change with your preferred style, or specific programming
task, or whatnot.

Again, if you get yourself thinking usenet is full of violent baboons,
maybe you haven't found the lispers you can identify with.  And
unmoderated usenet will probably always have some fundamental
toxicity.
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4oepvyyk4.fsf@tarn.inf.ed.ac.uk>
Cameron MacKinnon <··········@clearspot.net> writes:

> And if it is such a minor problem, how come it keeps coming up
> on c.l.l again and again and again.

Does it keep coming up again and again?  It didn't used to.

> I suspect that, had there been a context-aware macro system in use in
> some Lisp for several years before the CL standard was adopted, the
> standard would have incorporated it.

Why do you think that?  There were plenty of things that were
in use on some Lisp etc that weren't adopted in CL.  Even things
that had been used in near relatives of Common Lisp were not
always adopted, and others, such as the LOOP of many keywords,
had trouble.

Indeed, context-aware macro systems were in use in some Lisps
for several years before the CL standard was adopted, though
those Lisps happened to be Schemes.

I think the CL standard might have incorporated Scheme-style
macros *if they had been available in Common Lisp*.

CLOS had an implementation in Common Lisp.  LOOP had an implementation
in Common Lisp.  The condition system had an implementation in Common
Lisp.  Hygienic macros didn't.

It wouldn't have been enough if the implementation had appeared
too close to the end of active X3J13 work, but we don't have to
work out how close was too close, because the implementation
never appeared at all.

-- jd
From: Mario S. Mommer
Subject: Re: scheme seems neater
Date: 
Message-ID: <fz3c779f3r.fsf@germany.igpm.rwth-aachen.de>
Jeff Dalton <····@tarn.inf.ed.ac.uk> writes:
> Cameron MacKinnon <··········@clearspot.net> writes:
>
>> And if it is such a minor problem, how come it keeps coming up
>> on c.l.l again and again and again.
>
> Does it keep coming up again and again?

It does. There is always some random troll that claims (without any
proof) that this is actually a real problem.
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <nkXec.6352$zj3.647@newsread3.news.atl.earthlink.net>
Frode Vatvedt Fjeld wrote:

> What is really odd is that schemers, who appear to be obsessed with
> keeping the size of the language spec small and only standardize the
> absolutely most technically fundamental operators[*], eagerly advocate
> the addition of what to me appears to be a whole new sub-language just
> to solve some very minor problem.

If otherwise intelligent people appear to be doing something really odd, you
have to consider the possibility that their might be method in their
madness.  In that case, your perception of oddity may actually be a
reflection of something you don't know.

> [*] "technically fundamental" as opposed to "fundamental", for example
>     because Scheme dismisses the IMHO very fundamental difference
>     between the operator and variable name-spaces, just because it's
>     technically feasible to do so.

It's also technically desirable to do so, if you're interested in
functional-style programming.  Note that functional languages like Haskell
and ML also have a single function/variable namespace, for the same sorts of
reasons.  Both of those languages would suffer a severe negative impact
under a dual-namespace system, as would one of the dominant styles of
programming in Scheme.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4fzb7yxpg.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Frode Vatvedt Fjeld wrote:
> 
> > What is really odd is that schemers, who appear to be obsessed with
> > keeping the size of the language spec small and only standardize the
> > absolutely most technically fundamental operators[*], eagerly advocate
> > the addition of what to me appears to be a whole new sub-language just
> > to solve some very minor problem.
> 
> If otherwise intelligent people appear to be doing something really odd,
> you have to consider the possibility that their might be method in their
> madness.

Or that they're in the grip of an ideology.

That has to be considered too.

> In that case, your perception of oddity may actually be a
> reflection of something you don't know.

So far no one's produced this thing we don't know.

> > [*] "technically fundamental" as opposed to "fundamental", for example
> >     because Scheme dismisses the IMHO very fundamental difference
> >     between the operator and variable name-spaces, just because it's
> >     technically feasible to do so.
> 
> It's also technically desirable to do so, if you're interested in
> functional-style programming.  Note that functional languages like Haskell
> and ML also have a single function/variable namespace, for the same sorts of
> reasons.  Both of those languages would suffer a severe negative impact
> under a dual-namespace system, as would one of the dominant styles of
> programming in Scheme.

I don't agree that it is a severe negative impact.  It is a
slight negative impact in some cases, and in others it makes
the code clearer and easier to read.

-- jd
From: Ray Dillinger
Subject: Re: scheme seems neater
Date: 
Message-ID: <407CF488.A502F054@sonic.net>
Jeff Dalton wrote:
> 
> "Anton van Straaten" <·····@appsolutions.com> writes:

> I don't agree that it is a severe negative impact.  It is a
> slight negative impact in some cases, and in others it makes
> the code clearer and easier to read.

And of course, you therefore program in CL.  

I personally find the duplication of functions caused by CL's 
split namespace to be really annoying, and all the code 
cluttered up with needless funcall and #' is miserable for 
me to read.  A function is just another value, why not treat
it as one?

And of course, I therefore program in scheme, at least when 
I'm doing anything remotely FP.  

Let us take a moment to enjoy the totally unexpected result 
that each of us prefers what we're used to!  :-)

In a world with no shortage of coffee or tea, it is extremely 
silly for a tea drinker to be annoyed that some like coffee, 
or vice versa.  We live in a world with no shortage of scheme
or CL; let's sit down and have a cup of $BEVERAGE together and
then enjoy using our $LISP.

				Bear
From: Paul F. Dietz
Subject: Re: scheme seems neater
Date: 
Message-ID: <WOmdnY7Gq6ZkguDd4p2dnA@dls.net>
Ray Dillinger wrote:

> I personally find the duplication of functions caused by CL's 
> split namespace to be really annoying, and all the code 
> cluttered up with needless funcall and #' is miserable for 
> me to read.

I looked through a 100+K line CL program that I work on.

FUNCALL appeared 349 times.

#' occured more often.  However, I find it to be beneficial,
not ugly.

 >  A function is just another value, why not treat
 > it as one?

W lk t b bl t lv th vwls n r vrbl nms.

	Paul
From: Erann Gat
Subject: Re: scheme seems neater
Date: 
Message-ID: <gNOSPAMat-1404040745140001@192.168.1.51>
In article <·················@sonic.net>, Ray Dillinger <····@sonic.net> wrote:

> Jeff Dalton wrote:
> > 
> > "Anton van Straaten" <·····@appsolutions.com> writes:
> 
> > I don't agree that it is a severe negative impact.  It is a
> > slight negative impact in some cases, and in others it makes
> > the code clearer and easier to read.
> 
> And of course, you therefore program in CL.  
> 
> I personally find the duplication of functions caused by CL's 
> split namespace to be really annoying, and all the code 
> cluttered up with needless funcall and #' is miserable for 
> me to read.  A function is just another value, why not treat
> it as one?
> 
> And of course, I therefore program in scheme, at least when 
> I'm doing anything remotely FP.  

Not that I really want to get tangled up in this discussion, but I think
it's worth noting for the record that it is a simple matter to write CL
macros that allow CL to emulate a Lisp-1 so that one can, if one chooses
to, write functional-style code in CL without all the funcalls and #''s. 
(And not that it's really relevant, but I think it would be a lot harder
to write the corresponding macros that would allow Scheme to emulate a
Lisp-2.)

E.
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <STefc.7268$zj3.1359@newsread3.news.atl.earthlink.net>
Erann Gat wrote:

> Not that I really want to get tangled up in this discussion, but I think
> it's worth noting for the record that it is a simple matter to write CL
> macros that allow CL to emulate a Lisp-1 so that one can, if one chooses
> to, write functional-style code in CL without all the funcalls and #''s.
> (And not that it's really relevant, but I think it would be a lot harder
> to write the corresponding macros that would allow Scheme to emulate a
> Lisp-2.)

In standard Scheme, perhaps.  Some Schemes allow you to redefine the
application operation, though, and that trivially enables emulation of a
Lisp-2.

However, I think this discussion is moot with respect to functional
languages unless someone can point to a primarily functional language that
uses a separate function/value namespace (CL doesn't count as primarily
functional).  That's not a coincidence, it's the natural consequence of the
requirements of functional programming.

Anton
From: Luke Gorrie
Subject: Re: scheme seems neater
Date: 
Message-ID: <lhfzb6v3q3.fsf@dodo.bluetail.com>
"Anton van Straaten" <·····@appsolutions.com> writes:

> However, I think this discussion is moot with respect to functional
> languages unless someone can point to a primarily functional language that
> uses a separate function/value namespace (CL doesn't count as primarily
> functional).  That's not a coincidence, it's the natural consequence of the
> requirements of functional programming.

Erlang?
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <cwffc.7393$zj3.4963@newsread3.news.atl.earthlink.net>
Luke Gorrie wrote:
> "Anton van Straaten" <·····@appsolutions.com> writes:
>
> > However, I think this discussion is moot with respect to functional
> > languages unless someone can point to a primarily functional language
that
> > uses a separate function/value namespace (CL doesn't count as primarily
> > functional).  That's not a coincidence, it's the natural consequence of
the
> > requirements of functional programming.
>
> Erlang?

Ouch!  I'll have to check it out.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4n05exrwz.fsf@tarn.inf.ed.ac.uk>
Ray Dillinger <····@sonic.net> writes:

> Jeff Dalton wrote:
> > 
> > "Anton van Straaten" <·····@appsolutions.com> writes:
> 
> > I don't agree that it is a severe negative impact.  It is a
> > slight negative impact in some cases, and in others it makes
> > the code clearer and easier to read.
> 
> And of course, you therefore program in CL.  

But it's not why I program in CL.

> I personally find the duplication of functions caused by CL's 
> split namespace to be really annoying, and all the code 
> cluttered up with needless funcall and #' is miserable for 
> me to read.

I find it often makes the code clearer.

Anyway, you shouldn't need #' with lambda these days.

> A function is just another value, why not treat
> it as one?

I do.  And one operation on a function is FUNCALL, just as
one operation on a cons is CAR.  Sometimes having the call
operation written out is helpful; other times it's not,
and I just want to write (F X).  Those "times" happen to
line up fairly well with what happens in CL.

If I want a variable to have a function as its value, I can do that.
Indeed, I often do: with functional arguments.

> And of course, I therefore program in scheme, at least when 
> I'm doing anything remotely FP.  
> 
> Let us take a moment to enjoy the totally unexpected result 
> that each of us prefers what we're used to!  :-)

That's a common assumption in cases like this: that people prefer
something just because they're used to it.

But I've programmed a fair amount in Scheme.  I've even implemented
Scheme.  When I had a free hand to design a Lisp years ago, it was
a Lisp-1 very like Scheme.  I was also involved in EuLisp,
another Lisp-1.

I've put in plenty of time in both sorts of languages.

I have used Common Lisp more often, but for a variety of reasons,
not all of which would make me like it.

> In a world with no shortage of coffee or tea, it is extremely 
> silly for a tea drinker to be annoyed that some like coffee, 
> or vice versa.  We live in a world with no shortage of scheme
> or CL; let's sit down and have a cup of $BEVERAGE together and
> then enjoy using our $LISP.

There's much to what you say, but I don't think it all comes
down to random personal preference and what one's used to.

Despite seemingly obvious problems with Lisp-2, it has survived,
and I think that's because there's something good about it,
even though "it's not everyone's cup of tea".

Cheers.

-- jd
From: Frode Vatvedt Fjeld
Subject: Re: scheme seems neater
Date: 
Message-ID: <2hvfk36j7h.fsf@vserver.cs.uit.no>
"Anton van Straaten" <·····@appsolutions.com> writes:

> It's also technically desirable to do so [single namespace], if
> you're interested in functional-style programming.  Note that
> functional languages like Haskell and ML also have a single
> function/variable namespace, for the same sorts of reasons.  Both of
> those languages would suffer a severe negative impact under a
> dual-namespace system, as would one of the dominant styles of
> programming in Scheme.

Could you inform me of what this dominant style of Scheme programming
is, and how it requires single namespace?

-- 
Frode Vatvedt Fjeld
From: Dan Schmidt
Subject: Re: scheme seems neater
Date: 
Message-ID: <uvfk2k5sq.fsf@turangalila.harmonixmusic.com>
Frode Vatvedt Fjeld <······@cs.uit.no> writes:

| "Anton van Straaten" <·····@appsolutions.com> writes:
| 
| > It's also technically desirable to do so [single namespace], if
| > you're interested in functional-style programming.  Note that
| > functional languages like Haskell and ML also have a single
| > function/variable namespace, for the same sorts of reasons.  Both of
| > those languages would suffer a severe negative impact under a
| > dual-namespace system, as would one of the dominant styles of
| > programming in Scheme.
| 
| Could you inform me of what this dominant style of Scheme
| programming is,

He did: functional-style programming.

| and how it requires single namespace?

To be precise, he said that it would 'suffer a severe negative impact' with
two namespaces, not that it requires a single namespace.

We can quibble over just how severe the impact is, but if your style
of programming involves tons of passing around functions and calling
them, it's annoying to have to throw in funcalls everywhere.

For a published example, try translating the code in Structure and
Interpretation of Classical Mechanics into Common Lisp.  You can do it
(I assume), but it would be a lot less elegant to read and write.  (In
fact, they had to tweak Scheme even more to make their system
sufficiently elegant for their purposes.)

(If you care where I'm coming from, I like both Common Lisp and Scheme,
and do most of my lispy programming in Common Lisp and my functional
programming in OCaml.)

-- 
http://www.dfan.org
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5hcum$dmq$1@newsreader2.netcologne.de>
Cameron MacKinnon wrote:

> I haven't started studying macrology yet, so I only know from what 
> I've read. And I have no reason to believe my experience won't 
> eventually be like everyone else's: Learn to avoid the traps, and 
> spend many years with no problems at all.
[...]

> But if the computer can solve the problem so that I don't have to 
> worry about it, and if I don't lose any power with the new solution,
> then my vote is for progress. If a language can help amateurs or 
> newbies avoid mistakes without taking power away from advanced users,
> why not?

You could help newbies much better by stopping to portray things as much
more dangerous than they actually are - especially those that you admit
to not know well enough yourself yet.

As an experienced macrologist, you will have to know about name capture,
because it's an important concept in your arsenal. Otherwise you're
missing an important point. Why mislead newbies into a false illusion of
safety in this regard?

> But, oddly enough, it seems that classical Lisp has stopped evolving 
> with CL, while Scheme continues to mutate.

You are continually using judgmental language. Why didn't you say the
following?

"Classical Lisp has stabilized with CL that its users can rely on, while
Scheme is a moving target that still struggles to reach a stable basis."

Note that this doesn't reflect my actual opinion! However, I wonder
where you derive your conclusions from.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <99Xec.6350$zj3.2844@newsread3.news.atl.earthlink.net>
Pascal Costanza wrote:
> As an experienced macrologist, you will have to know about name capture,
> because it's an important concept in your arsenal. Otherwise you're
> missing an important point. Why mislead newbies into a false illusion of
> safety in this regard?

It's worth noting that with syntax-case, name capture is a "safe" operation
in the sense that it requires that the lexical level at which capture should
occur to be specified.  You can't accidentally capture a name from a level
other than the one you specify.

Anton
From: Kenny Tilton
Subject: Re: scheme seems neater
Date: 
Message-ID: <7KYec.37799$WA4.31033@twister.nyc.rr.com>
Anton van Straaten wrote:
> Pascal Costanza wrote:
> 
>>As an experienced macrologist, you will have to know about name capture,
>>because it's an important concept in your arsenal. Otherwise you're
>>missing an important point. Why mislead newbies into a false illusion of
>>safety in this regard?
> 
> 
> It's worth noting that with syntax-case, name capture is a "safe" operation
> in the sense that it requires that the lexical level at which capture should
> occur to be specified.  You can't accidentally capture a name from a level
> other than the one you specify.

Can you define level in nice easy terms an application programmer would 
understand? I use variable capture all the time, usually of the 
smalltalkian 'self so central to cells, and my macros and me don't give 
a rat's ass where they capture it, as long as they do.

In the most extreme applications of this I get cases where some function 
  with a self parameter spawns an instance with a rule which wants to 
refer to the spawning self. But every rule gets self inserted into its 
lexical scope by my handy-dandy rule macros, which shadows the spawning 
self, and on rare occasions the rule on the spawned instance wants to 
refer to the spawner. So then somewhere I have to say:

     (let ((spawner self))
         (make-instance 'thing :visible (c? (^enabled spawner))))

Mind you, it would be three minutes work to cook up a version of c? 
which allowed one to specify the variable to which the instance will be 
bound, but it comes up about twice a year so I have never bothered.

CLers are very practical people, you know.

:)

kt

-- 
Home? http://tilton-technology.com
Cells? http://www.common-lisp.net/project/cells/
Cello? http://www.common-lisp.net/project/cello/
Why Lisp? http://alu.cliki.net/RtL%20Highlight%20Film
Your Project Here! http://alu.cliki.net/Industry%20Application
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <xX0fc.6622$zj3.2546@newsread3.news.atl.earthlink.net>
Kenny Tilton wrote:
>
>
> Anton van Straaten wrote:
...
> > It's worth noting that with syntax-case, name capture is a "safe"
operation
> > in the sense that it requires that the lexical level at which capture
should
> > occur to be specified.  You can't accidentally capture a name from a
level
> > other than the one you specify.
>
> Can you define level in nice easy terms an application programmer would
> understand?

Yeah, sorry, I wasn't very specific.  I'm talking about a situation where,
for example, a macro uses another macro in its definition, and these nested
definitions may continue for a number of levels (it might be recursive).
If, in one of these nested macros, you want to insert an identifier to
capture a variable that was generated by a macro a few levels higher, then
with syntax-case you need to have a syntactic token that came from that
higher level, essentially to prove that you know what you're doing, and of
course to help the macro system do the right thing.

Some might consider this a b&d kind of thing, which to some extent it is.
But in fact, it imposes no cost in simple situations, since the default
behavior is simply to capture from the currently appropriate level.  In more
complex situations, it means you have less to worry about, in exchange for
having to be explicit about what you're trying to capture.

> I use variable capture all the time, usually of the
> smalltalkian 'self so central to cells, and my macros
> and me don't give a rat's ass where they capture it,
> as long as they do.

That's a pretty straightforward situation, where you're not using 'self' for
any other purpose, for example, and you probably aren't nesting those macros
much.  This, of course, is a big reason why defmacro works as well as it
does, because plenty of situations are similarly simple to deal with.

> In the most extreme applications of this I get cases where some function
>   with a self parameter spawns an instance with a rule which wants to
> refer to the spawning self. But every rule gets self inserted into its
> lexical scope by my handy-dandy rule macros, which shadows the spawning
> self, and on rare occasions the rule on the spawned instance wants to
> refer to the spawner. So then somewhere I have to say:
>
>      (let ((spawner self))
>          (make-instance 'thing :visible (c? (^enabled spawner))))
>
> Mind you, it would be three minutes work to cook up a version of c?
> which allowed one to specify the variable to which the instance will be
> bound, but it comes up about twice a year so I have never bothered.

That sounds like it could be a case where syntax-case could be useful.  One
of these days, we'll have to do a Scheme version of Cello...

BTW, what would happen if your macros had to work with some other system of
macros that also used 'self'?  Is there any possibility for undesirable
interaction?  That's the kind of thing which syntax-case makes it easy to
completely rule out - you can design macros which won't break under the
wrong circumstances.

Of course, one reason that's not such a big deal in practice is that
Sourceforge is not full of Lisp projects, and there's not as much mixing and
matching of libraries of macros as there would be in an alternate
Lisp-dominated universe.  The problem with 100% solutions is they can seem
over-the-top right up until the day you really, really need them.

> CLers are very practical people, you know.

This from a guy who's writing a GUI on top of OpenGL?  I'm not quite ready
to concede the practicality of that, even though its coolness is not in
doubt.

Anton
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5irif$6hb$1@newsreader2.netcologne.de>
Anton van Straaten wrote:

> BTW, what would happen if your macros had to work with some other system of
> macros that also used 'self'?  Is there any possibility for undesirable
> interaction?

No, since Kenny's self is in fact cells:self. The self of another macro 
library would be other-package:self. No undesirable interactions here. 
That's what CL's package system was designed for.

> That's the kind of thing which syntax-case makes it easy to
> completely rule out - you can design macros which won't break under the
> wrong circumstances.

Given two third-party Scheme macro libraries whose source code I cannot 
access, and that happen to inject the name "self" into my code for 
different purposes, how does syntax-case ensure that I can disambiguate 
the two different "self"s?


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <9Pefc.7255$zj3.6918@newsread3.news.atl.earthlink.net>
Pascal Costanza wrote:
>
> Anton van Straaten wrote:
>
> > BTW, what would happen if your macros had to work with some other system
of
> > macros that also used 'self'?  Is there any possibility for undesirable
> > interaction?
>
> No, since Kenny's self is in fact cells:self. The self of another macro
> library would be other-package:self. No undesirable interactions here.
> That's what CL's package system was designed for.

Thanks.

> > That's the kind of thing which syntax-case makes it easy to
> > completely rule out - you can design macros which won't break under the
> > wrong circumstances.
>
> Given two third-party Scheme macro libraries whose source code I cannot
> access, and that happen to inject the name "self" into my code for
> different purposes, how does syntax-case ensure that I can disambiguate
> the two different "self"s?

Each 'self' is associated with a lexical context specified at the time it
was injected.  There's no ambiguity.

Anton
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5kgkr$g4g$1@newsreader2.netcologne.de>
Anton van Straaten wrote:

>>Given two third-party Scheme macro libraries whose source code I cannot
>>access, and that happen to inject the name "self" into my code for
>>different purposes, how does syntax-case ensure that I can disambiguate
>>the two different "self"s?
> 
> Each 'self' is associated with a lexical context specified at the time it
> was injected.  There's no ambiguity.

We are probably talking about different things here. Here is what I mean 
in Common Lisp:

(defpackage "FOO"
   (:export "TEST" "SELF"))

(defpackage "BAR"
   (:export "TEST" "SELF"))

(in-package "FOO")

(defmacro test (&body body)
   `(let ((self t))
      ,@body))

(in-package "BAR")

(defmacro test (&body body)
   `(let ((self nil))
      ,@body))

(in-package "CL-USER")

? (foo:test
     (bar:test
       (print foo:self)
       (print bar:self)))

T
NIL

I can disambiguate between the two selfs in that example. Of course, 
that's trivial, because packages mainly just rename symbols and give 
several ways to access them. So it's still possible to say the following:

(use-package "FOO")

? (test
     (print self))
T


It seemed to me that you suggest that you can disambiguate symbols in a 
similar way with syntax-case. However, I can't imagine how this could work.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <XiCfc.9947$zj3.8664@newsread3.news.atl.earthlink.net>
Pascal Costanza wrote:
>
>
> Anton van Straaten wrote:
>
> >>Given two third-party Scheme macro libraries whose source code I cannot
> >>access, and that happen to inject the name "self" into my code for
> >>different purposes, how does syntax-case ensure that I can disambiguate
> >>the two different "self"s?
> >
> > Each 'self' is associated with a lexical context specified at the time
it
> > was injected.  There's no ambiguity.
>
> We are probably talking about different things here.

Yes, you're right.  I was talking about references to the variable injected
by the macro.  Any reference to a variable injected by a syntax-case macro
has to be associated with a lexical context, so that there's no ambiguity
resulting from the context in which the macro is expanded.  This allows you
to have macros that expand to something like:

  (foo:test (bar:test (print self) (print self)))

...where each 'self' could be a different one, depending on which macro
inserted the reference.  The distinction between the different 'self's would
be visible at the syntax object level, or in a suitably decorated source
representation.

Anton
From: Kenny Tilton
Subject: Re: scheme seems neater
Date: 
Message-ID: <h1gfc.17665$mX.5962250@twister.nyc.rr.com>
Anton van Straaten wrote:
> That sounds like it could be a case where syntax-case could be useful.  One
> of these days, we'll have to do a Scheme version of Cello...

Cells would be a nice start on that. Especially if it was not just a 
literal translation and was as Schemish as possible.

> 
> BTW, what would happen if your macros had to work with some other system of
> macros that also used 'self'?

If it was using self in a Smalltalkian way (and it better be) there 
would be no problem. self is in a sense a reserved word in an OO-based 
model-building library. I would worry about any application trying to 
use two model-building hierarchies. Put another way, I would /not/ worry 
about whether Cells worked in such a beast.

>>CLers are very practical people, you know.
> 
> 
> This from a guy who's writing a GUI on top of OpenGL?  I'm not quite ready
> to concede the practicality of that, even though its coolness is not in
> doubt.

Yikes. You think OpenGL is an impractical way of achieving portability?! 
Or did you mean that it has been a wicked learning curve for me: FFI, 
OpenGL, Glut, and that I was lucky to find a nice font lib for OpenGL. 
Or that now I have to also provide a friendly layer for programmers, so 
/they/ do not have to fight it out with OpenGL? Or...?


kt


-- 
Home? http://tilton-technology.com
Cells? http://www.common-lisp.net/project/cells/
Cello? http://www.common-lisp.net/project/cello/
Why Lisp? http://alu.cliki.net/RtL%20Highlight%20Film
Your Project Here! http://alu.cliki.net/Industry%20Application
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <zShfc.8080$zj3.7653@newsread3.news.atl.earthlink.net>
Kenny Tilton wrote:

>
> Anton van Straaten wrote:
> > This from a guy who's writing a GUI on top of OpenGL?  I'm not quite
ready
> > to concede the practicality of that, even though its coolness is not in
> > doubt.
>
> Yikes. You think OpenGL is an impractical way of achieving portability?!
> Or did you mean that it has been a wicked learning curve for me: FFI,
> OpenGL, Glut, and that I was lucky to find a nice font lib for OpenGL.
> Or that now I have to also provide a friendly layer for programmers, so
> /they/ do not have to fight it out with OpenGL? Or...?

Yes.  :)

Anton
From: Kenny Tilton
Subject: ImpractiCello [was Re: scheme seems neater]
Date: 
Message-ID: <Rpnfc.39986$WA4.4873@twister.nyc.rr.com>
Anton van Straaten wrote:
> Kenny Tilton wrote:
> 
> 
>>Anton van Straaten wrote:
>>
>>>This from a guy who's writing a GUI on top of OpenGL?  I'm not quite
> 
> ready
> 
>>>to concede the practicality of that, even though its coolness is not in
>>>doubt.
>>
>>Yikes. You think OpenGL is an impractical way of achieving portability?!
>>Or did you mean that it has been a wicked learning curve for me: FFI,
>>OpenGL, Glut, and that I was lucky to find a nice font lib for OpenGL.
>>Or that now I have to also provide a friendly layer for programmers, so
>>/they/ do not have to fight it out with OpenGL? Or...?
> 
> 
> Yes.  :)

OpenGL is not portable? Mind you, I have seen the hardcore ogl people 
sweating over which card with which capabilities was installed, but a 
GUI can be done with a pretty small and predictable subset. I hope.

Aside from that, it all depends on where one wants to put their energy. 
I could put my effort into interfacing to gtk (and then I still have to 
do the FFI) and save myself some trouble, but then the framework is not 
in Lisp where users can get to it to customize or debug. So perhaps such 
a project gets off to a quicker start, but then one pays a price 
forever. I would rather work hard for a few months than have to live 
with gtk forever. The glut was a nice compromise in that it provided a 
quick win and can be redone as pure Lisp if the project otherwise succeeds.

As for providing a friendly api for those not strong on opengl, that is 
how I work even if I am the only one using the code. Anything hard ends 
up behind a friendly face. So that is no extra hardship.

kt

-- 
Home? http://tilton-technology.com
Cells? http://www.common-lisp.net/project/cells/
Cello? http://www.common-lisp.net/project/cello/
Why Lisp? http://alu.cliki.net/RtL%20Highlight%20Film
Your Project Here! http://alu.cliki.net/Industry%20Application
From: Bulent Murtezaoglu
Subject: Re: ImpractiCello
Date: 
Message-ID: <87oept2m6o.fsf@cubx.internal>
>>>>> "KT" == Kenny Tilton <·······@nyc.rr.com> writes:
[...]
    KT> OpenGL is not portable? Mind you, I have seen the hardcore ogl
    KT> people sweating over which card with which capabilities was
    KT> installed, but a GUI can be done with a pretty small and
    KT> predictable subset. I hope. [...]

Based on my limited experience I can tell that if you stick with OGL
1.1 and read the spec for what is required you should be OK.  The
people who sweat over capabilities are either doing fancy stuff or
they belong to the "try and see, if it works you did the right thing"
school.  The latter fails often when you move from SW-only to accelerated 
GL drivers as some of the specified limits and such are ignored in SW 
implementations but exist in HW.  (going the other way you lose performance 
of course).

cheers,

BM
From: Anton van Straaten
Subject: Re: ImpractiCello [was Re: scheme seems neater]
Date: 
Message-ID: <yfCfc.9945$zj3.7781@newsread3.news.atl.earthlink.net>
Kenny Tilton wrote:

> >>Yikes. You think OpenGL is an impractical way of achieving portability?!
> >>Or did you mean that it has been a wicked learning curve for me: FFI,
> >>OpenGL, Glut, and that I was lucky to find a nice font lib for OpenGL.
> >>Or that now I have to also provide a friendly layer for programmers, so
> >>/they/ do not have to fight it out with OpenGL? Or...?
> >
> >
> > Yes.  :)
>
> OpenGL is not portable? Mind you, I have seen the hardcore ogl people
> sweating over which card with which capabilities was installed, but a
> GUI can be done with a pretty small and predictable subset. I hope.
>
> Aside from that, it all depends on where one wants to put their energy.
> I could put my effort into interfacing to gtk (and then I still have to
> do the FFI) and save myself some trouble, but then the framework is not
> in Lisp where users can get to it to customize or debug. So perhaps such
> a project gets off to a quicker start, but then one pays a price
> forever. I would rather work hard for a few months than have to live
> with gtk forever. The glut was a nice compromise in that it provided a
> quick win and can be redone as pure Lisp if the project otherwise
succeeds.
>
> As for providing a friendly api for those not strong on opengl, that is
> how I work even if I am the only one using the code. Anything hard ends
> up behind a friendly face. So that is no extra hardship.

I think I should give you some more rope and see what other problems you
mention.  :)

No, really, all I meant is that writing any new GUI from scratch (even with
what OpenGL gives you) seems ambitious, and not necessarily what most people
would consider "practical".  To some extent, whether something is practical
depends on how well it succeeds - in hindsight, it may seem practical, where
at first, few other than the people doing it may think it's practical.

Anton
From: Kenny Tilton
Subject: Re: ImpractiCello [was Re: scheme seems neater]
Date: 
Message-ID: <QNDfc.40057$WA4.36350@twister.nyc.rr.com>
Anton van Straaten wrote:
> Kenny Tilton wrote:
> 
> 
>>>>Yikes. You think OpenGL is an impractical way of achieving portability?!
>>>>Or did you mean that it has been a wicked learning curve for me: FFI,
>>>>OpenGL, Glut, and that I was lucky to find a nice font lib for OpenGL.
>>>>Or that now I have to also provide a friendly layer for programmers, so
>>>>/they/ do not have to fight it out with OpenGL? Or...?
>>>
>>>
>>>Yes.  :)
>>
>>OpenGL is not portable? Mind you, I have seen the hardcore ogl people
>>sweating over which card with which capabilities was installed, but a
>>GUI can be done with a pretty small and predictable subset. I hope.
>>
>>Aside from that, it all depends on where one wants to put their energy.
>>I could put my effort into interfacing to gtk (and then I still have to
>>do the FFI) and save myself some trouble, but then the framework is not
>>in Lisp where users can get to it to customize or debug. So perhaps such
>>a project gets off to a quicker start, but then one pays a price
>>forever. I would rather work hard for a few months than have to live
>>with gtk forever. The glut was a nice compromise in that it provided a
>>quick win and can be redone as pure Lisp if the project otherwise
> 
> succeeds.
> 
>>As for providing a friendly api for those not strong on opengl, that is
>>how I work even if I am the only one using the code. Anything hard ends
>>up behind a friendly face. So that is no extra hardship.
> 
> 
> I think I should give you some more rope and see what other problems you
> mention.  :)

? I mentioned that the first time. But I am sure I could think of more 
work required if you think anything requiring work is impractical. :)

> 
> No, really, all I meant is that writing any new GUI from scratch (even with
> what OpenGL gives you) seems ambitious, ...

Oh, I see. Well, this is my third GUI, so it's not from scratch, it's 
more of a port. And you forget (time to resume the saturation 
marketing?), I have Cells.

> would consider "practical".  To some extent, whether something is practical
> depends on how well it succeeds - in hindsight, it may seem practical, where
> at first, few other than the people doing it may think it's practical.

Yeah, at first I felt like I was on a wild goose chase. But with the 
thing running under two lisps and two OSes, one of which is a kissing 
cousin of Mac OS X, I think we can already say, "Why didn't anyone think 
of this before?".

kt

-- 
Home? http://tilton-technology.com
Cells? http://www.common-lisp.net/project/cells/
Cello? http://www.common-lisp.net/project/cello/
Why Lisp? http://alu.cliki.net/RtL%20Highlight%20Film
Your Project Here! http://alu.cliki.net/Industry%20Application
From: Anton van Straaten
Subject: Re: ImpractiCello [was Re: scheme seems neater]
Date: 
Message-ID: <gvEfc.10214$zj3.2074@newsread3.news.atl.earthlink.net>
Kenny Tilton wrote:
>
> > I think I should give you some more rope and see what other problems you
> > mention.  :)
>
> ? I mentioned that the first time. But I am sure I could think of more
> work required if you think anything requiring work is impractical. :)

I was just amused by the detail that my one-word reply elicited.  ;)   As
for requiring work, isn't practicality almost by definition doing something
with the least possible work - or is that laziness, I forget?

> > No, really, all I meant is that writing any new GUI from scratch (even
with
> > what OpenGL gives you) seems ambitious, ...
>
> Oh, I see. Well, this is my third GUI, so it's not from scratch, it's
> more of a port. And you forget (time to resume the saturation
> marketing?), I have Cells.

That's different - I didn't know this was a case of "third time's the
charm"!

> > would consider "practical".  To some extent, whether something is
practical
> > depends on how well it succeeds - in hindsight, it may seem practical,
where
> > at first, few other than the people doing it may think it's practical.
>
> Yeah, at first I felt like I was on a wild goose chase. But with the
> thing running under two lisps and two OSes, one of which is a kissing
> cousin of Mac OS X, I think we can already say, "Why didn't anyone think
> of this before?".

Must be because no-one else is as practical as you?

Anton
From: Kenny Tilton
Subject: Re: ImpractiCello [was Re: scheme seems neater]
Date: 
Message-ID: <b9Ffc.40065$WA4.31276@twister.nyc.rr.com>
Anton van Straaten wrote:

> Kenny Tilton wrote:
> 
>>>I think I should give you some more rope and see what other problems you
>>>mention.  :)
>>
>>? I mentioned that the first time. But I am sure I could think of more
>>work required if you think anything requiring work is impractical. :)
> 
> 
> I was just amused by the detail that my one-word reply elicited.  ;)   As
> for requiring work, isn't practicality almost by definition doing something
> with the least possible work - or is that laziness, I forget?

Well, I already answered this one, too. I prefer working hard for a few 
months to get to a Lisp-all-the-way-down solution to forever dancing to 
the tune (and being unable to debug) a C++ library. And now you know 
that I am not starting from scratch, which really does change the equation.

> 
> That's different - I didn't know this was a case of "third time's the
> charm"!

I guess you haven't been to my web site to check out the screenshots 
from the Cello precursors. The first two times were charms as well, they 
just weren't portable. So the key was breaking out of the OS trap, and 
that meant either building atop someone else's work of creating a 
portable layer (and getting stuck with a C framework) or... OpenGL. btw, 
I did look for a nice low-level "C" library that would not force an 
application framework on me, I just did not find one. I also asked about 
CLIM as a candidate substrate. Google for the Lisp body count from those 
discussions.

You know, I am surprised to hear a go-it-alone "I can do it all with 
lambda calculus" Schemer picking on me for rolling my own. I thought 
starting from nothing was a virtue in your camp. You guys won't even let 
an object system into your language. How practical is that?

:)

kt

-- 
Home? http://tilton-technology.com
Cells? http://www.common-lisp.net/project/cells/
Cello? http://www.common-lisp.net/project/cello/
Why Lisp? http://alu.cliki.net/RtL%20Highlight%20Film
Your Project Here! http://alu.cliki.net/Industry%20Application
From: Anton van Straaten
Subject: Re: ImpractiCello [was Re: scheme seems neater]
Date: 
Message-ID: <UCGfc.10410$zj3.5102@newsread3.news.atl.earthlink.net>
Kenny Tilton wrote:

> I guess you haven't been to my web site to check out the screenshots
> from the Cello precursors.

No, I've only seen some of the OpenGL shots.  And I downloaded Cells and a
pdf or two, last year.

> The first two times were charms as well, they
> just weren't portable. So the key was breaking out of the OS trap, and
> that meant either building atop someone else's work of creating a
> portable layer (and getting stuck with a C framework) or... OpenGL. btw,
> I did look for a nice low-level "C" library that would not force an
> application framework on me, I just did not find one.

You make it sound so rational...

> You know, I am surprised to hear a go-it-alone "I can do it all with
> lambda calculus" Schemer picking on me for rolling my own. I thought
> starting from nothing was a virtue in your camp. You guys won't even let
> an object system into your language. How practical is that?

That's the whole point!  If you won't call me practical, why should I go
easy on you when you start heading full tilt for the windmill??

Anton
From: Kenny Tilton
Subject: Lisp seems more practical [was Re: ImpractiCello [was Re: scheme seems neater]]
Date: 
Message-ID: <SgKfc.40132$WA4.26198@twister.nyc.rr.com>
Anton van Straaten wrote:

> Kenny Tilton wrote:
> 
> 
>>I guess you haven't been to my web site to check out the screenshots
>>from the Cello precursors.
> 
> 
> No, I've only seen some of the OpenGL shots.  And I downloaded Cells and a
> pdf or two, last year.

Look, if you people are not going to study up on my work, how do you 
expect to make any progress?!

One thing you'll find in there is the Story of the Makeovers, in which I 
was able to produce a completely new GUI for the CliniSys application in 
  about a week. Cells (and the GUIs that love them) rock.

> 
> You make it sound so rational...

<checks wallet>

> 
> 
>>You know, I am surprised to hear a go-it-alone "I can do it all with
>>lambda calculus" Schemer picking on me for rolling my own. I thought
>>starting from nothing was a virtue in your camp. You guys won't even let
>>an object system into your language. How practical is that?
> 
> 
> That's the whole point!  If you won't call me practical, why should I go
> easy on you when you start heading full tilt for the windmill??

The key is to be practical about being practical. A gtk wrapper would be 
wrong, and re-inventing OpenGL (the Schemer's approach) would be wrong.

Portacello3 was resurrected tonight on Lispworks, will be released 
tomorrow. Somewhat cleaner install for (+or (+and :win32 (+or :allegro 
:lispworks)) (+and :allegro :linux)), but still not a release for the 
faint of heart.

kt


-- 
Home? http://tilton-technology.com
Cells? http://www.common-lisp.net/project/cells/
Cello? http://www.common-lisp.net/project/cello/
Why Lisp? http://alu.cliki.net/RtL%20Highlight%20Film
Your Project Here! http://alu.cliki.net/Industry%20Application
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4smf7z03e.fsf@tarn.inf.ed.ac.uk>
Cameron MacKinnon <··········@clearspot.net> writes:

> But, oddly enough, it seems that classical Lisp has stopped evolving
> with CL, while Scheme continues to mutate.

What are these new developments in Scheme?

-- jd
From: Ray Dillinger
Subject: Re: scheme seems neater
Date: 
Message-ID: <407CF931.AA77E290@sonic.net>
Jeff Dalton wrote:
> 
> Cameron MacKinnon <··········@clearspot.net> writes:
> 
> > But, oddly enough, it seems that classical Lisp has stopped evolving
> > with CL, while Scheme continues to mutate.
> 
> What are these new developments in Scheme?
> 

Well, I hear there's a committee working on R6RS these days; 
lots of new libraries and things have been developed and 
placed in the scheme repository and in the SRFI's.  A lot 
more scheme code is portable now thanks to the SRFI system. 

There's a .net port now for people who are into that.  
Several implementations have standard bindings to the entire 
library of Java and/or compile to JVM bytecodes.  Scheme 
also has a number of free compilers that compile to native 
machine code (mostly by way of C, but in a few cases 
directly).

And people are experimenting....  One thing that's undergoing
some experimentation these days are character sets.  People 
have been floating all kinds of proposals, including NFC as 
character set (where char->integer may return a bignum if the 
character is actually several codepoints), 16 and 24-bit unicode
as character sets, naming rules, escaping rules, etc.  R6RS is 
expected to change the rules about character handling in a way
that allows unicode's frankly bizarre case mappings to be 
legal.  There are more divided opinions than there were a 
couple years ago on whether case-sensitivity in identifiers 
is a good or bad thing.  

A few implementations now allow dynamic user control of the 
precision of inexact reals and complex numbers; For example, 
you can say you want 512 bits of mantissa, and get it.  I 
don't recall anybody having this as recently as 5 years 
past. 

Hmmm, I've probably missed a bunch of stuff.  But off the 
top of my head, those are major points of scheme developing
some new facilities.

				Bear
From: Matthias
Subject: Re: scheme seems neater
Date: 
Message-ID: <36wad1ej22y.fsf@hundertwasser.ti.uni-mannheim.de>
Ray Dillinger <····@sonic.net> writes:

> Jeff Dalton wrote:
> > 
> > Cameron MacKinnon <··········@clearspot.net> writes:
> > 
> > > But, oddly enough, it seems that classical Lisp has stopped evolving
> > > with CL, while Scheme continues to mutate.
> > 
> > What are these new developments in Scheme?
[...]
> Hmmm, I've probably missed a bunch of stuff.  But off the 
> top of my head, those are major points of scheme developing
> some new facilities.

One of the interesting developments I saw & played with recently is a
different way to do FFI:
http://www.call-with-current-continuation.org/manual/manual-Z-H-49.html#%_sec_6.7
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <ofXec.6351$zj3.1913@newsread3.news.atl.earthlink.net>
Frode Vatvedt Fjeld wrote:

> To me, the way scheme attacks the problem of unintended variable
> capture looks like some hypothetical approach to solve the "problem"
> of unintended divisions by zeros

It must indeed appear that way, if you believe that "attacking the problem
of unintended variable capture" is the primary goal.  But in fact, what a
system like syntax-case offers is a consistent way to deal with all
syntactic transformation in the language, so that language implementations
become nothing but a set of macro definitions on top of a small core
language.

If you think of syntax-case as a generalized abstraction of an extensible
front-end compiler, you'd be closer to the mark.  Although you could build
compilers on top of defmacro, when it comes to building compilers, there are
reasons to prefer a system which provides some guarantees about referential
transparency.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4brlvywt7.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Frode Vatvedt Fjeld wrote:
> 
> > To me, the way scheme attacks the problem of unintended variable
> > capture looks like some hypothetical approach to solve the "problem"
> > of unintended divisions by zeros
> 
> It must indeed appear that way, if you believe that "attacking the problem
> of unintended variable capture" is the primary goal.

It is the primary goal, regardless of whether you describe it
as "attacking the problem of unintended variable capture" or as
"referential transparency".  It's what distinguishes hygienic systems
from non-hygienic ones and supposedly makes them better than defmacro.

>  But in fact, what a
> system like syntax-case offers is a consistent way to deal with all
> syntactic transformation in the language, so that language implementations
> become nothing but a set of macro definitions on top of a small core
> language.

Defmacro is also a consistent way to deal with all syntactic
transformation in the language, so that language implementations
become nothing but a set of macro definitions on top of a small
core language.

Moreover, Scheme was nothing but a set of macro definitions on top of
a small core language before syntax-case and friends were invented.

(Really there ought to be some data structures in there too.)

> If you think of syntax-case as a generalized abstraction of an extensible
> front-end compiler, you'd be closer to the mark.  Although you could build
> compilers on top of defmacro, when it comes to building compilers, there are
> reasons to prefer a system which provides some guarantees about referential
> transparency.

There we go: back to referential transparency as the reason to prefer.
That is what's supposedly the big deal.  I don't understand why you
keep denying it.

Anyway, there are reasons to prefer writing compilers in Lisp rather
than in some other language.  Sure, syntax-case lets you get out of
the other language and back into Lisp, but then you're writing in
Lisp.

It's easier to just work directly in Lisp in the first place.

I've written a number of compiler-like macros, and I'd much rather
do it that way than using syntax-case.

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <%F6fc.6813$zj3.1353@newsread3.news.atl.earthlink.net>
Jeff Dalton wrote:

> "Anton van Straaten" <·····@appsolutions.com> writes:
>
> > Frode Vatvedt Fjeld wrote:
> >
> > > To me, the way scheme attacks the problem of unintended variable
> > > capture looks like some hypothetical approach to solve the "problem"
> > > of unintended divisions by zeros
> >
> > It must indeed appear that way, if you believe that "attacking the
problem
> > of unintended variable capture" is the primary goal.
>
> It is the primary goal, regardless of whether you describe it
> as "attacking the problem of unintended variable capture" or as
> "referential transparency".  It's what distinguishes hygienic systems
> from non-hygienic ones and supposedly makes them better than defmacro.

When you deal with lambdas in Lisp, do you talk about accidental variable
capture related to the lexical variables they define?  Since it doesn't
happen, I presume not much.  Do you think that the lexically scoped lambda's
main contribution should be described as "attacking the problem of
unintended variable capture"?  Perhaps one might do that, given the funarg
problem history, but even so, does that description adequately capture what
the lexically-scoped lambda offers, in your mind?

> >  But in fact, what a
> > system like syntax-case offers is a consistent way to deal with all
> > syntactic transformation in the language, so that language
implementations
> > become nothing but a set of macro definitions on top of a small core
> > language.
>
> Defmacro is also a consistent way to deal with all syntactic
> transformation in the language, so that language implementations
> become nothing but a set of macro definitions on top of a small
> core language.

My understanding is that Lisp implementations don't do this to the same
extent.  I've only ever looked at CMUCL and CLISP, though.

Regarding the issue of wanting full Lisp during the macro expansion phase:
for efficient separate compilation, there are good reasons *not* to want
full Lisp.  That kind of efficiency may not matter in all cases, but it's a
consideration nevertheless.

> > If you think of syntax-case as a generalized abstraction of an
extensible
> > front-end compiler, you'd be closer to the mark.  Although you could
build
> > compilers on top of defmacro, when it comes to building compilers, there
are
> > reasons to prefer a system which provides some guarantees about
referential
> > transparency.
>
> There we go: back to referential transparency as the reason to prefer.
> That is what's supposedly the big deal.  I don't understand why you
> keep denying it.

I don't deny that referential transparency is a big deal.  The problem I see
relates to the conclusions people seem to draw from statements like "attacks
the problem of unintended variable capture" and terms like "hygiene".  I
think they end up missing the point, and comments I see here confirm that.

Apparently you don't actually miss that point, because you've said you'd
like to be able to use hygienic macros in CL.  Afaict by the statements of
others here, you're an exception in that respect.

Regarding the syntax-rules implementation for CL, the author comments on the
hygiene question here:
http://groups.google.com/groups?selm=9l8qvu%245m%241%40news.gte.com
He seems to think hygiene would not be that difficult to achieve.  However,
he makes the distinction, common in Scheme at least, between macro hygiene
and referential transparency, and says the latter wouldn't be possible with
his design.

We haven't really been following that distinction in our discussion, but
it's worth mentioning:  hygiene has to do with avoiding having the macro
capture variables referenced in the macro arguments, and referential
transparency has to do with free variables in macro definitions being
associated with bindings in the macro definition context.  The subtlety of
this distinction is yet another reason to talk about broader and ultimately
more important issues, like preservation of lexical scoping, unless you're
discussing the macro implementation.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4y8oyxvth.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Jeff Dalton wrote:

> > It is the primary goal, regardless of whether you describe it
> > as "attacking the problem of unintended variable capture" or as
> > "referential transparency".  It's what distinguishes hygienic systems
> > from non-hygienic ones and supposedly makes them better than defmacro.

> When you deal with lambdas in Lisp, do you talk about accidental variable
> capture related to the lexical variables they define?

If someone asks "why lexical scoping?" or "why doesn't Common Lisp
use the same scoping rules as earlier Lisps?" or "how is lexical
scoping better?" or "what problems does lexical scoping solve, or
is it a solution in search of a problem?" or "what's so good about
lexical scoping?", then such issues naturally come up.

> Since it doesn't
> happen, I presume not much.  Do you think that the lexically scoped lambda's
> main contribution should be described as "attacking the problem of
> unintended variable capture"?  Perhaps one might do that, given the funarg
> problem history, but even so, does that description adequately capture what
> the lexically-scoped lambda offers, in your mind?

When lexical scoping was came along in Scheme, there was much talk
of referential transparency and so on.

Now that the capture problems have been solved, and people have
lived in the post-solution world, the problems have receded in
people's minds; but if we want to understand why something was
done, we have to go back to they thought at the time.

Morover, there are now a number of different hygienic systems.
The focus naturally moved to the differences between those systems,
and their relative advantages and disadvantages.  They have
also changed people's thinking in other ways.

Anyway, not all hygienic macro system have a rule language,
and not all have a nice ADT that programmers can use.
If we want to understand "why hygienic macros?" we have to
look at what they have in common (though not only at that).

> > Defmacro is also a consistent way to deal with all syntactic
> > transformation in the language, so that language implementations
> > become nothing but a set of macro definitions on top of a small
> > core language.
> 
> My understanding is that Lisp implementations don't do this to the same
> extent.  I've only ever looked at CMUCL and CLISP, though.

Defmacro doesn't have to be used as a consistent way etc.
Neither does syntax-case.

But the idea of a core plus macros was present in CL.
There was a relatively small set of (24) "special forms"
and no way to define more, only macros.  Implementations
often "cheated" in various ways, unforunately, making it
difficult to write portable code-walkers, among other things.
Nonetheless, that was one of the selling points.

(I'll say something about the rest later.)

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <QkEfc.10202$zj3.6456@newsread3.news.atl.earthlink.net>
Jeff Dalton wrote:
> "Anton van Straaten" <·····@appsolutions.com> writes:
>
> > Jeff Dalton wrote:
>
> > > It is the primary goal, regardless of whether you describe it
> > > as "attacking the problem of unintended variable capture" or as
> > > "referential transparency".  It's what distinguishes hygienic systems
> > > from non-hygienic ones and supposedly makes them better than defmacro.
>
> > When you deal with lambdas in Lisp, do you talk about accidental
variable
> > capture related to the lexical variables they define?
>
> If someone asks "why lexical scoping?" or "why doesn't Common Lisp
> use the same scoping rules as earlier Lisps?" or "how is lexical
> scoping better?" or "what problems does lexical scoping solve, or
> is it a solution in search of a problem?" or "what's so good about
> lexical scoping?", then such issues naturally come up.

I agree, and I've found it impossible to avoid them myself here in this
thread.  But still, I see it as a focus on mechanism, rather than what the
mechanism allows you to achieve.

An analogy I consider reasonably close is that programmers new to
object-orientation will often listen to an explanation of some OO system and
then come up with some variety of a question like "so it's just a hash table
of functions, then?"  Which, while it may be technically accurate as far as
it goes, is nevertheless a bit too narrow a view to fully explain the
utility of the system.

Something similar seems to happen with the question of hygiene, both in the
respect that lexical scoping has implications that go beyond "preventing
unintended variable capture", and in the sense that "hygienic" macro systems
also have a collection of other features which go beyond the hygiene issue.

> Anyway, not all hygienic macro system have a rule language,
> and not all have a nice ADT that programmers can use.
> If we want to understand "why hygienic macros?" we have to
> look at what they have in common (though not only at that).

Yes, but I think you have to be careful to avoid being too reductionist
about it.  If you're going to reduce the description of a system down to a
sentence, you should be aware of what's being thrown out to achieve that,
and you can't extrapolate back from that sentence to the original system,
without putting back in the stuff you threw out.

> But the idea of a core plus macros was present in CL.
> There was a relatively small set of (24) "special forms"
> and no way to define more, only macros.  Implementations
> often "cheated" in various ways, unforunately, making it
> difficult to write portable code-walkers, among other things.
> Nonetheless, that was one of the selling points.

Is there a list of the 24 somewhere?  The "CL core spec" would be something
I'd find interesting, but it seems rather buried within the larger standards
and references.  Syntax-cases uses about 7 core forms: function application,
LAMBDA, BEGIN, IF, DEFINE, SET!, QUOTE; plus, of course, constant & variable
references.

Anton
From: Rob Warnock
Subject: Re: scheme seems neater
Date: 
Message-ID: <x9WdndFV_tRTS-LdRVn-iQ@speakeasy.net>
Anton van Straaten <·····@appsolutions.com> wrote:
+---------------
| Jeff Dalton wrote:
| > But the idea of a core plus macros was present in CL.
| > There was a relatively small set of (24) "special forms"
| > and no way to define more, only macros.  ...
| 
| Is there a list of the 24 somewhere?  The "CL core spec" would be
| something I'd find interesting, but it seems rather buried within
| the larger standards and references. 
+---------------

Barely two weeks ago, someone posed the question "Are there any
'special operators' in Lisp?" (as opposed to macros), to which I
posted an answer ["Yes; and no, not necessarily (but maybe)"].
The key references were the CLHS "3.1.2.1.2.1 Special Forms",
which has a list of 25 (not 24) special operators. But...

CLHS "3.1.2.1.2.2 Macro Forms" also says:

    An implementation is free to implement a Common Lisp special
    operator as a macro.

And conversely:

    An implementation is free to implement any macro operator as a
    special operator, but only if an equivalent definition of the
    macro is also provided.


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Pascal Costanza
Subject: Re: scheme seems neater
Date: 
Message-ID: <c5oqn2$ln0$1@f1node01.rhrz.uni-bonn.de>
Anton van Straaten wrote:

> Is there a list of the 24 somewhere?  The "CL core spec" would be something
> I'd find interesting, but it seems rather buried within the larger standards
> and references.  Syntax-cases uses about 7 core forms: function application,
> LAMBDA, BEGIN, IF, DEFINE, SET!, QUOTE; plus, of course, constant & variable
> references.

(do-external-symbols (symbol 'common-lisp)
   (when (special-operator-p symbol)
     (print symbol)))

I don't if this helps in this context, but there is also a paper by 
Henry Baker about Common Lisp special operators at 
http://home.pipeline.com/~hbaker1/MetaCircular.html


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Rob Warnock
Subject: Re: scheme seems neater
Date: 
Message-ID: <CfOdnS6lFIQ4-h_d4p2dnA@speakeasy.net>
Pascal Costanza  <········@web.de> wrote:
+---------------
| Anton van Straaten wrote:
| > Is there a list of the 24 somewhere?
| 
| (do-external-symbols (symbol 'common-lisp)
|    (when (special-operator-p symbol)
|      (print symbol)))
+---------------

(*sigh*) This will show what *ONE IMPLEMENTATION* chose to do,
not what the ANSI Standard specifies. As I've posted several times
recently, the relevant passages --  CLHS "3.1.2.1.2.1 Special Forms"
and CLHS "3.1.2.1.2.2 Macro Forms" -- give considerable latitude to
implementations to choose which are "really" special operators and
which are "just" macros.

And anyway, it's 25, not 24.  ;-}

+---------------
| I don't if this helps in this context, but there is also a paper by 
| Henry Baker about Common Lisp special operators at 
| http://home.pipeline.com/~hbaker1/MetaCircular.html
+---------------

While that's certainly an interesting paper (and perhaps even useful to
an implementor), it doesn't cover all of the special forms in the CLHS.


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Jeff Dalton
Subject: How to think about Scheme macros, was Re: scheme seems neater
Date: 
Message-ID: <fx4r7uqxtwl.fsf_-_@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> > There we go: back to referential transparency as the reason to prefer.
> > That is what's supposedly the big deal.  I don't understand why you
> > keep denying it.

> I don't deny that referential transparency is a big deal.  The
> problem I see relates to the conclusions people seem to draw from
> statements like "attacks the problem of unintended variable capture"
> and terms like "hygiene".  I think they end up missing the point,
> and comments I see here confirm that.

Ok.  I think that is the more interesting part of the discussion.
How should we think about Scheme macros *now*?

I appreciate the efforts you've made, though some things have
not yet been as clearly explained as they perhaps might be.

For example, re hygiene vs other notions, and thinking back to some
things said earlier, this is helpful:

> Regarding the syntax-rules implementation for CL, the author
> comments on the hygiene question here:

> http://groups.google.com/groups?selm=9l8qvu%245m%241%40news.gte.com
> He seems to think hygiene would not be that difficult to achieve.  However,
> he makes the distinction, common in Scheme at least, between macro hygiene
> and referential transparency, and says the latter wouldn't be possible with
> his design.

BTW, I agree with what (I think) Dave Bakhash is quoted as saying
in that article:

  I have not used those macros, but I am also not too eager to try.
  The description says that they fail in one of the major pluses of
  'define-syntax' and relatives, because they are not hygienic
  (i.e. they do not avoid variable capturing).

  (I have heard of that emulation before, and thought that they would
  be hygienic, so I was somewhat disappointed.)

I had already ported a similar rule language to CL.

It was nice for some simple macros, but that was about it.

I definitely didn't want to use it for anything complex,
because it looked like it would be harder to keep track of
what was what and to ensure that names referred to the right
things than if I was using defmacro.

> We haven't really been following that distinction in our discussion, but
> it's worth mentioning:  hygiene has to do with avoiding having the macro
> capture variables referenced in the macro arguments, and referential
> transparency has to do with free variables in macro definitions being
> associated with bindings in the macro definition context.  The subtlety of
> this distinction is yet another reason to talk about broader and ultimately
> more important issues, like preservation of lexical scoping, unless you're
> discussing the macro implementation.

It is useful to have the distinction set out.

Still, they are related issues, and I think it makes sense to
apply "hgyiene" to the lot as a shorthand.

Or is there another word you might suggest?

In any case, if you want to call the whole issue cluster "preservation
of lexical scoping", and to treat my comments on "hygiene" as about
"preservation of lexical scoping" instead, than I don't think that
would affect the substance of most of what I've said.

In any case, I think that *by far* the best thing anyone could
to do increase the understanding of Scheme macros in the CL world
would be to provide an implementation for CL -- one that had
both the rules and a lower level and that handled both hygiene and
referential transparency.  (Hygiene is probably more important.)

It needn't handle all of CL, but it should be clear enough how
it could be extended.

I don't think it works to say "you can try them in Scheme".

The possibility of trying them in Scheme has been there for
years, and it hasn't made much difference.

In a separate message, re perceived comlexity you say:

  It's not that easy to understand how most compilers work.
  Syntax-case is a compiler.

  ...

  How well do "many" understand the internal workings of their
  compiler?  I think many people have a reasonable mental model of it,
  but don't know all the details.  The same goes for syntax-case

There are problems with that analogy.

* There are also interpreters.
* They don't understand all the details fancy optimizations, but
  they do understand compilers well enough.
* They do not have a similar "reasonable mental mode" of
  how Scheme macros work.

Also this, even better from my point of view:

  The implementation of any system which implements lexical scoping is
  much harder to understand than the implementation of defmacro.  I
  don't see that stopping people from using LAMBDA

The implementation of lexical scoping in Lisp is very easy
to understand.

Anyone who can understand the simple interpreter in the Lisp 1.5
book could understand the version that implements lexical scoping.

It's even easy to implement it yourself by writing a simple
interpreter.

None of that is true of Scheme macros.

Perhaps *some* people find the workings of Scheme macros intuitive
and easy to understand, but in my experience that is not a typical
reaction.

Syntactic closures came the closest to being intuitive, but
there was some problem with combining them with the rule
language that makes the whole thing unclear again.

Perhaps people who use other languages don't care about
understanding in this implementational sense, but Lisp
programmers do care.

-- jd
From: Anton van Straaten
Subject: Re: How to think about Scheme macros, was Re: scheme seems neater
Date: 
Message-ID: <2lEfc.10203$zj3.367@newsread3.news.atl.earthlink.net>
Jeff Dalton wrote:

> "Anton van Straaten" <·····@appsolutions.com> writes:
...
> > I don't deny that referential transparency is a big deal.  The
> > problem I see relates to the conclusions people seem to draw from
> > statements like "attacks the problem of unintended variable capture"
> > and terms like "hygiene".  I think they end up missing the point,
> > and comments I see here confirm that.
>
> Ok.  I think that is the more interesting part of the discussion.
> How should we think about Scheme macros *now*?

Good question.  If I could answer it in a few sentences, I would have done
that already.  The main thing that I think anyone interested should take
from what I'm saying is that there are some interesting and useful ideas
there.  Having used all three of the systems we've discussed, I don't find
it possible to pick one and say that I'm willing to completely forgo the
features the others give.

Given syntax-case, syntax-rules is theoretically redundant, but when all you
need is a rule-based purely hygienic macro, using syntax-rules instead of
syntax-case communicates that requirement concisely, and makes the macro a
little smaller.

If forced to choose between defmacro and syntax-case, I'd give up defmacro
in a heartbeat, even though I do recognize its benefits in terms of
simplicity and conciseness in many circumstances.

If I were a language implementor, I'd do exactly what many of them have
done, which is implement defmacro in terms of syntax-case.  Regardless of
the explanations of why that's easy, that's still a selling point of
syntax-case, to me.

Implicit in saying that I can't pick a single system is that none of them
are perfect.  But that's all the more reason to want some choice.

> BTW, I agree with what (I think) Dave Bakhash is quoted as saying
> in that article:
>
>   I have not used those macros, but I am also not too eager to try.
>   The description says that they fail in one of the major pluses of
>   'define-syntax' and relatives, because they are not hygienic
>   (i.e. they do not avoid variable capturing).
>
>   (I have heard of that emulation before, and thought that they would
>   be hygienic, so I was somewhat disappointed.)
>
> I had already ported a similar rule language to CL.
>
> It was nice for some simple macros, but that was about it.
>
> I definitely didn't want to use it for anything complex,
> because it looked like it would be harder to keep track of
> what was what and to ensure that names referred to the right
> things than if I was using defmacro.

I don't doubt it.  While a hygienic system doesn't have to be rule-based,
I'd guess that a purely rule-based system probably almost certainly has to
be hygienic.

> > We haven't really been following that distinction in our discussion, but
> > it's worth mentioning:  hygiene has to do with avoiding having the macro
> > capture variables referenced in the macro arguments, and referential
> > transparency has to do with free variables in macro definitions being
> > associated with bindings in the macro definition context.  The subtlety
of
> > this distinction is yet another reason to talk about broader and
ultimately
> > more important issues, like preservation of lexical scoping, unless
you're
> > discussing the macro implementation.
>
> It is useful to have the distinction set out.
>
> Still, they are related issues, and I think it makes sense to
> apply "hgyiene" to the lot as a shorthand.
>
> Or is there another word you might suggest?

Hmm... lexically intelligent macros?  :)  Sticking to the time-honored
academic tradition of picking names whose negation results in a pejorative,
like "hygiene" and "proper".  To be more PC, lexically-aware macros could
work.  Of course, calling them lexical still doesn't address their other
features.

But as you've gathered, I would emphasize lexical scope over hygiene,
because I think the implications of lexical scope with respect to ordinary
functions are better understood intuitively by most people, than the
implications of hygiene with respect to macros.  By "implications", I mean
that given lexical scope, you can program in useful ways that are more
difficult without lexical scope.  There are certain problems you don't have
to work around, or even really think about, and that's liberating.

> In any case, if you want to call the whole issue cluster "preservation
> of lexical scoping", and to treat my comments on "hygiene" as about
> "preservation of lexical scoping" instead, than I don't think that
> would affect the substance of most of what I've said.

OK.  I think I understood that for the most part, but I was misunderstanding
the point you were trying to make based on that (re the "it's a myth"
question).

> In any case, I think that *by far* the best thing anyone could
> to do increase the understanding of Scheme macros in the CL world
> would be to provide an implementation for CL -- one that had
> both the rules and a lower level and that handled both hygiene and
> referential transparency.  (Hygiene is probably more important.)
>
> It needn't handle all of CL, but it should be clear enough how
> it could be extended.
>
> I don't think it works to say "you can try them in Scheme".
>
> The possibility of trying them in Scheme has been there for
> years, and it hasn't made much difference.

I agree.

> The implementation of lexical scoping in Lisp is very easy
> to understand.
>
> Anyone who can understand the simple interpreter in the Lisp 1.5
> book could understand the version that implements lexical scoping.
>
> It's even easy to implement it yourself by writing a simple
> interpreter.
>
> None of that is true of Scheme macros.

An interesting point.  If you ignore "extraneous" features of Scheme macros
for the moment, and focus only on the hygiene-maintaining algorithm, then I
see two reasons for what you say:

Reason 1.  It's very easy to implement lambda calculus substitution
yourself, at least as easy as the Lisp 1.5 interpreter, and possibly easier.
Clive Barker does it in 10 lines of admittedly rather dense Scheme, at the
bottom of this page:
http://ling.ucsd.edu/~barker/Lambda/
...but it can of course be done more readably, in a few more lines.

Once you've written such a program yourself, you basically know how pure
hygienic macro expansion works.  The only reason more people don't know this
is that it hasn't been emphasized or communicated that much, outside of
formal CS courses.  The problem is not that it's difficult, the problem is
which approaches people are already familiar with, and what's mostly being
communicated outside of CS courses, and in the Lisp world.

Reason 2.  The selective breaking of hygiene does introduce more complexity.
However, with an understanding of purely hygienic macro expansion, breaking
hygiene doesn't require that much handwaving to understand.

> Perhaps *some* people find the workings of Scheme macros intuitive
> and easy to understand, but in my experience that is not a typical
> reaction.

I'm guessing you've seen Peter Housel's series of articles, "An introduction
to macro expansion algorithms", but in case not:
http://groups.google.com/groups?selm=01H1GHN8IG0Y91VY9G%40delphi.com
http://groups.google.com/groups?selm=01H1QCA2CU3M90P2IS%40delphi.com
http://groups.google.com/groups?selm=01H2GRCZLH5E90OG1N%40delphi.com
http://groups.google.com/groups?selm=01H4847MVN08937B1A%40delphi.com

Unfortunately, unless there are other parts around somewere, they end before
they get into the most recent stuff, but they do give a good explanation of
the technical evolution of hygienic macros.

> Perhaps people who use other languages don't care about
> understanding in this implementational sense, but Lisp
> programmers do care.

Everyone cares about some variety of this, because they have to: they have
to have some model of the system they're using, something which maps it onto
something else they already understand and perhaps see as more basic.  The
question is just what that other thing is.

For many programmers, it's a mental model of the machine, mediated by the OS
and internal language features: things like memory locations, stacks etc.
Lisp programmers tend by default to have a slightly different preferred
model, and in the case of syntax transformation, that naturally centers
entirely around lists.

This goes back to the point I made earlier, which is that I think the syntax
ADT is a prime suspect in making syntax-case seem foreign.  There's an Alan
Bawden paper called "Quasiquotation in Lisp" which essentially discusses the
"synergy between quasiquotation and S-expressions", particularly related to
"program-generating programs".  It's the loss of this synergy that's really
unfortunate for syntax-case, IMO.  People say things about how defmacro
matches the way they think, and I think this is why.  [However, I don't
think the solution is to say that syntax-case went down the wrong road: a
better solution would be to make Lisp/Scheme capable of handling other ADTs
as well as it does lists.]

Back to mental models: many Scheme programmers tend to use a model of their
language that's more closely connected to the lambda calculus, even if only
informally: to the notion of functional abstraction as a really basic and
low-level feature.  The anti-minimalists like to make fun of this, they
don't see any point in being minimal, etc.  Actually, there's a huge point:
if you can easily and naturally express other constructs in terms of lambda,
including things like control flow, it gives you enormous leverage in terms
of understanding new things.  You have a small silver bullet which works
well to model more complex things, and to help you understand the essence of
those things.

It's a tool for understanding and analyzing systems.  It's not the only one
you'll ever need, but it's a surprisingly widely applicable one - although
perhaps not so surprising, when its very basic nature is considered.  Lambda
calculus is not an arbitrary system, it's the most general form of what's
left when you strip away everything but the notion of giving names to
variables in an expression, which no useful computer language can escape.
It's such a simple concept that we teach it to high school students, as the
basis for algebra.

With this mental model - or even a watered-down version of it obtained
intuitively from using Scheme - hygienic macro expansion seems very natural,
because it *is* lambda calculus.  The rules are the same, and the
lexically-scoped result has the same properties.  What you call
"understanding in an implementational sense" is still present, it's only
that the preferred implementation model is different.

Anton
From: Pascal Costanza
Subject: Re: How to think about Scheme macros, was Re: scheme seems neater
Date: 
Message-ID: <c5oq85$ns2$1@f1node01.rhrz.uni-bonn.de>
Anton van Straaten wrote:

 > While a hygienic system doesn't have to be rule-based,
> I'd guess that a purely rule-based system probably almost certainly has to
> be hygienic.

Why? What's wrong with destructuring-case?

BTW, this is one of the things that confuse me wrt 
syntax-rules/syntax-case - they combine too many things into one 
construct, i.e. pattern matching + hygiene + referential transparency. I 
don't like the pattern matching stuff.

> This goes back to the point I made earlier, which is that I think the syntax
> ADT is a prime suspect in making syntax-case seem foreign.  There's an Alan
> Bawden paper called "Quasiquotation in Lisp" which essentially discusses the
> "synergy between quasiquotation and S-expressions", particularly related to
> "program-generating programs".  It's the loss of this synergy that's really
> unfortunate for syntax-case, IMO.  People say things about how defmacro
> matches the way they think, and I think this is why.

Exactly.

> Back to mental models: many Scheme programmers tend to use a model of their
> language that's more closely connected to the lambda calculus, even if only
> informally: to the notion of functional abstraction as a really basic and
> low-level feature.  The anti-minimalists like to make fun of this, they
> don't see any point in being minimal, etc.  Actually, there's a huge point:
> if you can easily and naturally express other constructs in terms of lambda,
> including things like control flow, it gives you enormous leverage in terms
> of understanding new things.  You have a small silver bullet which works
> well to model more complex things, and to help you understand the essence of
> those things.

I prefer to think in terms of control flow and operational semantics.


Pascal

-- 
ECOOP 2004 Workshops - Oslo, Norway
*1st European Lisp and Scheme Workshop, June 13*
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
*2nd Post-Java Workshop, June 14*
http://prog.vub.ac.be/~wdmeuter/PostJava04/
From: Jeff Dalton
Subject: Re: How to think about Scheme macros, was Re: scheme seems neater
Date: 
Message-ID: <fx4pt9z3k1l.fsf@todday.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Reason 1.  It's very easy to implement lambda calculus substitution
> yourself, at least as easy as the Lisp 1.5 interpreter, and possibly
> easier.

I'm not convinced.  I think lambda calculus reduction is
harder to understand.  Environments were a big advance, IMO.

In this dicsussion, Scheme is seeming more and more like a
dual language.  The macro system is like a compiler, or it's
like lambda calculus substitution, both different from the
ordinary Scheme interpreter or compiler.

Syntactic closures had looked like they might restore a unified
view, but they seem to have fallen from favor.

> Clive Barker does it in 10 lines of admittedly rather dense Scheme, at the
> bottom of this page:
> http://ling.ucsd.edu/~barker/Lambda/
> ...but it can of course be done more readably, in a few more lines.

That's a great page, but I don't think it gives a clear
explanation.

> Once you've written such a program yourself, you basically know how pure
> hygienic macro expansion works.

Perhaps so, but the step from the one to the other isn't obvious.

> Back to mental models: many Scheme programmers tend to use a model of their
> language that's more closely connected to the lambda calculus, even if only
> informally: to the notion of functional abstraction as a really basic and
> low-level feature.

Yes, but from the start Scheme was explained via interpreters
that used environments, rather than substitution.

In simple cases, substitution can be easier to understand.
But once renaming enters the picture, substitution looks messy.

> The anti-minimalists like to make fun of this, they
> don't see any point in being minimal, etc.  Actually, there's a huge point:
> if you can easily and naturally express other constructs in terms of lambda,
> including things like control flow, it gives you enormous leverage in terms
> of understanding new things.  You have a small silver bullet which works
> well to model more complex things, and to help you understand the essence of
> those things.

When the "lambda the ultimate" papers first appeared, they presented
ways of thinking and of doing that were very appealing to Lisp
programmers, IMO.

If there's been a slip between Lisp and Scheme culture since then,
I think Scheme macros, which seemed to come with a lot of ideology,
were part of the reason.

> It's a tool for understanding and analyzing systems.  It's not the only one
> you'll ever need, but it's a surprisingly widely applicable one - although
> perhaps not so surprising, when its very basic nature is considered.  Lambda
> calculus is not an arbitrary system, it's the most general form of what's
> left when you strip away everything but the notion of giving names to
> variables in an expression, which no useful computer language can escape.
> It's such a simple concept that we teach it to high school students, as the
> basis for algebra.

> With this mental model - or even a watered-down version of it obtained
> intuitively from using Scheme - hygienic macro expansion seems very natural,
> because it *is* lambda calculus.

I don't agree that hygienic expansion seems natural from a Scheme
point of view.  It's renaming rather than environments.

Until hygienic macros came along, there was no need to have
renaming schemes as part of the way you thought with and
about Scheme.

-- jd
From: Tayssir John Gabbour
Subject: Re: scheme seems neater
Date: 
Message-ID: <866764be.0404161653.4acf1978@posting.google.com>
"Anton van Straaten" <·····@appsolutions.com> wrote in message news:<···················@newsread3.news.atl.earthlink.net>...
> When you deal with lambdas in Lisp, do you talk about accidental variable
> capture related to the lexical variables they define?  Since it doesn't
> happen, I presume not much.  Do you think that the lexically scoped lambda's
> main contribution should be described as "attacking the problem of
> unintended variable capture"?  Perhaps one might do that, given the funarg
> problem history, but even so, does that description adequately capture what
> the lexically-scoped lambda offers, in your mind?

Perhaps the point is that lisp is rather nihilistic, which makes it
multiparadigm, but certain support structures need to be built up to
better support some paradigms.
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx41xmt9jl1.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> If Lisp-1 were really such a problem for defmacro, it would be
> difficult to explain this continued usage.

I suggested an explanation in an earlier message: hygienic macros
haven't been made easy enough to use (and perhaps also not easy enough
to put into implementations).

You have also suggested reasons why defmacro is used, even
in the very same paragraph that says "If Lisp-1were really
such a problem ..."

  People writing portable Scheme often use defmacro because it's more
  powerful than syntax-rules, but also sufficiently portable between
  implementations.

The point is that Lisp-1 makes a problem with using defmacro worse,
not that Lisp-1 makes defmacro completely unusable or forces
people to develop hygienic macros.

Other factors can therefore outweigh the greater problem of
unintended capture, and defmacro can continue to be used.

-- jd
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx44qrxhuxq.fsf@todday.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> > I can see why you think that way, but in practice about the only real
> > difference it makes is that coming up with a good macro system for
> > Scheme has been harder (because of the much greater likelihood of
> > accidental variable capture in a single namespace).
> 
> This is a myth.

No it isn't.

The scheme designers had a very hard time coming up with a macro
system good enough to put in the language, and one of the reasons
ordinary Lisp defmacro-style macros were not as acceptable for
Scheme as they were for other Lisps was the greater danger of
unintended captures in Scheme.

> Most Schemes provide a defmacro just like CL's that works just fine,
> as far as defmacro goes.

Yes, but the point wasn't about the difficulty of coming up with
any macro system for scheme; it was about the difficulty of coming
up with a good one.

>  Scheme macro systems are addressing other issues - in
> particular, providing a macro system that integrates with the lexical
> structure of the underlying language, rather than being entirely unaware of
> that structure.

Do you think it has nothing to do with accidental captures?

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <GUDcc.12000$NL4.777@newsread3.news.atl.earthlink.net>
"Jeff Dalton" <····@todday.inf.ed.ac.uk> wrote:

> "Anton van Straaten" <·····@appsolutions.com> writes:
>
> > > I can see why you think that way, but in practice about the only real
> > > difference it makes is that coming up with a good macro system for
> > > Scheme has been harder (because of the much greater likelihood of
> > > accidental variable capture in a single namespace).
> >
> > This is a myth.
>
> No it isn't.

Is too.

> The scheme designers had a very hard time coming up with a macro
> system good enough to put in the language, and one of the reasons
> ordinary Lisp defmacro-style macros were not as acceptable for
> Scheme as they were for other Lisps was the greater danger of
> unintended captures in Scheme.

Do you have any references for where this "reason" was discussed?

> > Most Schemes provide a defmacro just like CL's that works just fine,
> > as far as defmacro goes.
>
> Yes, but the point wasn't about the difficulty of coming up with
> any macro system for scheme; it was about the difficulty of coming
> up with a good one.

Defmacro works very well in Scheme, and plenty of people use it.  That
negates the premise that an increased chance of accidental capture is the
reason for going after better macro systems.

The difficulty of coming up with a good macro system similarly has nothing
to do with an increased chance for accidental capture.  A macro system which
interacts with the lexical structure of programs turns out to be much less
trivial than defmacro, for reasons that have nothing to do with Lisp-1 or
Lisp-2.

For the purposes of these macro systems, *any* accidental capture was
considered undesirable in principle, for the reasons I've mentioned and
others.  CL is no better in this respect - accidental capture can occur, and
that was considered undesirable.

> >  Scheme macro systems are addressing other issues - in
> > particular, providing a macro system that integrates with the lexical
> > structure of the underlying language, rather than being entirely unaware
> > of that structure.
>
> Do you think it has nothing to do with accidental captures?

The degree to which a Lisp-1 increases the chance of accidental captures has
no bearing on the desire for a more sophisticated macro system.  That's
precisely the myth that I'm debunking.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4wu4q5mv1.fsf@todday.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> "Jeff Dalton" <····@todday.inf.ed.ac.uk> wrote:
> 
> > "Anton van Straaten" <·····@appsolutions.com> writes:
> >
> > > > I can see why you think that way, but in practice about the only real
> > > > difference it makes is that coming up with a good macro system for
> > > > Scheme has been harder (because of the much greater likelihood of
> > > > accidental variable capture in a single namespace).
> > >
> > > This is a myth.
> >
> > No it isn't.
> 
> Is too.

Is not.

> > The scheme designers had a very hard time coming up with a macro
> > system good enough to put in the language, and one of the reasons
> > ordinary Lisp defmacro-style macros were not as acceptable for
> > Scheme as they were for other Lisps was the greater danger of
> > unintended captures in Scheme.
> 
> Do you have any references for where this "reason" was discussed?

Just my memories of what people said.

Do you have some reason to doubt it?

> > > Most Schemes provide a defmacro just like CL's that works just fine,
> > > as far as defmacro goes.
> >
> > Yes, but the point wasn't about the difficulty of coming up with
> > any macro system for scheme; it was about the difficulty of coming
> > up with a good one.
> 
> Defmacro works very well in Scheme, and plenty of people use it.  That
> negates the premise that an increased chance of accidental capture is the
> reason for going after better macro systems.

No it doesn't.  Such macros don't work as well in Scheme as
in CL, and once hygienic macros came along, the old-style,
accidental-capture-allowing ones were doomed, in Scheme at least.
Notice how they didn't make it into the revised report or the
standard.

> > Do you think it has nothing to do with accidental captures?
> 
> The degree to which a Lisp-1 increases the chance of accidental captures has
> no bearing on the desire for a more sophisticated macro system.  That's
> precisely the myth that I'm debunking.

On what basis?  Your own opinion?

In any case, I was replying to the "myth" claim about the passage
quoted at the top of this message, which was not about the desire.

The combination of defmacro, gensym, and packages is a good enough
macro system for Common Lisp, but Scheme doesn't have packages,
and even with all three together, accidental captures are still
possible; and in Scheme they are more likely.  So traditional
Lisp macros weren't good enough for Scheme.

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <T0idc.1052$l75.69@newsread2.news.atl.earthlink.net>
Jeff Dalton wrote:
> In any case, I was replying to the "myth" claim about the passage
> quoted at the top of this message, which was not about the desire.

That passage does contain a myth, which is just a variation of what the
discussion had moved on to.  I'll clarify, with your original claim:

> > > The scheme designers had a very hard time coming up with a macro
> > > system good enough to put in the language

This part is true.  The reason for that, though, was because they had high
standards, imposing the same kinds of requirements on macros that are
offered by the semantics of the functional core of Scheme.  Simple textual
(or list-based) manipulation of source code without regard to the semantic
properties of the target language was not considered good enough.

> > > and one of the reasons
> > > ordinary Lisp defmacro-style macros were not as acceptable for
> > > Scheme as they were for other Lisps was the greater danger of
> > > unintended captures in Scheme.

This part is a myth, and I doubt you can come up with a single piece of
evidence showing that the (alleged) *greater* danger of unintended capture
in Scheme made defmacro-style macros any less acceptable for Scheme than
they would be otherwise.  The issue was simply that there was any danger at
all - the issue was syntax manipulation without what were considered basic
semantics safeguards.  If the Scheme designers had been working with a
Lisp-2, they would have followed the exact same route.  If you want evidence
of that, read almost any of the macro papers here:
http://library.readscheme.org/page3.html

Also, note what Will Clinger wrote here on March 7, in the thread "Scheme
macros":

"Both Kohlbecker's algorithm and the Clinger-Rees "Macros That Work"
algorithm were developed with Common Lisp in mind.  X3J13 rejected this
macro technology, but there has never been any doubt in my mind that this
technology could be made to work in Common Lisp."

> > Do you have any references for where this "reason" was discussed?
>
> Just my memories of what people said.
>
> Do you have some reason to doubt it?

Yes.  I doubt it because it makes no sense in the context of decades of the
history of development of macros, based on the above papers, among other
sources, and also based on my knowledge of the macro systems in question,
specifically experience with defmacro, syntax-rules, and syntax-case, plus
indirect knowledge of some of the other systems described in the above
papers.

> > The degree to which a Lisp-1 increases the chance of accidental captures
has
> > no bearing on the desire for a more sophisticated macro system.  That's
> > precisely the myth that I'm debunking.
>
> On what basis?  Your own opinion?

See above.  I don't recall having seen a claim like that made in any of the
macro system papers - I can't swear such a claim doesn't exist somewhere,
but in the context of what those macro systems are actually trying to
achieve, it would seem like a marginal reason at best, and it's in flat
contradiction to what many of the papers say.  Most of them are talking
about an absolute issue, i.e. whether or not the macros have sound semantic
properties with respect to the target language; not a relative issue, i.e.
whether the semantic problems with defmacro were relatively greater or less
in different circumstances.

It was the very existence of the semantic problems with defmacro that were
being addressed; a partial solution was not considered good enough.

> The combination of defmacro, gensym, and packages is a good enough
> macro system for Common Lisp, but Scheme doesn't have packages,
> and even with all three together, accidental captures are still
> possible; and in Scheme they are more likely.  So traditional
> Lisp macros weren't good enough for Scheme.

The fact that defmacro is still widely used in Scheme negates your
conclusion from a practical perspective.  From the standards perspective, I
agree that traditional Lisp macros weren't good enough to be *standardized*
for Scheme, because they don't satisfy the kinds of semantic properties
important to the designers of Scheme.  Those semantic properties aren't
satisfied in Common Lisp, either.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4n05l7tie.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Jeff Dalton wrote:
> > In any case, I was replying to the "myth" claim about the passage
> > quoted at the top of this message, which was not about the desire.
> 
> That passage does contain a myth, which is just a variation of what the
> discussion had moved on to.  I'll clarify, with your original claim:
> 
> > > > The scheme designers had a very hard time coming up with a macro
> > > > system good enough to put in the language
> 
> This part is true.  The reason for that, though, was because they had high
> standards, imposing the same kinds of requirements on macros that are
> offered by the semantics of the functional core of Scheme.  Simple textual
> (or list-based) manipulation of source code without regard to the semantic
> properties of the target language was not considered good enough.

At one point (I think it was the Snowbird meeting in 198x), there
was a rule-based hygienic system and there was syntactic closures,
some people wanted one, others wanted the other, no agreement,
and so some people were chosen to find a way to combine the two.
This proved more difficult than had been expected.  That's part
of the "hard time" I was talking about.

> > > > and one of the reasons
> > > > ordinary Lisp defmacro-style macros were not as acceptable for
> > > > Scheme as they were for other Lisps was the greater danger of
> > > > unintended captures in Scheme.
> 
> This part is a myth,

How do you get to speak for the Scheme community, which includes,
among other people, me?

> and I doubt you can come up with a single piece of
> evidence showing that the (alleged) *greater* danger of unintended capture
> in Scheme ...

*Alleged* greater?   You don't even believe that part?

> If the Scheme designers had been working with a
> Lisp-2, they would have followed the exact same route.

Some of them, yes, especially the ones most keen on hygienic macros.

> If you want evidence of that, read almost any of the macro papers here:
> http://library.readscheme.org/page3.html
> 
> Also, note what Will Clinger wrote here on March 7, in the thread "Scheme
> macros":
> 
> "Both Kohlbecker's algorithm and the Clinger-Rees "Macros That Work"
> algorithm were developed with Common Lisp in mind.  X3J13 rejected this
> macro technology, but there has never been any doubt in my mind that this
> technology could be made to work in Common Lisp."

That's just "could be made to work".  And probably it could be.
Has anyone done it?  I once ported a Scheme implementation of
extend-syntax (written by R. Kent Dybvig), but it didn't include
the hygienic part.

If anyone has done Scheme-style macros for CL, I would like
to use them.

> > > Do you have any references for where this "reason" was discussed?
> >
> > Just my memories of what people said.
> >
> > Do you have some reason to doubt it?
> 
> Yes.  I doubt it because it makes no sense in the context of decades
> of the history of development of macros,

I don't agree.

> based on the above papers, among other
> sources, and also based on my knowledge of the macro systems in question,
> specifically experience with defmacro, syntax-rules, and syntax-case, plus
> indirect knowledge of some of the other systems described in the above
> papers.

But not, for instance, on what people said, back when hygienic macros
came along and they were trying to decide on a macro system for
Scheme?  Minutes of meetings?  E-mail?  It sounds like you're basing
it on what you know of the macro systems, and on some papers.

Issues often assume a more theoretical form in papers.

> > > The degree to which a Lisp-1 increases the chance of accidental
> > > captures has no bearing on the desire for a more sophisticated
> > > macro system.

That's definitely false in the case of Common Lisp, which you have now
brought in.  That CL is not a Lisp-1 is one of the reasons why
traditional macros are more acceptable for Common Lisp, and the use
of traditional macros in CL is one reason why it would be difficult to
make CL a Lisp-1.

> ... I can't swear such a claim doesn't exist somewhere,
> but in the context of what those macro systems are actually trying to
> achieve, it would seem like a marginal reason at best, and it's in flat
> contradiction to what many of the papers say.  Most of them are talking
> about an absolute issue, i.e. whether or not the macros have sound semantic
> properties with respect to the target language; not a relative issue,

That's the sort of thing people say in papers.

> It was the very existence of the semantic problems with defmacro that were
> being addressed; a partial solution was not considered good enough.

Sure, some people did think that way.  And people motivated to
devise hygienic macro systems tend to be among them.  But they
were not the only people involved.

> > The combination of defmacro, gensym, and packages is a good enough
> > macro system for Common Lisp, but Scheme doesn't have packages,
> > and even with all three together, accidental captures are still
> > possible; and in Scheme they are more likely.  So traditional
> > Lisp macros weren't good enough for Scheme.
> 
> The fact that defmacro is still widely used in Scheme negates your
> conclusion from a practical perspective.

If anything, it shows that ordinary users and implementors have
a different view than the Scheme designers you've talked about,
and that hygienic macros haven't been made easy enough to use
(and perhaps also not easy enough to put into implementations).

What might negate my conclusion from a practical perspective
is evidence that Scheme programmers are just as happy with
defmacro as Common Lisp programmers; but that would also be
further evidence of their differences from the designers.

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <TQHdc.2730$zj3.586@newsread3.news.atl.earthlink.net>
Jeff Dalton wrote:

> At one point (I think it was the Snowbird meeting in 198x), there
> was a rule-based hygienic system and there was syntactic closures,
> some people wanted one, others wanted the other, no agreement,
> and so some people were chosen to find a way to combine the two.
> This proved more difficult than had been expected.  That's part
> of the "hard time" I was talking about.

But as a reason for this "hard time", you said: "one of the reasons ordinary
Lisp defmacro-style macros were not as acceptable for Scheme as they were
for other Lisps was the greater danger of unintended captures in Scheme."  I
don't see how the Snowbird situation demonstrates this.

> > > > > and one of the reasons
> > > > > ordinary Lisp defmacro-style macros were not as acceptable for
> > > > > Scheme as they were for other Lisps was the greater danger of
> > > > > unintended captures in Scheme.
> >
> > This part is a myth,
>
> How do you get to speak for the Scheme community, which includes,
> among other people, me?

Communities can believe in myths, that doesn't make them any less mythical.
I don't think the Scheme community in general does believe this myth, but we
could go and post the question in comp.lang.scheme and get some feedback on
that, if you like.

I can't deny that there may have been people who thought that it was
necessary to pursue hygienic macros because of a "greater danger of
unintended captures in Scheme".  However, this position seems difficult to
defend technically, and I have yet to see a remotely convincing argument for
it.  I also haven't seen any examples of macro system designers claiming
this as a motivation.

> > and I doubt you can come up with a single piece of
> > evidence showing that the (alleged) *greater* danger of unintended
capture
> > in Scheme ...
>
> *Alleged* greater?   You don't even believe that part?

In a reply just posted to Pascal Costanza, I go into details of that.  The
point is that normal Scheme practices for dealing with a single namespace
are sufficient to deal with the problem, in most cases.  The "greater
danger" is greatly overstated; practical evidence of this is the continued
defmacro use in Scheme.

> > If the Scheme designers had been working with a
> > Lisp-2, they would have followed the exact same route.
>
> Some of them, yes, especially the ones most keen on hygienic macros.

Can you give examples of others who were less keen on hygienic macros?

> > If you want evidence of that, read almost any of the macro papers here:
> > http://library.readscheme.org/page3.html
> >
> > Also, note what Will Clinger wrote here on March 7, in the thread
"Scheme
> > macros":
> >
> > "Both Kohlbecker's algorithm and the Clinger-Rees "Macros That Work"
> > algorithm were developed with Common Lisp in mind.  X3J13 rejected this
> > macro technology, but there has never been any doubt in my mind that
this
> > technology could be made to work in Common Lisp."
>
> That's just "could be made to work".  And probably it could be.
> Has anyone done it?  I once ported a Scheme implementation of
> extend-syntax (written by R. Kent Dybvig), but it didn't include
> the hygienic part.
>
> If anyone has done Scheme-style macros for CL, I would like
> to use them.

You're in luck - try out Dorai Sitaram's syntax-rules for Common Lisp:
http://www.ccs.neu.edu/home/dorai/mbe/mbe-lsp.html

> > based on the above papers, among other
> > sources, and also based on my knowledge of the macro systems in
question,
> > specifically experience with defmacro, syntax-rules, and syntax-case,
plus
> > indirect knowledge of some of the other systems described in the above
> > papers.
>
> But not, for instance, on what people said, back when hygienic macros
> came along and they were trying to decide on a macro system for
> Scheme?  Minutes of meetings?  E-mail?  It sounds like you're basing
> it on what you know of the macro systems, and on some papers.

I've read many messages in the RnRS author's list, and older messages on
comp.lang.scheme.  I've never even seen a hint at the idea that Scheme's
Lisp-1-ness was a driving issue behind the need for hygienic macros.  The
only place that comes up consistently is in comp.lang.lisp.

> Issues often assume a more theoretical form in papers.

And issues which can't be supported with technical arguments don't make it
into papers.

> > > > The degree to which a Lisp-1 increases the chance of accidental
> > > > captures has no bearing on the desire for a more sophisticated
> > > > macro system.
>
> That's definitely false in the case of Common Lisp, which you have now
> brought in.  That CL is not a Lisp-1 is one of the reasons why
> traditional macros are more acceptable for Common Lisp

This is just the inverse of the same myth.  There seem to be all sorts of
reasons that the CL community is uninterested in other macro systems, and
one of those reasons certainly appears to be the *belief* that CL doesn't
need a hygienic system because it's a Lisp-1.  However, that belief has no
technical basis, and is based on a misunderstanding of the goals of hygienic
macro systems.

> and the use of traditional macros in CL is one reason
> why it would be difficult to make CL a Lisp-1.

It would be difficult to make CL a Lisp-1 because existing code is already
written to rely on being a Lisp-2.  This has nothing to do with which macro
system it uses.  Again, the use of traditional macros in Scheme is evidence
of this.

> > ... I can't swear such a claim doesn't exist somewhere,
> > but in the context of what those macro systems are actually trying to
> > achieve, it would seem like a marginal reason at best, and it's in flat
> > contradiction to what many of the papers say.  Most of them are talking
> > about an absolute issue, i.e. whether or not the macros have sound
semantic
> > properties with respect to the target language; not a relative issue,
>
> That's the sort of thing people say in papers.

No, there's more to it than that.  There's a basic semantic issue here, a
significant problem that was being solved.  The problem was not "the greater
chance of accidental capture", but rather the problem of how to design macro
systems that had an understanding of their target language, so that macros
could be written safely without requiring programmers to manually manage the
interaction with the program context in which a macro is expanded.

Perhaps quoting another "paper" won't help here, but in R5RS it says: "All
macros defined using the pattern language are 'hygienic' and 'referentially
transparent' and thus preserve Scheme's lexical scoping".  The important bit
in that sentence is "thus preserve Scheme's lexical scoping".  That is the
goal of these systems, and Lisp-2 in no way helps achieve that goal.

> > It was the very existence of the semantic problems with defmacro that
were
> > being addressed; a partial solution was not considered good enough.
>
> Sure, some people did think that way.  And people motivated to
> devise hygienic macro systems tend to be among them.  But they
> were not the only people involved.

So, who would be examples of people who were thinking otherwise?  I'm
genuinely curious, because I'm interested in the history.

> > The fact that defmacro is still widely used in Scheme negates your
> > conclusion from a practical perspective.
>
> If anything, it shows that ordinary users and implementors have
> a different view than the Scheme designers you've talked about,
> and that hygienic macros haven't been made easy enough to use
> (and perhaps also not easy enough to put into implementations).

That's a different discussion.  My personal perspective is that choice is
good, and I've used all three macro systems commonly available in Scheme.
All have pros and cons.

> What might negate my conclusion from a practical perspective
> is evidence that Scheme programmers are just as happy with
> defmacro as Common Lisp programmers;

I think there is such evidence.  However, there's a fairly strong community
bias in the Scheme away from defmacro, for the kinds of reasons I've
covered, plus the fact that R5RS standardizes syntax-rules.  One Scheme
implementor I've spoken with prefers defmacro but has slowly been moving
towards the hygienic systems because, essentially, of peer and user
pressure.

> but that would also be
> further evidence of their differences from the designers.

Again, a different discussion.  I think there's little question that
defmacro offers a kind of simplicity that e.g. syntax-case lacks.  That
doesn't mean defmacro is superior in all respects.  I think closing oneself
off to alternatives is a pity, especially if that decision is based on a
myth.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4ad1h9lfw.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Jeff Dalton wrote:
> 
> > At one point (I think it was the Snowbird meeting in 198x), there
> > was a rule-based hygienic system and there was syntactic closures,
> > some people wanted one, others wanted the other, no agreement,
> > and so some people were chosen to find a way to combine the two.
> > This proved more difficult than had been expected.  That's part
> > of the "hard time" I was talking about.
> 
> But as a reason for this "hard time", you said: "one of the reasons ordinary
> Lisp defmacro-style macros were not as acceptable for Scheme as they were
> for other Lisps was the greater danger of unintended captures in
> Scheme."  I don't see how the Snowbird situation demonstrates this.

It wasn't supposed to demonstrate that.  It was supposed
to illustrate the difficulties on the branch in which defmacro
isn't good enough.  The difficulty of devising a good enough
macro system for Scheme was partly that defmacro -- which was
good enough for may Lisps -- wasn't good enough for Scheme,
and partly that developing a good enough alternative was
difficult.

> > > This part is a myth,
> >
> > How do you get to speak for the Scheme community, which includes,
> > among other people, me?
> 
> Communities can believe in myths, that doesn't make them any less
> mythical.

That's not an asnwer to my question.

> I can't deny that there may have been people who thought that it was
> necessary to pursue hygienic macros because of a "greater danger of
> unintended captures in Scheme".

That isn't what I said.

> > That's just "could be made to work".  And probably it could be.
> > Has anyone done it?  I once ported a Scheme implementation of
> > extend-syntax (written by R. Kent Dybvig), but it didn't include
> > the hygienic part.

Note that the above means that if I merely wanted the rule
language (without hygiene), I already had it.

> > If anyone has done Scheme-style macros for CL, I would like
> > to use them.
> 
> You're in luck - try out Dorai Sitaram's syntax-rules for Common Lisp:
> http://www.ccs.neu.edu/home/dorai/mbe/mbe-lsp.html

As I mentioned in another thread:

  According to that page:

    This Common Lisp implementation does _not_ provide hygiene.

> I've read many messages in the RnRS author's list, and older messages on
> comp.lang.scheme.  I've never even seen a hint at the idea that Scheme's
> Lisp-1-ness was a driving issue behind the need for hygienic macros.  The
> only place that comes up consistently is in comp.lang.lisp.

"Driving issue"?  You keep exaggerating the claim.  I'm not saying
the greater danger of unintended capture was the driving issue or
that it, by itself, made it "necessary to pursue hygienic macros".

> > > > > The degree to which a Lisp-1 increases the chance of accidental
> > > > > captures has no bearing on the desire for a more sophisticated
> > > > > macro system.
> >
> > That's definitely false in the case of Common Lisp, which you have now
> > brought in.  That CL is not a Lisp-1 is one of the reasons why
> > traditional macros are more acceptable for Common Lisp
> 
> This is just the inverse of the same myth.  There seem to be all sorts of
> reasons that the CL community is uninterested in other macro systems, and
> one of those reasons certainly appears to be the *belief* that CL doesn't
> need a hygienic system because it's a Lisp-1.  However, that belief has no
> technical basis, and is based on a misunderstanding of the goals of hygienic
> macro systems.

Presumably you mean "the *belief* that CL doesn't need a hygienic
system because it's a Lisp-2".

Anyway, that isn't the belief, and it isn't what I said.  I said
"one of the reasons".  The belief in the CL Community is that
traditional macros work well enough, partly because of packages,
partly because CL is a Lisp-2, and partly because of various
macro-writing techniques.

Now, what "misunderstanding of the goals of hygienic
macro systems" do you have in mind?

> > and the use of traditional macros in CL is one reason
> > why it would be difficult to make CL a Lisp-1.
> 
> It would be difficult to make CL a Lisp-1 because existing code is already
> written to rely on being a Lisp-2.  This has nothing to do with which macro
> system it uses.

The existing code doesn't include macro definitions?

> Again, the use of traditional macros in Scheme is evidence of this.

The Schemers were never happy enough with traditional macros to
make them standard, and when Common Lispers considered what it would
be like writing macros in a Lisp-1, it became *a* reason not to make
Common Lisp a Lisp-1.

> > That's the sort of thing people say in papers.
> 
> No, there's more to it than that.  There's a basic semantic issue here, a
> significant problem that was being solved.  The problem was not "the greater
> chance of accidental capture", but rather the problem of how to design macro
> systems that had an understanding of their target language, so that macros
> could be written safely without requiring programmers to manually manage the
> interaction with the program context in which a macro is expanded.

Next you'll be telling me that hygienic macro systems had nothing
to do with hygiene or referential transparency or unintentional
name capture.  But they did; that's what they were about.

You seem to think that those things can be relegated to unimportance
just by focusing on a different way of saying it: preserving lexical
scoping.

> Perhaps quoting another "paper" won't help here, but in R5RS it says: "All
> macros defined using the pattern language are 'hygienic' and 'referentially
> transparent' and thus preserve Scheme's lexical scoping".  The important bit
> in that sentence is "thus preserve Scheme's lexical scoping".  That is the
> goal of these systems, and Lisp-2 in no way helps achieve that goal.

It reduces the number of cases in which you need special magic
to avoid an unintentional capture aka violation of lexical
scoping.

> > > It was the very existence of the semantic problems with defmacro
> > > that were being addressed; a partial solution was not considered
> > > good enough.
> >
> > Sure, some people did think that way.  And people motivated to
> > devise hygienic macro systems tend to be among them.  But they
> > were not the only people involved.
> 
> So, who would be examples of people who were thinking otherwise?  I'm
> genuinely curious, because I'm interested in the history.

How about all the people who were happy to implement and use
traditional macros in Scheme?

> > > The fact that defmacro is still widely used in Scheme negates your
> > > conclusion from a practical perspective.
> >
> > If anything, it shows that ordinary users and implementors have
> > a different view than the Scheme designers you've talked about,
> > and that hygienic macros haven't been made easy enough to use
> > (and perhaps also not easy enough to put into implementations).
> 
> That's a different discussion.

It connects to the previous point.

You seem to want it both ways: Schemers were maximalist about
defmacro's problems ("a partial solution was not considered
good enough"), yet have continued to happily use defmacro
(showing, supposedly, that it works well in Scheme despite
Scheme being a Lisp-1).

> > What might negate my conclusion from a practical perspective
> > is evidence that Scheme programmers are just as happy with
> > defmacro as Common Lisp programmers;
> 
> I think there is such evidence.  However, there's a fairly strong community
> bias in the Scheme away from defmacro, for the kinds of reasons I've
> covered, plus the fact that R5RS standardizes syntax-rules.  One Scheme
> implementor I've spoken with prefers defmacro but has slowly been moving
> towards the hygienic systems because, essentially, of peer and user
> pressure.

> > but that would also be
> > further evidence of their differences from the designers.

> Again, a different discussion.

No, because you were using papers about macro systems as evidence
for that the Scheme community as a whole thought.  Yet there are
clearly people in the Scheme community that think differently from
the way those papers suggest.

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <AbCec.5349$zj3.485@newsread3.news.atl.earthlink.net>
There seems to have been some confusion in this part of the thread, which
I'll try to rectify below.

Jeff Dalton wrote:

> "Anton van Straaten" <·····@appsolutions.com> writes:
> > I can't deny that there may have been people who thought that
> > it was necessary to pursue hygienic macros because of a
> > "greater danger of unintended captures in Scheme".
>
> That isn't what I said.

Actually, it is what you said, first indirectly and then possibly directly,
depending on interpretation.  I responded to "mikel", who wrote:

> I can see why you think that way, but in practice about
> the only real difference it makes is that coming up with
> a good macro system for Scheme has been harder
> (because of the much greater likelihood of accidental
> variable capture in a single namespace).

I responded:
> This is a myth.

You responded:
> No it isn't.

By contradicting me here, you implicitly agreed with mikel, who drew an
explicit, strong connection between the "greater likelihood of accidental
variable capture in a single namespace", and "coming up with a good macro
system for Scheme has been harder".  By your latest post, it seems as though
you're not actually agreeing with mikel, in which case it seems we're in
agreement that the connection mikel drew isn't accurate.

What you wrote to back up your statement "No it isn't" was:

> The scheme designers had a very hard time coming up with a
> macro system good enough to put in the language, and one of
> the reasons ordinary Lisp defmacro-style macros were not as
> acceptable for Scheme as they were for other Lisps was the
> greater danger of unintended captures in Scheme.

I interpreted this paragraph in the context of your implicit agreement with
mikel.  Since you continued from the claim about the "hard time coming up
with a macro system good enough to put in the language" with a comma and the
word "and", I assumed you were drawing a connection there.  It now sounds as
though you consider those to be weakly related claims, since you wrote:

> "Driving issue"?  You keep exaggerating the claim.  I'm not saying
> the greater danger of unintended capture was the driving issue or
> that it, by itself, made it "necessary to pursue hygienic macros".

What you wrote here is almost identical to mikel's claim, and I haven't been
exaggerating that at all.  You objected to my calling mikel's claim a myth,
and that's been the basis for this part of the exchange.  Since it now seems
you're "not saying" what mikel said, which I said was a myth, we actually
may not have enough disagreement on this point to be worth discussing.
Well, I can dream...

> > > > > and one of the reasons
> > > > > ordinary Lisp defmacro-style macros were not as acceptable for
> > > > > Scheme as they were for other Lisps was the greater danger of
> > > > > unintended captures in Scheme.
> > > >
> > > > This part is a myth,
> > >
> > > How do you get to speak for the Scheme community, which includes,
> > > among other people, me?
> >
> > Communities can believe in myths, that doesn't make them any less
> > mythical.
>
> That's not an asnwer to my question.

To answer your question, I wasn't really speaking for the Scheme community,
and didn't say that I was.  I was speaking about both the Scheme
standardization process and the apparent motivations of macro system
designers.  In both of those cases, the reason that ordinary Lisp
defmacro-style macros were not as acceptable for Scheme as they were for
other Lisps was because Scheme has a tradition of wanting better solutions
than the kind defmacro offers.  I don't think it's possible to make the case
that had the Lisp-1 issue not existed, that defmacro-style macros would have
been acceptable, or even more acceptable, either for the Scheme standard, or
as the only macro solution in Scheme.

> Next you'll be telling me that hygienic macro systems had nothing
> to do with hygiene or referential transparency or unintentional
> name capture.  But they did; that's what they were about.
>
> You seem to think that those things can be relegated to unimportance
> just by focusing on a different way of saying it: preserving lexical
> scoping.

It's not just a different way of saying it: correct interaction with lexical
scoping is the goal, for which hygiene is an implementation issue, and not
the only issue, either.  I'm not relegating hygiene to unimportance, or
saying that "hygienic macro systems had nothing to do with hygiene or
referential transparency or unintentional name capture."  I'm saying that
the hygiene issue exists in a broader context, and focusing on "hygiene" as
the goal of these systems runs the risk of missing their point.

The lambda calculus parallel I drew in another post is relevant: hygiene is
a condition which needs to be achieved for proper functioning of the system,
it's achieved via alpha renaming, in the service of achieving referential
transparency for lexically scoped functions.  Perhaps you want to argue that
"hygiene" is a shortcut term which means all that, but judging by posts I
see here about hygiene, I don't think that's a widespread understanding.

> You seem to want it both ways: Schemers were maximalist about
> defmacro's problems ("a partial solution was not considered
> good enough"), yet have continued to happily use defmacro
> (showing, supposedly, that it works well in Scheme despite
> Scheme being a Lisp-1).

There's no contradiction here, but I may not have made the reasons clear
enough.  Schemers were maximalist about defmacro's problems in the sense
that defmacro wasn't good enough as a final solution for macros in Scheme,
the way it seems to be with CL.  OTOH, there've been various reasons to
continue using defmacro: macro systems were a moving target, both in the
standards and in implementations; the standardized systems weren't the most
powerful systems, etc.  Today, not all Scheme implementations offer
syntax-case as standard, so defmacro may still be the best option in some
cases, for something more powerful than syntax-rules.

None of this negates the fact that Scheme's designers and standardizers were
maximalist about defmacro's problems: in fact, this situation is largely
*because* Scheme's designers and standardizers were maximalist about
defmacro's problems, which made macro systems a moving target, slow to
standardize, and meant that the most recent standard macro system was
deliberately a restricted one (which made it easier to standardize).

And yes, continued use of defmacro today, for whatever reason, demonstrates
that defmacro works fine in a Lisp-1.  That's not going to bring defmacro
any closer to being the core macro system for Scheme, in the standard.
There isn't any conflict between those two positions.

> > Again, a different discussion.
>
> No, because you were using papers about macro systems as evidence
> for that the Scheme community as a whole thought.  Yet there are
> clearly people in the Scheme community that think differently from
> the way those papers suggest.

No doubt, but I'm not sure of the relevance of that.  I haven't been talking
about the Scheme community as a whole, I've been talking about what I
understand to be the motivations of the designers and people on the
standards committees, and I've provided a fair amount of supporting evidence
for that.

The continued use of defmacro is evidence that the Scheme community finds it
useful, but its continued exclusion from the standard indicates that it's
not considered good enough to be part of the core definition of Scheme.
Afaict, the community certainly agrees with that.  Actually, any member of
the Scheme community is free to submit a SRFI for defmacro, which I suspect
would have been done already were it not for the fact that defmacro is
already has a pretty good de facto standard, based on its Lisp ancestry.

Anton
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4brlwtwkq.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> There seems to have been some confusion in this part of the thread, which
> I'll try to rectify below.
> 
> Jeff Dalton wrote:
> 
> > "Anton van Straaten" <·····@appsolutions.com> writes:
> > > I can't deny that there may have been people who thought that
> > > it was necessary to pursue hygienic macros because of a
> > > "greater danger of unintended captures in Scheme".
> >
> > That isn't what I said.
> 
> Actually, it is what you said

No, it isn't.  Why would I say something I don't think is true?

Besides, it's *obviously* not true.  How could a greater
danger make it *necessary* to pursue hygenic macros?  Scheme
programmers could have decided to live with the problem instead,
as they did when using defmacro before hygienic macros came along.

You want to clear up confusion?  There's a clarification.

> By contradicting me here, you implicitly agreed with mikel, who drew an
> explicit, strong connection between the "greater likelihood of accidental
> variable capture in a single namespace", and "coming up with a good macro
> system for Scheme has been harder".  By your latest post, it seems as though
> you're not actually agreeing with mikel, in which case it seems we're in
> agreement that the connection mikel drew isn't accurate.

So which is supposedly the myth?  That "it was necessary to pursue
hygienic macros because of a `greater danger of unintended captures in
Scheme'"?  Or that the "greater likelihood" made "coming up with a
good macro system for Scheme" harder?  (They're not the same.)

> What you wrote to back up your statement "No it isn't" was:
> ...
> I interpreted this paragraph in the context of your implicit
> agreement with mikel.

And since then a lot has been said which should have clarified
what I meant.

> > "Driving issue"?  You keep exaggerating the claim.  I'm not saying
> > the greater danger of unintended capture was the driving issue or
> > that it, by itself, made it "necessary to pursue hygienic macros".
> 
> What you wrote here is almost identical to mikel's claim, and I
> haven't been exaggerating that at all.

If he said anything about necessity, it wasn't in the part I
quoted, and it wasn't anything I defended.  I didn't understand
him as saying it was the driving issue either, and I haven't
said it was the driving issue.

Here endeth the meta-discussion.

(At least for now.)

-- jd
From: Anton van Straaten
Subject: Re: scheme seems neater
Date: 
Message-ID: <xnWec.6294$zj3.1346@newsread3.news.atl.earthlink.net>
Jeff Dalton wrote:

> "Anton van Straaten" <·····@appsolutions.com> writes:
> > By contradicting me here, you implicitly agreed with mikel, who drew an
> > explicit, strong connection between the "greater likelihood of
accidental
> > variable capture in a single namespace", and "coming up with a good
macro
> > system for Scheme has been harder".  By your latest post, it seems as
though
> > you're not actually agreeing with mikel, in which case it seems we're in
> > agreement that the connection mikel drew isn't accurate.
>
> So which is supposedly the myth?  That "it was necessary to pursue
> hygienic macros because of a `greater danger of unintended captures in
> Scheme'"?  Or that the "greater likelihood" made "coming up with a
> good macro system for Scheme" harder?  (They're not the same.)

You're right, they're not the same.  The point that I stated was a myth is
the latter: the "greater likelihood" of capture made "coming up with a good
macro system for Scheme" harder.  You objected to my calling this a myth.
It now sounds to me as though you no longer believe that, although you seem
unwilling to acknowledge that directly.

It seems to me that much of the rest of the thread involved a
misunderstanding based on this.  You wrote:

> And since then a lot has been said which should have clarified
> what I meant.

But at no point did you retract your implicit agreement with mikel's claim.
I had to interpret your statements through that lens, and may have
misinterpreted some of them because of that.

> > What you wrote here is almost identical to mikel's claim, and I
> > haven't been exaggerating that at all.
>
> If he said anything about necessity, it wasn't in the part I
> quoted, and it wasn't anything I defended.  I didn't understand
> him as saying it was the driving issue either, and I haven't
> said it was the driving issue.

Forget necessity.  We had moved on in the thread, and perhaps I used too
strong a word.  You defended his statement, and you no longer seem willing
to do that - or do you still think his original statement was valid?

Anton
From: Jeff Dalton
Subject: Finally, a clear statement of the "myth", was Re: scheme seems neater
Date: 
Message-ID: <fx48ygyzc14.fsf_-_@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> Jeff Dalton wrote:

> > So which is supposedly the myth?  That "it was necessary to pursue
> > hygienic macros because of a `greater danger of unintended captures in
> > Scheme'"?  Or that the "greater likelihood" made "coming up with a
> > good macro system for Scheme" harder?  (They're not the same.)

> You're right, they're not the same.  The point that I stated was a myth is
> the latter: the "greater likelihood" of capture made "coming up with a good
> macro system for Scheme" harder.

At least that has finally been clarified.  At many points,
you seemed to be treating various stronger statements as
the myth.

I tried to point out that those stronger statements were not
(so far as I knew) what anyone was defending.

So here is the supposed myth:

  the "greater likelihood" of capture made "coming up with a good
  macro system for Scheme" harder.

Note that it doesn't even contain anything about Lisp-1, though
the original did, so presumably it's really still included.

In any case, the single namespace is one thing that creates more
opportunities for accidental capture, but not the only thing.

Anyway, against my better judgment, I'll go back over some thing in
some detail, in the hope of clearing some things up.  I hope we don't
then have to repeat the whole previous discuission.

1. Is there a greater likelihood?

Yes.

2. Does it make much difference in practice?

No one really knows.  No one's done a proper study.  Perhaps
Scheme programmers were choosing different names for variables
and functions already, because of the single namespace, in
effect creating ad hoc separate namespaces.  Just for example.
There are many possibilities.

But the connection needn't be direct.  For example, the single
namespace encourages styles of programming that use more local
functions.  FLET and LABELS made the capture problem more
serious in Common Lisp (making defmacro less good that it would
otherwise have been).

Scheme doesn't have any standard way to manage large numbers
of names.  (No packages, no modules, ...)  And so unintended
redefinitions are a greater problem.  Defmacro accentuates
the problem by providing yet more ways for things to go wrong.

3. Was it harder to come up with a good macro system for Scheme?

Yes.  It took someone's PhD, and then a lot more work.
None of that would have been necessary if traditional Lisp
macros (aka "defmacro") had been a good system for Scheme.

4. Is there a causal link from (1) to (3)?

Yes.  One of the reasons defmacro wasn't a good macro system
for Scheme was the greater likelihood of unintended capture.
That blocked the easier, defmacro, path to a good macro system.

Note that I am not saying greater likelihood was *the* reason it was
harder to develop a good macro system for Scheme, or that it made it
*necessary* to develop hygienic macros, or any of the other stronger
claims that have appeared here and there in the discussion.

You have argued that defmacro can't have significant
problems in Scheme, because people use it.  But that doesn't
follow.  People use many thing that have significant problems.
C++ for example.  :)

You have also argued that the designers of macro systems
didn't care about greater likelihood, at least in their papers
about macro systems.  But much of the rationale for "safe"
macros was about name capture, referential transparency,
hygiene, etc -- different ways to talk about the same cluster
of issues.  None of that provides much of a reason unless
name capture was a problem.

The same happens with ordinary lexical scoping.  Why bother?
Cue examples of unintended capture, MAPCAR and FUNCTION,
and so on.  If it hadn't been a problem, the "FUNARG device"
wouldn't have been invented.

Anything that makes the problem worse increases the pressure
to find a solution.  In Common Lisp, the pressure was too low;
in Scheme, it was higher.  There are a number of reasons for
the difference, and the greater likelihood of unintentional
capture was one of them.  

Note that "harder because" doesn't mean harder *only* because.

It doesn't even have to be the most important reason.

The original claim, as I understand it, was that the Lisp-1 /
Lisp-2 difference didn't make much difference in practice,
with "about the only real difference" being that coming up
with a macro system was harder because of the greater likelihood
of accidental capture.

Nothing about greater likelihood being the main reason it
was harder, or that the greater likelihood loomed large in
the minds of Scheme programmers or macro system designers
or anything of the sort.

-- jd
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx43c78tt3l.fsf@tarn.inf.ed.ac.uk>
"Anton van Straaten" <·····@appsolutions.com> writes:

> > Next you'll be telling me that hygienic macro systems had nothing
> > to do with hygiene or referential transparency or unintentional
> > name capture.  But they did; that's what they were about.
> >
> > You seem to think that those things can be relegated to unimportance
> > just by focusing on a different way of saying it: preserving lexical
> > scoping.
> 
> It's not just a different way of saying it: correct interaction with lexical
> scoping is the goal, for which hygiene is an implementation issue, and not
> the only issue, either.  I'm not relegating hygiene to unimportance, or
> saying that "hygienic macro systems had nothing to do with hygiene or
> referential transparency or unintentional name capture."  I'm saying that
> the hygiene issue exists in a broader context, and focusing on "hygiene" as
> the goal of these systems runs the risk of missing their point.

I think it would help if you said what you mean by "hygiene",
because it seems to me that hygiene *is* their point, and it also
seemed that way from what you said about the lambda calculus.
The only thing that doesn't fit is when you say it's an
"implementation issue", or something like that.

> The lambda calculus parallel I drew in another post is relevant: hygiene is
> a condition which needs to be achieved for proper functioning of the system,
> it's achieved via alpha renaming, in the service of achieving referential
> transparency for lexically scoped functions.  Perhaps you want to argue that
> "hygiene" is a shortcut term which means all that, but judging by posts I
> see here about hygiene, I don't think that's a widespread understanding.

Then what do you think those people mean by "hygiene"?

> > You seem to want it both ways: Schemers were maximalist about
> > defmacro's problems ("a partial solution was not considered
> > good enough"), yet have continued to happily use defmacro
> > (showing, supposedly, that it works well in Scheme despite
> > Scheme being a Lisp-1).
> 
> There's no contradiction here, but I may not have made the reasons clear
> enough.  Schemers were maximalist about defmacro's problems in the sense
> that defmacro wasn't good enough as a final solution for macros in Scheme,
> the way it seems to be with CL.

But you didn't just say they used it.  The wide and continued use
was supposed to show defmacro worked well in a Lisp-1.  That's
quite different from people using it because "macro systems were a
moving target" or not all Schemes offer syntax-case and syntax-rules
is too restrictive.

> None of this negates the fact that Scheme's designers and
> standardizers were maximalist about defmacro's problems:

Some of them were, and some of them weren't.

> And yes, continued use of defmacro today, for whatever reason,
> demonstrates that defmacro works fine in a Lisp-1.

But it doesn't work fine in Scheme.  It has problems that are
solved by the hygienic systems.  And Alan Bawden's 

  Good grief.  Do you -really- mean 'Common-Lisp style' in the sense
  of 'blind to issues of lexical scoping'?"

sure doesn't suggest that it works fine.

> That's not going to bring defmacro
> any closer to being the core macro system for Scheme, in the standard.
> There isn't any conflict between those two positions.

The conflict is between what seems to be a very negative view
of defmacro among the maximalists and the wide and continued
"works fine" use.

-- jd
From: Sunnan
Subject: Re: scheme seems neater
Date: 
Message-ID: <87ad1p6sbz.fsf@handgranat.org>
··············@hotmail.com (robbie carlton) writes:
> Question is, why?

The reason is historical and political. The double namespaces have
been around since lisp 1.5 and when Common Lisp came around it was the
only politically acceptable choice since "Common Lisp was the result
of a compromise between a number of dialects of Lisp", all of them
with this issue. See http://www.dreamsongs.com/Separation.html for a
summary of the technical issues involved.

-- 
One love,
Sunnan
From: John Thingstad
Subject: Re: scheme seems neater
Date: 
Message-ID: <opr51iljs2xfnb1n@news.chello.no>
On 6 Apr 2004 06:06:51 -0700, robbie carlton <··············@hotmail.com> 
wrote:

> Hi. Probs stupid question from someone relatively new to Lisp. In
> scheme a symbol has only one visible binding at anytime, whereas in CL
> a symbol can have a variable value a function value, a property list,
> a documentation string, and probs some other junk I forgot. Question
> is, why? Doesn't the CL way just promote messy unreadable code. Also,
> the Scheme way means function definitions are much more pretty, and
> consistent, ie assigning a symbol to a function literal. It just seems
> nicer. I understand Paul Grahams arc is going to include some of that
> schemeyness, but probably won't be around for a decade. Am I just
> wrong, or is Scheme just more elegant than CL?

funcall, apply can be used to call (lambda (var) func)

(fsetf func (founction-value name) (func)) sets a function

More of these duplicates: makunbound, fmakunbound

getting the picture?

-- 
Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
From: Jeff Dalton
Subject: Re: scheme seems neater
Date: 
Message-ID: <fx4y8p9umby.fsf@todday.inf.ed.ac.uk>
··············@hotmail.com (robbie carlton) writes:

> Hi. Probs stupid question from someone relatively new to Lisp. In
> scheme a symbol has only one visible binding at anytime, whereas in CL
> a symbol can have a variable value a function value, a property list,
> a documentation string, and probs some other junk I forgot.

That is mixing up two different (sets of) issues: ones
relating to symbols as data objects and ones relating to
symbols as identifiers in code.

Scheme, like Common Lisp, has symbols as data objects, and
all kinds of things might be associated with symbols in Scheme
programs.  A Scheme program might, for example, have a table
that maps symbols to property lists.

However, Scheme doesn't have that particular mapping built-in,
while Common Lisp does.

Some Scheme implementions might build-in property lists for
symbols; I don't think there's anything in RnRS or the IEEE
standard that forbids it.

When symbols are used as identifiers in code, Scheme has only
one "namespace" for them.  Symbols are identifiers are always
ordinary variables that can have any object as their value.
Among those objects are functions.

Common Lisp also allows variables to have functions as values.

In Common Lisp, however, there is a second namespace in which
symbols as identifiers are mapped specifically to functions.

In Common Lisp, there are also two more namespaces: one for
block names, another for go tags.

(Are there any more than that?  There didn't used to be.)

The two sets of issues meet in the top-level environment.
In Common Lisp, symbols as data objects are used as
variables and function names in the top level env.

That is, you can call a function -- symbol-value -- on a symbol and
get its top-level value as a function, and you can call another
function -- symbol-function -- to gets its top-level value as
a function name.  (There aren't any top-level block names
or go tags.)

> Question is, why?

Partly history; but also it's fairly natural for names to
have different interpretations in different contexts and
to have different uses.  For instance, it is fairly natural
to use "list" as a variable name without expecting that to
interfere with calling "list" as a function.

Many programming languages have multiple namespaces
of various sorts.

> Doesn't the CL way just promote messy unreadable code.

No.

> Also, the Scheme way means function definitions are much more
> pretty,

Some are, others aren't.

> and consistent, ie assigning a symbol to a function literal. It just
> seems nicer. I understand Paul Grahams arc is going to include some
> of that schemeyness, but probably won't be around for a decade. Am I
> just wrong, or is Scheme just more elegant than CL?

I think it's fair to say Scheme is more elegant, but elegance
is not the only desirable property.

-- jd
From: Tim Bradshaw
Subject: Re: scheme seems neater
Date: 
Message-ID: <ey3r7v0emu7.fsf@cley.com>
* robbie carlton wrote:
> Am I just
> wrong, or is Scheme just more elegant than CL?

You are exactly right.  BCPL is also more elegant than C. All
languages are more elegant than Perl (this is a lie, but you don't
want to think about the ones that aren't).  Practically any processor
design is more elegant than any x86-family CPU.  Plan 9 is more
elegant than Unix.  Nothing was more elegant than ITS (what's ITS? you
say). Lisp is more elegant than Python. Python is less elegant than
Perl.  English is less elegant than French.  Latin is more elegant
than either, except in its numbers.

--tim